Kendall Jenner is that you? Meta’s New AI Characters

A few months ago, an Instagram account that looked like supermodel Kendall Jenner popped up on my feed and introduced herself as Billie. At first I was confused about this, wondering if it was a new advertisement or brand deal that Jenner was participating in. Wanting to know more, I clicked on the account to find that Billie was the newest digital AI model from Meta.

On September 27th, Meta unveiled they had partnered with several celebrities to create generative AI versions of themselves for users to interact with. Meta has been paying celebrities millions of dollars for their likeness in image, personality, and mannerisms to create an interactive character that users can engage with in real time. So far, these celebrities include Charli D’Amelio, Snoop Dogg, Paris Hilton, Tom Brady, Dwayne Johnson, and most notably – Kendall Jenner. The characters based on real-life celebrities have different names, and are marketed with a particular specialty for engagement. The character created in Kendall Jenner’s likeness is called Billie, and is marketed as “an older sister you can talk to, but who can’t steal your clothes.” Each character is branded for a particular purpose, such as Tom Brady’s character, Bru, being marketed as a confident sports debater you can chat sports and fantasy drafts with.

These AI characters from Meta got me thinking about other ways that AI can be used to interact with celebrities and potentially even fictional characters. The for lack of a better work, bots, are programed with raw data and “learn” to generate statistically probable outcomes when prompted. This means that these characters respond based on the data they are programed with, and can be developed to mimic celebrities, fictional characters, and to even invent new ones. In the case of Meta’s AI characters, Meta got the express permission from celebrities to use their likeness and mannerisms in their tech, but what about in situations where this is done without permission?

Through some research, I found the website Character.AI, which allows users to create and query thousands of AI chatbots that represent real people and famous fictional characters. For example, on Character.AI, you can chat digitally with Hermione Granger, who will tell you about “He Who Must Not Be Named” and all her friends at Hogwarts. After playing around a little chatting with some of my favourite fictional characters to get a sense of the technology, I got to thinking about the intersection of the intellectual property of these copyrighted characters developed by real authors and creators. Doesn’t JK. Rowling own the intellectual property rights to Hermione Granger and Hogwarts, while Warner Brothers Entertainment owns the production rights to the franchise? How can someone use the character of Hermione in this way? It would get especially complicated if the data being used to mold the character was not true to the intellectual property at hand. Section 14.1 and 14.2 of the Copyright Act give creators the right to the integrity of their work. Despite the US opting out of this provision, it is still an interesting concern. What if someone was developing a Hermione Granger on Character.AI that answered questions in a sexual way that infringed on the integrity of JK Rowling’s work? Could she get an injunction or damages for such an infringement? It is an interesting concept that is on the cutting edge of the law right now.

Character. AI is not the only site toeing the line between copyrighted content and the fair dealing of that material. As we have discussed in class, OpenAI’s ChatGPT and Stability AI’s open-source engine can impersonate fictional characters and even “write” stories in the style of your favourite authors, based on data scraping of millions of copyrighted and protected works. In the current ongoing lawsuit Anderson v. Stability AI, three artists sued multiple generative AI platforms on the basis of the AI using their original works without license to train AI in their styles. This allowed users to generate works that are derivative of their existing, protected works without their permission.

I encourage you to take a look at Character.AI’s chat functions, and play around a little with them to see what you think. Do you think this infringes on existing copyright and if so, what do you think should be done about this from a legal standpoint?

2 responses to “Kendall Jenner is that you? Meta’s New AI Characters”

  1. J

    This an interesting post, and I think that not only is there are there IP issues here, but also a privacy one (which extends a bit beyond what we learn in class).

    COPYRIGHT:

    I imagine that depends on what the AI is reproducing. Take your Harry Potter example. If the AI is imitating Hermione Granger, then we have to ask if the AI imitation is an exception under s.29 of the Copyright Act. Of all the exceptions, it looks like FAIR DEALING is the only one that applies (this is not a non-commercial use, or a private use, or a backup copy). So, if this AI Hermione is only reproducing exact scenes from the book/novels without any parody or satire, that is probably not fair dealing. Regarding your concern that they make make Hermione sexual, is suspect this would run afoul of s.28(1) and s.28(2) of the Act (Moral Rights).

    If the AI is simply a reproduction of a celebrity (like Jenner) it can depend on what they say. If the AI is simply chatting as a normal person, that is not a copyright issue. However, if the AI is interacting with the user in way that reproduces some unique aspect of the celebrity, the could be copyrightable. I am thinking of Gould v. Stoddart. There, the court held that private conversation with Gould was not copyrightable. However, had Gould been giving some form of formal address, like a lecture, it could have been copyrightable. If the AI is simulation Jenner simply chatting about the whether, it is comparable to Gould having a casual conversation, and so nothing copyrightable arises. If the AI quotes large sections of an interview with Jenner, or recreates scenes from the film “Burn Book,” or something similar, then it could be copyrightable.

    PASSING OFF and TRADEMARK

    It am not sure if the AI site will lead to confusion. If the AI site explicitly promotes itself as providing AI celebrities, it should not confuse the user as to whether or not that is the actual celebrity. Also, there is no confusion where there is no competition. If the celebrities had their own platforms where they had conversations with people, then they would be in competition with the AI program. In such a case, the AI program might confuse and so misdirect some of the customers. If so, that would likely meet the elements of passing off and trademark violation. However, if there are no celebrities doing this commercial service, then there is no misdirection. I am not too familiar with this, but isn’t there a service called Cameo, where people pay celebrities to record messages for them? If Jenner does this, then there could be a misdirection by having AI Jenner do something similar.

    PERSONALITY RIGHTS

    We don’t really cover this in class, but this is another point of attack is through privacy law. In BC, the Privacy Act protects the “Personality Rights” of people.

    s.3(2) It is a tort, actionable without proof of damage, for a person to use the name or portrait of another for the purpose of advertising or promoting the sale of, or other trading in, property or services, unless that other, or a person entitled to consent on his or her behalf, consents to the use for that purpose.

    https://www.bclaws.gov.bc.ca/civix/document/id/complete/statreg/00_96373_01#section3

    So, it sounds like without consent, using AI to imitate someone for commercial purpose is a tort in BC. I’m sure that this statute was intended for “real” celebrity impersonators, but I suspect that applying it to AI imitation falls within the purposive reading of this statute.

    However, the courts in the Gould case did not apply this principle. They found that there was no violation of the personality rights. That is because they held that the publisher did not use Gould’s works and photographs to sell the book, but rather Gould’s material was the content of the book. If the AI version of Jenner was trying to sell you Pepsi, then that would run afoul of the BC Privacy act because the rendition is “for purpose of advertising or promoting the sale of” a product. However, if the AI Jenner is simply acting like a real person, that it is not promoting anything. Both the decision in Gould and the plain reading of the BC Privacy Act should permit this.

    In the USA, a few years ago Ariana Grande sued the retail chain Forever 21 for using a looking alike in their advertisements. I think this case was settled or withdrawn, but it does cover various elements of IP and Privacy law. The following is an interesting write-up about that lawsuit. I think the points there could apply to AI celebrities as well.

    https://uclawreview.org/2019/10/16/no-thank-u-next-ariana-grande-sues-forever-21/

  2. smacd223

    Interesting read! The part on the Hermione Granger AI chatbot stuck out to me for a number of reasons. Firstly, I feel like a similar chatbot existed when I was growing up reading the Harry Potter books… specifically a “Tom Riddle’s Diary” chatbot. Maybe it was the same one? It also raised thoughts that I have been writing in the piece I will be submitting soon to the course site, mainly the idea of the moral rights in copyrighted works and how these characters might get used and interacted with in this context. I know from my own research for my assignment that many authors are highly protective of their characters and would be highly objectionable to them being interacted with in this way and the way their written works had been fed into an AI in order to produce this output. JK Rowling specifically (as you’ve pointed out) has mentioned ways she does not want to see her characters being used (see here: https://winteriscoming.net/2021/07/04/8-famous-authors-think-fanfiction-george-rr-martin-anne-rice-jrr-tolkien/5/), and so I definitely feel like this raises moral rights issues.