Images: Screenshots via Meta
Meta unveiled its AI chatbot assistant on Facebook Messenger and Instagram this week. Ostensibly, the chatbot acts as any other, helping users acquire information or providing companionship. But unlike a faceless program like ChatGPT or the clearly made-up digital avatars in Replika, Meta has given its chatbots the faces of specific celebrities.
What’s more, each of the chats specializes in that celebrity-image’s particular areas of interest. It adopts components of their personalities. You can talk with Tom Brady about sports or Charlie D’Amelio about dance. Of course, you’re not actually chatting with them. Despite the similarities, you’re not even supposed to be. Because, bizarrely, each of these celebrities was assigned an entirely different name.
They’re sort of playing a character, but still, it’s clearly them. I started a chat with Padma Lakshmi’s “Lorena,” a travel expert. When I told her about my upcoming trip to Porto and Lisbon and my interest in food, she recommended specific dishes to try, followed by suggestions for day trips outside of the cities. Throughout, a recording of Lakshmi stared at me through a small box in the corner of my screen as though we were on Facetime. At times, she appeared disgusted. Still, Lorena’s tips were solid and didn’t feel entirely distinct from the types of things we can imagine Lakshmi saying on her own travel and food shows.
While many of the bots seem designed for some specific utility, others are more for comfort and entertainment. Kendall Jenner, for example, is Billie, a “ride-or-die older sister.” (Another star in a sibling role: MrBeast is “the big brother who will roast you—because he cares.”) Jenner is not Billie; Jenner is not your sister. Just as clearly that you’re speaking to their image, you’re also clearly not meant to believe you’re speaking with the actual celebrity. Rather, it’s who Kendall Jenner might be if she were actually your non-famous older sister, able to help dole out advice on self-care and provide a space to vent.
Some of the AI bot options through Meta are even more conceptual. One utilizing Paris Hilton’s face is Amber, a crime-solving detective. Upon opening my chat with her, I’m presented with three options: “Tell me about the victim,” “Who are the suspects,” and “Where did the crime take place?” Asking about the victim, I was told that she is a young mother named Sarah who has disappeared. As I asked more questions and gathered more details about this made-up case, Paris Hilton pouted, smiled, and looked off-camera in the corner of the screen.
It’s fun, I suppose, to imagine Paris Hilton playing this character, but… why her? Is Paris Hilton a true crime junkie? It’s not something she’s known for or, as far as I can tell, even a topic she’s ever talked about. So, who exactly is this for?
If Meta aims to normalize AI chatbots by making them feel familiar yet distinct, then it’s probably not the worst strategy. By renaming the celebrities and assigning them these specific characteristics, the company has basically embraced the uncanny nature that has long plagued AI. It’s so unreal—so unlike the actual people—that Meta isn’t even pretending they’re representational.
It is also, of course, something of a safety move: If one of these bots says something cancellable or if a user forms a parasocial relationship that’s a bit too strong, it’s easy to prove the celebs are not them. Still, the decision misses why exactly someone would want to use a chatbot of this nature in the first place: to get at some approximation of a specific person in their absence.
Also this week, pornstar Riley Reid launched Clona, an AI software that allows fans to sext with an AI version of her, audio included. Lena the Plug has joined the platform, as well. The point isn’t just to sext but to do so with a program that allows the user to imagine a particular individual on the other side of the screen. Most likely, platforms like Clona and Meta will soon advance their AI technology to feature visual representations of specific individuals capable of moving and speaking in real time. When that happens—or perhaps that future is already here—it’s hard to say which is less disturbing, pretending AI models are an extension of the person they appear to be or a different “person” entirely.