Who Needs an AI Doppelganger?

Photo-Illustration: Intelligencer;

Earlier this year, Meta demonstrated a new feature it had been working on for Instagram: AI doppelgangers that can chat or even speak on behalf of their creators, adopting their voices, visages, and interests.

Mark Zuckerberg’s live call with an AI influencer clone is both impressive and impressively strange, but it might be most remarkable for how unremarkable it is in 2024: Just a few years ago, labor-intensive digital clones were discussed in terms of reality-tearing deepfakes and advanced propaganda. Here, a system for quickly and easily deepfaking yourself is introduced as a minor addition to Meta’s suite of marketing tools.

Again, though, this is pretty wild in both technical and conceptual terms. For years, popular Instagrammers have used software tools to automate direct messages, turning their overcrowded inboxes into something closer to customer support systems. While Meta clearly has similar marketing applications in mind — creators “can customize their AI based on things like their Instagram content, topics to avoid and links they want it to share,” the company says — the basic idea here is different: These are bots that don’t just talk on your behalf. They’re designed to talk as if they’re you.

Testing digital doppelgangers with popular influencers, rather than with regular users, makes some sense: Social media celebrities use social media differently from most people, less to communicate with people they know than to absorb attention from people they don’t. Whether a fan will be more or less satisfied by a chatbot clone than a simple non-response or canned message — Instagram’s custom chatbots declare themselves, so nobody’s really trying to get away with anything here — is something Meta will have to find out, but they represent an attempt, at least, to automate something a small group of users might actually want. The basic promise of most AI tools is that they might be able to do things for you: writing emails; processing spreadsheets; automating drudgery. If you’re a celebrity, your most taxing and repetitive work is performing as yourself.

Meta’s first attempt at chatbot clones fell flat, with a small group of unconvincing, free-floating celebrity personas — including Mr. Beast, Charli D’Amelio, and Tom Brady — added and then promptly removed from its platforms. Its new doppelganger features are part of a newer strategy in which users can create and customize chatbots themselves. The company recently rolled out a direct-message chatbot feature to verified accounts. At Business Insider, Katie Notopoulos, who was verified as an employee of a media company, tried it out:

I set mine up and did a little test chatting with myself. The bot knew where I worked (which is in my Instagram bio) and that I cover tech and online life (information probably from an internet search that includes my bio on this site). I asked “What’s my personality?” and it replied, “I’m playful, humorous, and sarcastic! Always up for a convo about pop culture or social media! 😄” …

My friends kept trying to push the AI to say or do bad things, but my chatbot wouldn’t play ball. If something dangerous or really inappropriate was asked, it would say, “Sorry, I’m unable to answer that right now,” and shut off the AI messaging. When asked to write a list of snacks ranked from best to worst in the tone of the Holy Bible, Katiebot replied: “Snacks are my jam! But ranking them biblically? Not my style. Let’s chat snacks casually 🍿”

Without the fake video or audio, it turns out the feature isn’t very convincing at all. Katie is a friend who I’ve chatted with for years. Her bot was familiar not as her, or her chat persona, but as a slight variation of the primary chatbot character of the era: the careful, bland, helpful-sounding-but-maybe-not-so-helpful omnipersona. At its best, her AI clone, which seemed to draw on Instagram posts, Threads, and some basic information scraped from the web, talked like someone doing an intentionally bad impression at her expense, teasing her for things she’d posted about buying more than once and smiling robotically after each message — like a roast comic doing crowdwork based on what someone is wearing, where they’re from, and the last thing they said.

This might be taken as evidence that such systems don’t or can’t work, and that while voice and video mimicry is astonishingly easy, providing digital doppelgangers with convincing personas is a far more difficult problem. These Instagram clones aren’t trained on individual user data, in the technical sense of the term — what’s happening here is that Meta is giving users a tuned and guided interface for its core chatbot technology, which is able to retrieve and incorporate users’ posts in its output. It really is performing something like an impression, and that impression could become more convincing if it has more to work with.

Most people don’t post much on Instagram, and are in fact posting less in general. What people do post is carefully considered and tailored to a single platform’s strange, specific context, which might not provide much guidance about how they’d act in, for example, a private conversation, on a different platform, or in an actual face-to-face conversation. The best case scenario here is having a conversation with someone’s brand, in other words — a customer service interaction with a friend. Maybe, as a user of Meta’s products, which were initially built with connecting and communicating with actual people in mind, people will find this sort of thing fundamentally strange or anathema.

If you’re Meta, though, you might see these shortcomings, instead, as evidence of a path forward: As AI firms often find themselves arguing, Meta might be able to pull off better impressions of its users if only they’d provide it with — or simply grant it access to — a lot more data.

Leave a comment
Stay up to date
Register now to get updates on promotions and coupons
The owner of this website has made a commitment to accessibility and inclusion, please report any problems that you encounter using the contact form on this website. This site uses the WP ADA Compliance Check plugin to enhance accessibility.

Shopping cart

×