WASHINGTON (AP) — When disinformation researcher Wen-Ping Liu analyzed China's efforts to influence Taiwan's recent election using fake social media accounts, highlighted something unusual about the most successful profiles.
They were women, or at least that's what they looked like. Fake profiles that claimed to be female got more engagement, more eyeballs and more influence than the supposedly male ones.
“Pretending to be a woman is the easiest way to gain credibility,” said Liu, a researcher at Taiwan's Ministry of Justice.
Whether it is chinese or Russian propaganda agenciesonline scammers or AI chatbots, it pays to be a woman, proving that while technology may be getting more sophisticated, the human brain remains surprisingly easy to hack thanks in part to age-old gender stereotypes that they have migrated from the real world to the real world. virtual
People have long assigned human characteristics like gender to inanimate objects (ships are an example), so it makes sense that human-like traits would make fake social media profiles or chatbots more attractive. However, questions about how these technologies may reflect and reinforce gender stereotypes are gaining attention as more AI-powered voice assistants and chatbots enter the market. blurring the lines between man (and woman) and machine.
“You want to inject some emotion and warmth and a very easy way to do that is to pick out a woman's face and voice,” said Sylvie Borau, a marketing professor and online researcher in Toulouse, France. whose work has found that Internet users prefer “female” robots and see them as more human versions than “male” ones.
People tend to see women as warmer, less threatening and nicer than men, Borau told The Associated Press. Meanwhile, men are often seen as more competent, although they are also more likely to be threatening or hostile. Because of this, many people may be, consciously or unconsciously, more willing to engage with a fake account that pretends to be a woman.
When OpenAI CEO Sam Altman was looking for a new voice for the ChatGPT AI program, approached Scarlett Johanssonwho said that Altman told him that users would find his voice, which served as a the eponymous voice assistant in the movie “Her” – “comforting”. Johansson rejected Altman's request and threatened to sue when the company opted for what she called a “strangely similar” voice. OpenAI put the new voice on hold.
Female profile picturesespecially those that show women with flawless skin, lush lips and wide eyes in revealing outfits, can be another online attraction for many men.
Users also treat bots differently based on perceived gender: Borau's research has found that “female” chatbots are more likely to receive sexual harassment and threats than “male” bots.
Female social media profiles receive on average more than three times as many views as male profiles, according to an analysis of more than 40,000 profiles conducted for the AP by Cyabra, an Israeli technology company that specializes in bot detection. Cyabra found that female profiles that claim to be younger get the most views.
“Creating a fake account and presenting it as a female will help the account reach more than presenting it as a male,” according to Cyabra's report.
Online influence campaigns organized by nations such as China and Russia have long used fake women spreading propaganda and misinformation. These campaigns often exploit people's views of women. Some appear as wise, educated grandmothers dispensing homely wisdom, while others impersonate conventionally attractive young women eager to talk politics with older men.
Last month, researchers at the firm NewsGuard found hundreds of fake accounts — some with AI-generated profile pictures — criticizing President Joe Biden. It happened after some Trump supporters started posting a personal photo with the announcement that they “won't be voting for Joe Biden.”
While many of the posts were genuine, more than 700 were from fake accounts. Most of the profiles claimed to be young women living in states like Illinois or Florida; one was called PatriotGal480. But many of the accounts used nearly identical language and had profile photos either generated by AI or stolen from other users. And while they couldn't say for sure who was operating the fake accounts, they found dozens with links to nations like Russia and China.
X removed the accounts after NewsGuard contacted the platform.
A UN report suggested there is an even more obvious reason so many fake accounts and chatbots they are women: they were created by men. The report, entitled ” Are robots sexist?”, he looked gender disparities in technology industries and concluded that greater diversity in AI programming and development could lead to fewer sexist stereotypes built into their products.
For programmers who want to make their chatbots as human as possible, this creates a dilemma, Borau said: If they select a female persona, are they encouraging sexist views about real-life women?
“It's a vicious cycle,” Borau said. “Humanizing AI Could Dehumanize Women.”