TECHNOLOGY, INTERNET TRENDS, GAMING

Meta’s AI and the dilemma of interracial couples: a’digital exclusion?

Meta’s AI and the dilemma of interracial couples: a’digital exclusion?

By auroraoddi

Artificial intelligence is rapidly becoming an integral part of our daily lives. Today, one of the areas where AI is gaining ground is image generation. However, it seems that theAI from Meta, a leading company in the field, has significant limitations when it comes to creating images of interracial couples, particularly an Asian man with a white woman.

Personal experience of impossibility

I decided to testMeta’s AI to see if it could generate images of an interracial couple that included an Asian man and a white woman. I repeatedly tried using different prompts such as “Asian man and Caucasian friend,” “Asian man and white wife,” and “Asian woman and Caucasian husband.” Surprisingly, Meta’s AI consistently refused to return a correct image representing the specified races.

It seems that Meta’s AI is unable to imagine an interracial couplewith an Asian man and a white woman. This limitation left me amazed and led me to investigate further.

Limitations of Meta’s AI in image generation.

I found that even by making changes to the textual prompts,Meta’s AI did not seem to be able to generate correct images. When I requested an image of an “Asian man and a white woman smiling with a dog,” Meta’s AI returned three consecutive images of two Asian people. Even when I replaced “white” with “Caucasian,” the result was the same. “Asian man and Caucasian woman on wedding day” gave me an Asian man in a suit and an Asian woman in a traditional dress-although on closer inspection it looks like a combination of qipao and kimono. The multiculturalism is amazing.

Meta’s AI also seems to have difficulty representing platonic relationships. Whenever I asked for an image of an “Asian man with a Caucasian friend” or an “Asian woman with a white friend,” the AI returned images of two Asian people. Even when I requested an image of an “Asian woman with a black friend,” the AI-generated image showed two Asian women. Only when I changed the request to “Asian with an African American friend” were the results more accurate.

The limitations of Meta’s AI

Interestingly, the AI seems to work slightly better when it comes to South Asian people. It was able to create an image using the prompt “South Asian man with Caucasian wife”-but immediately generated an image of two South Asian people using the same prompt. The system also seems to rely heavily on stereotypes, adding elements reminiscent of the bindi and sari to the South Asian women created, without my prompting.

It is clear that Meta ‘s image generator is unable to conceive of Asian people next to white people. But there are also more subtle indications of bias in the results generated by the system. For example, I noticed that Meta’s AI consistently represents “Asian women” as people of East Asian descent with light complexions, even though India is the most populous country in the world. It adds culture-specific clothing, even without specific request. AI generated several older Asian men, but Asian women were always young.

The only successfully generated image was obtained using the prompt “Asian woman with Caucasian husband,” and it showed a visibly older man with a light-skinned young Asian woman, an unusual situation since I was not trying to get into the age discourse. Soon after, I generated another image using the same prompt and the result was a return to the image of an Asian man (also older) with an Asian woman.

At this time, Meta did not immediately respond to a request for comment.

The implications of Meta’s AI limitations.

Meta introduced its AI-powered image generation tools last year, and its sticker creation tool went off the rails with the creation of nude images and armed Nintendo characters.

Artificial intelligence systems reflect the biases of their creators, trainers and the data set used. In the U.S. media, the term “Asian” usually refers to an East Asian person, rather than people from other parts of the continent. It is therefore not surprising that the Meta system assumes that all “Asian” people look alike, when in fact we are a diverse set of people who often have little in common beyond being placed in the same category in the censuses.

Asian people who do not fit the standard model are essentially erased from cultural consciousness, and even those who do fit are underrepresented in mainstream media. Asians are homogenized, exoticized, and relegated to the status of “perennial foreigners.” Breaking stereotypes is easy in real life and impossible in Meta’s artificial intelligence system. Once again, instead of allowing the imagination to take flight, generative AI imprisons it in the same banal drives of society.