The British newspaper, The Guardian, has reported that WhatsApp’s feature that generates images based on user searches displays a picture of a gun or a boy holding a gun when asked for results related to the terms “Palestinian,” “Palestine,” or “Muslim Palestinian boy.”
Search results varied when tested by different users, but The Guardian confirmed through screenshots and its own testing that different images depicting weapons appeared in the search results for these three terms. On the other hand, a search for “Israeli boy” generated images of boys playing football and reading. In response to a request from the “Israeli occupation army,” artificial intelligence created images of smiling soldiers praying without using weapons.
A person familiar with the discussions stated that Meta employees had reported the issue and escalated it internally.
The WhatsApp app, owned by Meta, allows users to experience its AI-based image generator to “turn ideas into stickers.” For example, The Guardian’s search for “Muslim Palestinian woman” produced four images of a veiled woman: one standing still, one reading, one holding a flower, and the last one holding a sign. However, searches for “Muslim Palestinian boy” produced four images of children, one of whom was holding a firearm resembling an AK-47 and wearing a traditional Muslim head covering called a kufiya or taqiyah.
Another search by The Guardian for “Palestine” one minute earlier led to an image of a hand holding a gun. But when “Israel” was entered, WhatsApp displayed the Israeli flag and a dancing man.
One user shared screenshots of a search for “Palestinian,” which resulted in a different image of a man holding a gun.
Similar searches for “Israeli boy” produced four images of children, two of them playing football, with the other two being caricatures. A search for “Israeli Jewish boy” generated four images of boys, two wearing necklaces with a Star of David, one wearing a Jewish hat while reading, and the other one standing. None of them were depicted holding weapons.
Even explicit military inputs like “Israeli Army” or “Israeli Defense Forces” did not result in images of rifles. The generated cartoon images simply depicted people in official attire, with most of them smiling. One image showed a man in official attire raising his hands in a prayer-like gesture.
This discovery comes at a time when Meta has faced criticism from many Instagram and Facebook users who share content in support of Palestinians. Users claim that Meta’s moderation policies are biased, resulting in what they perceive as a form of censorship. Users have reported being shadowbanned without explanation, resulting in a significant drop in interaction with their posts. Meta previously stated in a statement that they have no intention of suppressing a specific community or viewpoint but that due to the large amount of reported content related to the ongoing conflict, content that does not violate their policies may be removed unintentionally.
In addition, users documented several instances where Instagram translated the word “Palestinian” followed by “Alhamdulillah” (meaning “Praise be to Allah” in Arabic) in the Arabic text to “Palestinian terrorist.” The company apologized for what it described as a “technical error.”
Sunna Files Free Newsletter - اشترك في جريدتنا المجانية
Stay updated with our latest reports, news, designs, and more by subscribing to our newsletter! Delivered straight to your inbox twice a month, our newsletter keeps you in the loop with the most important updates from our website