advertisement
(Editor's Note: The story was first published on 3 November 2023 and is being republished on the one-year anniversary of the war that broke out between Hamas and Israel.)
Kobbi Shoshani, the Consul General of Israel to India, shared the image of child wrapped in white cloth while using a smartphone, with a caption, "Don't believe Hamas' lies", insinuating that people are faking deaths in Gaza.
This has been a narrative that bad actors spreading disinformation around the Israel-Hamas conflict have been sharing to create a false narrative about "fake casualties" in Gaza. Simultaneously, Israelis on TikTok are creating and posting videos mocking the Palestinian suffering and deaths.
US President Joe Biden fueled these narratives and conspiracy theories when he seemed to question the number of casualties coming from Gaza.
Human rights organisations and investigative reporters were quick in debunking Biden's claims, and the Biden administration later softened their stand after facing pushback from American Muslim bodies, but the damage, sadly, was done.
Social media feeds are filled with old and unrelated videos shared to dismiss the casualties in Gaza, calling the rescue operations staged, creating a falsely amplified sense of support from international lawmakers and so on. At the same time, there are people who are creating AI-generated imagery to evoke emotional responses from people and gain "likes" and "views".
This story will take you through the most viral pieces of mis/disinformation on the casualties in Gaza.
On 28 October, Shoshani shared an image of a child using a smartphone while being wrapped in a white sheet, claiming that it showed an example of Gazans faking deaths.
However, the image dated back to October 2022 and showed a child wearing a costume for a Halloween competition in Thailand.
Similarly, an old video from Egypt was shared as news organisation Al Jazeera preparing to film the bodies insinuating that it shows people faking deaths amid the ongoing conflict.
The video was shared by X (formerly Twitter) subscriber MeghUpdates, which has been called out multiple times for spreading disinformation.
Another set of images, which showed three different people rescuing the same injured girl, went viral with users claiming that the same Palestinian girl was saved by three different people.
But here's the twist. Not only was the picture old, but it was from Syria and had no connection with the Israel-Hamas war. The images showed people rescuing a girl in Syria's Aleppo in 2016.
The Israeli Embassy in France shared a video of a man carrying a deceased child with the caption saying that it showed a plastic doll. The handle targeted Hamas and said that it intended on making people believe in the death of a baby following Israeli strikes.
However, The Quint contacted the person who took the video, Moamen Al-Halabi, who confirmed the location as Al-Shifa Hospital in Gaza City. He added that the child's name was Omar Al-Banna, who passed away at four and a half years of age.
A behind-the-scenes footage from 2017 showing Palestinian makeup artist Mariam Saleh's work as a female special effects makeup artist was shared to claim that it shows Gazans faking injuries amid the ongoing war.
On 29 October, a video of model Bella Hadid was circulated with users claiming that it showed her expressing support for Israel.
Similarly, a video of the Queen Consort of Jordon Rania Al Abdullah was too shared on social media platforms to claim she was seen supporting Israel amid the conflict.
However, The Quint found out both these videos were deepfakes and their audios were altered using the help of Artificial Intelligence (AI) tools.
Another AI-generated image showing football club Atlético de Madrid's fans holding up the Palestinian flag was shared on the internet as a real incident.
On 22 October, an X premium subscriber shared an image claiming to show a temporary tent city made specifically for displaced citizens by the Israeli government.
However, the image turned out to be AI-generated after we noticed several discrepancies and passed it through AI images detection tools. Team WebQoof had earlier published a guide on how to identify AI images, you can read the same here.
Two images were shared on social media platforms with a claim that they showed recent visuals of children being affected in the ongoing conflict.
However, both these images were AI-generated.
Watch this episode of The Quint's media literacy initiative Verify Kiya Kya? to understand how emotions are manipulated by bad actors spreading disinformation.
Social media users shared a video claiming that thousands of Israelis took to the streets to protest against Prime Minister Benjamin Netanyahu.
However, the video was altered. Two unrelated videos were actually stitched together and peddled as a huge number of Israelis protesting.
On 17 October, a video of a Pakistani politician, Sarwat Fatima, extending support to Palestine while threatening to 'bomb' Israel was shared as a recent incident.
The video was also picked by several news media outlets like Firstpost, Asianet Newsable and Navbharat Times.
However, the video turned out to be old and dated back to 2021.
Similarly, a video from 2021, which showed an Irish politician, Matt Carthy, introducing a bill to prevent Irish taxpayers' money from being invested in companies "that profit from Israel's illegal occupation and settlement expansion" was shared as recent.
As the war between Israel and Hamas continues, the flow of misinformation and false narratives refuses to die down. This increases the responsibility of the reader to verify every piece of content that they consume or share with others.
(Not convinced of a post or information you came across online and want it verified? Send us the details on WhatsApp at 9643651818, or e-mail it to us at webqoof@thequint.com and we'll fact-check it for you. You can also read all our fact-checked stories here.)
(At The Quint, we question everything. Play an active role in shaping our journalism by becoming a member today.)
Published: 03 Nov 2023,06:27 PM IST