Video games and false hostages: How Hamas-Israel war information is weaponised

Fact-checkers have identified dozens of fake and misleading videos about the Hamas-Israel war on social media, including footage from video games. They say information is being weaponised.

An image of a pro-Palestinian rally and another of a pro-Israel rally, separated by a silhouette of a man looking at a mobile phone.

Social media platform X is being accused of allowing misinformation to spread. Source: SBS News

Fact-checkers say they weren't surprised when disinformation and fake news started to surface after Hamas attacked southern Israel, which has responded by more than ever before and depriving the enclave of food, water, power and aid.

Fact-checkers from French news agency AFP debunked videos of the initial Hamas attack, which turned out to be clips taken from the military simulation game Arma 3, clips from a concert in Tel Aviv, footage from Mexico and footage of paragliders in Egypt.

Before facts had been established about the by Palestinian group Hamas, social media accounts posted footage of hostages in Turkey in 2016, and photos of troops in Gaza in 2022, claiming they were from the Hamas attack, AFP said.
A graphic depicting a hospital before and after an explosion.
"People are searching around wanting to see what's happening, and that creates an information void, so if there's no quality information available, then whatever is out there will be sometimes just grabbed upon and amplified," RMIT CrossCheck director Dr Anne Kruger told SBS News.

As the conflict has continued, Israel and Hamas, which controls the Gaza Strip, have accused each other of targeting Gaza's and denied responsibility.
"I think we just have to be very careful before we jump to conclusions," Kruger said.

"Sometimes there's a grain of truth in things, but it gets twisted, and if we can't categorically prove something, it doesn't mean that atrocities haven't happened."

She said it can take a long time, even "weeks" to be able to make proper assessments of what happens in war zones.

Social media content is today’s war propaganda

Professor of communication at the University of Canberra, Mathieu O'Neil, said changes to X made since the company was bought by Elon Musk had resulted in an increase in questionable content on the platform.

"It used to be that you could have some legitimacy based on your expertise, and then you'd have a lot of attention because people would retweet you."

O'Neil said changed algorithms driven by profit meant people who were would get more exposure and users would have "less access to people who have ."
A man in a collared shirt.
Professor of communication Mathieu O’Neil says changes to the social media platform X have led to an increase in the amount of misinformation being posted. Source: Supplied
O'Neil said while in days gone by, communiques may have been sent out to push a certain message in a conflict, social media is now stepping in as a form of war propaganda.

"Strategic actors are using information warfare tools and trying to shape the narrative," he said.

"Propaganda is a part of war, propaganda has always been used to paint the enemy in the worst possible light … the difference is that now there's a lot more uncertainty about who's doing it because of the anonymity of the accounts."
O'Neil admitted there was difficulty in verifying facts "when you have an immediate situation", so warned that unless people were willing to spend time checking the veracity of information, images or video, people should reconsider re-sharing content.

X under fire for misleading content

European Union Commissioner Thierry Breton last week shared a letter addressed to X owner Musk in which he urged prompt action on disinformation and warned that penalties may be imposed.
He said following the recent attacks by Hamas, the EU believed the platform was being used to disseminate illegal content and disinformation in the EU.

Breton's letter said "fake and manipulated images and facts" were circulating in the EU via X, including "repurposed old images of unrelated armed conflicts or military footage that actually originated from video games".

Who is behind such content?

Axel Bruns, a professor at the Digital Media Research Centre at the Queensland University of Technology, said there were likely a number of motivations for those

He said some would do so to "push the agenda of one side or to say how great either Hamas or the Israeli Defence Forces are doing".

"Some of it might be to mislead the enemy on the other side or to build up morale on one side, or to mislead the broader public."
At the same time, he said there were also trolls who simply "get a kick" out of creating disinformation and having it shared.

Bruns said there were also countries not directly involved in the conflict who may see such situations as an opportunity to manipulate public perceptions for different reasons.

"It's known that and Saudi Arabia both have substantial cyber warfare groups that are doing this sort of stuff, to what extent they themselves have got involved in this particular conflict is not clear," he said.

What can be done about social media disinformation?

Bruns urged governments to put more pressure on companies such as X to take more responsibility for the content posted on its platforms.
A man in a collared shirt wearing glasses.
Professor Axel Bruns says there are a number of reasons people create and share disinformation. Source: Supplied / QUT/Sonja de Sterke
"It is really important to push back against that sort of rhetoric that anything goes on these platforms, and ensure that they're not actively eroding society by hosting hateful, abusive content and misinformation," he said.

Bruns said it could also be difficult to prove the impact of a false or misleading social media post until time had passed and the post had already contributed to misinformation.
"But if it's the circulation of fake imagery, from the war in Israel now, then it might not be any one individual post showing that image, but it might be the sharing of that image across thousands of posts to thousands of accounts, that has an impact."

Bruns said proving the impact of X posts had only been made more difficult since Musk reduced access to API data.

"It's become a lot more difficult to prove the impact and prove the activity of coordination between multiple accounts because we simply can't get the data as well anymore," he said.

Bruns said the 'community notes' feature introduced after Musk took over the platform was not sufficient to undo the potential damage of misinformation.

Share
6 min read
Published 21 October 2023 4:01pm
By Aleisha Orr, Madeleine Wedesweiler
Source: SBS News


Share this with family and friends