Is Meta culpable in Israel’s killing of tens of thousands of Palestinians?

Questions are being asked if WhatsApp groups’ data was shared to train Israeli military AI.

Illustration shows Whatsapp logo / Photo: Reuters
Reuters

Illustration shows Whatsapp logo / Photo: Reuters

Everyday tens of millions of people use WhatsApp, the world's most popular messaging app owned by Meta.

Some of its active users are in Gaza where they must have created groups to exchange information and plan escape from constant Israeli military assaults, which have killed more than 34,000 people, most of them civilians.

There must be WhatsApp groups focused on finding the best way to get hold of meals because the Palestinian enclave is facing a famine as Israel has blocked supply of food and medicine to 2.3 million people.

Surely, there must be a WhatsApp group of overworked paramedics and doctors who have struggled to get hold of simple supplies such as oxygen cylinders for their patients. And just by virtue of being on that WhatsApp group, any one of them could have been the target of Israel’s deadly airstrikes.

Reuters

Palestinians charge their mobile phones from a point powered by solar panels provided by Adel Shaheen, an owner of an electric appliances shop

Lavender, the powerful AI system, used by Israel to identify targets in Gaza was probably trained on data gleaned from such WhatsApp groups, says Paul Biggar, a software engineer and founder of Tech For Palestine.

The AI programmed to look for Hamas members also potentially targeted civilians if they shared the same WhatsApp group.

“It is also important to note that the Lavender system is a misapplication of AI,” he tells TRT World.

While Lavender suggests that people might be on similar WhatsApp groups, it does not mean they are members of Hamas or were involved in any violence.

Reuters

Displaced Palestinians shelter at a tent camp in Rafah

“It is a "pre-crime" system and should never be used without (a) thorough investigation of all suggested targets," Biggar says.

He explains the targets are "rubber stamped," directly implicating Israel in the targeting of civilians.

"Each target named is a de facto civilian until proven otherwise - Lavender does not prove anything. This is just an attempt at "ethics washing" - using an AI system to accomplish a desired but immoral or illegal goal, and then blaming the AI," Biggar says.

"Anyone involved in this system is de facto targeting and killing civilians in direct contravention of international law and should be prosecuted appropriately."

Earlier this month, the +972 Magazine reported that Israel is using Lavender to target tens of thousands of Palestinians and single them out systematically, with "minimal human oversight and a lenient approach to casualties."

Reuters

How joy became fear - Gazan couple's wedding dreams destroyed by war

Since October 7th, the Israeli military has killed at least 34,097 Palestinians, mostly women and children and injured 76,980 people.

While the Israeli military used Lavender’s algorithm to select suspected Hamas members, it relied on another AI tool to pinpoint its targets in Gaza.

The Conversation says the second AI tool "grotesquely" named as "Where's Daddy?" is a geolocation tracking system to follow suspected Hamas members to their homes before attacking them.

Biggar says Meta must answer if it had cooperated with the Israeli military and shared the information of encrypted WhatsApp groups.

In an article, he wrote about how Israel's system of "pre-crime" assessment allows the military to use AI to "guess" who to kill and then the military strikes them at their family homes with the help of tracking AI Where's Daddy.

WhatsApp has denied sharing information with anyone. It told the MEMO there are no backdoors; it does not provide "bulk" information to any government.

WhatsApp parent, META, formerly Facebook, insists on carefully reviewing, validating, and responding to law enforcement requests by applicable law, are consistent with "internationally recognized standards, including human rights," and pledge to safeguard people's data.

Despite the rebuttal, Biggar insists there is more that the company should do.

"Certainly, Meta's response is not the sort of response we would expect from a company that was concerned about this. They should have announced a top-to-bottom audit of policy, personnel, and systems, to try and determine if their WhatsApp users were being targeted, and whether they were safe," he says.

Biggar says Meta should review its overall policy on Palestine considering what has happened on its other platforms in the past few months.

"It should include an audit of how Palestinian viewpoints are suppressed on Instagram and Facebook, as these are naturally linked. It should also include an audit of the involvement of former IDF (Israeli Defence Force) personnel at Meta, especially former Unit-8200 members, and especially Guy Rosen, their CISO, as well as all staff in Israel," he says.

Rosen, an Israeli, is the Chief Information Security Officer of Meta.

In recent weeks, Meta also introduced a new feature on Instagram that automatically limits users' exploration to so-called 'political' content, which critics say is censoring pro-Palestinian voices and content.

Beyond the devastation in Gaza, Biggar says advanced Israeli technology, including spyware is "battle tested" on Palestinians and then sold to other countries.

Route 6