Why Google's new contract with Israel is 'unethical' and worrisome

Tech behemoths like Google and Amazon are becoming partners in Israel's digital surveillance, which seeks to suppress pro-Palestinian views online.

A $1.2bn deal of cloud computing system built by Google and Amazon helps provide the Israeli government with advanced artificial intelligence and machine-learning tools
Reuters

A $1.2bn deal of cloud computing system built by Google and Amazon helps provide the Israeli government with advanced artificial intelligence and machine-learning tools

With several groundbreaking investigations that revealed details of Israeli authorities' use of surveillance to police and control Palestinians, one could only imagine the constant state of anxiety and increased feelings of alienation among them.

For years, Palestinians have been subjected to multiple layers of surveillance to restrict freedom of expression, suppress Palestinian voices and discourage their autonomy. 

But this time around, Israeli surveillance capabilities have moved up a notch, and Google, as well as Amazon, have become enablers of it.

In fact, the tech giants are helping on a grand scale.

Project Nimbus

A $1.2bn deal of cloud computing system, Google Cloud Platform, built by Google and Amazon, helps provide the Israeli government and its military with advanced artificial intelligence and machine-learning tools. 

The contract, signed in May 2021, increases the country's use of digital surveillance in occupied Palestinian territories. 

Nimbus training documents mention that the new cloud would give Israel capabilities for face detection, automated image categorization, object tracking, and even sentiment analysis, which purports to evaluate the emotional content of pictures, speech, and writing.

AutoML, a Google AI tool offered through Nimbus, accelerates the process of developing a model specifically matched to a customer's needs using that customer's own data and designs.

Israel could use AutoML to use Google's computational power to train new models with its own government data for just about any purpose.

The Intercept examined a Nimbus webinar in which a Q&A session that followed a presentation provided an example of the potential usage and abuse of AutoML. 

The Google Cloud developers on the call said it would be possible to process data through Nimbus to determine if someone is lying. 

Imagine a world where AI is used to decide whether or not to arrest you.

And wait, there is more. 

The contract ensures continuity of service even if the tech giants come under pressure to boycott the country.

It also prevents Google from denying services to specific Israeli government entities, such as the Israeli army.

A Google employee, Ariel Koren, 28, faced retaliation for raising concerns about Nimbus. She has accused the tech giant of profiting from the systemic abuse of Palestinians and is quitting the company this week.

"I joined Google to promote technology that brings communities together and improves people's lives," Koren told The Intercept.

"Not service a government accused of the crime of apartheid by the world's two leading human rights organizations."

Google's AI principles state that the company won't use AI to harm people, create weapons, or carry out surveillance against international law. 

But despite these principles, Google has been a part of similar projects in the past too. 

It has also "come under fire in the press for its retaliation against workers critical of unethical contracts," Gabriel Schubiner, a Google employee and Jewish Diaspora for Tech activist told The Jerusalem Post. 

In 2018, Google and the US Department of Defense collaborated on Project Maven, an AI project that could be used to increase the precision of drone strikes. However, Google decided not to extend the contract in response to employee protests.

In response to criticism from free speech activists and employees, Google also halted the development of Dragonfly, a news and search product for China subject to censorship.

Palestinian activists, journalists, academics, and residents have long called attention to the degrading ways in which new surveillance techniques violate their right to privacy. 

In many ways, the Palestinian example demonstrates how technologies may be weaponized, mainly to force self-censorship and cooperation from people living in occupied regions or from minority groups that governments have flagged as dangerous or threatening.

“We need to ask ourselves: Do we want to give the nationalist armies of the world our technology?” a Jewish Google employee Gabriel Schubiner said in a testimony.

“Or do we need to stand by the original theory behind Google: that we can make money without doing evil.”

Route 6