Tehilla Shwartz Altshuler, Israel Democracy Institute, August 6, 2019
With permission, read full article at IDI.
Imagine a growing Israeli startup whose product is deepfake videos that are based on artificial intelligence and appear to be utterly authentic. The company’s marketing efforts, according to its website, are conducted by two departments — “consulting for corporations” and “consulting for governments and politicians.” In addition, “the company helps its customers uncover their opponents’ weak spots and make them go viral.”
Finally, imagine that the company describes its employees as “highly experienced men and women, graduates of elite units of the IDF intelligence branch and Israeli government intelligence agencies,” and that its technology is based on developments by these same security agencies. On top of all of this, of course its board of directors includes former heads of Mossad and the Israeli General Security Service (Shin Bet), as well as retired senior army officers.
When you are done imagining this, it’s time to think about the private intelligence firm Black Cube. Various investigative reports published recently in the media in Israel and abroad paint a troubling picture — not because the company is violating the law, but because of its lack of ethics and internal moral code.
According to these reports, Black Cube does not work only for giant corporations that want to dig up incriminating information about their competitors, it also has contracts with foreign governments that seek to repress political opponents. It not only helps governments find those who are evading their financial obligations, but also to harass women who complain about crimes of sexual violence. Not only does it identify those who defame rival businesses, but it also frightens off regulators and watchdogs, human rights activists and journalists.
Black Cube, of course, is not alone in this. Have you ever heard of NSO, whose flagship product, Pegasus, can turn any cellphone into a mobile spying device? Or Glassbox and its product line? The list of such companies is long, and most of them are all but unknown. All of them are based on exploiting the skills, technology and professional culture created in the Israeli security establishment.
There is nothing new about former members of the Israeli defense and security agencies selling weapons and military know-how. But what has been added in recent years is the technology twist. Former high-ranking security officials and intelligence operatives, including from the renowned 8200 unit, strike out on their own. Some of them find employment in firms that break new ground, improve the world and better society; but others, in their greed, are willing to sell spyware and offensive cyber-weapons to dictators in Africa who need them to stamp out criticism and revolts.
This is also not a situation unique to Israel. Veterans of western security agencies worldwide face similar dilemmas once they retire from their careers in public service and seek their next professional challenges. The startup nation however, is based, to a large extent, on veterans of Israel’s high-tech units in the defense establishment. While this association certainly does bring honor, prestige, revenue and jobs to the Israeli economy, two issues resulting from this relationship need to be considered.
Technology can make the world a better place — or much worse.
The first relates to ethics. If anything is clear today in the world of technology, it is the need to include ethical concerns when developing, distributing, implementing and using technology. This is all the more important because in many domains there is no regulation or legislation to provide a clear definition of what may and may not be done. There is nothing intrinsic to technology that requires that it pursue only good ends. The mission of our generation is to ensure that technology works for our benefit and that it can help realize social ideals. The goal of these new technologies should not be to replicate power structures or other evils of the past.
Startup nation should focus on fighting crime and improving autonomous vehicles and healthcare advancements. It shouldn’t be running extremist groups on Facebook, setting up “bot farms” and fakes, selling attackware and spyware, infringing on privacy and producing deepfake videos.
The second issue is the lack of transparency. The combination of individuals and companies that have worked for, and sometimes still work with, the security establishment frequently takes place behind a thick screen of concealment. These entities often evade answering challenging questions that result from the Israeli Freedom of Information law and even recourse to the military censor — a unique Israeli institution — to avoid such inquires.
How can we know when the government permits to be sold, and to whom, technologies that were developed by the private sector but that have security implications? How can we know who intervenes when a foreign country in Europe arrests spies sent by a commercial firm, or when a Gulf state is targeted by an Israeli high-tech company? How can we know when the companies are serving the national interest, and their own bottom line — and who gets to decide this, anyway? And what is the impact on the defense establishment itself with the migration of its stars directly from national service to high tech? What effect does this have on the state’s decision-making process about which technologies to invest in, whom it trains and what it purchases?
Israel, and its tech business community, must carefully consider the negative ramifications of excelling in technology while disregarding moral and ethical questions. The “startup nation” must conduct extensive discussions on the crossroads between ethics and technology so as to endow the next generation with the strong moral compass necessary to navigate in this new world. The unanswered question at hand is how Israel, and similar western democracies, can grapple with the growing phenomenon of technological entities whose sole purpose is profit without any qualms about the moral implications of their products and services.
The article was published in Techcrunch.