The views expressed by contributors are their own and not the view of The Hill

Without transparency, we can’t trust robot operators — especially in a pandemic

Aaron Schwartz

The coronavirus pandemic is exponentially increasing the need for robots in our already exceedingly robo-reliant lives. For example, between March 2 and April 14, virtual urgent care visits at NYU Langone Health grew by 683 percent, and non-urgent virtual care visits grew by an unprecedented 4,345 percent in response to COVID-19.

If we continue along this trajectory, our day-to-day lives will become even more robot intensive, with robots increasingly making or advising our decisions. This new reality will offer some benefits. But should we trust the operators of these intelligent robots that are becoming the fabric of our everyday lives? The answer is no, not unless their objective functions and use of data are transparent to society. Specifically, what is their objective function maximizing and based on what data? The alternative is further risk of data misuse and further power concentration of digital platforms.

Our “new normal” has an eerie similarity to the planet of Solaria in Isaac Asimov’s “The Naked Sun,” where phobia of “Earth germs” has virtually eliminated all physical interaction. Communication occurs via perfect holograms that are indistinguishable from reality. Touch, considered dirty, is reserved mostly for essential acts such as procreation. Humans live long healthy lives free of disease, each served by 20,000 specialized robots. Necessary physical interaction requires, yes, six feet of separation.

A policeman from Earth is called by the Solaria council to solve its first murder. He discovers that a clever human operator orchestrated multiple robots, without their collective realization, to enable the murder in violation of their fundamental law of not harming humans. In this story, the culprit turned out to be an engineering genius in Solaria’s robot research center who had evil intentions of invading other human habitations using Solarian robots.

Our current germ phobia bears a striking similarity to Solaria’s extreme form of social distancing. Curiously, however, Asimov was silent about who makes and owns the many robots on Solaria that appear magically on-demand to satisfy the humans. Is the entity regulated?

Our economy also bears a striking resemblance to Solaria’s robot economy. Millions of machines around us make billions of decisions for us every day. The “tech” component of the S&P 500, which accounts for roughly 20 percent of the index, is highly understated considering that many industries are already essentially machines operated by humans. Banks, for example, like other internet platforms, are machines overseen and aided by humans whose numbers continue to diminish while the machines proliferate in number and function. Supply chains are run by robots. Mass transportation is largely machine-based, and individual navigation is headed in that direction. Even customer service, a human forte, is increasingly robot-based.

Economic growth is almost entirely robotic at the moment. After the last financial crisis, the increase in market value has come almost entirely from a handful of the robotic giants. Since the recent market meltdown in late February and early March of 2020, valuations of these companies and others with robotic services that serve the new virtual world have skyrocketed, even while the physical economy has cratered. In effect, markets are validating the virtuous cycle of data leading to knowledge leading to more power.

What is notable is that individual behavior is transparent to the robots, but the robots and their operators’ intentions are opaque to individuals. As data acquisition costs go to zero, it creates a rapidly growing “information asymmetry” between individuals who feed the algorithms and their operators who specify their objective functions, which remain shrouded from us and even the government.

Consider the evidence. The decisions of the robot operators to date have demonstrated their unbridled exploitation of personal data and society’s “behavioral surplus.” This has created some serious unintended consequences such as the Facebook scandal during the last U.S. presidential election. Robot operators might argue that individuals have entered this Faustian bargain willingly, but the research shows that people do not understand the implications of the bargain. While economists note that asymmetries of information are a fact of everyday life, gross imbalances can lead to unchecked power.

In a 2002 article, the Nobel Laureate Joseph Stiglitz warned that in such situations “the effectiveness of the check that can be provided by the citizenry is limited; without good information, the contestability of the political processes can be undermined.” Other research has similarly argued that in the absence of greater transparency, the deleterious effects of unequal access to information will continue and deepen.

In two short months, we have moved towards Solaria. We have become phobic about touch. We don’t trust that other humans are safe, and we have become wary of physical proximity to other humans. In the emerging era of increased dependence on robots, should we trust that their operators, such as Mark Zuckerberg, Jack Dorsey, Sundar Pichai, Satya Nadella and their successors, will design their machines for our larger social good? The answer is obvious: Not unless their objective functions and use of data are made transparent to their users.

Vasant Dhar is a professor at New York University’s Stern School of Business and the director of the PhD program at the Center for Data Science.

Tags #coronavirus #2019nCoV #contagion Alphabet coronavirus Facebook Google Human–robot interaction Jack Dorsey Joseph Stiglitz Mark Zuckerberg Microsoft Corp. Robot robotics Satya Nadella Sundar Pichai Three Laws of Robotics Twitter

Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.