The views expressed by contributors are their own and not the view of The Hill

With big data collected in ways that people do not know, we need to rethink privacy

Getty Images


The recent congressional hearings on online privacy raised important questions about how individuals can have trust and confidence in the technology they use. Unfortunately, in the discussions about the need for a regulatory approach that provides individuals with greater rights and more control over their privacy, many of the comments pointed towards outdated solutions that no longer will work.  

Those approaches focus on providing notice to individuals about how their data will be used, and then giving them options of whether to provide the data and allow the company to use the information for a specific purpose. This “notice and choice” method is, at best, only a partial solution. Those of us who have been working on these issues for more than two decades encourage policymakers to rethink privacy — the topic of an Intel initiative.

{mosads}The future of technology will allow for data collection from a variety of computing devices, including many that are not visible to individuals. Connected cars, the internet of things, internet-enabled security cameras, sensors in smart infrastructure such as roads, bridges and tunnels, and future health care devices will allow for the collection and use of data to remedy some of the world’s most challenging problems — climate change, traffic congestion, food shortages and treatments for cancer and chronic diseases. It is critical that we find methods that protect privacy while allowing for the innovative and ethical use of data.

 

Notice and choice will still play an important role in privacy, but to adequately protect individuals, that approach is incomplete for at least three reasons.

First, the great variety of data collection devices would create an impossible burden on people to review privacy policies in detail enough to effectuate control over their data. Research on privacy policies shows that individuals do not read them. With an increase in data collection in all areas of people’s lives, it would be unfair to expect that individuals shoulder the burden of making every choice as to how data relating to them will be used. Choice and control need to be reserved for decisions such as the storing of location data or the importation of contacts data.

Second, considerable amounts of personal data is collected in ways that people do not know it is being obtained, such as cameras capturing video in public places. These observational collections of data will increase as we include sensors in homes, businesses, roads, parks and airports.  Companies can and will create technologies to alert the individual to some data collection, but that is considerably different than putting the burden on the person to control this collection.

Third, increasing amounts of sensitive personal data are provided by a third party or are obtained from public records. An example of this is mentions on social media of other people’s health situations, sexual orientation, religious beliefs, political opinions, race or ethnicity. There are important free speech implications to the availability of this data, and limited ability for an individual to control its collection and dissemination. Yet the use of such data has the potential to harm both the individual and society.

The increasing power of artificial intelligence and other big data tools allows aggregators of data to learn much more about individuals than one might think possible. It is not reasonable to expect that individuals are going to fully exercise control over these profilers and data brokers.

Instead, there are opportunities to focus on accountability and use controls, along with the other Fair Information Practice Principles, which have guided the development of privacy legislation for over 30 years. Accountability obligations require organizations to put the right people, policies and processes in place to effectively protect privacy. Requirements such as having a privacy officer, adequate privacy compliance funding, robust employee training, internal risk assessments, and investigative reviews to make certain that partners and vendors behave properly are all part of what it takes to demonstrate responsibility.   

Similarly, legislative restrictions on the use of data are part of our landscape, with laws such as the Fair Credit Reporting Act. We can — and should — place new use restrictions into a comprehensive privacy law that improves the Federal Trade Commission’s ability to prevent individual and societal harm.

I have written often about how that approach can work, and how it applies to the future of technology. The world’s best companies should work with civil society and Congress to take these concepts and create a forward-leaning and comprehensive privacy law that is optimized for our country’s culture of innovation.

David A. Hoffman is associate general counsel and global privacy officer at Intel Corporation. In 1999, he founded Intel’s privacy team. He is also a senior lecturing fellow at his alma mater the Duke University School of Law.

Tags Big data Digital rights FTC Fair Information Practice Information Internet of Things Internet privacy Privacy

Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.