Lobbying

Facebook controversy stokes digital privacy debate

The controversy surrounding Facebook and Cambridge Analytica is triggering renewed debate over digital privacy in the United States and the potential for regulations around how tech companies collect and transfer data on consumers for commercial purposes.

Facebook has been the subject of massive scrutiny for more than a week after it was revealed that Cambridge Analytica, a data firm tied to President Trump, used a quiz app on the platform to harvest data on 50 million users that it used for political ad targeting. Many of those users never consented to having their data collected.

{mosads}

The revelations have become a scandal for the social media giant, which now faces a slew of investigations from federal and state officials. On Monday, the Federal Trade Commission (FTC) publicly disclosed that it is investigating Facebook privacy practices, amid charges the company may have violated a 2011 consent decree with the FTC.

Facebook CEO Mark Zuckerberg is reportedly prepared to testify before Congress on data privacy in April.

Privacy advocates say the developments point to a broader issue: the lack of a single, comprehensive federal law in the U.S. governing consumer privacy online.

Currently, the U.S. takes a patchwork approach to data privacy, with some laws instituting protections for certain kinds of personal data, such as the 1996 Health Insurance Portability and Accountability Act, which added protections for sensitive medical information.

Meanwhile, other countries have moved to implement much broader frameworks to protect consumer data. The European Union is poised at the end of May to begin enforcing the General Data Protection Regulation (GDPR), which requires businesses to protect data on its citizens even when they export that data outside of the EU.

“We have not put the proper restrictions in place to give users control,” said Amie Stepanovich of Access Now, a digital privacy advocacy group. “These are basic tenets of data protection that a good portion of the world has written in the law and the U.S. had lagged behind.”

On Sunday, Sen. Mark Warner (D-Va.) signaled the need for Congress to explore regulations around how tech companies handle consumer data following the controversy.

“All of these social media platform companies have said they have no responsibility for any of the content. I think we have to relook at that,” Warner told NBC News.

“I think we have to relook at the fact that if you move from one company to another, maybe you should be able to move all your data,” Warner said, later adding, “I don’t want to regulate these companies into oblivion.”

Warner, along with Sen. Amy Klobuchar (D-Minn.) and other lawmakers, has already pushed for more rules around social media advertising disclosures after Facebook revealed last year that it unknowingly sold $100,000 in political ads to Russia-linked accounts before the 2016 presidential election.

More regulations on how companies use and transfer consumer data for commercial purposes could have tremendous implications for tech companies, which have operated largely unchecked since their founding.

The Cambridge Analytica saga has ignited a firestorm of concern because the Facebook users who participated in the survey app did not know their data would be used for political purposes. Facebook also says the research firm failed to delete the data after being notified in 2015, in violation of the social media giant’s policies.

“To me it’s a question of, how far afield can data about me go before I feel like I’ve lost control of [my data] online,” said Nuala O’Connor, president of the Center for Democracy and Technology. “People feel violated when their data is used in ways they didn’t entirely expect.”

Facebook only publicly acknowledged the matter and suspended Cambridge Analytica in mid-March, as news reports of the controversy began to emerge.

Mike Horning, a professor who researches social media companies, thinks that technology companies are at a make-or-break point because of the Cambridge Analytica scandal.

“If companies don’t get serious about regulation themselves or giving people more mechanisms to control what info they give away, I can see the need for federal regulation,” Horning said.

Some groups think that while a better regulation system is necessary, conducting audits on data, in a manner similar to financial audits, could provide the transparency that many companies currently provide.

“We have no idea what they’re collecting and using, so to have independent auditors to go in and uncover that would be helpful,” said Jamie Lee Williams, an attorney at the Electronic Frontier Foundation, a technology consumer advocacy group.

Williams said data auditing can help consumers make more educated choices about which applications they should give their information to in lieu of clear regulation or potentially unhelpful regulation.

“Sometimes the regulators don’t understand how these platforms work, which can be counterproductive. Auditing could help. You don’t need regulation for that,” she said.

But critics, such as Don Graham, a former Washington Post publisher and Facebook board member, make the case that any extra regulation of technology platforms will hurt more than it helps.

“Technology companies must move fast; regulation slows things down, sometimes drastically. Almost inevitably, this hurts a company’s performance,” Graham wrote in a Washington Post op-ed.

If consumers get fed up with technology companies’ data collection practices, he argued, then they can simply leave the platforms. These natural incentives and penalties guide them toward self-regulation.

Still, Jennifer Daskal, an American University law professor, said there is “broad-based consensus” within the U.S., including within the tech sector, for more oversight and accountability of how companies use consumer data.