The views expressed by contributors are their own and not the view of The Hill

The Christchurch massacre was another internet-enabled atrocity

Getty Images


We are reminded every time we stream a movie, search Google or receive an email of the immense benefits that flow from the internet.

The latest mass killing in New Zealand is a reminder that these benefits are not unalloyed. Trolling, bullying, fake news, election manipulation, hate speech, international and domestic terrorism have all become internet-enabled abuses, incited by, propagated by, sometimes organized by and concealed by online activity. Calls to rein in such malign behavior have grown, so far to limited effect. 

{mosads}The ubiquity and the anonymity associated with the online world make fixing responsibility for abuses and designing remedies exceptionally difficult.

Who should be held accountable for abusive content online, the author or the publisher? That is, should it be the creator of the content or the hosting site? And what role should government play, if any, in regulating such activity?

The individual certainly bears responsibility for his or her work. Even in the United States not all expression is free. Speech involving fraud, conspiracy to commit crimes and, under some limited circumstances, even the simple encouragement to do so is illegal, whether conducted online or in person.

It is forbidden, for instance, to post, transmit or even download child pornography. The originator, the transmitting site and even the recipient can each be prosecuted. But the anonymous nature of much social media authorship, particularly that of a transgressive nature, makes individual accountability difficult to identify and even harder to penalize. 

There are also civil penalties under U.S. law for slander, libel and defamation. In addition to the authors, the print publishers and broadcasters can also be sued for carrying such material. Social media sites, however, cannot be sued, having been granted an explicit exemption from such liability by the Communications Decency Act of 1996.

Congress could choose to remove that exemption. The government could also become more active in enforcement against various criminal forms of online behavior, as it has already with child pornography.

Taken very far, however, such enforcement could become expensive, intrusive and quite unpopular without becoming very effective.

There are other ways to encourage social media sites to exercise greater discretion regarding their content. Freedom of the press means freedom to carry or not carry whatever content the publisher chooses. And society can evaluate for-profit enterprises by the choices it makes.

Freedom of speech does not carry with it a freedom to be published. Facebook is no more obliged to accept a posting than the New York Times is to print a submitted article. And so it is the social media companies themselves that bear the social responsibility for their content. 

Increasing the legal liability of internet sites to lawsuits for defamation might inspire some greater care on their part, and the threat of broader government regulation is already doing so. But users and advertisers can also have a major impact when they act in a concerted fashion, as they sometimes do.

For the market to provide a potent form of internet discipline, the public would need to hold social media companies morally and commercially responsible for the content they disseminate.

This means moving away from the conception of social media sites as passive transmitters of individual expressions, like the phone company, to see them as active moderators whose algorithms sort, organize, cull and display content in a calculated fashion for which they should be held responsible.

Exercising such editorial control presents a major challenge given the immense volume of material that flows through and remains on these sites, but it is ultimately a task only the social media companies themselves have the capacity to accomplish.

Although social media companies could rise to this steep challenge of their own volition, it seems likely that it would be individuals, through their choices as both voters and consumers, and businesses, through their choices as advertisers, who might be the most likely to nudge them in that direction. 

James Dobbins is the former U.S. ambassador to the European Union. He holds the Distinguished Chair in Security and Diplomacy with the RAND Corporation.

Tags Advertising Christchurch massacre Digital media Facebook Freedom of speech Human behavior media new media New Zealand Social information processing Social media Social media marketing World Wide Web

Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.