America shouldn’t follow German lead in tackling fake news problem
Facebook announced on Thursday that it was beginning a partnership with organizations including the Associated Press, ABC News, and Snopes to flag “fake news,” an action aimed at elevating the status and power of those organizations as arbiters of truth.
The move could cause the door to outright censorship – which is already ajar – to creak open even further.
{mosads} I recently returned from Germany, where I spent three weeks in a start-up incubator aimed at introducing young companies to Berlin. I was enjoying a nice bratwurst and Glühwein at a Christmas market with a friend when the conversation turned to the election of Donald Trump. We discussed a number of political issues, but perhaps most striking in a German context was our discussion of freedom of expression.
In Germany, protections for freedom of expression are weak. What we consider free speech is often subject to criminalization. Section 185 of the criminal code punishes “insults.” Sections 186 and 187 address “gossip and malicious defamation,” and there are further prohibitions for disparagement of the Federal President, the State and its Symbols, and other foreign leaders.
Some prohibitions, like the use of the swastika, are for the better. Others are rightly condemned by organizations like Human Rights Watch. In April, President Merkel authorized the prosecution of a satirist who wrote a highly offensive poem about Turkish President Recep Tayyip Erdogan.
We should ask ourselves: could technology companies play this role in the United States?
Yes.
While our free speech protections are strong, the U.S. has a long history of elected officials and corporations tamping down ideas they don’t like. Tech and media companies now risk falling into that same category. Efforts to label sites as patently untruthful are attractive but could easily mutate into the censorship of political speech.
As Jeff Jarvis and John Borthwick wrote, “circular discussions about what is fake and what is truth and whose truth is more truthy masks the fact that there are things that can be done today.”
Infowars is an example of a site that is concerning in the “fake news” context. Many of their articles have a taste of truthiness but spread mis–or dis–information.
“FACEBOOK, TWITTER BANNING FREE SPEECH TO FORM VIRTUAL ‘SUPERSTATE,’” said the headline of one December article. “Both Facebook and Twitter are encouraging users to give up their own national identities in favor of globalism,” it claimed.
The article also cites the Mises Institute, an organization categorized by the Southern Poverty Law Center as neo-Confederate in ideology.
Infowars recently published another article arguing that Hillary Clinton wore purple because she is active in George Soros’s “color revolution” against the Trump presidency. I’m colorblind – does that mean I can’t participate?
The issue is that Infowars masquerades opinion as news. “Truthy” articles are far more dangerous than obviously ridiculous articles, like those claiming Bill Cosby was selected as Trump’s secretary of women’s issues.
It is even more distressing that disinformation online led someone to bring a gun into the Comet Ping Pong pizza shop in Washington, D.C. this month. It illustrates that online falsehoods could catalyze violent reactions in the real world next week, next month, or next year.
Still, Germany serves as an example of what American institutions should avoid when it comes to speech policies. As someone new to the technology world with a background in political advocacy, I am concerned about companies like Google, Facebook, and Twitter becoming more powerful controllers of truth in content. With algorithms, political content is already filtered according to our preferences.
Observers including Jarvis and Borthwick have provided ideas for censorship-free actions that companies, publishers, and media organizations could take today to move forward. Those include expansion of user-centered systems for verifying sources, offerings of additional content that is outside a user’s traditional “filter bubble,” the hiring of experienced journalists and editors to inform non-media companies on professional practices and credibility, and an investigation of the relationship between design choices and content visibility.
The suggestions are good ones. They encourage us to step out of echo chambers, rather than trying to regulate our way out.
Connecting people across all forms of distance in honest conversations is what I do every day. We should support solutions that reduce online falsehoods without large tech corporations increasing their editorial influence.
Maybe all we need is to talk to each other.
Jake Levin (@JakebLevin) is chief of staff at Shared Studios, an arts, media, and technology collective that creates physical portals to connect people across distance. The project currently has portals in 25 locations around the world, including Berlin’s Tempelhof Airport, Washington, D.C.’s Holocaust Museum, and Harsham IDP camp in Erbil, Iraq.
The views expressed by contributors are their own and are not the views of The Hill.
Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.