Facebook reports spike in violent content

Getty

The number of Facebook posts showing graphic violence increased over the first three months of this year, the social media company said in its first ever public release of such statistics.

For every 10,000 posts on Facebook, users posted roughly 22 to 27 pieces of content featuring violent images. In the last three months of 2017, that number was between 16 and 19.

{mosads}The company did not explain the jump in such posts, but Alex Schultz, Facebook’s vice president of data analytics, said releasing the data is a part of an attempt to work with the community to improve Facebook.

“It’s an attempt to open up about how Facebook is doing at removing bad content from our site, so you can be the judge,” Schultz wrote in a post.

“And it’s designed to make it easy for scholars, policymakers and community groups to give us feedback so that we can do better over time.” 

The new numbers were included in Facebook’s quarterly transparency report, which includes other information on how many other posts that violate Facebook’s community standards show up on the platform.

Other categories it provides numbers for include content featuring nudity or sexual activity, terrorist propaganda, hate speech, spam and fake accounts.

In categories where Facebook did provide information on such offending posts, it reported an increase. In every category except fake accounts, Facebook reported also taking more action to remove such content.

Facebook, for example, said it took action against 2.5 million hate speech posts in the first quarter of 2018, an increase from 56 percent in the previous quarter, attributing the uptick in improved detection methods.

The company also said it took action against 837 million spam posts, 21 million posts featuring nudity or sexual activity, 1.9 million posts promoting terrorism and disabled 583 million fake accounts.

Tags

Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.