Welcome to Hillicon Valley, The Hill’s newsletter detailing all you need to know about the tech and cyber news from Capitol Hill to Silicon Valley. If you don’t already, be sure to sign up for our newsletter with this LINK.
Welcome! Follow our cyber reporter, Maggie Miller (@magmill95), and tech reporter, Chris Mills Rodrigo (@chrisismills), for more coverage.
FACEBOOK AUDIT HEAPS ON HURT: Facebook has made decisions in the last year that signal a “significant setback for civil rights,” according to an independent audit of the platform released Wednesday.
In the 89-page report, the final of three commissioned in 2018, Facebook was commended for making some progress since the beginning of the review, but auditors said the company has still not moved fast enough or devoted enough resources to deal with civil rights issues.
“In our view, Facebook has made notable progress in some areas, but it has not yet devoted enough resources or moved with sufficient speed to tackle the multitude of civil rights challenges that are before it,” the auditors wrote.
The report outlined “painful decisions over the last nine months with real world consequences,” including leaving up a post from President Trump with the phrase “when the looting shoots, the shooting starts,” in response to protests over the Minneapolis police killing of George Floyd.
“After the company publicly left up the looting and shooting post, more than five political and merchandise ads have run on Facebook sending the same dangerous message that ‘looters’ and ‘Antifa terrorists’ can or should be shot by armed citizens,” the auditors wrote.
“The Auditors do not believe that Facebook is sufficiently attuned to the depth of concern on the issue of polarization and the way that the algorithms used by Facebook inadvertently fuel extreme and polarizing content.”
FACEBOOK TAKES DOWN STONE ACCOUNTS: Facebook said Wednesday that it removed a network of more than 100 pages and accounts associated with former Trump adviser Roger Stone after determining that they engaged in “coordinated inauthentic behavior” focused on audiences in the U.S.
Nathaniel Gleicher, Facebook’s head of security policy, said in a blog post that the network attempted to deceive the platform’s users on issues related to local politics in Florida, Stone and hacked materials released by WikiLeaks ahead of the 2016 presidential election. People behind the activity used fake accounts to pose as Florida residents and make it appear that the content was more popular than it was, among other things.
Stone, a former Republican operative, was sentenced to 40 months in prison in February after being convicted of lying to Congress, witness tampering and obstructing a legal proceeding. His charges stemmed from former special counsel Robert Mueller‘s investigation into Russian interference in the 2016 election.
The network linked to Stone and his associates consisted of 54 Facebook accounts, 54 Facebook pages and four accounts on Instagram, the photo-sharing app owned by Facebook. Several of the pages were also linked to the Proud Boys, a far-right group banned from Facebook in 2018.
The network analysis firm Graphika also released a report concluding that Stone’s personal accounts were part of the network, whose main purpose was to “amplify” other pages and Stone’s work. The report said activity among the accounts and pages suggested they engaged in coordinated harassment and incitement in some cases.
Stone’s personal Facebook and Instagram pages are no longer active.
Facebook said that the network was most active between 2015 and 2017 and that it had been largely dormant in the following years. The pages had cumulatively garnered about 260,000 followers on Facebook and about 61,500 on Instagram. The network spent more than $300,000 on advertising.
A sample of the content shared by the accounts included posts about Stone, 2016 Democratic presidential nominee Hillary Clinton and FBI investigations.
ELECTIONS OFFICIALS BEG FOR FUNDS: Top state and local election officials on Wednesday begged Congress to appropriate more election funding to address COVID-19 challenges ahead of November.
Congress sent $400 million to states to address COVID-19 election concerns as part of the stimulus package signed into law by President Trump in March, called the Coronavirus Aid, Relief, and Economic Security (CARES) Act. Election officials testified during an Election Assistance Commission (EAC) summit on Wednesday that those funds were running out.
“It’s looking like I spent close to 60 percent of my CARES Act funding on the primary election,” Jared Dearing, the executive director of the Kentucky State Board of Elections, testified. “To put that in context, we are expecting turnout to go from 30 percent, which was a record high for a primary election, to as much as 70 percent.”
Dearing noted that only around 2 percent of ballots in Kentucky are typically cast through mail-in voting, but that number increased to 75 percent during the COVID-19 pandemic, a change he said would require further funds to address.
“Where we procure these funds and how much this is going to cost is incredibly concerning,” Dearing said.
Iowa Secretary of State Paul Pate (R) also testified in favor of the federal government sending more funds, but argued the funds should be sent with fewer strings attached.
“Clearly we welcome more resources, the goal here is we want more stable and consistent funding, because we have COVID, we may be facing COVID in the next elections,” Pate said.
Read more about the election challenges here.
TIKTOK UNDER PRESSURE: The Federal Trade Commission (FTC) and the Department of Justice (DOJ) are reportedly investigating whether TikTok, a Chinese social media app popular among teens, failed to comply with a 2019 agreement designed to protect children’s privacy.
Reuters reported on the federal probe Tuesday, citing two people interviewed in the investigation. The revelation comes as the short-form video platform also faces rising scrutiny from members of Congress and the Trump administration.
Secretary of State Mike Pompeo said Monday that the U.S. was exploring a ban of TikTok and other apps associated with China, citing concerns they have shared user data with the government in Beijing.
A staffer in a Massachusetts tech policy group and another person told Reuters that the FTC and DOJ conducted separate phone calls with them. The focus of the discussions was whether TikTok failed to comply with an agreement reached with the FTC in February 2019 regarding privacy for children 13 and younger.
The Center for Digital Democracy, Campaign for a Commercial-Free Childhood and other groups reportedly asked the FTC in May to look into allegations that TikTok was not living up to the agreement, which required it to delete videos and personal information about children using the app.
“TikTok takes the issue of safety seriously for all our users, and we continue to further strengthen our safeguards and introduce new measures to protect young people on the app,” a company spokesperson told The Hill when asked for comment about the investigation.
“In the U.S., we accommodate users under 13 in a limited app experience that introduces additional safety and privacy protections designed specifically for a younger audience,” the spokesman added.
The FTC declined to comment, and the Justice Department did not immediately return a request for comment from The Hill.
INFO ON DISINFO: Democrats on the House Energy and Commerce Committee are pressuring Twitter, Facebook, and Google to be more transparent about COVID-19 disinformation on their platforms, asking the tech giants to produce monthly reports on the issue.
In letters to the companies sent Wednesday, House Energy and Commerce Committee Chairman Frank Pallone (D-N.J.), along with subcommittee leaders Reps. Diana DeGette (D-Colo.), Mike Doyle (D-Penn.), and Jan Schakowsky (D-Ill.), detailed concerns that the “rise of false or misleading information” around the coronavirus could lead to real-world consequences.
“This disinformation has ranged from false statements about certain groups being immune from contracting the virus to unsubstantiated assertions about masks and vaccines,” the Democrats wrote. “This type of disinformation is dangerous and can affect the health and well-being of people who use this false information to make critical health decisions during this pandemic.”
The European Union last month requested that Twitter, Google, and Facebook produce monthly reports on disinformation seen around COVID-19 and how they were combatting this issue. The three companies told The Verge that they planned to comply with the request.
The lawmakers pointed to this decision in asking the companies to provide similar reports to the House Energy and Commerce Committee, and that the companies brief committee staff by July 22 on disinformation concerns.
“Given the Committee’s jurisdiction over consumer protection and its ongoing oversight efforts around COVID-19 disinformation, we request that your company provide the Committee with monthly reports similar in scope to what you are providing the European Commission regarding your COVID-19 disinformation efforts as they relate to United States users of your platform,” the members wrote.
Read more about the request here.
Lighter click: Freshly created meme
An op-ed to chew on: Will Twitter make @RealDonaldTrump a one-term president?
NOTABLE LINKS FROM AROUND THE WEB:
Thousands of contracts highlight quiet ties between Big Tech and U.S. military (NBC News / April Glaser)
Twitter gave free rein for Jack Posobiec to publish antisemitic hate and disinformation (Southern Poverty Law Center / Michael Edison Hayden)
Virus-tracing apps are rife with problems. Governments are rushing to fix them. (New York Times / Natasha Singer)
The hidden trackers in your phone, explained (Recode / Sara Morrison)
Quibi hoped for 7.5 million subscribers in year one. An analyst says its at 72,000 (Protocol / Janko Roettgers)