Google reportedly warned its employees this week that at least one person has been diagnosed with measles at the company’s Silicon Valley headquarters.
The measles diagnosis, first reported by BuzzFeed News, comes as Google’s search engine and its video streaming platform, YouTube, face scrutiny for promoting anti-vaccine content.
{mosads}The highly contagious disease has seen a resurgence in the U.S. this year after it was eliminated almost 20 years ago. Experts have attributed the recent outbreak in part to the spread of anti-vaccine misinformation proliferating online on sites like YouTube, which is owned by Google.
“This note is just a precaution,” Google’s occupational medicine physician David Kaye wrote in an email to some employees, according to BuzzFeed, which obtained a copy of the message. “A Googler who was in Charleston 1295 [a Google office] on Thursday, April 4, has been diagnosed with measles.”
“We have been working with the Santa Clara County Public Health Department and they would like us to share this measles advisory, which contains information on measles, exposure risks and actions to be taken,” Kaye wrote.
Five Google employees told the news outlet that they had not received the email or been told about the measles exposure.
The current measles outbreak has sickened at least 555 people across almost two dozen states, and it has been linked to the rise of the anti-vaccination movement that promotes misinformation about the possible side effects of vaccines.
YouTube this year said it would ban channels that promote anti-vaccine content from running advertisements, amid pressure from lawmakers such as Rep. Adam Schiff (D-Calif.).
Schiff wrote in a February letter to Google’s and Facebook’s CEOs that he was concerned their platforms serve as a conduit to spread anti-vaccine misinformation.
The House Intelligence Committee chairman wrote he is worried that YouTube, Facebook and Instagram are “surfacing and recommending messages that discourage parents from vaccinating their children, a direct threat to public health, and reversing progress made in tackling vaccine-preventable diseases.”
YouTube announced earlier this year that it is tweaking its algorithms to stop recommending videos with misinformation to users, including videos that promote “anti-vax” conspiracy theories.