Tech companies step up fight against bad coronavirus info
(AP Photo/Richard Drew, File)

800 (3)
Health officials pleased with social media to stop spread of hoaxes. They're listening,

Potentially dangerous coronavirus misinformation has spread from continent to continent like the pandemic itself, forcing the world’s largest tech companies to take unprecedented action to protect public health.

Facebook, Google and others have begun using algorithms, new rules and factual warnings to knock down harmful coronavirus conspiracy theories, questionable ads and unproven remedies that regularly crop up on their services — and which could jeopardize lives.

Health officials, critics and others who have long implored the tech companies to step up their response to viral falsehoods have welcomed the new effort, saying the platforms are now working faster than ever to scrub their sites of coronavirus misinformation.

“It was definitely, within the companies, a shift,” said Andy Pattison, manager of digital solutions for the World Health Organization, who for nearly two years has urged companies like Facebook to take more aggressive action against anti-vaccination misinformation.

Pattison said he and his team now directly flag misleading coronavirus information and, at times, lobby for it to be removed from Facebook, Google and Google’s YouTube service.

Last month, Iranian media reported more than 300 people had died and 1,000 were sickened in the country after ingesting methanol, a toxic alcohol rumored to be a remedy on social media. An Arizona man also died after taking chloroquine phosphate — a product that some mistake for the anti-malaria drug chloroquine, which President Donald Trump and conservative pundits have touted as a treatment for COVID-19. Health officials have warned the drug hasn’t been proven safe or effective as a virus therapy.

Days later, Twitter and Facebook began cracking down in unprecedented ways on posts promoting unverified treatments.

Twitter deleted a post by Trump’s personal attorney Rudy Giuliani that described hydroxychloroqine, a cousin to chloroquine, as “100 percent effective” against coronavirus. The company also removed a tweet from Fox News personality Laura Ingraham touting what she called the drug’s “promising results.”

Other widely shared claims that hydroxychloroquine cures COVID-19 live on. A conservative radio host’s tweet claiming that “ALL hospitals and health care workers are using it with total success” has been shared more than 12,000 times.

In what may be a first, Facebook removed a post from Brazilian President Jair Bolsanaro, who promoted hydroxychloroquine as “working in every place” to treat coronavirus. Twitter also removed an associated video.

Facebook has long resisted calls to fact check or remove false claims directly made by politicians, arguing the public should be able to see what their elected officials say. In this pandemic, however, the platforms have no choice but to rethink their rules around misinformation, said Dipayan Ghosh, co-director of the Platform Accountability Project at Harvard Kennedy School.

“The damage to society is clear cut: it’s death,” Ghosh said. “They don’t want to be held responsible in any way for perpetuating rumors that could lead directly to death.”

Other sites have also tightened their policies.

YouTube began removing videos that claimed coronavirus was caused by 5G wireless networks last week. Some of the videos had racked up hundreds of thousands of views. Google searches for “5G” and “coronavirus” now redirect users news videos debunking the theory.

Facebook-owned private messaging service WhatsApp has limited how many chats users can forward messages to in an effort to limit the spread of COVID-19 misinformation. Since WhatsApp encrypts all messages, it can’t read them to determine if they contain misinformation.

The pandemic has thrown up new challenges to content moderation. Early on, health considerations forced the contractors that employ human moderators to send most of them home, where for privacy reasons they couldn’t do their jobs. Facebook eventually shifted some of that work to in-house employees and leaned more heavily on artificial-intelligence programs. More recently, it has made new arrangements for contract moderators to do their jobs remotely.

Meanwhile, bogus ads for masks, hand sanitizer and unregulated blood tests for COVID-19 still appear on Facebook and Google. And one North Carolina man with 44,000 YouTube subscribers who complained that his videos promoting the 5G and coronavirus theory were removed is now using the platform to hawk $99 subscriptions to view his videos.

The tech platforms point out they are putting facts about the virus from news outlets, fact checkers, and health officials in front of their users when their safeguards fail.

Google “coronavirus” and you’ll be directed to your local health department. Search on Twitter for “coronavirus hoax” and you’ll get a link to the U.S. Centers for Disease Control and Prevention. Watch a coronavirus conspiracy theory video on YouTube and you’ll see a label promoting legitimate news outlets and COVID-19 information from the CDC hovering over it.

“There’s a lot of misinformation when there is a lack of good information,” said Pattison. “People will fill the void out of fear.”

__

Republished with permission of the Associated Press.

Associated Press



#FlaPol

Florida Politics is a statewide, new media platform covering campaigns, elections, government, policy, and lobbying in Florida. This platform and all of its content are owned by Extensive Enterprises Media.

Publisher: Peter Schorsch @PeterSchorschFL

Contributors & reporters: Phil Ammann, Drew Dixon, Roseanne Dunkelberger, A.G. Gancarski, William March, Ryan Nicol, Jacob Ogles, Cole Pepper, Jesse Scheckner, Drew Wilson, and Mike Wright.

Email: [email protected]
Twitter: @PeterSchorschFL
Phone: (727) 642-3162
Address: 204 37th Avenue North #182
St. Petersburg, Florida 33704