Search

Disentangling Disinformation: Not as Easy as it Looks - EFF

Body bags claiming that “disinformation kills” line the streets today in front of Facebook’s Washington, D.C. headquarters. A group of protesters, affiliated with “The Real Facebook Oversight Board” (an organization that is, confusingly, not affiliated with Facebook or its Oversight Board), is urging Facebook’s shareholders to ban so-called misinformation “superspreaders”—that is, a specific number of accounts that have been deemed responsible for the majority of disinformation about the COVID-19 vaccines.

Disinformation about the vaccines is certainly contributing to their slow uptake in various parts of the U.S. as well as other countries. This disinformation is spreading through a variety of ways: Local communities, family WhatsApp groups, FOX television hosts, and yes, Facebook. The activists pushing for Facebook to remove these “superspreaders” are not wrong: while Facebook does currently ban some COVID-19 mis- and disinformation, urging the company to enforce its own rules more evenly is a tried-and-true tactic.

But while disinformation “superspreaders” are easy to identify based on the sheer amount of information they disseminate, tackling disinformation at a systemic level is not an easy task, and some of the policy proposals we’re seeing have us concerned. Here’s why.

1. Disinformation is not always simple to identify.

In the United States, it was only a few decades ago that the medical community deemed homosexuality a mental illness. It took serious activism and societal debate for the medical community to come to an understanding that it was not. Had Facebook been around—and had we allowed it to be arbiter of truth—that debate might not have flourished.

Here’s a more recent example: There is much debate amongst the contemporary medical community as to the causes of ME/CFS, a chronic illness for which a definitive cause has not been determined—and which, just a few years ago, was thought by many not to be real. The Centers for Disease Control notes this and acknowledges that some healthcare providers may not take the illness seriously. Many sufferers of ME/CFS use platforms like Facebook and Twitter to discuss their illness and find community. If those platforms were to crack down on that discussion, relying on the views of the providers that deny the gravity of the illness, those who suffer from it would suffer more greatly.

2. Tasking an authority with determining disinfo has serious downsides.

As we’ve seen from the first example, there isn’t always agreement between authorities and society as to what is truthful—nor are authorities inherently correct.

In January, German newspaper Handelsblatt published a report stating that the Oxford-AstraZeneca vaccine was not efficacious for older adults, citing an anonymous government source and claiming that the German government’s vaccination scheme was risky.

AstraZeneca denied the claims, and no evidence that the vaccine was ineffective for older adults was procured, but it didn’t matter: Handelsblatt’s reporting set off a series of events that led to AstraZeneca’s reputation in Germany suffering considerably. 

Finally, it’s worth pointing out that even the CDC itself—the authority tasked with providing information about COVID-19—has gotten a few things wrong, most recently in May when it lifted its recommendation that people wear masks indoors, an event that was followed by a surge in COVID-19 cases. That shift was met with rigorous debate on social media, including from epidemiologists and sociologists—debate that was important for many individuals seeking to understand what was best for their health. Had Facebook relied on the CDC to guide its misinformation policy, that debate may well have been stifled.

3. Enforcing rules around disinformation is not an easy task.

We know that enforcing terms of service and community standards is a difficult task even for the most resourced, even for those with the best of intentions—like, say, a well-respected, well-funded German newspaper. But if a newspaper, with layers of editors, doesn’t always get it right, how can content moderators—who by all accounts are low-wage workers who must moderate a certain amount of content per hour—be expected to do so? And more to the point, how can we expect automated technologies—which already make a staggering amount of errors in moderation—to get it right?

The fact is, moderation is hard at any level and impossible at scale. Certainly, companies could do better when it comes to repeat offenders like the disinformation “superspreaders,” but the majority of content, spread across hundreds of languages and jurisdictions, will be much more difficult to moderate—and as with nearly every category of expression, plenty of good content will get caught in the net.

Adblock test (Why?)



"easy" - Google News
July 29, 2021 at 05:30AM
https://ift.tt/3A049tp

Disentangling Disinformation: Not as Easy as it Looks - EFF
"easy" - Google News
https://ift.tt/38z63U6
Shoes Man Tutorial
Pos News Update
Meme Update
Korean Entertainment News
Japan News Update

Bagikan Berita Ini

0 Response to "Disentangling Disinformation: Not as Easy as it Looks - EFF"

Post a Comment

Powered by Blogger.