As social media sites were flooded with misleading posts about vaccine safety, mask effectiveness, Covid-19’s origins and federal shutdowns, Biden officials urged platforms to pull down posts, delete accounts, and amplify correct information.

Now the Supreme Court could decide whether the government violated Americans’ First Amendment rights with those actions — and dictate a new era for what role, if any, officials can play in combating misinformation on social media.

The Supreme Court is set to hear arguments next month in a case that could have sweeping ramifications for federal health agencies’ communications in particular. Murthy v. Missouri alleges that federal officials coerced social media and search giants like Facebook, Twitter, YouTube, and Google to remove or downgrade posts that questioned vaccine safety, Covid’s origins, or shutdown measures. Biden lawyers argue that officials made requests but never forced companies.

Government defenders say that if the Court limits the government’s power, it could hamstring agencies scrambling to achieve higher vaccination rates and other critical public health initiatives. Critics argue that federal public health officials — already in the throes of national distrust and apathy — never should have tried to remove misleading posts in the first place.

    • Venia Silente@lemm.ee
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      2
      ·
      10 months ago

      I think things like taxes are more binary in terms of whether you lied. Either the numbers match or they don’t.

      Covid is quite binary too: either it exists and people are sick and dying / already died, or it doesn’t and everyone is faking and the “dead” are pulling insurance scams. Sounds quite obvious and testable to me!

      Covid misinformation is essentially down to who you think is telling the truth.

      It’s simple: the one telling the truth is the one we know closest to the truth now. In a decade or two, when history changes, we can adjust on the go. Back in the 50s every doctor said bacon was healthy and eggs were the devil; in the 60s it was the turn of milk and bacon, respectively; then in the 70s, eggs and cheese. And so on. Sometimes you have to just be able to operate legally with information that is patently true and peer-reviewed.

        • Venia Silente@lemm.ee
          link
          fedilink
          English
          arrow-up
          1
          ·
          9 months ago

          To me honestly it’s quite simple: like any other personal right, my “freedom of speech” ends where other people’s rights begin. If what I’m saying both is patently untrue and deals a net harm for society - be it because of what I’m saying it or because of how I’m saying it, then it can’t be protected. It would be backwards for the purpose of a State if it was.

          So for your example of misinformation that we should punish people for, it’s quite patent-as-untrue stuff that leads to harm, such as “drinking bleach will immunize you from Covid!” (leads people to self-harm), or “it’s because of the niggas / gays / asians / anything non-Christian living in your neighbourhood” (leads people to cause harm to others). Something like “Covid doesn’t exist”, while patently untrue, does not invite harm in any way that I see as proportionally punishable (but for comparison “let’s organize to evade vaccinations because Covid doesn’t exist” does invite harm to others, so it should be punishable).

          Then again this all assumes it’s only about government prosecution. XKCD “show the door” applies here for any private party who feels they are given harm by some nutjob announcing that Covid doesn’t exist and trying to convince my grandma to drink bleach over Instagram, and there’s no “but muh freedom of peach” complain to take about that.