Wednesday, September 29, 2021

The Disinformation Funnel


Nathan Robinson wrote an interesting piece for Current Affairs arguing that 1. there is a problem of disinformation, and 2. the root of the problem is an asymmetry of access. In other words, it's easy for yahoos to write a blog post about a crazy claim (microchips in the vaccines!) but the ACTUAL SCIENCE debunking these bad claims are behind paywalls. 

I like the idea but I don't think it totally explains the problem. First, the problem isn't just that the ACTUAL SCIENCE is behind a paywall but that it's written in vernacular only a small percentage of people can comprehend, and a larger percentage of people claim to have the ability to both comprehend and interpret. So influencers can read the same study and draw different conclusions, whether they are good-faith idiots or bad-faith manipulators.

Second, I would argue the problem isn't one of information vs. disinformation, or a problem of easy vs. secure access. I think the problem is too much information, or better yet, wicked problems.

In my town's Facebook group, there was a recent argument about whether or not students should have to wear masks. One antimask parent bolstered her argument by linking to a study that purported to show that masks were ineffective. She did everything that liberals accuse conservatives of being ignorant of. She linked to an ACTUAL SCIENCE study, and not some yahoo's blog. The study made it into an academic journal, published by people who went through the rigorous process of defending their dissertations and earning their PhDs. 

They made it through the funnel.

What funnel? I have yet to read Jonathan Rauch's The Constitution of Knowledge, but I assume it is an expansion of his essay of the same name. He uses a funnel as a metaphor for the production of knowledge, things like peer review. 

"To protect the wide end of the funnel, we disallow censorship. We say: Alt-truth is never criminalized. At the same time, to protect the narrow end of the funnel, we regulate influence. We say: Alt-truth is always ignored. You can believe and say whatever you want. But if your beliefs don't check out, or if you don't submit them for checking, you can't expect anyone else to publish, care about, or even notice what you think."

So here is a parent using the information that came out of the narrow end of the funnel to support her antimask beliefs. The next step for the interlocutor is usually looking to discredit the study, which means discrediting the funnel. This puts counterarguments in a bind.

For example, one parent started combing through the study and criticized its small sample size or high p-value. It's like critics of the Sokal Squared hoax who point out the academic journals that published their entries weren't SERIOUS JOURNALS. If this is true, then it's a BIG PROBLEM. Allowing bad institutions to operate within the funnel discredits the whole funnel.

For me, I've been skeptical of mask studies because I know a random controlled trial has never been done. But a lot of Asian countries have been dealing with viral outbreaks for a lot longer than us, and if they use masks I trust that they work. 

But then! What's this? A randomized controlled study out of Bangladesh!

Did I read the study? Of course not, but I trusted Lyman Stone's analysis because he seems to have earned the respect of ACTUAL SCIENCE people. 

Thank God this is settled ... oh, no. Please no. Ugh, fine, here is long-ass blog post in which the author worries "that because of statistical ambiguity, there’s not much that can be deduced at all," from the Bangladesh study. I tried to read the whole post but it made my brain hurt. I looked up the author, hoping to discredit him. Turns out he's an associate professor at UC Berkeley. Goddamnit.

So what we have is people who pass through the credentialed funnel, interpreting knowledge passed through the scholarship funnel, coming to different conclusions. Then everyday schmoes like me and the parents on my town's Facebook Group can find funnel-approved arguments that support our priors. 

I can't solve this problem on an institutional level. The Constitution of Knowledge needs to tighten its belt or add another layer, something like replication or meta analysis, and ignore things that don't make it all the way to the bottom. I know I'm not smart enough or have the proper training to comb through studies and distinguish the ACTUAL SCIENCE from the ones with poor methodology, small samples, p-hacking, or whatever disqualifiers people can come up with. 

Instead I put my trust in people. 

In a blog post, Richard Hanania wrote "you should always take good intellectual habits over credentials." He has yet to expand on what he means by "good intellectual habits" but I think I know what he means and I hope to one day better define and test what I consider to be good intellectual habits in others. Here are a few things right off the bat:

  • Is well-read in numerous areas (more fox than hedgehog)
  • Gives his convictions with stoicism rather than "passionate intensity"
  • Admits when he is wrong
  • Considers tradeoffs
  • Praises people in his outgroup when they are right
  • Expresses uncertainty
  • Can be found saying something like "I don't know enough about that to have an opinion"
  • Is cited (positively) by people from opposing ideologies

I also think about the person's incentives. I think Emily Oster checks off many of these boxes. And even though her background is in economics, I trust her more than Anthony Fauci when it comes to COVID-19. I don't think that Fauci is always wrong or completely useless, but his revealed lie about the effectiveness of masks showed that his position is as much political as it is scientific, so I always have to weigh that against whatever his position is in a way that I don't with Oster.

But I don't know if these heuristics are enough. One worry is that the bulleted list is mostly a description of how I view myself and I'm just giving extra weight to people who remind me of me. So I need some way of testing these heuristics objectively and I think the next step might be something like prediction markets or public participation in the Good Judgement Project. I want to see how the people I trust to analyze information that has made it through the funnel fare when asked to use their expertise to make real world forecasts. 

But until then I think science is in a real bind and I don't see a credible way out.


No comments:

Post a Comment