Friday, October 22, 2021

Updating My Priors: Short Takes IV

In my post "The Biggest Threat to Progress" I articulated what felt like an original idea. It's not. Matt Yglesias describes the idea here:

"One of Stimson’s big findings is that public opinion operates like a thermostat that acts to bring the political system into equilibrium, stopping it from moving too far to the left or the right. So while the sharply liberal Mood of the early 1960s set the stage for the Great Society, the actual enactment of the Great Society sent it in a rightward direction. The 12 years of Reagan-Bush governance later pushed the Mood steadily leftward."


I'm quite happy with my post "The Dam of Free Speech," (it even got a retweet from Conor Friedersdorf!). I think the dam/flood metaphor works better for what I was describing in my "On Worshipping and Compression" post. I drew my ideas from Arnold Kling's The Three Languages of Politics, but whereas he sees political views as an equilibrium (eg oppressor v. oppressed), I liken it to a default setting and a conscious effort to fight against it.

So the flood is the bad thing that humanity will do if left unchecked; for classical liberals it's authoritarianism, for conservatives it's barbarism, and for progressives it's exploitation. The dam is the thing that needs to be built and maintained so that the flood does not destroy us. So it might be liberalism, military/police/Christianity, or regulation/antiracism. I think a lot of cultural conflict comes from people protecting their dam from people who can't see the water on the other side and assume the dam people are just acting selfishly. 

Amanda Ripley wrote that every conflict has the surface-level argument ("vaccine mandates are government overreach!") and the understory, which is usually fear. It's important to get to the understory to move past the conflict. Maybe another way of saying this is to ask "What is your flood?"


In a recent podcast, Steven Pinker brings up the topic of Wikipedia, and just how good it is at producing knowledge and attracting editors who value objectivity and truth. It made me think of Parts I and II of my posts on the Disinformation Funnel. How is it that something that allows anyone to write and edit has not become a major source of disinformation?

Pinker mentions their code of ethics all editors agree upon, similar to my journalistic oath I recommend in Part II. Plus they have some added checks and balances to incentivize some equilibrium. But I think the other answer may be that it's just not that powerful. Despite it being one of the most visited websites in the world, it's not something people use to prove their point. It's more useful than influential. 

But maybe there is something there and the key to figuring out the disinformation problem is following the Wikipedia model.


In my post "Is Malcolm X Winning?" I talked about patents and how segregation cut Afrian Americans off from the social capital in white communities that led to a decrease in patents. Well, here's a cool analysis of a study about the impact prohibition had on patents. Banning alcohol didn't just make drinking illegal, it also banned saloons, which turned out to be a massive resource for social infrastructure. 


My post about the biggest threat to liberalism being voter backlash faces a new test. This story in The Atlantic seems to think the Texas abortion near-ban will rally pro-choice voters to a Democratic victory in the mid term elections. If true, the biggest challenge to Republican dominance has been pro-life Republicans pushing unpopular policies. 


My post about how Fox News and other fringe establishments became bottom feeders, picking up on the stories that mainstream outlets passed on. Now the same thing is happening on the left. Rolling Stone made a claim that a hospital was overwhelmed with ivermectin poisoned patients, which then got picked up by The Guardian, the BBC, Yahoo News, etc. Even though it's totally bogus and no one ever checked up on it. (Although, Scott Siskind demonstrates that the story gets even more complicated the more you dig).

My original worry was that mainstream outlets would lose credibility by passing on legit stories that were true but popular with fringe rightwing crowds. Now they're passing off their own crap. Race to the Bottomfeeders.


My post about bottomfeeders also related to my worries about screen time on my children. Recently, when I've been on my phone too much, I force myself to put it down. I've found that I often end up picking up my guitar instead. I also wrote a post about being a Xennial, meaning I remember the analog world even though I'm fully immersed in the current digital world. One way that manifests is that when I'm staring at my phone for too long, part of me feels that this isn't right. I don't think my children will ever feel that way; staring at screens has always been the norm for them. 

So if they don't feel guilt about screen time, and never feel the need to put their phone down, does that mean they will never pick up a guitar? I'd be interested to know if fewer children are taking music lessons today.

Tuesday, October 19, 2021

The Disinformation Funnel Part II: Solutions

Michael Jackson, leaning in

I work in fundraising and each year we put on a Giving Day where we challenge our donors to make as many gifts as possible in 24 hours. In a prep meeting, we were looking at donations by the hour and noticed some patterns; lots of gifts at 7 am, noon and 5 pm. We were trying to decide when to promote incentives during the day and the data led to two conclusions that split the department in half.

One half thought we should push out our incentives when giving was low, to try to shore up the moments when we are weak. The other half thought we should lean into what we do well and put the incentives out when giving is high.

In my last post, I described what I see as the problem of disinformation and Jonathan Rauch's "funnel." I didn't make any great suggestions other than to say I put my trust in certain people to interpret for me what comes out of the funnel. I think most people do some version of this. When people say something like "I'm not one of those sheep who follow Fauci. I do my own research," I think they mean they look for people they trust to interpret what comes out of the funnel.

This is the familiar pattern of the 21st century; we lose trust in institutions and place it in people. And I totally agree with Yuval Levin that this is bad but maybe the solution is to just lean into it. 

PETA spent decades trying to shame people into giving up meat, doing things like throwing blood on ladies in mink coats. It didn't work because they couldn't solve the big problem, which is that hamburgers are delicious. Along comes Impossible Foods and decides to lean into the problem by creating a meatless burger that is also delicious. 

That is the challenge. How can we lean into the individuals that people already trust and sort out the grifters from the rogue geniuses? 

We need to create a funnel for public intellectuals.

Moral Accountability

So Twitter has a blue checkmark that verifies that people are who they say they are. How about a green checkmark for someone in the journalism/publishing industry who has taken the journalistic oath? They can police one another and decide to rescind green checkmarks. You're still allowed to tweet, you just lose that credential.

I'm looking to appropriate the Hippocratic Oath, but for media. The oath should be something agreed upon by heterodox journalists and they can hold one another accountable. People will still be able to choose whom they follow and whom they trust, but the green checkmark will lend more credibility. 

This doesn't involve occupational licensing, manipulative algorithms, or government oversight. Just the will of the industry to want to do better.

Forecasting on Record

In Scott Alexander's plan to improve the Republican party, he suggests they repeal all bans on prediction markets and give tax breaks for participating in them. 

"You'll get a decentralized, populist, credentialism-free, market-based alternative to expertise."

He then warns that if we continue to not trust experts, Republicans are at risk of not listening to people who are actually right. 

"Prediction markets - an un-biasable, decentralized form of aggregating expert and non-expert opinion correctly - are the only solution I can imagine working."

From there, we're looking at an expansion of the Good Judgement Project. Basically, influencers make their predictions public and their followers can see for themselves how often they are right and how often they are full of shit. And anyone who refuses to participate should be considered unreliable.

Arnold Kling has a similar idea with a draft. My only critique is that he judges people based on things he finds admirable, which will preclude other thinkers outside your ingroup. Better to open the funnel and give everyone a chance to prove their trustworthiness. 

The Spectrum of Approval

Scott has another idea but this one isn't about discerning trustworthy individuals, but rather a possible fix to the scientific funnel problem. He makes a suggestion to improving the FDA. Instead of giving the binary approved/not approved stamp (and the occasional "emergency authorization"), give drugs a rating system.

"Grant drugs one-star, two-star, etc approvals. Maybe one-star would mean it seems grossly safe, the rats we gave it to didn’t die, but we can’t say anything more than that. Two-star means it’s almost certainly safe, but we have no idea about effectiveness. Three-star means some weak and controversial evidence for effectiveness, this is probably where aducanumab is right now. Four-star means that scientific consensus is lined up behind the idea that this is effective, this is probably where the COVID vaccines are right now. Five star is the really extreme one where you’re boasting that Zeus himself could not challenge the effectiveness of this drug - the level of certainty around MMR vaccine not causing autism or something like that."

I remember going into restaurants in North Carolina and they had a placard displayed with a grade. The department of health would grade each restaurant, which I liked. My libertarian side hoped they wouldn't shut people down for failing but would have to put up an F and people could decide if it was worth it to them to eat there. 

The reason people like Scott like prediction markets is that there are penalties for being wrong. You literally lose money. There needs to be more of a penalty for misinformation. So instead of doing the FDA thing of saying this speech is/isn't allowed you give it a rating system and people are penalized via their rating when they're wrong.

So whatever we end up doing, it will only have a chance of success if we lean into where the trust already resides.