The UCI Paul Merage School of Business’s Center for Digital Transformation invited Professor Marshall Van Alstyne to speak on the issues of free speech and misinformation in the modern age.

CDT Leadership Series: Free Speech, Platforms & The Fake News Problem

June 14, 2022 • By Brian Nguyen

On April 21st 2022, Vijay Gurbaxani and The UCI Paul Merage School of Business’s Center for Digital Transformation (CDT) invited Professor Marshall Van Alstyne to speak on the complicated issue of navigating the line between free speech and misinformation in online spheres.

The event began with a short news clip that highlighted the demonstrations in Ukraine known as the “Russian Spring” and the key role that Facebook played in these anti-government protests. The clip ended with the quote: “with no Facebook, there’s no revolution.”

The mood that this 2012 video clip set was apt for the discussion to come as it showed the great difference in how people viewed free speech in online spaces then compared to today. As Gurbaxani further contextualized the clip, he said: “We’ve gone from talking about democratization, not just of countries but of companies, to where we are today, where our worldview is nowhere close to being as positive as it was back then.”

Dealing With Disinformation

The main speaker, Professor Marshall Van Alstyne is a professor of management and information systems at Boston University and is co-author of the book Platform Revolution.

Van Alstyne immediately began by agreeing with Gurbaxani’s earlier sentiments, saying that “we have almost a sea change, almost a complete opposite” of what social media once was. He attributes this change to the rapid rise in disinformation online, citing an MIT study that found that “falsehood diffused significantly faster, farther, deeper, broader than truth in all categories of information.”

The problem of disinformation is not one that Van Alstyne believes can be easily solved: “Not a single solution that I’m aware of changes the economics. At the moment it’s easier (more profitable) to produce fake news than the truth.” Other pitfalls he includes are technological arms races, discrediting of fact-checkers, and failures to hold the right people accountable. At the end of the day, he says that we are focused on the wrong things when it comes to combating social media disinformation.

Van Alstyne made it clear that centralized government action would not be enough, saying: “If we’re doing mechanism design, if we’re trying to write laws, laws are a terrible focus for the truth. You can’t own truth. You can’t be liable for truth. He can’t be dispossessed of truth.”

The Real Problem

Van Alstyne posits that the real problem with regulating fake news online does not lie in the disinformation itself, but in the externalities created by the relationships between disseminators of information, consumers, and advertisers.

He defines “externalities” as: “any spillover benefit or harm that happens to a third party in any one or two-way interaction. So, an externality is any harm or benefit that is not taking place among two given parties—it’s affecting someone else.”

He says: “Metaphorically, and one of the things that has happened is, one person’s free speech creates the spark, but it’s the amplification that turns a spark into this wildfire that burns down the whole neighborhood while Facebook sells ads.”

In the end, it is not the disinformation itself that is the problem, but the incentives that disseminators have to keep producing it. The negative externalities that are created make disinformation more profitable and more exploitable than the earnest truth.

So what can be done to remedy these externalities that, according to the first amendment right to free speech, cannot be regulated by a centralized governmental entity? How can we as a people incentivize a more truthful online space?

Proposed Solutions

Van Alstyne proposed a series of solutions that he thinks might better equip society in mitigating the spread of disinformation campaigns. The list is as follows:

  1. Enablement: Transparency in terms of who is funding misinformation paired with the ability for affected parties to respond to such misinformation.

  2. Reverse Amplification: If parties are caught lying in informational campaigns, then their exposure to the internet is limited and their ability to disseminate information in general is delayed. This puts the onus back on the author of the “fake news.”

  3. Pigouvian Taxes on Ads: Taxes ads progressively in order to shift the focus of companies away from ads and toward subscriptions. This leads to less economic incentive for harmful ad campaigns.

  4. Pigouvian Tax on Externalities: Attempts to solve the weaknesses of the other solutions by taxing the externalities that come from posts instead of the information itself.

  5. Claim Guarantees: Allows claimants to guarantee their claims or their disseminated information. Their guarantee is a bond, and if their claims end up being true, then they are returned their investment. The opposite is true if their claim is false. This makes it cheaper, economically, to tell the truth.

The common denominators between all of Van Alstyne’s proposed solutions is that solutions need to be decentralized, they need to pressure bad actors economically and they need to be scalable. As he summed his points up, Van Alstyne said: “An ‘honest ads market’ … forces liars to internalize their social costs or admit they are lying” so that these bad actors have more incentive to do good than to do bad.

Van Alstyne’s solution to the fake news epidemic is a way of democratizing information dissemination. One that distances itself from excessive government control and excessive private control. His most important point is that “we’ve lacked the institutions for dealing with externalities” and that “externalities cause market failures.”

In order to combat the onslaught of fake news and disinformation, society must combat the negative externalities that promote their amplification.