Date: 2/24/17

When you think about a propaganda army, what do you imagine? Personally, I picture a dark, windowless room with rows of government workers dressed in gray uniforms, folding pamphlets and rolling up giant posters in unison. It’s like a weird hybrid scene from the last Harry Potter book and every dystopian novel ever written. But today’s propaganda armies look a little different.

Imagine individuals hunkered over their laptops in their bedrooms and Internet cafés, never actually meeting. They post on social media sites, online forums, news websites you’ve never heard of – a distributed network of people, disseminating their ideology across the Internet. 

This week on Raw Data, we dive into the world of propaganda, fake news, and misinformation here in the U.S., as well as in China. The American constitutional right to free speech and a free press obviously differs from the situation in authoritarian China, but it turns out there are parallels between misleading political news in the Western world and what the Chinese government is doing to control the flow of information to its citizens. In both countries, social media feeds us an overwhelming amount of content, and it’s hard to know what’s true, as well as what information lies outside our filter bubbles. 

So what’s motivating the creators of fake news and propaganda? Why are we susceptible to believing it? And what are the dangers of having a misinformed public? 

***

The principle of a free press supports a “marketplace of ideas.” The concept is pretty simple: with an open forum for discussion, every idea can go out into the market on equal footing. The ideas that survive and spread in public discourse are the best ideas, because they’ve survived the test of public interest. This theory relies on people’s judgment to choose which ideas are worthy of adoption. 

In today’s digital media market, however, algorithms are often making the value judgment of which ideas are worth spreading.

We don’t know the secret sauce behind Facebook’s News Feed algorithm, but it’s safe to assume that when people are sharing and engaging heavily with an article, the algorithm will place the story higher up on people’s feeds. That’s how we get clickbait – people try to game the algorithm so their content is propagated more widely on the web. And with fake news, the goal of reaching a huge audience doesn’t even have to be about spreading a particular ideology.

“It's really about the economics,” said BuzzFeed media editor and fake news expert Craig Silverman. Craig led a popular investigation of around 140 fake U.S. election news sites operated out of a small town in Macedonia. He found that the Macedonians running the sites weren’t really invested in the outcome of America’s presidential election. They cared about ad revenue from Google. “It's really about the fact that they live in a place where a few hundred dollars U.S. a month is a pretty big thing.”

In many ways, we’re now living in a marketplace of misinformation. Shocking headlines are what people click on most often and what algorithms promote most, so newsmakers are incentivized to break stories that grab people’s attentions – whether or not they’re 100% true.

But, as humans, we have a bias towards thinking that the things we read are true and should be believed.

“This is the scariest part of deception or propaganda,” says Jeff Hancock, a Stanford expert on deception and technology. “You're actually changing the perception of reality of the population.” 

Not every country has a free media market where economics can drive misinformation. In China, all information is centrally controlled by the government, where the Chinese Communist Party holds a tight grip over the spectrum of ideas its citizens encounter.  

Stanford Communication Professor Jennifer Pan shared her research on Chinese social media censorship with us. She said the government’s main goal is to avoid the risk of revolution by preventing citizens from using social media to organize protests or any form of collective action. If you never hear about any organized dissent or major challenges to state authority, you’re more inclined to go along with the status quo. 

Jen’s research shows that the Chinese Communist Party is deleting social media posts that encourage collective action, and, what’s more, the government also strategically disseminates propaganda in order to drown moments when people might otherwise be taking to the streets.

This kind of censorship and misinformation can fly under the radar in the digital age, in part because there’s always a firehose of content to consume. It’s hard to tell what’s real and fake, what’s trustworthy or not, when it feels like you’re consuming information in an overcrowded  marketplace of ideas. 
 
“You wake up every morning, there's more than you can ever consume,” said Jen. “So why would you then seek out additional information?” 

Sound familiar? 

We were really struck by the ways in which our digital lifestyles enable the spread of misinformation across cultures. But we’re by no means saying the current situation in the U.S. is the same as China – here, we still retain the democratic value of freedom of assembly, and social media helps people organize protests and collective action. However, we would all do well to give the information and news we see online a more critical eye, and keep in mind the cocktail of political motivations, economic incentives, and invisible algorithmic decisions influencing our news feeds.

–Isha

Extra notes:
We highly recommend Craig Silverman's BuzzFeed article about the young purveyors of fake news in Macedonia
Lots of great stuff has been written about Jennifer Pan's work – we like this one from Gizmodo
Jeff Hancock has a TED talk, "The Future of Lying"
Also, our next episode will be about cryptocurrencies and the blockchain!

 


TRANSCRIPT

LESLIE CHANG: Hi everyone! We have a small announcement before we get started – our listener Chrysanthe tweeted at us a few weeks ago asking if we had transcripts available of our episodes. We do, and for easy access, we’ve uploaded all of them to our website, which is Raw Data Podcast DOT com. So if you’re looking for transcripts of our episodes, that’s where you can find them. Thank you to Chrysanthe for bringing that to our attention.

One other small thing to note: we have a language advisory for this episode. Someone says a bad word, so please be advised. Okay, here’s the show.

[Raw Data theme]

MICHAEL C. OSBORNE: Welcome to Raw Data. I’m Mike Osborne.

LC: And I’m Leslie Chang. Today’s episode: Propaganda Armies.

[theme continues]

MCO: As the title suggests, on today’s show we’ll be talking about modern day propaganda and the armies of people who create misinformation. But before we dive into it, we wanted to take a moment to test our truth-detection radars. And I think Leslie has a quiz for me?

LC: Yes I do. So it's from the CBC Canadian Broadcasting Corporation. The quiz title is "Can You Spot the Fake News Headline?"

MCO: Okay.

LC: All you have to do is just listen and evaluate whether you think it's true or false.

MCO: So I just say "Fake" or "Real?"

LC: "Fake" or "Real," yeah.

MCO: Okay.

LC: Here's the first one. "Woman Murders College Roommate for Sending Too Many Candy Crush Requests."

MCO: Real.

LC: I like how serious you were when said it. Wrong! This colorful story which claimed a young woman beat her roommate to death with an industrial sized bag of jellybeans is not true.

MCO: Okay. It sounds reasonable, but okay.

LC: But it had ... It earned 438,599 Facebook engagements last year.

MCO: Wow, okay.

LC: Next question. "Canada Post Looks to Drones As Possible Future of Mail Delivery."

MCO: Real, got to be.

LC: Let's see. Yes, it is true.

MCO: That's such a good idea.

LC: Yeah, it's very Jeff Bezos.

MCO: Yeah, of course.

LC: Last one. "Florida Man Dies in Meth Lab Explosion After Lighting Farts on Fire."

MCO: No judgment against Florida, but I think that's real.

LC: Okay, let's see here. No, it was fake!

LC: (breathe) Alright, I mean a lot of those stories are kinda funny or inconsequential, but fake news is a real and serious problem, because more and more it’s being used for political purposes. Ever since the 2016 American presidential election, a lot of people have been raising the alarms. The term “fake news” is getting used a lot, and that phrase may be imperfect, but whatever you want to call it – hoaxes, lies, alternative facts – at the end of the day, information that is demonstrably untrue is being presented as news and it’s spreading online like wildfire.

MCO: So today on the show, we wanted to look behind the scenes – the makers and mechanics – who are the people creating fake news, and what are their incentives? Craig Silverman was one of the first reporters to raise this issue to the surface shortly after Donald Trump was elected president.

CRAIG SILVERMAN: And I specialize in covering social and digital platforms, as well as online misinformation and fake news.

MCO: I'm sure most of our listeners are familiar with the story about the Macedonian teenagers creating fake news, but just so we get it, can you tell us the abbreviated version of what all happened, maybe starting with The Guardian piece and how you got turned on to it?

CS: Back in the late summer, early fall, I was talking to a really great researcher named Lawrence Alexander, who had previously had done some great work looking at the Russian disinfo sphere online: live journals and websites and other things online that were part of disinformation efforts that could be traced back to Russia. As we were looking at the U.S. election at that time, we were interested to figure out, "Well, can we find any examples of similar types of websites that are focused on the U.S. election."

LC: So Craig and Lawrence started looking around for websites outside of the US that focused on American politics. They found an unexpected cluster in Macedonia, and almost all of these websites were publishing articles and stories that were favorable towards Donald Trump. These posts were getting hundreds of thousands of comments, reactions, and shares on Facebook.

MCO: A few months earlier, The Guardian had also discovered a similar phenomenon, so Craig and Lawrence started to dig in more. They were able to confirm at least 140 US politics websites run out of the small Maecondian town of Veles, population of about 45,000.

CS: As I looked at these sites more and more, from there, I realized that they were pushing information that was oftentimes misleading and sometimes just 100% false.

LC: The most popular articles from Macedonia included false stories you may have heard, like  the one about the Pope endorsing Trump, or the one that Mike Pence said Michelle Obama was the most vulgar first lady we’ve ever had. Neither, of course, was actually true.

MCO: Craig reached out to the Macedonian site owners and talked to a bunch of them.

CS: One of the things that I found was they were often very young. Some of them were teenagers. They were all men that I spoke with. Some were in their 20s. What they said is that, basically, "This was just a good way to make money.”

LC: These enterprising young guys were making websites, and getting them approved with Google's AdSense, which places advertisements. Then, they would go on Facebook and drop links to their articles into partisan Facebook groups. Apparently they experimented with left-leaning content, especially stories that favored Bernie Sanders. But eventually they focused just on right-wing content, because nothing brought in the clicks and shares like pro-Trump articles. And the more views and shares they got, the more ad revenue they would rake in.

CS: Everyone that I interviewed and everyone who's ever been interviewed by The Guardian and by that Macedonian site, that's what they said: "We're just trying to find content that can earn us some money." Some of the people that I spoke to did say that, "Hey, I thought Trump would win," or some of them said, "Yeah, I kinda like Trump," but they said it was always a financial motive. And…it's really about the economics. It's really about the fact that they live in a place where a few hundred dollars U.S. a month is a pretty big thing.

MCO: While many Americans felt that the fate of the free world hung in the balance, these guys were just there to make a buck. As best as Craig was able to learn, they were not interested in swaying an election.

LC: The 2016 election was emotionally charged and there was a lot of noise in the news cycle. It should be noted that there were numerous instances during the campaign in which Trump’s advisors and Trump himself made statements and tweeted stories that were completely untrue.

MCO: But did fake news itself actually sway the election? As of now, we don’t know, and we’ll probably never know. Regardless, none of this would have been possible without the Facebook algorithm bumping this content to the top of people’s feeds. [pause] So, we were curious whether Craig was able to learn anything about the algorithm itself in the course of his reporting.

LC: We've been talking, Mike and myself, a little bit about Facebook's algorithm as almost making editorial decisions on behalf of their users. I'm wondering what you've uncovered?

MCO: What did you learn about the beast?

LC: Yeah.

CS: "The beast" is a good name for it. It absolutely is making editorial decisions and it absolutely has biases, as well. I think too often people think about an algorithm as something that operates completely without bias, compared to say a human editor, but it's programmed by humans first, of course. It's programmed to value certain things over other things, and that is bias.

And so, the first thing that's happening to game it is that people are writing really emotionally-driven headlines and using very striking thumbnail images, because in the news feed that's what you see first: You see the headline and you see the image. We're getting these really unhinged political headlines during the U.S. election because people who were strong Trump supporters or strong Clinton supporters would react to those.

LC: Facebook is an emotionally charged platform, and fake news creators used that to their advantage.

MCO: Now, when the first reports were coming out that misinformation was being spread on Facebook, the company’s reaction was dismissive. But recently, Facebook has taken a more active stance. They’ve made it easier for users to flag suspicious content, and they’ve also partnered with 3rd party fact checking organizations like Politifact and Snopes. If at least two of the fact-checkers agree an article is misleading, Facebook will demote the story in the News Feed and add a label to the link.

LC: Facebook has gotten a lot of attention for propagating fake news, but the other tech giant that makes the spread of misinformation possible and profitable is Google and its advertising model. Ultimately, Google Adsense is the economic engine of the system. And it’s especially lucrative when the web traffic is coming from the US and other wealthy countries.

CS: So what we're seeing increasingly, and it's not just with politics, is that people in Eastern Europe and other parts of the world are creating English language websites aimed at the United States, and Canada, and the U.K., because that's a more valuable audience for advertisers, and trying to get as much traffic as they can from them. For these folks, a few hundred or a few thousand dollars a month American is gaming-changing.

LC: Another important factor here is the fact that so many people now rely on social media as the starting point for finding their news. Pew Research and the Knight Foundation recently found that people are AS LIKELY to get news from social media as they are by going to actual news websites.

CS: I don't think it's an exaggeration to say Facebook's news feed, especially when it comes to information, is one of the most important things in the world right now. It determines what a lot of people encounter.

MCO: But it’s not just about encountering, it’s also about spreading and perpetuating that information – or misinformation. There may be cottage industries of fake news popping up in Eastern Europe, but that doesn’t totally get at the question of why certain stories go viral.

Jeff Hancock: Interestingly, seems like when there's fake news, you have a person that's acting honestly, sharing something that's deceptive. So fake news was something like that a lot of times, I think people were sharing things that they really believed to be true and thought would be useful for others.

MCO: This is Jeff Hancock, a professor of Communication at Stanford. If you’re a longtime listener of our show, you may remember him from our episode in season 1 “All The News That’s Fit to Feed.”

JH: I liked that one

LC: Oh, cool, thank you.

MCO: Jeff studies deception and trust in online environments – how do social media platforms or apps change how we lie to each other and trust one another?

LC: There’s an art to lying and deception. Maybe it’s tweaking small parts of a story, or maybe it’s just sharing information that’s consistent with popular assumptions. The difference between spin, and deception, and manipulation is all fuzzy territory. And online we are still getting used to a new set of norms.

JH: We've been talking a lot about a concept I thought you guys would like. It's epistemic vigilance. This is the notion that we're basically walking through life and information's really valuable for us. And at the same time, we have probably another process that's running that's basically saying okay, is this information I'm getting, truthful, valid. It's not like one or the other is better, it's that information usually is good for us, and a lot of benefit, but if we trust just blindly, then we're going to run into trouble.

LC: Think of epistemic vigilance as a process that’s always happening in your brain, like an app running in the background. Generally as humans our default setting is to trust people. But, over time and through experience, we learn that there’s a cost to bad information. We then adjust our attitudes about how we incorporate information from various sources.

JH: ...and now with the attention of the media to this notion of fake news and trustworthiness and with media, I think something's changed with our epistemic vigilance.

And unfortunately I think, we don't know what are the cues and signals to trust anymore. It feels like something's changed where well, man, if it was shared by 1200 people, that feels like it's something I should be able to trust, but I'm being told it's not.

MCO: It’s natural to think that people who grew up with the Internet – digital natives – would be better conditioned to evaluate what cues and signals to trust online. But, Jeff says that’s not the case. He pointed to a recent research paper from the Stanford School of Education.

JH: Here's this report. Basically says hey, guess what, elementary, high school, and even early university kids have very difficult times assessing the veracity and credibility of post in social media. When I was in school, we had a course which was how do you believe something that you read. It was like well, what's the source, and might they have a bias, and what kind of claims are they making ... right? We did that, didn't we? Okay, so maybe that's still happening, but the kids,  us, adults, old people, everyone were not applying it to things that occur in our online information environment. There's a good reason, it's really hard. Second reason, this is brand new; Facebook's news feed is maybe a decade. How do we rethink teaching, educating people to basically employ epistemic vigilance in the social media age?

LC: Better education may sound like kind of an unsatisfying answer, but… it really could be part of the solution. In a society where we promote freedom of expression, we should get better at evaluating the information we see on the Internet.

MCO: But, of course, there are other places in the world where the Internet is NOT an open platform for anyone to express whatever they want. Places where the government censors its citizens and controls the flow of information. So what is the experience of the Internet in those places? How do people encounter and evaluate their online news?

LC: More on that after the break.

LC: Okay, back to the show.

MCO: The willingness of the Trump administration to downplay and dismiss verified facts has a lot of people alarmed. The problem of misinformation seems to be getting worse, and some journalists have gone so far as to draw comparisons with authoritarian countries.

LC: So Mike and I thought… why not take a look at how authoritarian regimes actually control information? Stanford Communication professor Jennifer Pan has done some very original work on this.

JENNIFER PAN: I'm interested in how non-democratic or authoritarian governments control the spread of information, especially in the digital age as more and more information is being recorded online.

LC: On our podcast we talk a lot about how our digital footprints reveal who we really are. In a way, Jen’s flipped the script. Her research turns the lens onto governments. [pause] What do the digital footprints of authoritarian regimes reveal about THEIR motivations, inner workings, and how they control the flow of information to their citizens?

MCO: Most of Jen’s current research is focused on China. The Chinese Communist Party controls everything from education to media content to what can be accessed on the Internet. And a lot of effort and resources are devoted to censorship.

JP: I would say that censorship in China is part of a broader effort to control the spread of information, ultimately to shape what people believe and what their preferences are. Those efforts are really multi-faceted and diverse, and it includes things like the education curriculum, which the propaganda department has control over from elementary school to college. It includes control of traditional media, for the most part, newspapers, television, radio. It's all state-controlled directly or indirectly. When it comes to the internet, people have heard of The Great Firewall, which is, if you're in China, there's certain websites like Twitter or Facebook, New York Times, that you can't access.

LC: The propaganda department shapes everyday life for Chinese citizens. And when it comes to censorship online, there’s more than just the great firewall. There’s also search filtering, which is when you go looking for something on a search engine, certain websites might not show up in your results. There’s also keyword filtering and blocking – say there’s a keyword that the Chinese government doesn’t want to show up on the Internet, like “banana milkshake.” I don’t know why they WOULD censor the word banana milkshake, but if you tried to make a post on social media with the word “bananas” in it, you wouldn’t be able to submit it.

MCO: Jen and her colleagues did a study where they looked at a different kind of censorship called content filtering – so stuff that’s already appeared or been posted online, but then gets taken down, usually within a day. Jen actually kind of stumbled into this research project.

JP: Initially, it was not intended to be a study of censorship in China. We were just trying to collect data to develop a algorithm for automated text analysis in Chinese.

MCO: The dataset they were pulling together included a bunch of Chinese social media posts. Later, when they went back to the live websites, they discovered that some of the posts had gone missing. At that point, they realized that the missing content actually represented an incredibly valuable dataset itself. They now had a record of what the government had removed.

LC: So, they decided to do a little experiment to try and prod the censorship machine. First, the team set up a bunch of shill accounts on various Chinese social media platforms, including Sina Weibo and Tencent Weibo, which are like Chinese Twitter. They posted hundreds of messages to see what would get taken down, and what would be left up.

JP: When we looked and analyzed that data, what we found is that criticisms of the regime, its leaders, its policies were allowed and not censored. What was censored was, instead, discussions of on-the-ground protests and collective action.

LC: This was a surprising discovery. Contrary to common wisdom, the Chinese government seemed to be allowing a certain amount of free speech. What the the Chinese government is NOT cool with is people congregating or taking to the streets in protest.

MCO: But Jen and her colleagues wanted to have more confidence in this finding, so they decided to go one step further. They developed their own social media site from scratch. This fake social media site never actually went live.  They just wanted to try and to figure out what rules were for building a social media company inside China’s borders. While developing their a site, they spoke to Chinese firms that install censoring technologies for a number of websites. They chatted with IT support, compliance officers, and so on.

JP: And so there was a lot of guidance for us on how should we censor if you are running a social media site in China.

LC: Is that engineering support and guidelines, does that come from the main propaganda office in China? Who's coming up with those?

JP: We were just talking to the commercial vendor, and they were providing guidelines.

LC: Which the government had given them, or...?

JP: That we're not sure about. I think what it is interesting is that, if you create a website, you're not automatically given guidelines. There is some ambiguity, especially for smaller sites.

...Whereas it seems that some of the larger sites or larger internet content providers like Tencent or Sina. They probably have much closer communications directly with the government.

MCO: And Internet companies in China are probably going to play it safe, and censor stuff they think the government wouldn’t like – if they want to do business, they gotta follow what the party wants.

LC: Jen’s team basically reverse-engineered the way censorship happens in China. These experiments confirmed what the researchers had found before.

JP: I should be clear that it's about any sort of organized action outside of the control of the government. They do not have to be protesting the government. They do not have to be protesting in some way that is critical of the regime. They could be just a group of people who, for example, want to advocate for some new policy or new program. In that sense, they are not antithetical at all to the government. They're instead trying to work with the government, but it's any on-the-ground collective behavior. If that gets discussed a lot online, all discussions of that are deleted, regardless of whether the content is critical or supportive of the regime.

LC: And, if you think about it, the government’s strategy has a certain logic to it. If people can write posts criticizing their government, average Chinese citizens will feel like they have some right to have their voices heard. Jen told us it’s conceivable that the government even welcomes  some criticisms, because it can give them insights into the frustrations of the people. Still, their tolerance for free speech stops when it comes to taking collective action.

JP: In terms of what's really motivating this, when we think about authoritarian regimes, there are two threats. One is from elites, so that there's a coup, and the second is from the public, that there is a revolution. With new media and the internet, what that's facilitated is anyone can now be a broadcaster. Theoretically, we might think that, if anyone can be a broadcaster, that means there's more common knowledge of grievances, so I know that you don't like the regime, and you know that I don't like the regime. We're more able to coordinate and revolt. What we see in the China case is that existential threats to CCP rule, perhaps the last one was Tiananmen Square. Another one were the Falun Gong protests in the '90s. In both cases, there wasn't just protests, but there was protests that were coordinated across regions and across different types of people. What the censorship and propaganda effort in China seems to be aimed at is preventing the snowballing of these protests across geographies, across different types of people.

LC: So far we’ve been talking about censorship, but the control of information is about more than just taking down people’s posts and subtracting content.

MCO: The Chinese government is also actively involved in disseminating propaganda. One of the ways they do it online is through the infamous 50 Cent Party. According to popular belief, this is a propaganda army – people who are paid small amounts by the Chinese Communist Party to write pro-government posts.

JP: And the conventional wisdom is that what they're doing is to argue against critics of the regime. The word I think we use is antidisestablishmentarianism, if we want to use it. If there's someone online who's criticizing the Chinese government and its policies, then these are the people who are criticizing those critics to say that you're being unpatriotic. You are betraying your country.

LC: But this was mostly guesswork – no one outside of the Chinese government actually knew what the 50 Cent Party was... until a Chinese blogger leaked a trove of emails. These emails were sent in 2013 and 2014, and Jen and her team analyzed a subset of these communications between a county-level propaganda department and members of the 50 Cent Party.

MCO: Jen told us that it was a fairly meticulous process to organize these emails, and their methodology gets a little involved. But at the end of the day, they were able to learn several things. First, it turns out the people who are part of the 50 cent party are actually government employees.

JP:  Especially a single-party regime like China. The party is everywhere, and it's pervasive. It's embedded in government, and it goes down from the center to the province to the prefecture, to the county, to the township, to the village and neighborhood levels. There are already people in place at all these levels, and when we look at the astroturfing, it’s being done by these government employees.

MCO: Another of Jen’s findings is that, contrary to popular belief, the 50 cent party is not actually arguing with anti-government people. They mostly make what Jen calls cheerleading posts – these are positive, motivational, aspirational sentiments. And Jen says the timing of when the cheerleading happens strongly suggests a strategy of misdirection. They want to drown out discussion of collective action.

JP: It's not the case that there's a constant level of these 50 Cent posts that are being made. They're happening in much more coordinated time periods and instances. We see two examples where there was collective action, and these protests were reported nationally. Right after these protests happen, we see a burst of discussion of cheerleading, so nothing addressing or arguing or engaging with the controversy of the protest, instead putting out all sorts of positive content on everything else that you could think of.

So when we think about the strategy, and this is combining with what we found in the censorship papers, so much of this seems aimed at distracting the public from key issues of the day.

LC: Misdirection and distraction are not new tactics for governments that want to advance their agenda. But the nature of social media makes it an especially effective tool for that strategy.

JP: And I think that's because, when you're on social media in China, there's a huge amount of volume of content, so you wake up every morning. There's more than you can ever consume, so why would you then seek out additional information? You might very well perceive there to be plenty of information and no need to try to obtain additional information.

LC: In the course of our conversation with Jen, it really struck us that even though we were talking about China, a lot of this sounded familiar. Everything we’re experiencing in the US with fake news and algorithms determining what information we see – it’s actually not so different. In both countries, social media gives us an overwhelming amount of content, and it’s easy to be distracted. In this environment, it’s also hard to know what information we’re NOT seeing.

MCO: Jeff Hancock, who we heard from earlier, has been doing a lot of soul searching on this. Before the 2016 election, a major conclusion of his research was that the online environment actually doesn’t change how much people deceive each other. But when we talked to Jeff, it sounded to us like he’s now questioning that conclusion.

JH: And I think that I misjudged the problems that deception from politicians would cause. I thought of Trump, and I still do, as a kind of bullshitter, in which he's content to say things without regard for the truth. Which is different than saying he's a liar; I think those are slightly different things. I think Trump says things and it's doesn't matter where he got the facts or whether it's true, it's whether it fits his message.

I think that's relatively accurate but it underestimates I think the importance of that, in that authoritarian leaders get their power by shaping perceptions of reality.

MCO: A name that’s coming up in a lot of conversations lately, including our conversation with Jeff, is Hannah Arendt. She’s a political theorist, and she’s famous for analyzing totalitarian states and the way they use propaganda and misinformation.

JH: So her point is that these lies are constructing in a very real and powerful way, the perception of reality for the population. That's when it becomes scary. I think an example is that there are two to three million fraudulent votes. Actually now he's saying three to five million fraudulent votes, and there's no evidence of that. But for a decent and reasonable majority of people in the US, their president has said that there were three to five fraudulent votes-

MCO: Million votes.

JH: Million votes, and that is a very reasonable conclusion could come to if you're an everyday citizen and you're not like obsessed with finding out the truth with things or looking at deception. That's where the scary power comes from, of using deception at the White House political level leader. Her analyses were of course of totalitarian regimes, authoritarian regimes, but she says this is what the most scary part of deception or propaganda is, it's you're actually changing the perception of reality of the population.

MCO: There’s no question that we’re in uncharted territory here. The path from political spin, to manipulating the facts, to outright lies can be a slippery slope. But Leslie and I want to be careful about drawing too many comparisons between our experiences in the US and the experiences in China. For starters, the Chinese system does not have a tradition of a free and independent press. However polarized America is right now, the freedom of information remains a shared principle.

LC: And even more important – in the US, we still have the freedom of assembly. Jennifer Pan’s research shows that in China, the government goes to great lengths to block protests and suppress collective action. But here, social media is extremely helpful for organizing political opposition and mobilizing people.

MCO: Still, the problem of fake news in the US is just one part of a much bigger issue: the erosion of trust across society. It’s extremely tempting to point the finger at technology and the forces of disruption for chipping away at our trust in institutions. At the same time, there are other ways in which we clearly trust the digital environment. So if we want to begin to lay a new foundation of trust, and harness the power of technology, maybe a good place to start is with the oldest form of social technology.

LC: Money…

[cash register sound effect]

[Fade up music: “Escaletor,” by DOC]

NATHANIEL POPPER: Bitcoin is fundamentally, a kind of social technology, a social network.

CHRIS BERNISKE: What we have now with blockchain architecture is the digitization of that trust and the decentralization of that trust. That’s where the power of the revolution starts to become overwhelming.

LC: On the next episode of Raw Data, we bring it back to the blockchain... and Bitcoin.

LC: Thank you for listening to our show. If you like what we do, please share the podcast with your family and friends, your cool co-workers, your neighbors, your dogsitter, your social media followers. Every mention and share helps, and we really appreciate it.

MCO: Our podcast is produced by Leslie Chang and me, Mike Osborne. Our amazing intern Isha Salian is the Queen of Gifs. Thanks also to Allison Berke, Jackson Roach, and the rest of the Worldview team.

LC: Also, a special shoutout to my friend Eddie Hu for taking the time to talk with me about what it’s like to actually experience censorship in China.

MCO: We’ll provide a transcript of this episode on our website, and we’ll also post links to Craig Silverman’s articles on BuzzFeed about the Macedonian fake news creators. Our website is www DOT raw data podcast DOT com.

LC: Our show is a production of Worldview Stanford, and we’re receive additional support from the Stanford Cyber Initiative, whose mission is to produce research and frame debates on the future of cyber-social systems.

MCO: Again, please get in touch if you have any feedback or ideas for us. We’re on Twitter @rawdatapodcast, or you can email us – raw data podcast AT gmail DOT com. Thank you so much for listening, and we’ll be back soon TALKING ABOUT BLOCKCHAIN AND BITCOIN. It’s gonna be good.