Every day,people are bombarded by fake news, foreign propaganda, consumer scams, andother kinds of deliberate disinformation that make it hard to sort out factfrom fiction. Alex Cole has been on the front lines of information wars aroundthe world. He is the vice president of external affairs at Internews, aninternational nonprofit that believes everyone deserves trustworthy news andinformation. They have 30 offices worldwide and support independent media in100 countries, reaching millions of people.

 

At theglobal nonprofit IREX, he helped organizations in Ukraine and other countriesfight back against the onslaught of disinformation from the Putin regime inRussia. Doug Hattaway sat down with Alex to talk about real solutions andpractical steps organizations can take to fight one of the most importantchallenges facing the field of communications and democracy today.

 

Doug Hattaway: Alex, let’s start with a story.What’s an example of a disinformation challenge that was met with areal-solution approach that really made a difference in your view.

 

Alex Cole: So let me give you two stories. I’mgoing to start with a historical one, Germany and the printing press. And thenI’m going to fast-forward to WhatsApp in Zimbabwe. I’m going to start inGermany because I think disinformation has been around for a long time, and wecontinue to not learn the lessons of how to deal with it in the modern era. So,[in] 1400s Germany, [the] printing press was invented. What’s the book that’sprinted the most?

 

Doug: The Bible.

 

Alex: Right. What’s the book that’s printedthe second most or competing with the Bible?

 

Doug: The dictionary?

 

Alex: The Hammer of Witches. It was about witchcraft and howwomen should be executed.

 

Doug: Whoa.

 

Alex: Right, and when this piece ofterrible disinformation comes out, it marshals in two centuries of killingwomen, drowning them, burning them at the stake. The Vatican believed inwitchcraft, but they thought this little piece of disinformation went too farand they condemned it. So what happens after they condemn it? It starts sellinglike hotcakes. The demand for this book picks up.

 

This is howmost people think of disinformation in the modern era. Somebody producessomething that’s inaccurate or hate speech, and the first thing that one doesto counter [it] is to condemn it, but usually that draws more attention tosomething and can amplify and make the problem worse.

 

I’llfast-forward to Zimbabwe and WhatsApp. [During] the COVID crisis you’ve gotdisinformation swimming everywhere, on WhatsApp in particular. [WhatsApp uses]end-to-end encryption. It’s very hard to interfere with mis- and disinformation[because] you can’t remove the content [from the platform]. So how do you dealwith that? Do you try to condemn it? No, that just makes the problem worse.

 

We wereworking with a small media outlet that’s actually a WhatsApp-based outlet. Theyhave 30,000 subscribers, all on WhatsApp. We did an experiment: rather thancountering all these crazy rumors that are running around about COVID in thecommunity, we asked what would happen if you just report on or send packets ofinformation each week on the general best practices about social distancing?And guess what, the group that gets the “good information” [was] 30% morelikely to follow social distancing practices. The lesson is to be moreassertive with your message versus going after the disinformation and sayingthis thing is wrong.

 

Doug: Your example is reminding me ofscientific experiments about how to do rapid response, and finding that peoplewho heard disinformation and then somebody trying to counter the disinformationwere more likely to remember the disinformation in the first place, not theattempt to correct it.

 

Alex: Exactly, and that’s the lesson thateverybody keeps not learning. Since 2016 in America, right after [the]disinformation in the election, it’s like, oh, disinformation is this newproblem. Good information and bad information [have] competed with each othersince the dawn of time. This isn’t new. The toolkit for addressing it is reallythe tried-and-true practices that I think a lot of the world’s bestcommunicators know.

 

Doug: This brings up an article you wrotefor The Chronicle of Philanthropythat pointed to the need to “equip citizens to distinguish falsehoods from thetruth.” Is that different from what we’re just talking about here? How shouldpeople be thinking about that?

 

Alex: With information, you’ve got theproduction side, you have the distribution side, which is the channel you areaccessing it on—social media, TV. Then [there’s the] consumption side, [wherewe ask] how is the person on the receiving end of that information processingit in their head? On the consumption side, many citizens do not have the basicskills in terms of telling the difference between what is true and what isfalse.

 

That isdifferent now from the era of the printing press. We have a more complicatedmedia environment with a billion websites. It’s probably more than a billionnow, [so there are] so many different ways to access information, and more waysthat you have to be a savvier consumer.

 

[There aremany] different ways to be tricked, so you actually have to teach people,“Here’s what you’re interacting with, what you’re seeing on social media, andhere’s how to critically assess whether something is true, whether something isfalse.”

 

And this issomething that we really don’t do in the United States—there is no medialiteracy national curriculum. Even at the state level, it just doesn’t exist.And in the countries that teach critical [media] skills, specifically educationon media consumption, their populations are much more immune to disinformationcampaigns. In Finland and in France, for instance, they have media literacyclasses that are [a] standard part of a high school curriculum. Almost everymodern-day election now suffers from disinformation, [and in] those particularcountries it’s had the least impact on voting behavior, and it’s because theircitizens just know how to look for the good stuff and ignore the bad stuff.

 

Doug: So what’s an example of something youlearn in a media literacy class?

 

Alex: I would say one of the moreinteresting things, and this is something that your listeners could actuallyuse—if you’re trying to make somebody a more savvy media consumer, the questionI usually get is, okay, so you teach somebody the basic skills—cross-checking.You hear it in this outlet, so you cross-check it in another source. This isprobably the most important, basic media literacy skill. But is that reallygoing to help a person who’s deep in the conspiracy hole or who’s just watchingone news channel and they’re just sitting there and just passively taking inall this information? How do you get to that person?

 

There is amethodology for that. I think of it in the nutrition sense, and ask peoplequestions. What we do in our media literacy training, we ask people questionsabout their media and information consumption diet. It’s like a food diet. Youwouldn’t only eat ice cream. You need carbs and protein. You need all thesethings. [Question] whoever you’re trying to train about their sources ofinformation.

 

Almostalways, when you have this conversation [it] goes, “Oh yeah, I really only watchedthat one show. Oh yeah, maybe I should watch some others”—and [then you] justhave a conversation with them about it.

 

You’re goingto make them feel smarter by encouraging them to diversify their media diet.The [mistake] that people always make is calling their uncle an, an idiot, andtelling them they’re watching just that one, wrong news channel, and theyshould turn it off. You’re not going to convince somebody to change by callingthem stupid or by attacking the channel itself. It’s very, very challenging toactually tell somebody this channel is [bad], especially if it’s one thatthey’re consuming already, because they don’t want to admit that they’ve made amistake in tuning in [to] the channel. You just want to encourage this personto diversify their diet.

 

Doug: I’ve wondered about that when thinking about [how to]approach people—pointing out mistakes and errors and [saying] you were fooledto believe something. It really puts people on the defensive rather than openstheir eyes, and your approach is opening their eyes [while] giving them a senseof agency. That’s interesting.

 

Alex: Exactly, this is the communicator’stoolkit. Whether you’re coming at this as a journalist who’s trying to presentthe facts, or you’re coming at this as an advocate where you’re actually tryingto use a message to move somebody somewhere else, you’ve got to give youraudience agency. You can’t alienate [that] audience if you want to move them oreducate them.

 

Doug: So some of your advice for theadvocacy organizations [reading] this conversation [includes asking] whatchannels their audiences are using, and to make sure you understand how they’reconsuming media, and how to open their eyes to the options if they’re subjectto disinformation.

 

Alex: Yeah, and the other recommendation Iwould have for advocacy organizations is one, take a really critical look atwhat kind of disinformation is in your space. I’ve seen advocacy groups who aresometimes in a space where [they’re] just overwhelmed by disinformation andthen [there are] others where there might only be a tiny bit, but they’reoverreacting to it. Those are the ones who should ignore [the disinformation].Focus on your message. So you need to be real with yourselves and ask, “How bigof a disinformation problem [do] we really have?”

 

Doug: Right, don’t make a mountain out of amolehill per your first point.

 

Alex: Absolutely, and then the otherrecommendation I have, which comes from working with journalists all around theworld, is that journalists don’t want to be fooled either, and they don’t wantto be part of the disinformation problem.

 

They have achallenging job because they’re running from issue to issue and they’re not anexpert on everything, so they also don’t always know when they might beinadvertently repeating a piece of disinformation or misinformation. And Ithink one of the best things advocacy groups can do, rather than just pitchingthe journalist on what to cover, is to come together with that journalist andagree on the objective language you want to use in your space and coordinate soyou don’t fall into the disinformation traps. I find that journalists are veryreceptive to that.

 

You have tobe careful in that conversation and be authentic and real. You can’t try topush your own agenda too hard or you will alienate [the journalist]. But youcan really help to mitigate the disinformation in your space if you’re justsimply educating journalists about how to manage it.

 

Doug: So I’ve heard, educate your audienceabout media literacy and the options they have. Educate your journalists. Whatabout social media where a lot of people are getting information? What’s thetack to take there?

 

Alex: I think a lot of social mediacompanies, to their credit, have experimented with “real-time intervention.”That’s the problem with social media. With Twitter, the misinformation goesaway. You see the feed and then it’s gone. There’s no chance to rebut somethingor even catch that person again. They need to see that something is inaccuratein real time or see the accurate message in real time.

 

Google andYouTube [have] been particularly effective at this—when there is a video thatmight have some misinformation, sometimes they take it down. Sometimes it’s notappropriate for them to take it down, but they’ll flag it so that in real time,the consumer knows, “Oh, I should be critical about what I’m hearing in this.”So the platform can help tag those things in real time, or they can feed up thealternative point of view in another video: Oh, you heard the conspiracytheorist—here’s Dr. Fauci that comes up right after.

 

So I thinkthat’s one of the most effective things on social media there [is] when itcomes to citizen engagement with it—you have to actually train people tounderstand how social media works. Most people don’t know that there’s thisalgorithm that is feeding stuff up based on their own habits and is trying toaddict them. And when you point that out to them—this is what we do in ourtrainings—they’re disturbed by it because they don’t want to be tricked andfooled. So just raising consciousness about the idea that social media istrying to trick you puts people on defense. Once you’ve got them in that kindof skeptical mode, it’s kind of natural for people to go and fact-check things,[but] it’s more about motivating them to do it.

 

Doug: Yeah, it’s reminding me of the brainresearch about awareness being the first step to disrupting habits. A lot ofthe concerns and conversation about disinformation is about threats todemocracy. I know you’ve done a lot of work in that world, and we’ve talked inthe past about combating disinformation around elections and politics and thelessons Americans can learn from other places that have been contending withthe issue for a long time. So knowing you can’t be too specific in some cases,what are lessons learned around the world that we should be putting intopractice here to protect our own democracy?

 

Alex: The crucible of disinformation[recently] has been in Ukraine. It’s been the Kremlin’s disinformation testingplace for quite some time. And I think actually a lot of the interventions andthings I was just talking about were really pioneered in Ukraine in response toRussian disinformation. Some of the best media literacy courses were forged inUkraine in the early 2010s after the invasion of Crimea because there was thismassive flood [of disinformation], of this hybrid warfare, and we needed torapidly train people on media literacy skills.

 

You canintegrate media literacy into schools and colleges, but what about the adultpopulation? That’s a whole lot of people, and we couldn’t leave them behind. Oneof the things that happened in Ukraine that is often not done, we found thecommunity influencers, trained them on media literacy, and then had them go anddo trainings in their communities. We were training the trainers.

 

So wetrained the librarians, and then they’re going to go to their library patrons,who are senior citizens in their communities. We found different influencers.People care about their communities and don’t want them to be fooled. We hadtens of thousands of people who were participating in this [in Ukraine]. So wedon’t have to forget the adults here in the United States. There is stillopportunity. They too can be trained on media literacy.

 

The other[point] about disinformation is that we can’t take our media for granted. Andthat is the number one most important antidote to disinformation: Journalism isdying all over the world, and thousands of news outlets are closing everysingle year all around the world. We need to support them here in the UnitedStates. We now have 200 news deserts, counties that don’t have newspapers atall, and that trend is all around the world.

 

There’s thebig-picture disinformation that infects national elections, but then there’sthe local disinformation, rumors that spread, and if you don’t have a newspaperthere to counter them or to present the accurate facts about the schoolcommittee or whatever’s happening, that’s disinformation at the local levelthat’s hurting democracy.

 

Local iswhere it all starts. We can’t forget about investing in local journalism. Onething I always tell people when they’re asking how they can help, I saysubscribe to your local paper, or give to NPR. I don’t mean to preference anymedia outlet here, but somebody, particularly the local guys, because they arethe ones who are going out of business right and left. They need support. Theyneed the help of philanthropy as well. All around the world, the number ofmedia outlets that survive on volunteers and philanthropy—in Africa, 60% iseither philanthropy or volunteer. Only 40% can survive on ad revenue. So thelesson from the rest of the world is, we’re losing our news media so we have tohold the line.

 

Doug: We have to get proactive abouteducating our communities. That’s what I’m hearing here—[there] is a need and[an] opportunity for organizations that serve some constituency. It’s a goodinvestment to do media literacy with that constituency because that could helpyou forestall and address disinformation problems.

 

Alex: Absolutely. I know you work with alot of community foundations, Doug. I think that if you are a communityfoundation or a local philanthropist, and no matter what issue it is thatyou’re trying to advance, you need a healthy information ecosystem in yourcommunity. You need good, high-quality journalists who are going to coverwhatever advocacy campaign you’re running or whatever quality informationyou’re trying to educate your citizenry on.

 

Communityfoundations should be adopting their local papers. They’re going to be goingout of business if they’re not. You can try to push more ad revenue into thesenews outlets, but fundamentally, the big-tech players have just sucked up allof the advertising, and frankly [I] don’t really think there’s a way out ofthis without philanthropy.

 

Doug: Absolutely. Investing in thefoundation of good information. Are there other areas for investment beyondmedia literacy and the news media itself?

 

Alex: On the business side, if you’re aphilanthropist, you could also help to support tech innovations that are tryingto drive money back into the news media. For instance, we have this initiativethat we just started called Ads for News, and the concept is providing lists oflocal media outlets to advertisers on a mass scale. This is one of the reasonswhy all these news outlets are going out of business. They’re too small forlarge advertisers to plow money into.

 

Theadvertiser would rather just send all their money to Google and instantly reachpeople. If you can give them a list of 10,000 URLs of local media outlets andthey can just advertise in them en masse, then they have a reason now toremodel local news. It takes philanthropy to put that whole system together.You need nonprofits who are creating these lists. The advertisers are not goingto just do this out of the goodness of their hearts.

 

So [thereare] those sorts of initiatives for the philanthropist who likes to take thebusiness intervention approach. [There are] some more interesting things thatyou can do there, rather than just adopt my local paper. Right.

 

Doug: That’s interesting. So that’s anapplication of long-tail theory, the idea that got Amazon started. Recognizethat these little outlets that seem small, when you add them up, can be reallyhuge, can reach a huge audience. So what developments in the disinformationspace are giving you hope? What innovations, or what’s coming up that gives yousome hope about contending with this?

 

Alex: This is a hope and a fear: generativeAI. I think that this is both going to be helpful for dealing with thedisinformation problem in that in many ways, it’s going to make the job ofnewsrooms easier. They’re going to be able to do the sort of written-storiesstuff much faster because you can have a machine do that, and it frees them upto do higher-order journalism: fact checking, investigative reporting, storiesthat really can get in depth and counter disinformation and hold power toaccount.

 

So that’sthe hopeful part. The scary part of generative AI is that it’s the next wave ofdisinformation. This is not new to the world. The one that worries me isparticularly on the video side. If you can make videos that are sosophisticated that people can’t tell them from a real video, we’ve got aproblem. Just think of George Floyd. People’s eyes were opened up, includingpeople who were not particularly inclined towards racial justice, because youcouldn’t watch that nine-minute video and not say, we have a problem here.

 

We trustvideo right now because it’s very hard to fake. You couldn’t have faked that,right? You would have figured it out. When we enter the world where you couldfake that, I worry more that people are going to just stop trusting everythingand be able to say, oh, no, that didn’t really happen, versus the proactiveside of disinformation where they’re trying to trick you. I’m actually moreworried about people discounting the facts because they think that facts can bemanufactured.

 

Doug: Because you can’t believe your eyesanymore.

 

Alex: Yes, I’m not a technologist, but Ihave heard that watermarking generated videos is a real thing and that it canbe done fairly effectively. So I do think that there is an actual technicalsolution to that. And if you were to combine that with media literacy skills,so people know, oh, that’s what a watermark is, I know that’s fake. Oh, this ishow I authenticate a video. But that alone won’t do, you will also have acitizen education component to make that work.

 

Doug: Yeah. Well, Alex, as a HattawayCommunications alum you know we like to end on the aspirational note, andthere’s a lot to worry about here, but you have given us some very practicalinsights and ideas for contending with an issue that’s very much on our[audiences’] minds. So thank you very much.

 

Alex: Thank you, Doug.