How “Fake News” and Disinformation Can Affect Your Cause – And How to Counter It

Anusha Alikhan

Wikimedia Foundation

Alex Cole

IREX

Brian Wesolowski

Center for Democracy and Technology

Kelly Born

Hewlett Foundation


Key Takeaways:

  1. Making your audience immune: How to inoculate your audience from disinformation by honing critical information consumption skills.

  2. Influencing the storytellers: Ways to help journalists tell the story more fairly.

  3. Filtering out the junk: Approaches to removing or marginalizing disinformation on platforms like Facebook.

Slide Deck:

Breakout Notes:

Background on Disinformation

Disinformation often thought of in electoral context: Stat: 50% of tweets relating to 2016 election included false information of some sort.

2016 Election put fake news & disinformation on the map, but prevalent from education to healthcare to climate change to poverty, etc.

Example of disinformation: Foundation staffer heard rural resident say: “I’m not a racist but don’t appreciate Black Lives Matter people burning down cities like ours.”

3 Big Intervention Points

  1. Production- Influencing the storytellers

  2. Distribution-  Filtering out the junk

  3. Consumption- Making your audience immune

A Case Study in Information

Hewlett’s Madison initiative– Nonpartisan effort to improve US Democracy.

Goals: Uphold guardrails of democracy; Supporting the institution of Congress, Improve campaigns & elections; Combat digital disinformation.

Production– Great, important work to create stronger content, but not enough on it’s own.

Consumption– Some citizens better able to make sense of info with fact-checking, smarter, more diverse consumption, but also not enough especially in polarized societies.

Distribution– Where and how the content shows up. This is where Hewlett focuses.

Technology, social media in particular, makes it easy now to amplify artificial ideas.

All this makes it hard for us to have a common conversation. Why is social media a different kind of platform?

  • No gatekeepers (quality issues)

  • Fragmentation & overload makes it hard to process the volume of content and understand the issues.

  • Personalization & microtargeting- Can now target inflammatory content that exploits known vulnerabilities to very specific audiences.

  • Bots and artificial amplification

Platforms not incentivized to solve the issue: Science says people want news that reinforces their ideology, is emotionally engaging (ex: outrage induces engagement).

Hewlett approach is to focus it’s $10M research fund to eventually influence the handful of platform companies rather than a more diffuse strategy of all newsrooms, public as a whole.

3 Main Ways We Address Disinformation

  • Fact check- Day late, dollar short though.

  • Reframe- Effective, direct, (usually) reactive though

  • Improve critical info consumption- Equip people with skills to immunize themselves from disinformation

How do you encourage behavior change to make people less susceptible to disinformation?

Media literacy sometimes hasn’t challenged people to change their media consumption habit. That can change:

  • Encourage critical self reflection of consumption.

  • Call out the liar not necessarily the lies – expose the agenda. May be advantageous to use the terms lies, false news, disinformation rather than the fraught “fake news” which can inflame the issue.

  • Empower audience to seek new sources of information. Make them feel like they have choice. (Ex PSA: liken it to grocery shopping to stock up for a healthy diet)

Does it Work?

Media literacy classes can help change knowledge, skills, and behavior.

Distribution: Role of Online platforms & content moderation

Online Platforms and Disinformation

What are the platforms doing?

  • Filtering illegal content

  • Enhancing user-control features (Ex: mute buttons on Twitter, Facebook)

  • Empowering communities to moderate (Ex: OK Cupid helping people ID abuse on the site)

  • Building reporting mechanisms

  • Suspending fake accounts

Platforms make their own rules – know the terms of service!

Report, flag disinformation. Platforms more responsive than ever.

Audience exercise: Should editing content that violates terms of service be done automatically? Examples of disinformation on social media showed it’s very tricky for humans to make decisions on these given the subtleties of language use and grey areas, even harder for algorithms.

Privacy considerations and not breaking encryption important- think of the protester, journalist writing pieces their government disapproves of. Leaving a site backdoor that leaves people vulnerable to abuse by political sources, law enforcement, etc. not the preferred solution.

No single answer to how to solve the issue, humans still have advantages over machines as the best solvers.

Challenges for Journalists (and everyone else)

  • Info is easier to manipulate than ever

  • People are living in news echo chambers

  • Trust in media at an all-time low.

  • Partisan divide is the widest in 25 years.

Knight Foundation Media and Democracy Initiative: Another Case Study

There is a multi-layered approach to helping journalists solve the problem of misinformation:

  • Funding- grants small and big, including to new innovators

  • Convening- Connecting news influencers across backgrounds – Philanthropies have opportunity to push their convening power here.

  • Research- Helping people understand problem, and shape solutions

Funding examples:

Innovation- Ex: Cortico, a tool to help reporters surface stories that may be unheard.

Engagement- Ex: Report for America, Emerging journalists embedded in newsrooms across country.

Local News- Ex: NewsMatch initiative looks at matching gifts to local news. Venture funding for philanthropy to support local journalism.

Convening examples:

Knight Commission on Trust, Media and Democracy: Foundations can leverage ability to bring people together across divides.

Research example:

Knight-Gallup research series that explores American perceptions of the media and issues including misinformation.

Lessons for Communicators:

  • Make fighting misinfo part of your comms strategy

  • Partner with journalists to promote media literacy and trust in news

  • Use convening power to connect people

  • Educate networks through accessible research

  • Help your org respond to the pace of tech. And create campaigns that attract ideas from new innovators.

Audience Q&A:

Q: How do you balance ignoring vs. confronting the liars – not wanting to elevate disinformation spreaders but not wanting to ignore it either.

A: Who’s watching the source? If it’s influential, attack, if not, ignore.

Q: How do you deal with philanthropy funded front groups propping up bad actor industry players. How do we begin to talk about this issue in philanthropy?

A: Data gap a problem. No dashboard to track disinformation. We need a field resource that can help orgs and people have a unified place to turn for data to help track the issue, understand the sources, relationships, money behind disinformation campaigns.

Q: Issue feels less like fighting back to disprove disinformation but to disrupt it somehow. How best to fight back against effective persuasion techniques – fake news?

A: Extreme content gets engagement, which brings ad revenue, incentivizes platforms to not intervene, which leads to further polarization. Need to try and influence the platform operators, algorithm editors to help take responsibility to better address the issue when they design interventions.

These notes were captured by The Communications Network and have been reviewed by the presenters. ComNet18 Breakout Session notes were made possible thanks to the generous support of the Kalliopeia Foundation.

Upcoming events

Previous
Previous

110 Degrees in the Shade: Using the Heat of Public Controversy to Power the Mission

Next
Next

How to Create Your Own News Bureau