Getting it right: Strategies for truth-telling in a time of misinformation and polarization
- Journalists need a new set of skills and strategies to operate in an information ecosystem infused with misinformation, to fend off attacks on their work as biased or “fake” and to reach polarized audiences.
- There are a number of strategies for reporting on falsehoods without amplifying them. One is the “truth sandwich,” which involves stating a true fact, then the falsehood, then the true fact again. Journalists also must know their “tipping point” — the point at which a story about false information becomes too big to ignore.
- Journalists can respond to attacks on their credibility by being transparent in their reporting, quickly acknowledging and correcting mistakes, and avoiding a “war footing” with antagonistic public figures.
- Reaching polarized audiences calls for better listening and creating more opportunities for journalists to get out in the field. It also calls for more complex, nuanced stories that avoid moral outrage phrases or keywords that contribute to groupishness.
There was a time when being a journalist meant pursuing a story by reporting the available information from as many sources as possible, writing the piece, and getting it published in print and online.
How quaint that now seems. Today’s media environment requires reporters and editors to be detectives of misinformation, and then be prepared to debunk falsehoods without amplifying them. The proliferation of user-generated content online means journalists must adhere to their craft’s ethics and standards while competing with people who have none. Journalists must employ techniques to maintain the eroding trust of readers and viewers even as partisans seek to delegitimize the profession. They must find new ways to listen to increasingly polarized communities and show that they are hearing different perspectives – but without taking sides or appearing to do so.
The new ethics governing journalists’ work have been evolving over the past couple of decades. Media professionals and consumers alike are aware of how the digital information age has gradually eroded the traditional media’s role as gatekeepers of information, not to mention its advertising revenue. The explosion of social media over the past 10 years then added a new dimension – an internet driven by its users, who also compete for attention. More recently, the exploitation of these technologies by nefarious actors, domestic and foreign, has thrown disinformation and media manipulation into the mix.
Now, add in attacks from politicians, led by President Donald Trump, that the media is the “enemy of the people” delivering “fake news,” and the picture is complete. In today’s information ecosystem, journalists who are just trying to inform the public as truthfully as they can are now being sidelined, distrusted, manipulated and maligned all at the same time.
These phenomena – misinformation, polarization and media attacks – all reinforce one another. People who are persuaded that traditional media are not delivering the truth may turn to more partisan alternatives for news and information, as they seek out and stick with sources that fit and validate their own beliefs. An information tribalism takes hold, worsening polarization. The cycle repeats.
On the national level, an election in 2020 and the effort to impeach the president have further quickened and intensified the news cycle. Many journalists describe being under relentless pressure in breaking news situations, which is when they are most vulnerable to manipulation and misfires. The notion of slowing down and carefully deliberating word choices is often unrealistic, however. The tension between getting a story out on deadline and getting it right is not new, but that tension, experts and practitioners say, is only getting worse.
On the local level, newsroom cutbacks have not only eroded coverage of important issues, but also interfered with the ability of journalists to take time to develop the kind of sophisticated framing needed to avoid feeding into polarization.
With generous support from the Craig Newmark Philanthropies, the American Press Institute has been exploring ways journalists can navigate this environment.
There are a variety of new strategies being developed to help journalists convey to readers what’s really happening in their communities and how their institutions of government, business and education are performing. There is heightened awareness of how misinformation spreads. There is also new research on how people process information.
This report explores and explains some of those strategies and is designed to help people better understand and navigate the crosscurrents of polarization and misinformation buffeting the industry. The insights here are derived from a review of recent thinking on the subject, including scholarly research and interviews with experts and journalists in the field. In addition, API held a two-day summit in the summer of 2019 that brought journalists together with people who study the information environment. We included experts in the ways manipulators try to influence mainstream reporters and social psychologists who understand how people process information. We talked to journalists who are trying out new strategies to deal with divided audiences, and fact-checkers who are doing their part to police the falsehoods and help clean up the internet landscape.
There is also a growing body of literature on the subject from both academia and non-profit organizations studying how to contend with misinformation and polarization. Some of the strategies included here are well known but worth recounting. Others are newly evolving. And some will seem counterintuitive for journalists. The current era requires that they at least be given consideration.
As noted above, the issues involved are complex and intertwined, but they are treated in four sections in this report for clarity:
- In our first section we explain how journalists can respond to and operate in an information ecosystem contaminated with misinformation, fakes and hoaxes, and how they can avoid being manipulated.
- In the second, we suggest how and when to cover false information without amplifying it.
- In the third section, we consider how journalists under attack might respond to politicians like Trump who are seeking to erode trust.
- In the fourth, we offer ways that journalists contending with polarized audiences can avoid widening the divide – and maybe even narrow it.
A Sea of Falsehoods
The public today must navigate a news ecosystem contaminated with false information. Some of it is spread without malicious intent, which is generally called misinformation. Some of it is designed to mislead or disrupt, which is known as disinformation. And some is aimed at swaying public opinion or beliefs by distorting the facts, which is a classic definition of propaganda. In today’s environment, it is often hard to make a distinction between these terms – misinformation can be passed on without nefarious intent, even though it was originally created with such a purpose. Intent, in fact, is rarely relevant. Even the content spread with an innocent or good intention can hurt people in the process. A lot of disinformation is also propaganda, but can be aimed not just at changing minds but at stoking division and chaos.
(In this paper, we use the term misinformation broadly, especially in referring to falsehoods of uncertain origin or intent. For more on how to characterize different kinds of false information more specifically, we recommend Data & Society’s Lexicon of Lies and First Draft’s Definitional Toolbox).
For news organizations, understanding this environment, and the way in which adversarial actors are seeking to influence the mainstream media, is critical to any effort to capture the attention, trust and loyalty of their communities.
The problem of misinformation poses an ambitious challenge, but audiences expect journalists to solve it. A Pew Research Center survey conducted in early 2019 found that half of U.S. adults surveyed identified made-up news and information as a very big problem, and even more say it is having a big impact on political leaders’ ability to get work done.
In a key finding for the news industry, Pew reported that 53 percent – the largest share – believe it is primarily the responsibility of journalists to fix the problem.
One big question is whether journalists have the tools to do so. The job requires understanding the psychology of audiences at a level beyond that which exists in most newsrooms. It also requires understanding how false information moves, which may be the province of scholars more than reporters. It is also not clear that journalists today have the trust of a broad enough spectrum of Americans to persuade them when information is false. And doing so might require more cooperation among American journalists across the political spectrum than they typically operate with now.
It is also important to know the scale of the problem. Facebook in 2017 estimated that 126 million people may have been served content from a page associated with the Russian disinformation factory known as the Internet Research Agency. That was over a two-year period, and it involved a number of different stories, memes, ads and other divisive content, but it shows the size of the challenge for providers of legitimate news. On Twitter, according to a Massachusetts Institute of Technology study published in Science Magazine last year, false news stories are 70 percent more likely to be retweeted than true stories are. The researchers also said that true stories take about six times as long to reach 1,500 people than false stories.
For some hoaxers, tricking the mainstream media is a goal in itself, experts say.
In other words, the information environment that has most challenged traditional media in the attention economy – social media – is where a lot of misinformation thrives and spreads, and it is huge. Moreover, the biggest of these platforms are the ones benefiting from and even contributing to the erosion of traditional news.
But while the mainstream media may no longer be gatekeepers determining what is and isn’t news, they still have enough influence and credibility that agents of misinformation perceive them as important carriers of their content.
As a result, manipulators try to trick the media into reporting these falsehoods as a way to accelerate and broaden their reach. They know that if a mainstream media outlet carries a falsehood, people feel more comfortable sharing it, said Ben T. Decker, a disinformation expert who runs the consultancy firm Memetica, in a phone conversation. “It creates more consumption and creates the perception that the information is more popular than it actually is. If it’s being reported on ABC News, for example, then it must be real, because ABC is a real media company,” he said.
For some hoaxers, tricking the mainstream media is a goal in itself, experts say.
In a comprehensive report on media manipulation for Data & Society, Alice Marwick and Rebecca Lewis discuss how trolls and other disruptors use the mainstream media to amplify their messages. Their case studies include a 2015 example in which the founder of a Neo-Nazi news outlet, the Daily Stormer, suggested that followers create “White Student Union” pages on Facebook, then contact local media about the posts. Some such pages were created, leading to outrage and controversy on those campuses – and coverage in the mainstream press, including USA Today. The objective, Marwick and Lewis write, was to either facilitate the creation of such groups or simply to “trick the media into moral outrage and simultaneously spread some racial tension throughout college campuses.” It worked.
“The media, hungry for stories about racial tension on college campuses, took the bait and amplified what was essentially a non-story,” they wrote.
The exploitation of big news events
The moments when journalists are most susceptible to misinformation are during breaking news events, when a combination of legitimate or official information intermingles online with fakery, and when journalists and the public are looking for new information.
After the Feb. 14, 2018, mass shooting at Marjory Stoneman Douglas High School in Parkland, Fla., a number of internet rumors started circulating as to the shooter’s identity. One name that made the rounds, Sam Hyde, frequently pops up after such shootings. There were also suggestions that the shooter was affiliated with Antifa, the leftist anti-fascist protest movement. At one point, a “screenshot” of a faked BuzzFeed headline appeared, saying, “Why we need to take away white people’s guns now more than ever.” Some people retweeted faked screenshots of tweets that pretended to be from Miami Herald reporter Alex Harris asking whether the shooter was white and requesting pictures of dead bodies.
All of this happened within hours of the murders. And there were even more hoaxes in the days that followed. Some of the students who spoke out were cast as having made it all up, being, in the lexicon of extremist forums online, “crisis actors” hired to stage the event to further some kind of anti-gun agenda. The students were smeared in other ways, too. One now-famous hoax involved a fake photo of one of the students, Emma González, tearing up the Constitution. It was a doctored version of a real photo of her tearing up a gun target. Another false report that the shooter was part of a neo-Nazi group made its way into mainstream media after a coordinated effort by disinformation actors online.
Mass shootings bring out a particular breed of hoaxes, conspiracies and other kinds of disinformation. They usually try to blame a member of a specific group – religious, racial or political. Gun control is a common theme. Sometimes, as in the case of the Harris, trolls take aim at the media. The purveyors of this misinformation, whom experts identify broadly as adversarial actors, use these moments to take advantage of the public’s instinctive hunger to know more – who was hurt, who did it and why.
These shootings exemplify the special challenges journalists face in breaking news environments. Events are fluid. Reporters are worried about competition. Speed, always the enemy of accuracy, is given added importance. In an era of misinformation, these are also the moments when journalists need to be most on guard against hoaxes. People spread false information during developing news events because that is when the public is paying close attention and journalists have less time to be on guard.
In 2017 the Sam Hyde hoax played out in real time on CNN after the shooting at a church in Sutherland Springs, Tex., where 26 people were killed. A congressman from Texas being interviewed about the killings said he had been told Sam Hyde was the shooter.
The Sam Hyde hoax is just that – a kind of not-funny trick that plays across social media, perhaps for the hoaxers’ own amusement. (Inside jokes are big with trolls and hate groups.) As BuzzFeed’s Craig Silverman explained in this video, the real Sam Hyde is a comedian who’s been made into a meme, but his isn’t the only name that gets circulated after shootings.
The methods of manipulators
danah boyd, a partner researcher with Microsoft Research and founder of the research institute Data & Society, said breaking news situations can also expose how certain terms or phrases exist in what she and Microsoft program manager Michael Golebiewski call “data voids.” Data voids occur when obscure search queries turn up few or no results because the terms haven’t yet been sufficiently established or defined on the internet. When the void happens, media manipulators can then create content designed to hijack those terms, especially in breaking news situations.
Those voids don’t exist forever – eventually, once enough people are aware of the void, news stories and other legitimate content will be created so that search engines will surface them. But until then, the void exists.
“Unfortunately, the time between the first report and the creation of massive news content is when manipulators have the largest opportunity to capture attention,” boyd and Golebiewski wrote in an October 2019 report, Data Voids: Where Missing Data Can Easily Be Exploited.
In discussing data voids, they point to the hours after the Sutherland Springs shootings. Rarely had anyone before that moment searched for Sutherland Springs, a town of about 600 people in south Texas, and if they did, they’d get back mostly weather and maps. But as soon as the shooting happened, far-right groups coordinated online to use social media to associate the Sutherland Springs murders with Antifa, the anti-fascist group that the alt-right seeks to blame for violence including murder.
In moments like these, boyd and Golebiewski wrote, manipulators will take to social media and use fake accounts to pose provocative questions or point to misleading posts, a tactic journalists need to be aware of.
“While their primary goal was to influence news coverage, this tactic also helps waste journalists’ time,” they wrote.
For journalists, this means taking care with the terms they use in these situations, because repeating or adopting the language of extremism or manipulators can cause people to search for terms that aren’t yet clearly explained online, and, like “crisis actor” after a shooting, can lead users into conspiracy theories. This might come as counterintuitive advice to journalists, noted Kelly McBride, chair of the Craig Newmark Center for Ethics and Leadership at the Poynter Institute, because they like to “discover” and define emerging terms to help their readers understand them. But it could be counterproductive to the public’s understanding.
The motives behind manipulations and hoaxes vary. Hackers may just see it as a technological challenge. Extremists usually have a political agenda. Trolls may be just making mischief or trying to drive wedges in society, deepen mistrust in the media and government and further polarize groups who are already divided. The common element is that during breaking news events, these adversarial actors seek to disrupt the information ecosystem when the largest possible audience is paying attention – and when people most need reliable and accurate information.
Decker also noted that sometimes these actors are paid, like those profiled in a Washington Post piece on troll farms in the Philippines, and sometimes the growth of such content is generated by user interest, as in the case of a conspiracy theory connecting Bill Clinton to the convicted sex offender Jeffrey Epstein.
The spread of a conspiracy into the mainstream media also creates what Decker describes as a “weird beacon of recruitment,” wherein people vulnerable to buying into a conspiracy might see the bait of intrigue in mainstream coverage, then follow it deeper into the darkest corners of the internet where they become a convert to the cause of the false narrative.
Opportunities for collaboration
Breaking news situations are also moments, argue both boyd and Donovan, when news organizations, even competing ones, could band together and collaborate on separating reality from falsehoods. Journalists who strive for information exclusivity might find this anathema to their mission, but boyd believes some instances require it. “Industries are taught to compete with each other, but there are times and places when collaboration is essential to survival and the ability to thrive,” boyd said in an email exchange. “Journalism hasn’t yet realized it’s reaching that tipping point. I just really hope it does before it’s too late.”
boyd points to New Zealand, where journalists agreed on a coverage plan that included avoiding certain images and symbols and limiting coverage of statements. The goal here was to prevent the active funneling of people toward white supremacist or terrorist ideology by limiting the phrases and content that would promote self-investigation. Some media experts have applauded the moves, as Nieman Lab wrote in June 2019. But not everyone agrees. Politico’s Jack Shafer argued that the agreement amounted to the New Zealand media agreeing to censor themselves. “Drop the blinders, New Zealand,” Shafer wrote. “You can’t stop a threat you have blinded yourself from seeing.”
These disparate reactions showed how controversial any kind of collaboration might be among U.S. media outlets, which are competitive by nature and value their independence and ability to exercise news judgment case by case. The U.S. reluctance to band together is also demonstrated in fact-checking, as Cristina Tardáguila of the International Fact-Checking Network wrote in June 2019. In other parts of the world, she noted, fact-checkers have worked together, enabling them to operate faster and check a larger number of claims. “U.S. fact-checkers haven’t even begun discussing this possibility,” she wrote.
One hope is that newsrooms will inform one another about disinformation they see online in order to identify trends and patterns.
Some U.S. news organizations have recognized the cost- and time-saving advantages of sharing resources while also broadening the audience of the stories they produce. These collaborations are usually aimed at producing deep investigations and projects, and they often involve traditional newsrooms working with nonprofit news organizations such as ProPublica. But many experts say such collaborations aren’t used often enough in breaking news events, which are “Super Bowl” moments for bad actors is to get their conspiracy theories, hoaxes or false narratives picked up by the mainstream media, either intentionally or by accident.
At the Center for Cooperative Media at Montclair State University in New Jersey, which works to facilitate and track collaborative journalism projects around the world, Director Stefanie Murray said that newsrooms are getting better at these efforts. As more of them collaborate on enterprise journalism, they will build relationships and a “muscle memory” of working together that will take them to the next step of collaborating on breaking news to debunk mis- and disinformation, she said in a phone interview.
First Draft, a nonprofit that fights misinformation around the world, has launched an initiative it calls “Together, Now,” a global network of collaborative research aimed at helping newsrooms identify disinformation. The U.S. training, which began in Denver in November 2019 and will continue through 2020, includes a “disinformation emergency training” aimed at teaching newsrooms how to grapple with the problem in real time. One hope is that newsrooms will inform one another about disinformation they see online in order to identify trends and patterns.
“I hate using war metaphors, but it just feels like a battleground, and journalists are not aware of the landmines they face,” said Claire Wardle, the U.S. director of First Draft, in a phone interview. She said she is concerned that journalists’ reluctance to take sides can be used by manipulators, especially hate groups, who seek to get journalists to position disinformation and incendiary views against legitimate points of view in a bid to get journalists to engage in false balance. Journalists, she said, often don’t realize that their desire to include all sides is being taken advantage of.
Fortunately, she said, some news organizations are learning more about how to defend against misinformation.
In addition, a number of journalists are aggressively covering this kind of manipulation to help their audiences recognize it, and news organizations continue to add to these teams. Among them: Ben Collins and Brandy Zadrozny at NBC News; Casey Newton of The Verge; Craig Silverman and Jane Lytvynenko of BuzzFeed News; CNN’s Daniel Dale and Donie O’Sullivan; PolitiFact’s Daniel Funke; The Washington Post’s technology team including Drew Harwell, Abby Ohlheiser and Isaac Stanley-Becker; and the New York Times’ Malachy Browne and Kevin Roose, among others.
As good as these reporters are, though, the need for more coverage will only grow as malicious actors ramp up their pace, volume and sophistication. The Times said in mid-2019 that it is “moving in multiple ways” to confront the disinformation problem; smaller publications that have fewer resources will need to think through strategies, some of which are suggested below, for dealing with the problem.
Strategies to avoid being manipulated
A variety of people are working on ideas to help journalists be prepared for the kinds of mis- and dis-information campaigns that target the mainstream media, especially during major breaking news events or at other times when journalists are most vulnerable.
One common theme from experts is that forensics and verification should be a newsroom-wide endeavor. The news media will always have leaders in this field, like those named above. But training for all reporters and editors in a news operation, even just a one-time primer course, will create overall heightened awareness and help reporters and editors know what to do or where to go for help when something is fishy. Such expertise should not reside in just one corner of the newsroom, and should be available at all times. “Disinformation actors are working double time to fool the night and weekend crews,” said Aimee Rinehart, bureau editor for First Draft’s New York office. “Newsrooms can no longer rely on a 9-to-5 news cycle.”
Successful manipulations of news outlets, along with rising concern about potential hoaxes in the 2020 campaign, have prompted groups that fight misinformation to significantly ramp up their outreach efforts. They have produced a bevy of new reports and guidebooks on the subject and are helping newsrooms through in-depth training and best practices for spotting hoaxes and avoiding manipulations.
Every newsroom will face a different version of this problem, depending on the size and makeup of their audiences. Big national news organizations will be the targets of manipulation schemes that are different from those aimed at local or regional news outlets, for example. But each needs to develop skills and strategies to handle it.
What follows is a distillation of some of those strategies from interviews with and written reports by several experts in the field.
1. Learn basic skills to identify manipulations. The least-sophisticated hoaxes are often the most toxic – and the most commonly spread. Forensic experts have whole arsenals of techniques journalists can use to avoid getting duped into thinking something is real and republishing it. Here are just three basic skills journalists should have:
- Spot manipulated video: “Deepfake” videos are getting a lot of attention, and it’s true that they will be difficult to discern if you’re not an expert. But currently the problematic content is less sophisticated – like the slowed-down video of House Speaker Nancy Pelosi (D-Calif.) that was posted on Facebook in 2019 (discussed in more detail in the first section). Journalists should learn basic skills in spotting manipulated video through groups like First Draft that do hands-on training and simulations. There are also many online resources, including those from fact-checkers and other organizations who want to help fellow journalists understand their methods. Some examples:
- The Washington Post Fact-Checker in 2019 published a guide to manipulated video that it said was partly aimed at helping develop a nomenclature around different types of manipulations. It looked at videos that missed context, those that had been edited deceptively and those that had been maliciously transformed.
- At the Poynter Institute, Daniel Funke in 2018 put together a guide to verifying social media video, which includes a list of tips and tricks for spotting online fakes.
- Full Fact, a fact-checking non-profit in the United Kingdom, did a similar guide.
- Verify images through reverse image searches: This is one of the most basic but also most crucial verification skills a journalist should have. Reverse image searches can establish the provenance of a photo that might have been manipulated or taken out of context.
- First Draft, for example, offers a wide range of online resources for journalists. It also has established simulations and master classes around major news events like the 2020 election, including a nationwide strategy to help newsrooms learn some basic digital forensics, like reverse image searches.
- Use screenshots from others with care (if at all): On Twitter or other social media, people will often take a screenshot of a controversial post because they fear it will later be deleted. The screenshot is to “preserve” it in time. But that’s also what can make a screenshot problematic – it purports to represent something that no longer exists, and people have manipulated them. In other words, just because something looks like a screenshot does not indicate authenticity and could, in fact, be fakery. The Miami Herald example noted above, for instance, is a simple photo edit, but it looked real.
2. Know the viral pathway of misinformation. How does something move from a fringe platform to the mainstream media? If journalists understand how misinformation flows, how it finds a path of least resistance to the largest possible audience, then they can have a greater understanding of how to react.
In describing what she calls a “Trumpet of Amplification” First Draft’s Wardle wrote that the most important takeaway in First Draft’s study of how misinformation travels “is understanding that many agents of disinformation see coverage by established news outlets as the end-goal.”
Journalists must know how to find where the content started and what were the motives of its originator. Newsrooms may monitor social media for tips and insights, but social media could be only the most recent station on the journey from its origin on a fringe site.
3. Understand how manipulations live on social media. The social media strategies of online manipulators have become more sophisticated and varied. In a 2019 paper for Data & Society that breaks down these tactics, Joan Donovan and Brian Friedberg at Harvard’s Kennedy School identify what they call “source hacking,” a category of online manipulations aimed at “planting false information in places that journalists are likely to encounter it, or where it will be taken up by other intermediaries.”
One of the practices on Twitter, which the authors call “keyword squatting,” describes how hoaxers or misinformers hijack hashtags or handles to inject themselves into popular conversations, and get the mainstream media to pick up their views.
The social media monitoring and news agency Storyful partners with news organizations to identify misinformation (or verify that something is real) and train newsrooms in the latest verification techniques. It collects as much information as possible about information in question including original source interviews and geolocation techniques to ensure that it is valid.
In addition, Storyful investigates trends that might themselves produce stories about how misinformation is affecting society. An example was an August 2019 Wall Street Journal story exposing how people are getting around Facebook’s ban on selling guns in its marketplaces by pretending to just sell the gun cases. But the gun cases are posted at inflated prices, an indication that the postings have become “code” for the sale of actual guns.
4. Be aware that some catastrophic events, like mass shootings, are categories unto themselves. Mass shootings and other crises are seen as opportunities by nefarious actors to spread misinformation, and, sadly, are now common enough that every journalist should have a checklist for covering them. Resources that can help organizations think through their plans for shootings include one from Poynter and another from the Suicide Awareness Voices of Education. Fact-checkers like those at BuzzFeed and PolitiFact often monitor social media for hoaxes after such episodes.
As noted above, some experts like danah boyd argue that in these crisis situations journalists should collaborate and share information, since having all journalists vet the flood of information individually is inefficient and even dangerous.
The Sound of Silence: Strategic Amplification
Once a news organization identifies something as false, the question becomes how to cover it, if at all.
The instinct of most news people when they encounter a falsehood is to correct it. The underlying principle here, ingrained in newsroom professionals over time, is to dispel inaccurate information before it spreads too far, whether that false information is a deliberate hoax about a prominent politician, a false assertion about vaccine safety innocently passed on, or a state-sponsored effort to sow discord and create doubt about democratic institutions and processes. In that sense, journalists have tended to operate like police officers at speed traps passing out tickets, hoping their vigilance will stop others on the road.
But just as journalists have become better at spotting false information online, they are also gaining a better understanding of the care they must take in debunking it to avoid amplifying the thing they are trying to correct. And the task is far more complicated than it once seemed. It also requires situational judgment. Every case is different.
The first consideration is whether the misinformation should be covered at all. The issue isn’t always a matter of whether something is wrong. Nor is it necessarily whether it has already gotten some notice. That might have been the case when the press could consider itself a gatekeeper that determined to a large degree what the public knew and didn’t know. Today, the public has many sources of information. The question of whether to cover a falsehood can also be a matter of whether giving it more attention can spread it further and do greater harm. Not every falsehood that has been sent out into the world needs to be corrected. In today’s environment, after all, not covering something can be as much of a statement as covering it.
Journalists must ask themselves whether a falsehood has become so significant that it needs to be knocked down.
To help address the tension between these competing impulses, two prominent scholars in the misinformation field, Joan Donovan at the Harvard Kennedy School and Microsoft researcher danah boyd, have put forth the idea of “strategic amplification,” the idea that in a complex communication landscape, news organizations and the new platforms challenging them should develop and employ best practices for producing news content and designing the algorithmic systems that help spread it.
When the notion of becoming more thoughtful about not calling out lies was introduced by scholars, it was described as “strategic silence.” But today’s information landscape “has destabilized the notion that silence as a tool for editors may be used strategically,” Donovan and boyd wrote in a fall 2019 article, Stop the Presses? Moving from Strategic Silence to Strategic Amplification in a Networked Media Ecosystem. The media’s ability to actively balance public interest and public harm in their news decisions, they wrote, “has come undone.”
The decision-making for journalists in this case is more complicated than handing out “falsehood tickets.” Instead, journalists must ask themselves whether a falsehood has become so significant that it needs to be knocked down. What falsehoods would wither more by ignoring them? Does a serious news organization legitimize problematic information by giving it recognition? How can it amplify the right content?
The authors say that the news media and technology companies need to acknowledge how their roles are “isomorphically intertwined.” Both platforms and news organizations are engaged in disseminating and amplifying information, they write, and they both should understand that they have a moral obligation to act.
The tension between publishing or remaining silent about misinformation – and other kinds of problematic information such as hate speech – is well-known to journalists. But irreversible changes in the way information spreads has altered the calculations news leaders must make in exercising that editorial judgment.
In a report for Data & Society, “The Oxygen of Amplification,” Whitney Phillips, an assistant professor in communications, culture and digital technologies at Syracuse University, noted that journalists are aware that their task has changed. In her conversations with them for the study, she wrote, she found that “…as soon as the reporter finished listing the dangers of amplification, they would then explain the dangers of doing nothing.”
How to handle that tension in everyday publishing decisions? Researchers such as Phillips and First Draft co-founder Claire Wardle talk about a “tipping point,” the moment at which journalists may feel that ignoring a false story is no longer an option. Finding the tipping point will be driven by a number of factors, they say. Those factors include assessing the value of the public knowing that the false information is circulated, the degree to which the information has been shared elsewhere, who created it, and whether it has already had a demonstrated effect.
The growth of misinformation has generated a growing body of work aimed at helping journalists understand and contend with these problems. First Draft published four guides in 2019 to help journalists navigate these questions. In one, “Responsible Reporting in an Age of Disorder,” it lays out some questions journalists can ask themselves when they’re deciding whether something has reached the tipping point.
The size, locale and makeup of a news organization’s audience can be a key factor. Different news organizations may identify their tipping point at different times. There is also likely to be disagreement, which can also lead to arguments after the fact in social media and elsewhere. But in real time, journalists still need to decide. And whatever that decision is, they need to be able to defend it.
The Truth Sandwich
Once a decision is made to go with a story about misinformation, journalists must then frame it in a way that ensures amplification of the truth, and non-amplification of falsehoods.
Some media scholars emphasize that simply saying something is not true will not persuade people it is false. Doing so, in fact, could have the effect of planting the idea more firmly in people’s minds. Various studies have shown that repetition can actually persuade people to believe even things they already know are not true.
In a 2019 study into this illusory truth effect, “Repetition increases perceived truth equally for plausible and implausible statements,” Vanderbilt University’s Lisa K. Fazio, along with David G. Rand at the Massachusetts Institute of Technology and Gordon Pennycook at the University of Regina, wrote that people’s belief in all statements – including the most implausible ones – is increased by repetition. “Even implausible falsehoods may slowly become more plausible with repetition,” they wrote.
Such studies heighten the need for emphasizing – and repeating – truthful information.
In mid-2018, one strategy for amplifying the truth over falsehoods gained prominence when CNN’s Brian Stelter interviewed George Lakoff, a linguist who is professor emeritus at the University of California, Berkeley, for Stelter’s “Reliable Sources” podcast.
Lakoff had written a piece in The Guardian arguing that President Donald Trump was using “words as a weapon” to manipulate the media, which he said too often just repeated the president’s falsehoods. Digging into this, Stelter asked how journalists could avoid this practice, and Lakoff suggested a framing in which journalists put the truth first, then the falsehood, then repeating the truth. “It’s a truth sandwich,” Stelter said, using a term that would later become more widely used, including by Lakoff himself.
The term “truth sandwich” became a momentary buzzword for a tactic journalists could use for dealing with the falsehoods emanating from the White House. The Washington Post’s media columnist, Margaret Sullivan, wrote about it, as did Mark Memmott, NPR’s standards and practices editor. And some people still advocate for it.
It is also not an entirely new concept. Back in the earliest days of the fact-checking movement in the early 1990s, scholar Kathleen Hall Jamieson advised network news divisions that if they were going to note that something in a TV ad was false, they should put the disputed advertisement inside some kind of graphic, such as a television, to illustrate visually that the images were a TV ad that was being deconstructed. Otherwise, she argued, viewers would think the false images were news images produced by the network.
So far it’s not clear whether the truth sandwich as a story-writing technique has gained much traction among journalists. One reason may be that it’s somewhat counterintuitive, since the falsehood is often a reason for writing the story in the first place, so “sandwiching” it can feel like burying the news. In a breaking news case, the approach is often to say that “Trump falsely claimed…” or “Trump wrongly said…” But that may be a thin piece of bread for the sandwich.
Another issue is that sometimes getting to the truth involves proving a negative. Thus a truth sandwich would make some news stories feel inside out. In June 2019, for example, Trump asserted that Barack Obama during his presidency “was begging” North Korean Leader Kim Jong Un for a meeting. Applying a truth sandwich to that – by putting the truth first – would have the story lead with something that didn’t happen, and didn’t happen years ago. Even in fact-checking this assertion, reporters had to rely on former Obama administration officials’ denials to disprove Trump’s assertion. (Trump repeated a similar claim in a lengthy cabinet meeting in October, setting fact-checkers into motion again).
But while the truth sandwich might not always work in a hard news situation, it could be applied to an analytical piece after the original news is reported. In the analytical take, a truth sandwich might look as follows:
The point of such an approach is to double down on the truth, as Lakoff suggests, rather than to simply state it. He argues for the sandwich technique because simply rebutting the assertion can have the effect of reinforcing the falsehood in readers’ minds.
“It’s like when Nixon said, ‘I am not a crook,’ and everyone thought of him as a crook,” Lakoff told Stelter. “The point is that denying a frame activates the frame.”
The cover/no cover tension: Vaccine hesitancy
Nowhere is the tension between remaining silent and publishing stories about false information in an attempt to correct it more apparent than on the issue of vaccine hesitancy – the decision by parents to go against medical recommendations and not have their children vaccinated because they fear there will be side effects.
Anti-vaxxers, as they are known, spread their views on social media, constantly challenging the medical establishment’s consensus that vaccines are safe and necessary to prevent outbreaks of measles and other communicable diseases. Often they share emotionally charged stories of children with autism or other conditions the parents are convinced were caused by vaccines. There is also evidence that Russian bots have worked to sow confusion and further fuel that debate.
Here is a case where the story must be told, as a public health matter, in order to debunk false information and reinforce the truth. But it must be told with care. It’s not appropriate to report the views and fears of the parents who hold anti-vaccination views in isolation. It’s not even enough to report those fears while also reporting the truth about vaccine safety. Even a seemingly neutral or question headline can send the wrong message.
In one 2019 example, NBC’s Today Show was widely criticized for a tweet, later deleted, that raised the question of vaccine safety as part of a story aimed at debunking myths about vaccines.
We removed the tweet, seen below, which included an irresponsibly presented headline. The article headline has also been updated. https://t.co/HPlkuhlwCY pic.twitter.com/s9cYrfjxPo
— TODAY Health & Wellness (@TODAYshowHealth) June 13, 2019
The tweet linked to a story with the headline: “Doctors discuss 7 common vaccine myths,” with the subtitle: “When it comes to vaccines, it is easy to be confused.” Some critics said even that was problematic, arguing that information about vaccines is not confusing at all if you’re listening to medical professionals and proven science.
In some cases, like on the local level, there might still be an argument for avoiding coverage altogether.
Say a community of anti-vaccination advocates is planning to hold a meeting where they hope to win converts and spread their views that the measles-mumps-rubella shot is dangerous. This is clearly a case in which misinformation is being spread, but it may be better to avoid publicizing the event. Editors decide to cover events based on a number of factors – how many people will it attract? Is anyone in danger? Is a confrontation expected? Is there new information that will come to light? In the end, if there is no compelling reason to cover it, the editor might legitimately choose to pass.
As noted in the section on strategic amplification, not covering something is a concept at odds with the instincts of journalists who see it as their job to deliver even the most difficult or complex news to their communities. The impulse to cover an event can also be triggered when it is getting attention elsewhere, like on Facebook or other social media platforms – again, triggering the “tipping point” decision.
Tackling misinformation head-on: The Pelosi video
Perhaps the best-known example of a news organization’s decision to tackle a story about a piece of disinformation was The Washington Post’s decision earlier this year to report on a video that was manipulated to make U.S. House Speaker Nancy Pelosi (D.-Calif.) appear drunk or somehow otherwise impaired. Memes and smears about Pelosi have made the rounds on social media for years, some of them implying that she was drunk. But this one stood out for its audacity – and its reach.
There was nothing sophisticated about the manipulation of the video of a speech Pelosi delivered to the Center for American Progress. It was not what is known as a “deepfake,” which uses artificial intelligence to create the appearance of reality; it was merely slowed down. But it looked real and it got over a million views before The Post even did its story. (For a deeper exploration of video manipulations, see Deepfakes and Cheap Fakes: The Manipulation of Audio and Visual Evidence, by Joan Donovan and Britt Paris, an assistant professor of library and information science at Rutgers University.)
The Post’s editors put significant thought and discussion into running with the story, the paper’s business editor, David Cho, explained at API’s summit. He emphasized that every case is different, that there is no set formula for deciding on whether and when to report on misinformation.
In this case, a number of factors went into The Post’s “tipping point” – its decision to go with the story – including, but not limited to, the fact that it had already been viewed so widely on Facebook. The Post had been writing about the emergence of deepfakes and “shallow” fakes prior to the Pelosi video manipulation, so this was a continuation of that coverage. In addition, tensions between the speaker and the president were growing at that moment. And Facebook’s refusal to take down the video – in contrast to YouTube’s decision to remove it – was an important story about how the platforms treat misinformation.
In cases like this, writing about fake or manipulated content could add to the viral nature of the video. But, as The Post’s story noted, the video exemplified the kinds of misinformation people would be exposed to heading into the 2020 election. The Post’s report called it an example of the kinds of how “low-tech, relatively simple editing can dupe viewers and trigger widespread disinformation.”
Moreover, it was aimed at one of the nation’s most important politicians. The speaker of the House is second in the line of succession for the presidency and the most powerful Democrat in America in 2019. The “impairment” may have been false, but, as with much misinformation, its aim was much greater – to damage faith and confidence in the leadership of the House and the Democratic Party.
In the end, we think The Post’s treatment of the Pelosi story followed many experts’ recommendations about how to frame the fake without amplifying it. The reporters and editors made it as clear as possible that this was a story about a fake video, not one about Pelosi. Importantly, the first word in the headline was “faked” and the first word in the story was “distorted.” This is meaningful for first-impression reasons, but also for search reasons, as some search engines will deliver only partial headlines on their results page. In that sense, The Post case is a model for how to make difficult decisions about false information. Even then, publishers can expect controversy.
The unique problem of Trump
The Trump presidency has raised to new levels the complexities involved in deciding whether to cover something or remain silent about it for fear of giving it undue attention. Is it news if the president says it? What if it’s false? What if he’s already announced it to his millions of followers on Twitter? What if it’s false but so audacious that the very fact that the president is saying it is news? What if it is a falsehood that he has repeated dozens of times and has already been thoroughly discredited? Which misleading tweets are newsworthy, and which are not?
The pressures to publish something Trump says, even if untrue, can be greater than the desire not to give it traction.
In the past, presidents have used the full array of channels to communicate to the public – interviews, daily press briefings, news conferences, public appearances, and, in recent years, social media. Trump relies most days (and at all times of the day) on one channel by which he can communicate directly to the public, Twitter. When he became president, this posed a new challenge to White House reporters and editors because they suddenly had to be prepared to write about a midnight or 5 a.m. tweet – sometimes in the same 24-hour period. And social media – not the mainstream media – is the one framing his comments. There is no intermediating filter.
Trump’s use of Twitter – and the direct communication that all political figures will use going forward – illustrates a growing reality: For much of what the press now does, it is no longer a gatekeeper of what the public knows. It is often instead more an annotator and analyzer of what the public has already heard, noting what is false, out of context, or a flat-out lie, after it is already out there.
After the first year or so of his presidency, Trump’s tweets weren’t always automatic news. This is partly because the novelty of a president communicating through Twitter had worn off. Trump was often repeating himself. And the phenomenon of a president tweeting in all caps, often ungrammatically, sometimes hysterically, was no longer shocking.
Another challenge is that often what he says is simply not true.
Holding the president accountable for the veracity of his statements, of course, is an important function of the press, though, so those falsehoods need to be put on the record. That job in recent years has fallen to fact-checkers such as those at The Washington Post, which has chronicled and tallied the president’s falsehoods each day since he took office. They surpassed the 13,000 mark in September of 2019.
In a way, argues David Lauter, Washington bureau chief of the Los Angeles Times, Trump has done the media a favor. No longer is there an assumption that what the president says is true and thus there is no longer an assumption that it is always news that needs to be covered the same way as in the past. Lauter is quick to note that what presidents before Trump said wasn’t always true, either, but they commanded a greater level of credibility. Not covering his every word frees the journalists to write about issues they consider more pressing for their readers. In a sense, reporters are now employing a form of “strategic silence” when it comes to the president of the United States.
But the “go/no go” story decision is often made under the pressure of a deadline, and not quoting the president at times can also be perilous – especially in cases where he seems to be announcing some new policy (which he may or may not end up pursuing).
Trump, by using social media so widely, is seeking to simply circumvent traditional media. He also uses live television to his advantage, an older medium than Twitter but one that raises even more troubling questions because he is so unpredictable. In cabinet meetings or White House press gaggles, he can perform long monologues that often contain multitudes of untruths, leaving producers or livestream operators to either let the falsehoods flow directly to viewers or make the political decision of cutting away from the president of the United States.
Some networks have taken to live fact-checking his remarks. CNN is an example. While the network often runs Trump’s comments live, it will include on its screen a “reality check” that calls out falsehoods as he makes them. The network in the summer of 2019 hired Daniel Dale, one of the most prolific Trump fact-checkers in media, to bolster this effort.
Using attribution as cover
In their rush to keep up with the accelerated news cycle, journalists will often default to the one true thing they can publish. The person did say this thing – that is a true statement. But repeating it without context is using attribution as cover for amplifying a falsehood.
In environments where misinformation is rampant, what seems like neutrality on deadline can look after the fact like a publication has been used to amplify an agenda or a falsehood. This is particularly true in cases where politicians are being quoted on something that might be considered news, e.g., “Politician Smith said XYZ.”
In a 2019 example, after Trump held a rally in North Carolina in which his supporters started chanting “Send her back!” in a taunt at Rep. Ilhan Omar, a Minnesota Democrat who was born in Somalia, the president tried the next day to disavow those chants.
Some of the resulting headlines missed the context by simply quoting the president as saying he didn’t like the chants. That context: The chants started as Trump recounted controversial remarks made by Omar, including one in which she perpetuated an anti-Semitic trope. His claim that he tried to stop the chants is clearly contradicted by video of the event showing him waiting quietly while the audience chanted. Also, the day before, Trump had tweeted that Omar and three other congresswomen who have been critical of the president and his policies could “go back” to where they came from if they didn’t like it.
Here is an example, on Twitter, of the importance of context.
Lacks context as story was breaking:
#BREAKING: Trump says he disagrees with “send her back” chant https://t.co/BWmkMgfXXc pic.twitter.com/gI5L2wp7G1
— The Hill (@thehill) July 18, 2019
Includes the context 22 minutes later:
JUST IN: Trump claims he tried to stop rally crowd’s “send her back” chants despite letting them run for over 10 seconds https://t.co/xGK2VkGjaa pic.twitter.com/sh7Y9UJNMG
— The Hill (@thehill) July 18, 2019
The headline for the story, “Trump says he disagrees with ‘send her back’ chant” remained the same throughout. The story did note that Trump did not seek to stop the chanting, as he said he did.
A final challenge involved in covering Trump’s falsehoods are cases where he repeats something that oversimplifies and leaves out important context, such as his claim that the United States is building a wall along the southern border.
Fact-checkers have repeatedly debunked this claim, which the president has made as many as 200 times, according to The Washington Post. “A barrier is being built on the southern border, but not the 30-foot-tall, concrete, 1,000-mile wall Trump promised in the 2016 campaign,” The Post’s fact-checker wrote in October 2019. “Much of this barrier is simply replacement fencing.”
As The Post’s video editor wrote, “Images and video can create a powerful effect online in convincing American voters of the wall’s progress.”
The challenge goes beyond simply debunking the claim. Trump’s repetition of the falsehood means editors have to make a decision every time about whether to debunk it again or ignore it, which would mean letting the falsehood stand unchallenged on social media.
Strategies for covering falsehoods without amplifying them
There are various strategies journalists can employ to ensure they’re amplifying the truth and avoiding amplification of falsehoods. Among the most helpful:
1. Identifying the tipping point: First Draft and other organizations have suggested questions for journalists to ask themselves before reporting on misinformation. Every situation will be different and every news organization has a different audience, so each publication should have its own guidelines.
First Draft has a checklist of “10 questions to ask” before publishing misinformation. A key question is why go with such a story – is reporting on the misinformation helping to clear up a widespread public misunderstanding? Is it important to hold a public official accountable?
A central consideration is whether the information has reached a “tipping point” — the point at which a story about a hoax, a falsehood making the rounds, or a conspiracy theory becomes too big to ignore. Each story, newsroom and each scenario will have a different tipping point. But it is important to know there is a line and that you are going to decide when to cross it.
2. Truth Sandwich: If you report a falsehood, cushion it between two hearty slices of the truth. As noted above, this may work better in an analytical treatment than a breaking news situation.
The truth sandwich is more of a concept than a precise technique, but journalists could develop their own versions based on the observations of its creator, George Lakoff, from a podcast in 2018 or from this article on Vox.
3. Label misinformation clearly: If you report on a doctored photo, meme or video, mark it clearly as misinformation. BuzzFeed News does a good job of labeling, as in a 2018 story about explosive devices sent to prominent politicians, which shows screenshots of false information with big “fake” stickers on them. There is little chance people will think it is showing something real. Some fact-checkers use stickers or other visuals to indicate falsehoods.
First Draft provides some examples and techniques for this in the October 2019 report, Responsible Reporting in an Age of Information Disorder.
4. Avoid using attribution as cover for amplifying a falsehood: Don’t repeat the falsehood in the headline. Just because someone said something doesn’t mean you should repeat it without context.
‘Enemy of the People’
In his more than two decades in journalism, Joel Christopher had never seen anything like it.
When he arrived at the Louisville Courier Journal as executive editor in late 2016, he found that his paper was frequently on the receiving end of abuse from Kentucky’s Republican governor, Matt Bevin. A newcomer to politics, Bevin came to office the year before as a wealthy businessman who cultivated an outsider, anti-establishment image on the campaign trail.
Bevin didn’t appear to have much use for traditional media, and often took aim at the Courier Journal. That may sound familiar. And, as it turned out, the governor was like Donald Trump in other ways, too. He was quick to attack journalists whenever he disagreed with something they published – a practice he continued into his fourth year as governor. (He lost his bid for re-election in November 2019.)
Enduring disparagement from politicians has long been part of journalists’ work, especially disparagement from officials who are the subject of critical news stories. In Louisville, the paper was investigating Bevin’s purchase of a mansion at below-assessment price from a political supporter.
But in the era of President Trump, the attacks have escalated to a new level. Any time the media report something that reflects negatively on the president, he and his surrogates immediately ignore the details and label the organizations or the stories as “fake news.” The president tweets his criticism of the news media almost daily, frequently repeating his accusation that they are the “enemy of the people.” After the congressional impeachment inquiry began in the fall of 2019, his rhetoric became even more aggressive. In a speech to U.S. diplomatic officials in New York he described reporters as “animals” and “scum.” He also has used the word “corrupt.” His allies have sought to raise money to investigate journalists’ backgrounds, obviously preparing the groundwork for more attacks.
The big national news organizations that are usually the target of Trump’s drumbeat have responded by not taking the bait. As Washington Post editor Marty Baron said in a quote widely praised and repeated among journalists: “We’re not at war with the administration, we’re at work.”
Baron’s comment, made in the first month of the Trump Administration, was part of a larger conversation about whether The Post would find itself in outright battle with the president of the United States, as the paper did in 1971 with President Richard Nixon over publication of the Pentagon Papers and again soon thereafter over Watergate. The media under Trump, Baron noted, had reached a strange point where “just being independent, which the press should be, is portrayed as being the opposition.”
Two and a half years later, New York Times editor Dean Baquet made a similar point when defending a much-criticized Times headline on a print story about a speech Trump gave after two mass shootings in August. The original headline, which was changed for later editions, said “Trump Urges Unity Vs. Racism” and it was seen as a tone-deaf misrepresentation of a speech in which racism was not the main theme, and from a president who is perceived as having worsened rather than quieted racial tensions.
In an interview with the Columbia Journalism Review, Baquet agreed that it was a “bad” headline because it didn’t convey enough skepticism. But in addressing critics who want the paper to take a more activist stance toward Trump, he echoed Baron’s notions of dispassion. The Times’ role, he said, is not to drive the resistance to Trump, but rather to simply tell the truth. “I don’t believe our role is to be the leaders of the opposition party,” he said. He reiterated that sentiment in a November 2019 interview with The Guardian.
Tension between political actors and an independent press is an inevitable and essential element of a healthy democratic system. Editors such as Baron and Baquet are seeking to maintain this traditional relationship, and to avoid allowing their news organizations to be, or be perceived as, participants in partisan warfare. Such participation would make it difficult to regain their status as unbiased chroniclers of these battles. A press that is too enraged, these editors are arguing, overreacts – which is exactly what Trump wants – and in effect proves his point.
Responding to attacks at the local level
Big publications such as The Post and The Times may be able withstand the kind of battle Trump is waging. The Times, in particular, is a national publication drawing on a select audience across the country. At a local level, where publications are trying to command a large share of their community, the problem of polarization is even more complicated. And in communities where a larger percentage of residents tend to be more conservative, attacks such as Bevin’s are more divisive and potentially more damaging because local papers are more vulnerable. They have less margin of error financially, and they also do not have a worldwide reservoir of potential readers to draw from. It would be devastating to them if, say, half their audience were persuaded by such attacks to stop reading.
Editors have embraced a strategy of trust-building to inoculate their news organizations against partisan attacks.
The disparagement is also more personal on the local level because the journalists usually live in the communities or regions they cover, sharing schools, municipal services, public spaces and cultures. And it is potentially more divisive because, if successful, it can turn members of the community against one another.
In such cases, dispassion may be even more critical. In theory, it can also be more persuasive if the news organization accompanies it with trust-building techniques such as being more transparent about how news is gathered. The better an audience knows the journalists being disparaged – and the more audiences understand how journalists conduct their work – the less successful the denigration will be. Editors like Christopher have embraced a strategy of trust-building to inoculate their news organizations against the attacks.
“We’re not the enemy. We’re not the political opponent,” said Christopher, who is now the executive editor of the News-Sentinel in Knoxville, Tenn.
In some cases, these editors say, the best response is to avoid a direct response, or, in cases where the news organization is accused of something it didn’t do, to simply state the truth. But in many cases, it can be helpful to address the community directly, lay out how a story was reported, be transparent about what is still unknown in a developing story, and make clear that the news organization has no hidden agenda.
Tensions between the Courier Journal and the governor escalated when the paper broke a story in the spring of 2017 about Bevin’s purchase of the mansion from a local businessman who was also a friend and Bevin donor at a purchase price lower than the property’s tax assessment.
Before it was clear who bought the property in Anchorage, Ky., Courier Journal reporter Tom Loftus went to the home to see what he could find out. He was met by a state trooper working on the governor’s security detail. His story shows he tried but could not get the governor’s office to respond to questions about it.
Later, after it became clear that Bevin was the buyer, the new assessment raised ethical questions about the deal, which the Courier Journal reported. The governor, in turn, attacked Loftus, calling him “a sick man” and accused him of “sneaking around” the home, language that echoed terms Trump has used.
A sick man…@TomLoftus_CJ of the @courierjournal was caught sneaking around my home and property..Was removed by state police..#PeepingTom
— Matt Bevin (@MattBevin) May 27, 2017
In another episode later that year, the governor falsely accused the Courier Journal of sending a drone over the home. In fact, it was a local television station.
“It was a total shock to me to see not only a political leader but someone in that prominent of a role behave in a manner that seemed wholly inappropriate,” said Christopher. “The problem isn’t the criticism – we’ll accept vigorous criticism from anybody. But this was a sharp and frankly juvenile attack with utterly no basis in truth whatsoever.”
The lawmaker and The Bee
In California, The Fresno Bee has faced a similar situation as the Courier Journal – being attacked in Trumpian terms as fake or illegitimate. Rep. Devin Nunes, who began representing the 22nd congressional district in the Central Valley in 2003, has been conducting a public campaign against the newspaper since the early months of the Trump administration. Before Democrats took control of the House in 2019, Nunes was chairman of the House Intelligence Committee, which was investigating Russian interference in the 2016 election. That gave him more prominence nationally, and as a result more coverage both in The Bee’s news sections as well as its editorial pages, which were critical of his handling of the case.
There was plenty to cover. Nunes, an early Trump supporter, stepped aside from running the Russia inquiry in the spring of 2017 amid criticism that he was obstructing the probe and that he had possibly mishandled sensitive information. As The Bee covered Nunes more closely, their relationship deteriorated. At one point during an interview with a Bee reporter, Nunes said the paper was a “joke” and a “left-wing rag.”
The antagonism grew after a May 2018 story by reporter Mackenzie Mays about a 2016 lawsuit against a winery in which Nunes is an investor. In 2015 several winery investors went on a charity cruise that included prostitution and cocaine, the story said. Nunes was not involved in the cruise, and the winery made an effort to distance Nunes from the episode. The lawsuit, filed by a female winery employee who was on the cruise, was settled.
Nunes then heightened his attacks. He purchased ads criticizing the paper and put out a 40-page glossy mailer aimed directly at The Bee, purporting to tell “the dirty little secrets of the Valley’s propaganda machine.”
The same year, the national press started to pay greater attention. In March of 2018, Mother Jones wrote a piece headlined, “Why Devin Nunes and his Local Paper Suddenly Can’t Stand Each Other.” GQ magazine later ran an in-depth look called “The Fresno Bee and the War on Local News.” Vice News did a segment on its HBO news show. “Nunes Declares War on the Media,” said Politico.
In the spring of 2019 Nunes filed a $150 million lawsuit against The Bee’s owner, The McClatchy Co., alleging defamation. McClatchy called the lawsuit “a baseless attack on local journalism and a free press.” (The lawsuit is one of several the lawmaker has filed, including one against Twitter, which he called “a portal of defamation,” and later against reporter Ryan Lizza and Hearst Magazines, for a piece in Esquire exposing that a Nunes family farm is in Iowa rather than California.)
Attacks on news organizations, though, are usually not intended to win an argument on the merits, legal or journalistic. They are generally aimed at intimidating news organizations by making critical stories costly to defend, and as public relations stunts aimed at stoking division and creating doubt about everything the outlet publishes.
The potential for such division and doubt is something The Bee’s editor, Joe Kieta, says he thinks a lot about. And when he considers the best response in such a situation, he said in a phone interview, he always comes back to the need for reinforcing audience trust.
Attacks on news organizations are usually not intended to win an argument on the merits, legal or journalistic.
The Bee has taken several steps in that direction. Among them, it has formed a partnership with Arizona State University’s Walter Cronkite School of Journalism and Mass Communication to find ways to better connect the newsroom with The Bee’s readers. In a piece explaining the effort, the editors acknowledged the need: “The processes we use to report the news are developed to engender readers’ trust. But the problem is you don’t really know much about how we go about our work, or why you should trust what we do. That’s on us. We have to do a better job of explaining how news is made.”
When it profiled Nunes in 2018, The Bee appended a section to the story that explained “how we reported this story.” It laid out who the reporter talked with, who returned calls, who didn’t, and the range of documents reviewed.
“I think the best way we can attack this is to be as open and transparent as possible about the work we do, and explain the reasons behind our coverage and our work, and be as upfront as we can when we are scrutinized or criticized for our work,” said Kieta.
The editorial page voice
In a pitched battle with a public figure, another question is whether to take a formal editorial stand about the person, in addition to news coverage that establishes the facts.
The Fresno paper’s opinion pages have pushed back relentlessly against Nunes. After Nunes’ 2018 ad buy, the paper ran an editorial with the headline “The real ‘fake news’ is Devin Nunes’ ad about The Bee.” That was one of several editorials critical of Nunes before and after the ad.
An editorial sends a signal that the institution is taking a stand. In the traditional view, it is intentionally more meaningful than a column by a single writer or an opinion piece submitted by an outsider. In the case of the Nunes lawsuit against The Bee, such an editorial was used to “fact check” what it said were several of Nunes’ false assertions in the suit.
The New York Times has had a similar strategy with Trump – using the platform of its editorial board to publish opinions that would be out of place in the news pages. One unsigned editorial, for example, was headlined: “Why the Trump Impeachment Inquiry Is the Only Option.”
Some editors today worry that editorials taking sides on issues such as this have the effect of drawing the entire news organization, in the public’s mind at least, into fights in which the news side wants to avoid being a combatant.
The argument here is that in the digital era, people find news content through a number of paths – search engines, social media, sharing by friends, links in email newsletters – so it has become harder to see whether a story is coming from an objective reporter or a decidedly opinionated columnist. In the print era, those sections were easily segregated and marked for the reader. Joy Mayer, who runs the organization Trusting News (an affiliated organization of API’s that helps newsrooms build trust), wrote about this phenomenon in an issue of her “Trust Tips” newsletter.
For Christopher, the Knoxville editor, this is a big – even existential – question that contributes to trust. He thinks newspapers have not worked quickly enough to fundamentally restructure themselves toward a more community-oriented model.
“We try to communicate as widely as we can either through direct interaction with readers on social media or public events to continually make the point that we’re here to report stories, not engage in a particular partisan battle,” he said. “The problem is that through the structure of the American newspaper over generations, we’ve hampered our ability to make that case. Readers do not understand the editorial pages vs. news coverage, and if we aren’t able to communicate that difference, we’ve failed.” (This is a topic API has been exploring separately through a series of events and reports).
In August of 2018, Trump’s continued attacks on the media inspired a group of newspapers organized by The Boston Globe to band together to fight back, coordinating editorials defending a free press. The same week, the Senate passed a resolution stating that the press was not the enemy of the people. Trump used both occasions to take another swipe at the media.
Trump has almost certainly emboldened politicians such as Nunes and Bevin to attack the media. Yet it’s useful to note that the decline in trust in the news media that these politicians are exploiting can be traced back to the 1980s and is connected to a complex array of cultural, political and technological changes in the country.
Sometimes political figures also feed off each other’s methods, a cross-pollination accelerated by consultants who work across campaigns and even nations.
While Trump and Bevin in Kentucky rose to power independently – Bevin was elected a year before Trump – Christopher sees a “reinforcing effect” that takes hold when politicians see the other’s success with their tactics. Nothing in politics happens in a vacuum, he noted.
And in an era of polarization, Democrats have taken aim at the press, too, particularly when they feel that publications haven’t gone far enough in calling out Trump for his excesses.
“A vast swath of Democratic voters are pretty angry at the media,” Dan Pfeiffer, a former senior Obama administration adviser who is now a co-host of “Pod Save America” told Politico earlier this year. “They see a racist liar in the White House and a media too afraid to call him a racist or a liar.”
Such criticism may seem ironic when Trump is claiming the press is an arm of the Democratic Party, or part of a deep-state and out-of-touch elite dedicated to protecting the Washington “swamp.” But it is worth recalling that historically the far left and the far right have been aligned in arguing that journalistic notions of independence and disinterestedness were always impossible.
The motives behind Trump’s critique have at least two dimensions. Attacking the press as part of the deep state elite helps him cast himself as the anti-establishment disrupter “draining the swamp.” It also allows him, as a political figure, to then discount their monitoring him as illegitimate.
It takes a newsroom
Where does all this leave reporters on the ground? They are likely to feel the greatest impact of such hostility, as journalists are interacting daily with people who might refuse to talk to them because they believe reporters are biased, making things up or in cahoots with groups pursuing a particular agenda.
One answer is that it leaves journalists under something of a microscope. They need to be more scrupulous than ever about being independent in any public setting. That suggests that journalists should also be trained in practices such as social media use so that everyone understands they represent their publication and their industry in public, and that they are under scrutiny. Members of the media can fight the narrative that they are putting their own interests ahead of the public’s through their own public behavior.
Another area where journalists must be careful is in the field. Especially during breaking news events or other critical moments, reporters must be sensitive to their communities.
Teresa Frontado, the digital director at WLRN, the public radio station in the Miami market, recounted at API’s event that after the shooting at Marjory Stoneman Douglas High School in Parkland, Fla., two of her reporters witnessed reporters for other organizations high-fiving one another after they secured interviews with some of the surviving students. Exploiting trauma in your community, she said, is the quickest way to feed the stereotype of the media as disconnected from citizens and willing to abandon good judgment for a good story. Frontado said the episode was incorporated in a newsroom discussion about what the staff learned from Parkland and how to better cover traumatic events.
Reporters also should have diplomatic answers at the ready when they are confronted with accusations of bias or fakery.
Trusting News’ Mayer counsels that if someone accuses a reporter or editor of representing fake news, the journalist should be prepared with a response that says “here’s what fake news actually means and why it doesn’t apply to us.”
Similarly, journalists accused of political bias can be clear about how they work to be fair, and how they hold one another accountable.
When called “enemies of the people,” she said, journalists can articulate their news organization’s mission, talk about how long they’ve worked in the community, who they are as people, and how hard they work to deliver critical information that helps people live their lives.
Strategy summary for journalists being maligned
In contending with attacks on the media, journalists can avoid defensiveness and employ some best practices that we took away from our conversations with those who work in media and other experts in the field:
- Arm yourself with answers. Trusting News has a list of valuable tips for journalists who might be confronted by these attacks. Most of them involve preparation – having responses to common criticisms at the ready.
- Use transparency to build trust. Convey to readers why you did this story or that, or why you handled it the way you did. Listen to their complaints, and show them that you’re listening. In its weekly “Trust Tips” newsletter, Trusting News points to examples of good ways to handle these situations (and sometimes what not to do).
- Avoid a war footing. Adhere to the mindset that you are not at war, even when a politician appears to be at war with you. A bully’s objective is to trigger a fight-or-flight response – to force you counter-attack or recoil into a defensive crouch. Either one can be damaging. Your main mission is journalism.
- Insist on whole newsroom training. Everyone in the newsroom must know best practices for responding to criticism, whether it’s legitimate or not. In the era of social media and personality journalism, it’s no longer just the editor who is speaking for the news organization – it’s everyone.
- Sweat the details. Prepare to be scrutinized. At a time when critics are looking for fights to pick, they will be watching everything you publish.
- Correct mistakes. It shows you’re committed to the truth, even if it means admitting errors. Arizona State University’s News Co/Lab has a project to improve newsroom correction practices in the digital age. Their efforts include testing a process to send corrections to people who shared the piece on social media.
- Respond publicly when criticized publicly. Often, says Mayer, attacks on journalistic credibility come publicly, in the form of comments on stories or social media posts. Those offer an opportunity to correct the record about journalism’s motivations, ethics and processes. Journalists who respond publicly are talking not just to the person sharing the opinion but to everyone else who’s reading. And not answering amounts to giving control over the narrative about your work to the person attacking you.
- Be sensitive to your community at critical times. Avoid actions that make you a target of criticism that you are uncaring and only there for a good story.
- Maintain perspective. Politicians cycle in and out of public life. One might attack you or your news organization, but his or her replacement will probably be different. Moreover, news organizations may change their rules as times change, as Baquet told CJR, but in spite of pressures on the industry, their audiences still depend on them. Be able to articulate that history and role in the community.
Contending with Polarized Audiences
Most news organizations aspire to a goal of delivering quality information that can promote healthy dialogue among members of their communities. Some succeed better than others. But it’s more challenging to do this in a society cleaved by partisan polarization, culture wars, foreign campaigns to use technology platforms to divide us, and more. There is a shortage of moderate voices among political leaders, and no shortage of activists exploiting extreme and divisive rhetoric.
By definition, political polarization means more people are aligning themselves with one side of a divide, and positioning themselves against the other. It also means a shrinking middle – fewer people who split their ticket at the polls or are willing to listen to arguments with which they disagree.
Today that partisan divide is also fed by a toxic mix of partisan content, some of it outright disinformation, often shared on social media.
In addition, some outlets in a crowded media marketplace have embraced polarization as a business strategy. And some of the most influential extremists work for cable news channels and partisan publishing outlets that claim to be news.
All of that means the challenge of navigating division and helping people get the information they need to solve problems – one of the traditional roles of a modern post-colonial press – has become more difficult. It also strikes at a central tension of journalists’ identity: where to locate themselves on the spectrum of involvement – the detached, non-partisan recorder of events on one end vs. the activist (booster or dissident) on the other.
As local and regional news organizations consider how to win back dwindling audiences, some argue that newsrooms have positioned themselves too far on the detached end of that spectrum. Their staffs didn’t look enough like their communities; their stories were flawed by trying to “present both sides” of an issue, but often the extreme ends, and not much more. Efforts to include community “voices” too often translated into dry op-eds that ran on an editorial page or stories that treated underserved communities as some kind of “other” culture to a traditional mainstream white upper-middle class and slightly liberal one. And as editorial-page content became less discernible from news content online, some newsrooms believe that could be contributing to polarization.
Journalists are considering new ways to frame narratives, instead of casting stories as one side against the other.
How can news publishers move toward greater audience engagement without appearing to take sides – an important objective in divided communities – or without aligning themselves with a particular outcome?
This has been the focus of significant work in recent years among journalists, academics and foundations seeking to help news organizations better connect with their communities. As a result, some newsrooms are creating new beats to ensure that otherwise neglected parts of their localities get covered. They are imagining how, instead of casting stories as one side against another, journalists could demolish side-taking, and consider new ways to frame narratives. They are holding community events and creating vehicles for public and audience engagement positions specifically to cultivate these spaces.
In Philadelphia, for example, a community advisory group, working with students and faculty at Temple and Jefferson universities, has partnered with local media to create a resource “hub” to share information and stories from the historic Germantown neighborhood in the northwest part of the city. The resulting discussions and stories aren’t just feel-good pieces about community successes, though there are some of those. The hub has also tackled polarizing issues that affect places like Germantown; it held a conversation with other media and civic-minded local residents about how news organizations could better cover gun violence.
Part of the strategy here is to use in-depth listening and engagement practices to figure out what the members of the community really want and need from their local news organizations. Reporters in the field can provide important intelligence to their newsroom leaders about what people in the community are concerned about, and story assignments can follow. But with newsroom cutbacks, there are fewer reporters and less time for those remaining to spend quality time out in their neighborhoods or with business owners or community organizers, which has in the past been a good way for news organizations to take their communities’ temperature.
There is also the question of how to reach disadvantaged and neglected segments of the community, or specific neighborhoods. In Illinois, for example, the Peoria Journal Star formed a community advisory board as part of an effort to strengthen ties to the city’s South Side. And nonprofits and support organizations like Free Press’ News Voices and the Jefferson Center helps other news organizations conduct this kind of listening work.
The Tennessean is another example of a publication creating space to listen to varied segments of the Nashville-area population. The paper has invited leaders from the Muslim community to discussions with reporters, sharing concerns and real-life experiences, and it has similarly held conversations with gun owners, veterans and area residents who identify with the LGBTQ community. The Alabama Media Group has explored listening across political divides.
In all these places, the hypothesis is that a first step to building community trust in their local media is to help people be heard. API has a program, supported by the News Integrity Initiative at the Craig Newmark Graduate School of Journalism at the City University of New York, aimed at helping journalists better listen to their communities. In 2019, 10 journalists from news organizations around the country undertook fellowships to do this kind of work.
Make conflict great again: Complicating the narratives
An indispensable project in the field of connecting better with audiences was authored by journalist Amanda Ripley for the Solutions Journalism Network, whose mission is aimed at helping journalists tell stories about how people are responding to difficult social problems. Her report is called “Complicating the Narratives.”
In her article, portions of which were adapted for a piece in The Guardian, Ripley makes the point that journalists – or anyone working amid conflict that seems intractable – must understand that complexity in a story will produce a fuller and clearer picture of an issue and spark readers’ curiosity, rather than cause them to dig into their previously-held opinions about it.
“As politicians have become more polarized, we have increasingly allowed ourselves to be used by demagogues on both sides of the aisle, amplifying their insults instead of exposing their motivations. Again and again, we have escalated the conflict and snuffed the complexity out of the conversation,” Ripley wrote. “Long before the 2016 election, the mainstream news media lost the trust of the public, creating an opening for misinformation and propaganda.”
Ripley started this project after that election, which she said left her with the feeling that journalists, herself included, hadn’t understood how the nation was so divided and how those divisions produced an outcome that so few people saw coming. As a result, she spent time with people who specialize in dispute resolution, like conflict mediators and psychologists, to find out how they bridge differences. What she found, she said, was that these people have techniques that journalists could use to better listen to people and elicit interesting opinions and details that can make stories richer and more complex.
Better listening, not surprisingly, is one of those techniques and, arguably, central to the others. (For a deeper exploration of the techniques, Solutions Journalism put together a guide for journalists to understand them and how they might be implemented in real-world situations.) For a reporter presenting a story, whether on television or in writing, Ripley says, the journalism produced will be more interesting if the journalist listens better – and shows it. This is particularly important for television, which Ripley notes is still the primary source of news for six in 10 people.
Among her other strategies, the notion of amplifying contradictions may be the most counterintuitive for many journalists, who often spend time trying to get at the “heart” of an issue, a single “nut.” A story, after all, provides a frame for people to understand something and, presumably, they understand it best when it’s not complicated. Even when journalists are not framing issues as a tension between groups, they do tend to frame them as a tension between ideas, usually two.
It’s hard to blame them. Journalists in already-stretched newsrooms can get hurriedly lured into the most simplistic this-versus-that interpretations of news developments and cast their stories that way. As Tom Rosenstiel and Bill Kovach write in The Elements of Journalism, good stories explore tensions. But, the authors note, journalists often find themselves reporting on the extreme sides of those tensions. That becomes even more tempting in today’s hyper-partisan environment, especially when the most prominent representatives of each “side” are themselves promoting their views. And those views are easy to find in today’s social media environment: They’ve been summed up and blasted out in 140 characters or less.
A related issue is that often, in a bid for balance, journalists will seek to find the opposite view of whatever one side says, and can end up offering a false equivalence that even they know isn’t right. Or they will generalize with attribution, casting a point of view as one that “Republicans say” or “Democrats say,” further entrenching the divide. Such framing leaves no room for a Republican, for example, who believes a dramatic reduction in greenhouse gases is needed to slow climate change or a Democrat who wants to limit immigration at the southern border. In this kind of framing, readers aren’t presented with middle ground – they’ve been offered a choice of partisan loyalty or nothing at all.
The result is the kind of story Ripley identifies, which ends up amplifying insults and downplaying the complexities that might lead to a better understanding of the issue at hand. The vicious circle back to finger-pointing then commences.
The power of meta-perception
The notion of stereotyping or oversimplifying arguments or points of view plays into another concept that University of Pennsylvania neuroscientist Emile Bruneau has been researching: It’s called “meta-perceptions” and it refers to how people in one group perceive what members of another group think of them.
Bruneau, who directs the Peace and Conflict Neuroscience Lab at the Annenberg School for Communication and is lead scientist for the Beyond Conflict Innovation Lab in Boston, says that members of a group who feel that they are being dehumanized, or treated as less human or less evolved by another group, are more likely to react negatively or even aggressively toward the opposing group, which can drive division.
Two classic examples of meta-perceptions surfaced in the last two presidential elections. One was the quote, captured at a private event, in which Hillary Clinton referred to half of Trump supporters as a “basket of deplorables” who have “racist, sexist, homophobic, xenophobic, Islamophobic” views. The other was Mitt Romney’s 2012 comment that 47 percent of people would never vote for him in his challenge against President Obama because they are people “who are dependent upon government, who believe that they are victims, who believe the government has a responsibility to care for them…”
The power of meta-perceptions is they reinforce the suspicion people have about the other side – that deep down they are looked down upon. “Groups are obsessed with what the other side thinks of them,” Bruneau said.
Clinton and Romney’s words were damaging because they not only generalized about the other side, but they also “confirmed” for one side that the other thought the worst of them. Both politicians tried damage control but once the words were out there, they went viral. In fact, the Trump team in 2016 played off the Clinton comment by selling merchandise with the word “deplorable” on it, to remind people of the negative meta-perception that she triggered.
But, Bruneau says, what people think the other side thinks about them is often inaccurate. “Humans have a negativity bias that makes them think the other side thinks worse of them than they really do,” he said.
He said recent data collected from both Democrats and Republicans shows that both sides think that the other side dislikes and dehumanizes them nearly twice as much as they actually do. In addition, he said, both sides believe that the partisan divide on ideological issues – from gun control to taxation and border control – is twice as great as it really is.
At the API summit, Bruneau also talked about the importance of exposing people to realities that challenge their assumptions about other groups. Correcting falsehoods of an opposing side may be less important, he said, than providing people with context about what one side really thinks of another.
In that light, journalists seeking to avoid deepening partisan divides should be aware of ways in which writing techniques like generalized attribution can reinforce and even deepen meta-perceptions by feeding the notion that one side is more dug in than it really is. That only worsens the cycle of conflict.
Journalists do sweeping attributions such as ‘most conservatives’ or ‘many liberals’ for a number of reasons. One is for space. “Democrats say” is easier (though less accurate) than quoting a variety of Democrats who might have similar but not identical views. The ingrained habit of trying to make things clearer by oversimplifying them is another factor that leads journalists to unknowingly feed meta-perceptions. To do that, the writer defaults to an X-versus-Y construction. That not only builds tension through conflict into the story – one thing journalists are taught to do is make things interesting – it also, supposedly, makes the tension easy to follow. But it also may result in a false generalization that is not only inaccurate, but also makes the problem it is trying to describe worse.
Partisanship and belonging
Group identity is a powerful force, though, and framing stories as one group opposed to another can just reinforce what Jay Van Bavel, an associate professor of psychology and director of the Social Perception and Evaluation Lab at New York University, calls “the partisan brain.”
Van Bavel has studied people’s perception of facts and how social identity shapes the way people perceive information. He says that when people are considering whether to trust information, they often put a higher value on group identity than on accuracy.
“Partisan identities bias a broad range of judgments, even when presented with facts that contradict them,” he and social psychologist Andrea Pereira wrote in “The partisan brain: An Identity-based model of political belief,” a review of current research that concludes that partisanship can bias information-processing in the brain.
This research, they said, “suggests that partisanship can even alter memory, implicit evaluation, and even perceptual judgments.” Further, they wrote, the influence of partisan identities “threatens the democratic process, which requires and assumes that citizens have access to reliable knowledge in order to participate in the public debate and make informed choices.”
They suspect that the affiliation is so strong that even when people who strongly identify with a political party are confronted with the failure of that party or one of its leaders, some will “double down” on their support and may even try to recruit others to join their party.
This also explains why people are often susceptible to misinformation that comports with a partisan identity, and resist facts that might contradict or threaten it. It may also explain, at least to those who disagree with Trump, why support for him has remained so consistent despite behavior that in other presidents would be condemned by both parties.
Certain loaded language or partisan frames are more likely to bring out a reader’s partisan bias when they read a story.
For the purposes of journalism, this framework suggests that a just-the-facts approach is not going to persuade everyone that a story is delivering the truth. For journalists, who dig out and deal in facts as their life’s work, it might be hard to accept that people aren’t always persuaded by facts, but rather by their group identities. As Ripley wrote, “we like to think of ourselves as objective seekers of truth.”
But there may be ways to mitigate this tendency toward groupthink. According to Van Bavel, certain loaded language or partisan frames are more likely to bring out a reader’s partisan bias when they read a story. “Highly moral emotional language,” like characterizing a debate by saying that one politician is “slamming” or “bashing” another, is more likely to trigger divisiveness, he said at API’s summit.
So while facts alone may not be able to persuade many readers, journalists might consider avoiding language that triggers partisan reactions they think will get audiences’ attention. It may, the research shows, cause them to tune out.
None of this is easy at a time when politicians are using emotionally charged language on a regular basis, acutely aware of – and perhaps motivated by – the reality that the press cannot avoid reporting it.
Van Bavel does offer one optimistic note. While polarization in society is higher today than it’s ever been, he says, it’s still probably not as high as people think it is. Like Bruneau’s observation about meta-perception – that people usually think those in an opposing group think worse of them then they really do – this is one way in which reality may not be as bad as perception.
If people are not as divided as they think they are – or as polarized as journalists believe – that represents an opportunity to open minds with the kinds of framing Ripley is advocating. The one thing people on opposite sides of the partisan divide have in common is that they’re human. Stories that don’t accentuate one side dehumanizing the other and that recognize common human experiences, even among opposing groups, are more likely to have an impact.
The message from both Van Bavel and Bruneau: “Affirm a common sense of humanity.”
Strategies for reaching polarized audiences
1. Story framing: Have conversations within your newsroom about story framing and whether new frames could have the power to better deliver truthful information to polarized audiences. Discussion topics should include:
- How stories become more interesting (rather than less so) when sweeping attributions are avoided.
- How “presenting both sides” is not enough. How can we better show the texture and frustrations of people in communities or groups?
- How to avoid false equivalencies. By now most journalists understand that stories about vaccine hesitancy do not need to include anti-vaxxers, for example. In late 2018, when Chuck Todd of NBC’s Meet the Press held an hour-long segment on global warming, he was lauded for including no climate change denier. Are there cases from your news organization that deserve revisiting?
- How can stories and headlines steer away from moral outrage phrases or keywords that might contribute to and reinforce groupishness?
2. Doing richer interviews: Discuss potential interview techniques that can help elicit nuance and texture. Topics should include:
- Listening and “looping.” As Ripley says, people will open up more when they feel they’ve been heard. Looping involves repeating back to the subject what you think they said; often the interviewee will clarify or enrich what they said the first time, strengthening the quote.
- Taking more time: Are journalists too rushed to conduct interviews? How can we elicit more than sound bites? How do we even convince our interview subjects that we want more than that, if that’s all we’re delivering to readers?
3. Community listening: Consider and discuss ways your news organization can implement listening strategies to better understand the community’s news needs. Topics should include:
- Whether public events can bring people together and help a news organization tap into what people are talking about.
- Ways to create opportunities for reporters to spend quality time in the field, away from deadline pressure, to report back to newsroom leaders about what they’re hearing so that story ideas can follow. If not reporters, who can put their ear to the ground in the community?
4. Overall coverage: In polarized communities, seemingly straightforward decisions can give people the impression that the news organization is “taking sides.”
- Is the beat structure of your newsroom built to cover conflict in a way that goes beyond X vs. Y? The flashpoint of a key issue might be at the city council, for example, but is coverage to focused on the process and politics of the issue?
- Is the “voice” of the opinion page giving people the impression that the news organization as a whole is “taking sides”?
Conclusion: A Challenge for Challenging Times
Each of the challenges addressed in this report – misinformation, attempts to manipulate journalists, polarized audiences and disparagement of journalists by politicians – is a discrete problem with its own unique causes and solutions.
But because they all relate to and reinforce one another, it is essential for news leaders seeking to respond to look at the combined effect of these forces on consumers. In other words, to look at it from the audience’s point of view.
One expert who has tied all these themes together is Andrea Wenzel, an assistant professor of journalism at Temple University. In a 2019 study, Wenzel looked at people’s news and social media habits in what she says is an environment of “pervasive ambiguity” that audiences are feeling amid the political polarization and uncertainty about the truthfulness of what they read.
In her report, To Verify or to Disengage: Coping with “Fake News” and Ambiguity, Wenzel organized a series of 13 focus groups in four cities across the country and asked them a number of questions about how they consume information on news platforms and social media.
What she found was that people cycle back and forth between attempting to verify information and disengaging from it for stress relief. For some, she said, distrust and weariness with the effort of having to look up or validate information they were uncertain about or troubled by led to fatigue and frustration. Some indicated that disengagement from social platforms was driven by a sense of self-preservation.
Wenzel, who has put her work into practice as a founder of the Germantown Info Hub, (noted in Chapter 4) which shares information and stories from the historic Philadelphia neighborhood, writes that for media publishers “rebuilding trust means addressing fundamental relationships with the public in need of repair.”
Many of her conclusions are consistent with some of the strategies advocated by other experts cited in this report. She notes that publishers must confront and come to terms with partisan biases, that they need to examine “the toll of exclusively negative coverage,” and listen to people who say that “journalists are distant and that coverage does not reflect their lived experiences.”
The danger of not doing so is that people will turn that temporary state of disengagement into a permanent one. In its 2019 annual Digital News Report, which is based on a survey of more than 75,000 people in 38 markets around the world, the Reuters Institute said that 41 percent of people in the United States actively avoid the news often or sometimes.
There is also a commercial imperative to these efforts. If news organizations see a subscription or paid membership model as necessary to their future success, then they must seek to forge a closer relationship with their audiences. They must win their trust, helping readers get to know the journalists who write about their communities rather than believe politicians’ attacks on them. They must not amplify false information. And they must acknowledge in their stories that the world is complicated, and abandon journalism that seems to ask people to take sides in deepening partisan warfare.
There is also research showing that communities with no newspapers are more likely to be polarized, reinforcing the idea that local news outlets can serve the higher purposes of democracy.
The strategies outlined in this report may not all prove workable or effective for all news organizations. But the need for newsrooms to think about – and try – new ways to operate in today’s environment could not be more urgent, nor the stakes higher.
Journalists or others interested in exploring these topics more deeply can find further resources quoted or linked in this report.
Joan Donovan and danah boyd (American Behavioral Scientist): Stop the Presses? Moving from Strategic Silence to Strategic Amplification in a Networked Media Ecosystem
First Draft’s Essential Guides
- Understanding Information Disorder
- Newsgathering and Monitoring on the Social Web
- Verifying Information Online
- Responsible Reporting in an Age of Information Disorder
Daniel Funke (Poynter): 10 tips for verifying viral social media videos
Cole Goins (for API): How a culture of listening strengthens reporting and relationships
Michael Golebiewski and danah boyd (Data & Society): Data Voids: Where Missing Data can Easily be Exploited
Alice Marwick and Rebecca Lewis (for Data & Society): Media Manipulation and Disinformation Online
Nic Newman, Richard Fletcher, Antonis Kalogeropoulos and Rasmus Kleis Nielsen (Reuters Institute and Oxford University): 2019 Digital News Report
Britt Paris and Joan Donovan: Deepfakes and Cheap Fakes: The Manipulation of Audio and Visual Evidence
Whitney Phillips (for Data & Society): The Oxygen of Amplification
Amanda Ripley (Solutions Journalism): Complicating the Narratives
Plus: interview techniques to help implement such a strategy.
Tom Rosenstiel and Bill Kovach: The Elements of Journalism
Jay Van Bavel and Andrea Pereira (New York University): The partisan brain: An Identity-based model of political belief.
Claire Wardle (First Draft): 10 Questions to Ask before Covering Misinformation
Claire Wardle (First Draft): Five lessons for reporting in an age of disinformation
The Washington Post: Guide to Manipulated Video
Andrea Wenzel (Temple University): “To Verify or to Disengage: Coping with ‘Fake News’ and Ambiguity.”
READ MORE FROM:White papers