- Episode 106
Trolling the News in an Attention Economy
Terms like trolls and butterfly attacks sound like something out of a childhood fable. Unfortunately, in the digital information landscape, these terms represent very real tactics that can have devastating effects on democracy. What are these bad-actors are trying to accomplish? And, how can you protect yourself from becoming prey to their malicious schemes? To find out, we talk with Dr. Joan Donovan, one of the leading experts on media manipulation, and disinformation campaigns, and online extremism.
-
Leah Dajches: Once upon a time, a young, innocent news consumer opened their magical information box, commonly known as a laptop, and began their sacred quest to find reliable information on the news of the day. Then, without warning, a smelly, ugly creature emerged from its digital cave. It opened its hideous mouth and began spewing...
Matt Jordan: Look at me, look at me.
Leah Dajches: The unsuspecting news consumer's screen filled with something so horrifying, so dangerous that they had a slam it shut and run far, far away. Alas, it was too late. The damage had been done. Their attention had already been hacked by an internet troll, and it was all they could talk about.
Matt Jordan: When we hear terms like sock puppets, butterfly attacks, and swarming, it does sound like something out of a childhood fable. Unfortunately, since we've entered the era of the attention economy, these tactics and the scenario Leah described are very real. But unlike a bedtime story, the narrative is much darker, the stakes are much higher, and there's often no more or less than at the end of the tale. Who are these trolls? What is it these bad actors are trying to do? What effect do these noise makers have on the media ecology? And most importantly, how can you protect yourself from becoming prey to their false narratives and malicious schemes?
Leah Dajches: To escort us through the terrifying terrain of conspiracy theories and other disinformation, we're going to talk with Dr. Joan Donovan, the Research Director of the Shorenstein Center on Media, Politics and Public Policy at Harvard University. Dr. Donovan leads the field in examining internet and technology studies, online extremism, media manipulation, and disinformation campaigns. Joan, welcome to News Over Noise.
Joan Donovan: Thanks for having me.
Leah Dajches: So we might be familiar with kind of this idea of trolls from childhood bedtime stories, but what is a troll in today's grownup media landscape?
Joan Donovan: So when we're talking about trolling, there's a rather long history here, which we don't need to get into. But essentially what it means is to stalk and harass someone in order to get them to shut down their account. The longer history is that it began in a very playful prankstery kind of way, where the word was actually trawling, which is a fishing term, which would mean to sort of drag a net and catch what you can't. And you'd see this happen on message boards where people would be looking for some kind of fight or they'd be targeting a particular person and then they'd trawl them. That is, they'd kind of catch them in their net and pull them along.
And that became trolling over time, and it started as pranking and then eventually became much more sinister. There's a version of trolling that involves what might be called brigading, or a network of harassers. So a group of people all trying to get one person to remove their account. And so over time, academics and other digital rights activists have moved away from the term trolling, that seems to be the more playful version, and talk about what's happening as network harassment.
Matt Jordan: When I think of this term, I often mean people who kind of hack the attention of people, right? That they're disruptive. Is there a better word for that than troll?
Joan Donovan: If you are just disrupting in order to get attention, grandstanding, stunting, these are words we've used in the past for this. But a troll, for me, has some kind of mischievous or perhaps even more sinister or malign intent than someone that is grandstanding or showboating.
Matt Jordan: How does the structure of the internet, right, the media ecology that a lot of our news depends on, reward people who grandstand or troll?
Joan Donovan: It's very simple. Social media tends to reward outrage and novelty. And so if the thing you're saying online is novel and outrageous, and it tends to anger people on either side of an issue, or affirm someone on one side and anger someone on another side, that is when your content has the most potential to go viral. And so trolls tend to be really good at this, which is to say that novel things, we have a question of if they're true or not, they don't care, and then to cause outrage is really part of the job.
And so we'll see anonymous trolls often kick off conspiracy theories that then trade up the chain very quickly to newsworthy individuals, who then start to spread rumors, and then journalists will cover those rumors because someone newsworthy is saying them. So an example very recently has to do with Nancy Pelosi's husband, Paul Pelosi, being attacked in his own home. Very quickly after the attack, there were trolls and provocateurs on social media saying that this must have been some kind of lover's quarrel, that this wasn't a break-in, that this was some sex slave. And it very quickly devolved into conspiracy leading content.
Journalists called it out saying this is outrageous, but it eventually traded up the chain all the way to Elon Musk, which then becomes a newsworthy story if Elon Musk is spreading this kind of garbage. Even if he's just trolling, he might be just trying to provoke a response. And the story really died down up until yesterday, or very recently, when someone interviewed the man who assaulted Paul Pelosi's son, who basically said, "I'm estranged from my father, I don't talk to him very often. For all I know it could have been blah, blah, blah, blah, blah." And one of those rationales that he was Paul Pelosi's sex slave.
And so that story was then disingenuously reported in the news and then revived all of these conspiracy-type theories. And so a long way around of saying that when the content is novel and it's outrageous and it provokes reaction, whether the reaction is good or bad, it doesn't matter because the algorithms are just going to reward engagement. And say off it, from the New Georgia Project, often says, "Algorithms don't care about your analysis." And I love that because it really does stop you and make you think, "Okay, if I engage with this content, what kind of information am I rewarding with my attention?"
Leah Dajches: So when we think about trolls, it sounds like they kind of like to eat the novel type of news, or what we're calling "news", right? I mean, are there any other topics or themes that trolls can't seem to resist and kind of gravitate towards?
Joan Donovan: Definitely. So wedge issues are a really big part of this. So different cultures, different cities, states, are going to have different wedge issues. There's an incredible paper by Deen Freelon at the University of North Carolina talking about, I think the title is Black Trolls Matter, and this had to do with how Russian trolls from the IRA went after race issues in the United States. And so if we just think about our recent history and much of the dust up around critical race theory being taught in schools, that was something that trolls and other folks then eventually just really went in on. And even though critical race theory, the legal theory wasn't being taught in schools, critical race theory came to stand in for education about race or about racial injustice. And so when trolls get organized into these media manipulation campaigns, they can be very, very effective.
Another example, of course, is very pertinent in the news. There was a meme a couple years ago called "OK boomer". No surprise to us, anyone over 30 is a boomer. It's fine, we'll get over it. And the meme was funny. There was lots of ways to shut down conversations with your parents and whatnot by being like, "OK boomer." But the opportunity then opened up for anti-trans activists who knew that they couldn't call someone a pedophile online because that would be an allegation of a crime that had been committed. They knew they would get a terms to service violation. So they started saying, "OK groomer," which did all of the work of implying that someone who was queer or trans was somehow intent on exploiting children. And it stuck. And now, anytime people bring up pedophilia or "grooming", very quickly, trans activists jump up and say, "You can't talk about trans people like that."
And then the recourse of the trolls, which is always this sleight of hand, it's always this ironic hedge, as my friend Matt Gertson calls it, where they'll say, "Well, why do you think we're talking about trans people? We're talking about groomers." Right? And so they've done the work of attaching groomer to trans issues, and then they take this step back in the discourse and gaslight trans people who then call it out. And all of this wordsmithing and these language games center around the content moderation policies of these platforms. And they also don't have the same kind of faithfulness that we would have with each other if we were trying to reach some kind of middle ground like, "I'm not interacting with you in an authentic way where I am trying to understand your position and change my mind." Instead, it's about the defense of one person's position and the gaslighting of another person.
Matt Jordan: It's interesting, I mean, again, I often think that there's a kind of a projection element with all this. But of course, if we actually talk about grooming, it's about pushing the line, keep seeing how far you can push it past the norms, past the... That's what grooming behavior's doing. It's getting people used to kind of transgressive or bad behavior. And in a way, that's kind of what online trolls are doing. They're constantly pressing the boundaries of what is in good taste or what is within the kind of social contract to move that Overton window.
Joan Donovan: Yeah. Well, that's exactly it. That's what I was going to say is they used to call it the Overton window in a sense. where they would talk about, and this comes from, I still study white supremacists, but early on when I was studying white supremacists in 2015, before the election of Trump and everything, they would talk about the Overton window and how you needed to have certain people in politics and in culture saying these things so that the public would have a discussion about them. Right then, it was about immigration and building the wall. Trump was using that terminology, and the meme "Build the wall", which comes from 2004, the Minuteman movement, which was a white vigilante movement on the southern border where citizens took up arms to patrol the border. And they had T-shirts and they would chant, "Build the wall," or, "Build that wall."
So Trump just pulled that meme from that moment and put it center stage. And white supremacists, white nationalists in the United States that were posting on certain message boards that I was studying were very clear that Trump is doing the work, and Trump is putting the issues into the public, and that they didn't need to force anything. That what they needed to do was massage the conversation, was to get people to think that immigration was the most important wedge issue of the day, and that building a wall would be one way to solve this problem, as well as they had a lot of other fantastical ideas about how to stop racial diversity from spreading in the United States. Very violent ideas, of course.
But you did start to see that stuff reflected in the campaign and in the campaign materials. And I think there is an element of, sociology would call it normalization that happens, and I think here the demonization of trans people normalizing the public to the idea that these people are dangerous, that they're a threat to children, that they're in your schools, they're in your libraries, very wholesome places, is a serious accusation.
Unfortunately, I don't think our institutions are really up for the challenge of handling this kind of onslaught through the internet. I don't think that our institutions are really capable of responding to, in many ways, made-up and exaggerated claims because they also just don't have a ton of experience dealing with these issues, whether it be racial injustice or queer issues in classrooms or other spaces.
Matt Jordan: Of course, one of the institutions that we're interested in for what we do here is the Fourth Estate. You'd think six years after Trump took over Twitter and made it his platform and showed how easy it was to manipulate the media by being kind of troll-like by saying outrageous things, you would think that journalism as an institution would have learned to not take the bait. Why do you think people continually allow trolls to set the agenda in that way?
Joan Donovan: There's a couple of things going on here. One is that you have a couple of different media ecosystems at play. You have a center, you have a right wing media ecosystem, you have a further right, so a far right media ecosystem. You have a left wing media ecosystem and then you have a further left media ecosystem or a far left media ecosystem.
Now, the far left media ecosystem doesn't really make a lot of noise. The stories that they talk about don't really travel into the rest of the media ecosystem. Whereas the right wing and the far right wing media ecosystems are much more closely tied together. They're much more likely to cover each other's stories.
Then the difference between the left and the center versus that conglomeration of the right is that the right wing is going to have a bunch of novel stories. They're going to have a bunch of stories that they're going to report on that no one in the left and the center are going to give much attention to.
Iconically, the case of this is the Hunter-Biden laptop. There were numerous attempts to make Hunter Biden the story in 2020 prior to the laptop drop, which I hesitate to even call it a laptop drop. It looks to me more like an iCloud app.
But nevertheless, there were at least two concerted attempts by the right and the far right to make Hunter Biden the news and center and left, just didn't pick it up. This is a lot to do with the fact that Hunter Biden is a known quantity in media spaces. He is not his father, who was running for office.
Then the sort of poison pill of the laptop was that everybody, everybody was waiting for a hack-and-leak operation in October of 2020. We know that when tactics work like they did in 2016, that they are likely to work again.
So everybody was waiting for it and when it happened, it caused so much chaos and there are a couple of things going on there where the left and center really just kind of stepped to the side and said, "Well, unless we have more information about where this content comes from and what the veracity of the claims are, we're not going to report on it until it's been fact-checked." On the right and the far right, especially the far right, they were just throwing spaghetti at the wall, nude pictures of Hunter Biden, Hunter Biden doing drugs. Then there was this rather intricate story of money and favors and attention and that story about Hunter Biden selling his father's influence was that rather complicated and didn't really land even on the right. People were more upset by the salacious pictures than they were upset by this supposed influence pedaling.
So that story didn't really have the impact that they thought it would, that because of his guard being up about a hack-and-leak attempt and also that Hunter Biden was a bit of a nothing burger at that point, which is all to say that these media ecosystems are going to have moments where they're reporting on things and what they're trying to do is compel the other side of the media ecosystem to respond.
We see this with the Twitter files. Some of the coverage of the Twitter files on the right is about how the center and the left isn't covering the Twitter files.
Now to make this Twitter files newsworthy, one, they need to be more contemporary. We're talking about 2020. We're talking about inside baseball, about employees' chat messages to one another about how to apply their policies. We're talking about what everybody already knows, which is that government was updating all tech companies, not just social media companies, but all the way through our tech infrastructure about threats to platforms. So cybersecurity companies were getting updated by DHS and CISA.
It's not that uncommon. So if you work in this field, the claim that Twitter was meeting with DHS and the FBI, it's just not an anomaly. That doesn't show anything nefarious. But what they want you to believe is that the Deep State had their hands in the gears of content moderation at that point, which is all to say that it's messy, but it's not entirely that complicated.
But what Musk has been trying to do, of course, with the Twitter files is make them as pathbreaking and as important as the Facebook files, which were the leaks from Frances Haugen that showed that Facebook knew that their products were dangerous to certain segments of the population. Her testimony in front of Congress and subsequent investigations by reputable news organizations, of course, is what made the Facebook files important. But anyhow, I've gone on too long.
Matt Jordan: It's interesting. In the piece you just wrote about the Twitter files, you said that silence is still the editor's best weapon. So you're talking in a way about an institution of journalism which is actually curated, which is actually fact-checking, which is actually doing all the things we hope that news will do.
But that that silence that they then use as the weapon against trolls and against the kind of flying monkeys who enable the whole operation to work is going to play in a way right into those narratives, which are kind of old narratives that date back to the Pentagon Papers about this conspiracy to suppress things. So in a way, they're mining these old narratives about the press and about the influence of government to try and get everybody to take the bait.
Joan Donovan: Yeah. I mean, the conspiracy to suppress and the word suppress itself, it's been interesting to think about because in the context of the early internet where Musk comes from, there was this adage, I think from Stewart Brand, that said, "Information wants to be free" and this idea of transparency that Musk is interested in is one where he highly curates which messages are being seen by the public about what's happening on the platform and about what decisions were taken. So he's created his own main character set where he's going after some of the executives from Trust and Safety that have since left, as well as some of the people who testified in the January 6th hearings.
I think that this is really important because what this is showing isn't just that, at any point in your career as someone who works at Twitter or uses Twitter, Musk might try to use your messages to his advantage.
But I also think it shows this very early ethic that some of these early internet entrepreneurs had, which was that all information is equivocal and, therefore, all information should be transparent and available. There should be a permanent record.
I don't think that what is going on here is leading to that kind of transparency that Musk is hoping for because he is selecting and being very careful about whose messages under what conditions. Then he's also lacing the Twitter files with allegations that Twitter executives weren't looking at child exploitation content in the way that they should have, which is then pushing this other much older conspiracy theory around elites seeking to harm children.
So when it comes to suppression of information, if you think about some of the early heroes of the internet age like Edward Snowden or Chelsea Manning, these people, at great cost to themselves, leaked information primarily about the governments. I think the shift that we're seeing here, which I think is a really important one, is that the government is less of a threat to the public in terms of freedom of information than these tech companies have become.
So I know that that isn't exactly something that I'm going to be able to prove empirically, but I do think that there's something going on here where we're seeing a shift in the balance of power, and the way in which we used to understand and imagine free information and information suppression coming from the government is getting convoluted and disrupted by the fact that these information conduits, particularly social media, are now acting as mediators and possibly suppressors.
Leah Dajches: Just a reminder, this is News Over Noise. I'm Leah Dajches.
Matt Jordan: And I'm Matt Jordan.
Leah Dajches: We're talking with Dr. Joan Donovan about how provocation and disinformation impact journalism, society and democracy.
Matt Jordan: I think about the work that's been done trying to analyze the firehose of falsehood method for information distribution and one of the tenets of that is that the Russian government found out long ago that it was much more effective to just spew nonsense out there at scale, rather than try to suppress anything. That what they were aiming to do was just keep any consensus from happening so you just let it all fly out, just throw everything out there and so nobody can really get a sense of what is actually true. When nothing is true, then people tend to go back to their kind of authority figures, their identity groups and whatnot.
So, when I look at somebody like Elon Musk, the head troll of Twitter, you can, in a way, see what he's up to here, which is just to create what he calls the kind of narrative layer of the internet just to make it so that no consensus can really ever happen.
Joan Donovan: Yeah. I think it's hard for me to think about Elon Musk as a narrative strategist in that way. I do think that he understood very clearly what Twitter was.
There was an interesting thing that happened in a Twitter Spaces the other night. I was co-hosting where several journalists that Musk had banned came into this chat because we figured out, essentially, that even if they suspend your account, you can still use Twitter Spaces, which was kind of bizarre. So you had these banned journalists in the chat talking about how they don't know why they got banned and had to deal with stories they had written about Elon's Jet, that account that's the tracker.
Elon Musk shows up to the chat, Elon answers the chat and like a hush goes over the crowd because people have no idea what he's going to say. But he, essentially, comes in and says, "Journalists are the same as everybody else. You're not going to get any preferential treatment anymore." It was very much him coming in with his Dad voice saying like, "These are the new rules," and he was going to get really serious about doxxing on the platform, which is sharing of people's personal information, usually for the purpose of intimidation.
So he comes in, he says this stuff, and then he leaves very quickly. A few of us are back channeling and it's pretty obvious to us that he's going to figure out a way to shut down the chat because, like, he was not having it. He was like, "And none of this ban of evasion stuff is going to fly."
So I don't tend to think he thinks that far ahead about narrative. It didn't seem like he cares that he's contradicting himself. He seems to be really aggrieved by the insults and people's impression of him. In that case, there were maybe 30,000 people on the Twitter Spaces and many very reputable journalists from important news organizations were present. It just seemed like the narrative here was, "I own this platform and you're going to deal with my rules," and that's just it. And on the one hand, that's what the web was supposed to be. I have my web, I got my GeoCities or my Angelfire message board, and you come on and if you piss me off, I'm going to get rid of you. And if you don't like it, go build your own website. But we can't be so facetious about this, because ultimately, what do we destroy when Twitter becomes this plaything of a billionaires? Is that a lot of communities that never would've found each other, very local communities or groups of people that have been marginalized in different ways that have found each other through Twitter, are going to have a harder time finding other places to get attention from the media.
And so what's interesting I think about Twitter in general is that, so if you're a news organization and you want to drive eyeballs to your content, Twitter doesn't do that. What Twitter does is allow for journalists to reach new sources, to do news gathering and to talk with other journalists. And so in that same Twitter spaces, Ben Smith of Semafor made a very interesting point, which is that if journalists all left Twitter that management would be really happy, because it's not going to affect traffic to the website and their journalists would have much less power in terms of fighting back against their news organizations or unionizing and whatnot.
So I do think that Musk, obviously, as a narrative wants to be the main character in Twitter every day, but I do think that there are things to be lost by having Twitter operate under these conditions.
Leah Dajches: And when we're talking about narratives, and Musk's narrative in this sense, I think back to trolling and this narrative of spreading false information. And so thinking largely, sometimes I just wonder why? Why do we have trolls who are spreading false information? How does this benefit anybody? Is there an end game to this?
Joan Donovan: It's an interesting question about the why. There's politics, so obviously trolls are, if they are involved in politics, you can win political arguments, political clout, attention from politicians. If you are a politician, you can win constituents over. You can, in some ways, influence media coverage. And trolls do get the advantage in many instances of politics, if that's their main goal.
There's also profit to be had, if we think about certain people that have trollish type behavior online. Someone like Milo Yiannopoulos, for instance, before he was taken off of Twitter, used trolling as a way to get attention to himself and through the harassment of celebrities and then he would monetize that. He would do campus events or he would sell T-shirts and whatnot. He was somewhat famous amongst the white supremacist set for printing up a T-shirt that said, "It's okay to be white." Which was a slogan that a lot of white supremacists were really proud of because it caused a lot of cognitive dissonance amongst young people, essentially young white men. But when Milo made a T-shirt out of it, they were just like, "This is ridiculous. He's profiting off of our meme".
So politics, profit, and then you have fun. Some of this just comes down to the LOLs. Can you get journalists or other people, notable people, so in a frenzy that they shut down their accounts, and you can take that trophy. For instance, trolls, it's not just a condition of are trolls going to exist, but really it's about when is trolling going to happen. And so the when of trolling usually is during breaking news events. And so we see trolls very much jump into action after very tragic events. They want to get news organizations to report the wrong information. They will lace message boards and other places that journalists might be looking for evidence with false evidence. It's really quite a game for them. They might be talking in a Discord channel about how they're going to manipulate media or create fake Instagram accounts or put information on message boards or Twitter, to get journalists to report the wrong information. And in that way, I think some of these trolls, if they did have a cause beyond just being totally nihilistic and black pilled, it is to be anti-media. That is to get mainstream media to look foolish.
And I think that that's something that we have yet to really reckon with as a society, about how do we deal with the last decade of social media where the anti-media sentiment begins on the left. And it starts with trolls or trolling type behaviors to trick journalists, and to use websites as imposters as the way in which, for instance, the artist activist group the Yes Men did; impersonating the New York Times, impersonating the WTO, impersonating George W. Bush. Which is to say that these shenanigans aren't necessarily without merit, but it really depends on the intents and the actual outcome. Where many of the most destructive trolls aren't here to reveal any bigger rift or aren't here to reveal any information that would make you think differently once you've witnessed it. What Yes Men do is primarily pranks.
But yeah, I think that there's a kind of advantage that you get as a troll online if you don't really care that much about the outcome, but just want to sow chaos.
Matt Jordan: Some of what you're describing, both on the left, early left, critics of them, is a kind of anti-institutionalism, right? Anti mainstream media that is designed to... I've always thought of it as niche marketing. You want to take out the center so you can create a little niche market for yourself, and you go after the competitor like Coke goes after Pepsi, a little bit. But I'm wondering in relation to that, as these new people have come into this space and tried to capture that attention that used to be owned by the center mainstream media, what do you think about the concept of citizen journalism, which is often offered as this democratic saving enterprise?
Joan Donovan: I, years upon years ago, would've called myself a citizen journalist, which is interesting to think about in this light. There was a time, as we were moving from internet at home to mobile internet, where it was pretty remarkable to be able to use your telephone, at that time I had a Blackberry, to take pictures or to take videos that you could then instantly upload to the internet and get feedback on. And in 2011, you really saw mobile phones be this incredible instrument for holding power to account, across the globe.
What we knew about what was happening in Egypt and in Tahrir Square for instance, wasn't coming from mainstream media so much as was coming from clips of people on the ground who were trying to show what was happening at these protests. As well, over the years the phones have gotten much better, the definition has gotten much higher, and now pretty much every one of us has a broadcast station in our pocket. High definition with limitless bandwidth and capacity to reach untold amounts of people. And I think that that's really powerful. I think it's really, really powerful.
And perhaps our governments haven't really been able to reckon with the power of the technology to move us beyond the blogosphere, where people were describing what they witnessed and were trying to get people to engage with their content, to this much more what we might call hot media that has video and images and has this gotcha type of look and feel to it. And so I do think that the stretch here though is calling this journalism. And I say that because now that I've spent a lot of time around journalists in newsrooms rooms, the organizational hierarchy of what a news organization does, from getting the piece of data or the video or the piece of information or sending the reporter to an event or getting the copy, all of these layers of editorial as well as fact checking as well as copy editing, give us a much more professional, albeit slower, that of what we might call facts or the first draft of history. Citizen journalism doesn't tend to have any of those layers.
So for me, it doesn't really reflect journalism so much as there's something else going on here. Maybe we call it reporting, I don't really know. I'd love to ask you all, what else should it be called if it doesn't have editorial, if it doesn't have the same kind of structures that we commonly think of when we think of the word journalism, is we think, Oh, there's more to it than just a person with a camera?
Matt Jordan: I tend to think of this stuff as infotainment and these are content creators. But as you say that, and I was thinking of the difference between people who just go out and play music every day in the street versus people who record in the studio. And there's something about this first impulse that comes out of your mouth that I think is what we see on microblogging. Things like Twitter or any social media, it's really without the institutional practices that you described that are what I think we depend on. I'm wondering what you think... It took journalism a long time to create these norms and there have been ebbs and flows in how much regulation we think we need. But one thing we can say about the internet in general is it has been completely devoid of these norms, the liability, the ability to hold people accountable for what they publish. Is that something you think will require a moment, a reckoning, where we're going to have to say, people have to be liable for what they say?
Joan Donovan: Yeah, we live in a very litigious society, and I would hate to suggest that the best way to counter this would be to sue everybody. But there is something to be said about the kind of liability that Fox is going through with being sued by Dominion Voting Machines. So there was a lot of conspiracy about the 2020 election going around, so many different theories. But one of them was very particular about the Dominion, the brand of voting machines, being somehow infected with communist algorithms. And it was pretty outrageous and dumb, but at the same time when it was being talked about by these different news organizations it was being taken pretty seriously. And so you had folks on Fox News and other outlets talking about Dominion Voting machines being rigged in some way, that is devastating for a business that relies on integrity. The role of the voting machine is to get the count right. And once you make it seem like they can't count the votes, the whole business model falls apart.
Now, only certain actors in this world are going to have recourse to lawyers that can enforce this. And what we're dealing with right now is a bit of an open season online, where anyone can say anything about anyone and then you really got to let the chips fall where they may. There's a lot of room there for people to contradict or try to offer up evidence to the contrary, but by and large, once a rumor is established about a politician, it's very sticky. It's very hard to get rid of, we've seen this happen time and time again. And if the entity that is pushing the rumor or the false accusation appears to look like a news organization, appears to look like a nonprofit, appears to look like they have credentials from a university for instance, that's still going to matter. And so people do look for these signals, even within these falsehoods, to try to think about, well, where is this information coming from? Is this a reputable source? And so the other thing that has been turned upside down and kind of hacked by media manipulators is that credentialing. And so that part of this aspect where they're not jumping in and saying, "I'm a citizen journalist," so much as saying, "I am a news organization," or, "I represent this nonprofit," which is the more I dig into some of these things, they're not, they didn't go as far as to register as a nonprofit.
Which is all to say that as we think about and work on the issues of media manipulation, disinformation, and how it affects our politics, our economy, our educational systems, our public health, we really need to start to shore up what I call TALK online: timely, accurate, local knowledge. So the more we can focus on creating the pipes for TALK, and I say knowledge instead of information, because knowledge is something that is vetted, the more we can build pipes for TALK, the better off we're going to be in terms of displacing all of this media manipulation and disinformation.
Leah Dajches: So we're kind of nearing the end of our time together, and I just kind of want to hear some of your advice in terms of, our podcast is about empowering citizens, and so for our listeners, how can we combat against trolling when we come across it, whether it's on our newsfeed or even at our dinner table?
Joan Donovan: Well, I mean, yeah, I never call a troll, a troll, because that is only going to make their resolve even more. I mean, ignore it when you can. Over the years, I've written some about silence and how if you don't give someone the response that they are agitating for, there's no escalation of the claims because they're not getting the kind of engagement that they imagine they would get. However, that's hard. That's really hard if they're going after your name, if they're going after your employer, if they're going after your family, it's hard not to stand up and say, "Quit it. You're being dumb." Right?
But unfortunately, in the online worlds, those kinds of responses tend to become trophies for media manipulators, disinformers, trolls and the like, serial harassers in particular. Because their goal is essentially to upset you, and the less satisfaction you can give them in terms of being upset, the less they have to go on over time. That being said, choose your battles, choose who you engage with, mute people instead of blocking people, and if it's happening within your own family or if it's happening at work, you can always say, "I appreciate your opinion and everybody's got one," and that's about it.
Matt Jordan: And you work on silence, or been thinking about silence, and that it's a great weapon against misinformation and trolls. Is it axiomatic to say maybe that the news is not going to be whatever is noisiest. And people, if everybody's talking about something, that might be something to look out for?
Joan Donovan: Well, the thing about news is, news is a legitimating function of many assertions. So attention from journalists is something that many of these trolls are seeking. This is why we see a lot of trolls showing up in the replies from journalists trying to get attention, linking journalists to specific stories, or trying to get journalists to add some particular narrative to their beat. And journalists have a really important role to play in terms of setting the public conversation, setting the agenda here. And so journalists should often look for those who are impacted and try to tell their narratives, try to tell the narratives of the people that are, in many ways the ones that are being assaulted either verbally or talked about, but never really talked to.
For instance, to take the example of how would you write a story about, Okay Groomer, is you would want to look at and understand a couple of individuals' narratives that are from the trans community. And they might not be even directly affected by the allegations, but rather our family members who are saying, "One of my members of my family wants to access medical care, but all of these hospitals are removing information about gender affirming care, and we don't know where to go now." Right? And so there are ways that you can tell these stories and talk about the impacts, and shed light on the issues that are happening without going directly to the trolls and putting a microphone in their face and say, "Tell me why you're doing this."
This was very common when I was studying white supremacists is journalists would go right to white supremacist and be like, essentially, "Tell me you're a racist." And then they'd say, "Well, I'm not a racist, but I am part of the alt-right and I would like an ethnostate in the United States. I do think we need to stop immigration."
There was a really controversial article years ago in the New York Times called the Nazi Next Door, and basically they had gone to the grocery store with this neo-Nazi, and he was like, "I don't know how people think I'm a racist. I enjoy tortillas." These were the literal themes of this article, and of course there was a disclaimer later put at the top of the article in the New York Times, did walk back some of the coverage. But that impulse to try to humanize trolls or those that are creating this kind of pain in the world, I think, is potentially a destructive one in the sense that we should really be looking at and hearing from the people who are most impacted by these narratives.
And what does it mean for them in their practice of their everyday lives? I think that that's a really important thing that we need to know as a society, which is that trolling, and when trolling turns into network harassment, that it has very negative consequences, not just because they're absolute effects and harms committed against the people who are directly involved, but also it has these cultural ripple effects where people who are seeing this happen don't want that to happen to them. So they tend to turn inward. They may not seek the medical attention or the care or the rights that they deserve because they don't want to be treated in this way. And I think that that's the most important thing that journalists can do at this time, is to really be selective about sources.
Matt Jordan: All right. Joan, I would like to thank you very much for talking with us today. This is a lot to think about and a lot to be intentional about as we work our way through this media ecology.
Joan Donovan: Thank you so much. It was really great to be here.
Matt Jordan: Leah, that was a really interesting conversation with Joan, and she gets down into the weeds so much in these media ecologies that it's hard to know where to come out. And I was wondering what your takeaways are from today?
Leah Dajches: Yeah, absolutely. This was a very informative conversation we had with Joan about a really, I think, timely topic. And I think my major takeaway today is really the importance of news literacy and understanding who is setting the agenda and why and being able to kind of dissect where your information is coming from and how you can respond appropriately when it's false information. What about you, Matt? What was your large takeaway from today?
Matt Jordan: Well, I find myself in a way learning to think about my own impulses to engage, right? That there's a thing we all have, I think, when we see injustice or when we see somebody suffering or when we see things, that we want to do something. And it seems like maybe the best way to deal with trolls is to not do anything, right, is to disengage. And so to not get pulled into the Twitter dumpster fire or to not get pulled into these online debates, and really to go back to, I hate to sound all stodgy, to go back to mainstream media, and it's more curated content than kind of getting into the melee that sometimes you see online.
Leah Dajches: That's it for this episode of News Over Noise. Our guest was Dr. Joan Donovan. For more on this topic, visit newsovernoise.org. I'm Leah Dajches.
Matt Jordan: And I'm Matt Jordan.
Leah Dajches: Until next time, stay well and well-informed.
Matt Jordan: News Over Noise is produced by the Penn State Donald P. Bellisario College of Communications and WPSU. This podcast has been funded by the Office of the Executive Vice President and Provost at Penn State and is part of the Penn State News Literacy Initiative.
END OF TRANSCRIPT
About our guest
Dr. Joan Donovan is a leading public scholar and disinformation researcher, specializing in media manipulation, political movements, critical internet studies, and online extremism. She is the Research Director of the Harvard Kennedy School’s Shorenstein Center on Media, Politics and Public Policy and the Director of the Technology and Social Change project (TaSC). Through TaSC, Dr. Donovan explores how media manipulation is a means to control public conversation, derail democracy, and disrupt society. Dr. Donovan is co-author of the book Meme Wars, The Untold Story of the Online Battles Upending Democracy in America. She is a columnist at MIT Technology Review, a regular contributor to the New York Times, The Guardian, NPR, and PBS, and is quoted often on radio and in print.