I’ve spent my career fighting for democracy and truth in Russia and Eastern Europe. I worked with civil society activists in Russia and Belarus and spent a year advising Ukraine’s Ministry of Foreign Affairs on strategic communications. These experiences inspired me to write about what the United States and West writ large can learn from countries most people think of as “peripheral” at best.

Since the start of the Trump era, and as coronavirus has become an "infodemic," the United States and the Western world has finally begun to wake up to the threat of online warfare and attacks from malign actors. The question no one seems to be able to answer is: what can the West do about it?

My book, How to Lose the Information War: Russia, Fake News, and the Future of Conflict is out now and seeks to answer that question. The lessons it contains are even more relevant in an election year, amid the coronavirus infodemic and accusations of "false flag" operations in the George Floyd protests.

The book reports from the front lines of the information war in Central and Eastern Europe on five governments' responses to disinformation campaigns. It journeys into the campaigns the Russian and domestic operatives run, and shows how we can better understand the motivations behind these attacks and how to beat them. Above all, this book shows what is at stake: the future of civil discourse and democracy, and the value of truth itself.

I look forward to answering your questions about the book, my work, and disinformation more broadly ahead of the 2020 presidential election. This is a critical topic, and not one that should inspire any partisan rancor; the ultimate victim of disinformation is democracy, and we all have an interest in protecting it.

My bio: https://www.wilsoncenter.org/person/nina-jankowicz

Follow me on Twitter: https://twitter.com/wiczipedia

Subscribe to The Wilson Center’s disinformation newsletter, Flagged: https://www.wilsoncenter.org/blog-post/flagged-will-facebooks-labels-help-counter-state-sponsored-propaganda

Comments: 529 • Responses: 75  • Date: 

Plusran163 karma

Wow, this is amazing! I’ve had an idea like this bumping around in my head for a while. I was calling it ‘how to destroy America’ focusing on dividing the people.

Question: what are your top 3 recommendations that regular people can do to identify and combat disinformation?

wiczipedia464 karma

Awesome question, thank you so much for asking! I think for more reddit users these will be pretty simple, but...

  1. Check the source- if you're looking at a website and it seems shady or is new to you: does it have an editorial masthead? Does it have contact info (a phsyical address and phone number)? Has the author written anything before, and is their portfolio similar in terms of its editorial integrity?
  2. Has the article been printed anywhere else? Drop a line into Google and see if the same text appears on other websites- this is a good indication of a for-profit disinfo or misinfo network.
  3. Reverse image search! Misattributed images are huge during times of crisis. Everyone should know how to reverse image search. This is an in depth guide. https://www.bellingcat.com/resources/how-tos/2019/12/26/guide-to-using-reverse-image-search-for-investigations/

coryrenton145 karma

In your opinion, which is the smallest or least likely non-state actor that is the most effective at cracking down on disinformation campaigns?

wiczipedia101 karma

Do you mean in terms of civil society organizations, platforms, journalists, etc?

coryrenton70 karma

Sure, if there were any surprising ones (say a high school journalist uncovering major corruption), but I was thinking more along the lines of say a cereal company having to employ its own anti-disinformation squad for some bizarre geopolitical struggle affecting breakfast cereal markets, or something along those lines.

wiczipedia176 karma

Steak Umm has been great! (Bless) https://twitter.com/steak_umm

On the more academic / activist side, I like the work of Data & Society a whole lot: https://datasociety.net/

coryrenton50 karma

Steak Umm is indeed very surprising!

On the other side, what is the weirdest thing you have seen being co-opted during the course of a disinfo campaign?

wiczipedia86 karma

There are some good ones in this thread (taken from the 2018 IRA ad dump): https://twitter.com/wiczipedia/status/994587498692206592

My favorite is probably the golden retriever holding a US flag who says "Like if you think it's going to be a great week!"

Eattherightwing144 karma

Nina, I suspect that disinformation campaigns work because people are overloaded with information, and disinformation campaigns simplify complex issues, thereby getting more airtime.

Now if you come along and say "I've got a 300 page document that outlines a strategy for investigating misleading information," will you not just get drowned out in the clammering voices?

I guess my question is, how do we simplify this? How do you encourage people to "stay with you" as you carefully spell things out? The attention span out there is zero right now!

wiczipedia202 karma

Hi all, sorry for delay- power outage here but I'm back :)

You're absolutely right! Information overload or a "firehose of falsehood" (as the RAND Corp calls it) is part of the strategy.

I think in part, the media needs to do a good job distilling information and laying it out for people. A great example of this is the series that PBS Newshour did distilling the Mueller report for those that didn't want to slog through it in print. That's the sort of thing more outlets need to be doing- and public journalism is really good at it. I'm a huge advocate for journalism as a public good, and hope we as a country start to invest in it more. We only spend $3 per person per year on the Corporation for Public Broadcasting. We can do so much better, and provide quality information to people who might otherwise live in news deserts (NPR and PBS provide some of the only local coverage in some parts of the country).

Eattherightwing26 karma

Thanks for the response! Public broadcasting is indeed a good thing. The corporate versions of mainstream media can be bought and sold, and therefore manipulated. If people don't want fake news, they need public journalism. I think it's the only way some people can trust media at this point.

What about public social media? I suppose the cbc has a great presence in my country(Canada), but forums and other social media platforms are all corporate. Maybe it's time for NPR, CBC, BBC, etc to create the new Twitter, Facebook, or Reddit. Trust is becoming the biggest factor in this stuff..

Anyway, thanks for taking the time!

wiczipedia21 karma

Canada is great, and I am glad to hear you like the CBC's social media. I agree that nobody's really cracked the "social news" code yet, but I would love to see this happen!

DiceMaster21 karma

This is a great question, and I hope OP answers. Just my two cents:

I think the influence of a book like this, at least in the best case, extends far beyond just the individuals who read it. If the book is well-written, people who are interested in the pursuit of truth, fairness, and justice will read it and arm themselves with ways of both seeing through disinformation when it is presented to them, as well as ways of promoting good information when they speak to others.

If the book only addresses how to recognize disinformation, but falls short on how to reach others with quality information, it will not be a very useful book, in my eyes.

wiczipedia44 karma

Thanks! The idea behind the book is less about recognizing disinformation and more about telling the story of its decades-long patterns. It's written in an accessible, character-based way (and is pretty short as far as non fiction books go). My mom called it "not boring like most non-fiction books"- which is probably the best endorsement I could have hoped for :) It might not be everyone's cup of tea, but I think for those who want to know more about how both domestic and foreign disinformation function, it should be interesting!

Revolutionary-Kale101 karma

How do you recommend dealing with someone who claims mainstream information outlets are “incredibly biased and have agendas” while putting up articles from fringe sources that are from sites with a historical record of twisting the truth? It is always in a suggestive format of “Did you hear about this? It is worth considering. Don’t brush it off too quickly.” (An example being microchips in vaccines.)

wiczipedia153 karma

This is an awesome question! I always recommend talking/chatting the person privately (as opposed to leaving a public comment or responding to a tweet). Opening with a nonconfrontational question is a great way to start- something like "This is interesting- why does it resonate with you?"- then gently pointing out the inconsistencies in the information. I find that linking to fact checking sites in particular tends to put people on edge- instead just speak from your own experience and knowledge and make it human. Good luck!

Kahzgul17 karma

Do you also do this on social media? Isn't a side effect of that approach that the incorrect statements remain public to be spread to countless others, while the correction is only a private discussion, reaching at most one other person?

wiczipedia51 karma

I've found in my own interaction online that these private interactions are usually better. Unfortunately very few people will see corrections on social media, and studies suggest that fact-checks/corrections often don't change people's minds. Further, if you engage publicly you risk amplifying the bad info. This is the approach I generally try to stick to, offline or on.

josefjohann7 karma

and studies suggest that fact-checks/corrections often don't change people's minds

I know this idea became popular with Brendan Nyhan. But I had the impression that it was dependent on circumstances and how you frame the study that examines it, and that the blowback effect had been disputed in other research, although I'm admittedly fuzzy on my source for that.

I guess I'll just lay my cards on the table and say that I don't doubt this is true, but I think there are examples in different circumstances of subjecting falsehoods unapologetic, comprehensive confrontation having the effect of shifting public opinion, such as Popular Mechanics debunking 9/11 myths, or scientists & mainstream news pushing back more forcefully against global warming misinfo in the mid 2010s after treating it like a 'both sides' issue. Is the issue that it runs deeper than a communication strategy and depends a lot on whether the source of information is trusted?

wiczipedia17 karma

I'm familiar with the Nyhan study you're referencing, but I'm actually harkening back much earlier to psychological studies from the 70s. Basically, these studies find that when people are corrected, they're more likely to remember the false information than the correct version. There are some more encouraging studies specifically on social media labeling that have come out recently, but I still think it can only be part of the solution, as I've seen from my research deep seeded distrust of fact checkers in vulnerable communities. So I think you're right in your ultimate conclusion- the source matters. This is why government or platform campaigns that encourage healthy information consumption habits will be hard pressed to find success- what we really need is trusted third parties, community leaders, etc, adopting these tactics and teaching their communities about them. TikTok is trying something like this with its media literacy efforts; in general I'm a bit skeptical of that effort but eager to see where it goes!

CHUBBL3S13 karma

The issue is not about the sources of information (mainstream media/fringe website) but the evaluation of the specific claim itself. The only thing responding publicly does is give the claim more credence and the fringe site more traffic. It will spread to less if you don't engage; and not one holocaust denier, flat earther, etc. will be convinced by whatever you, a brainwashed sheeple, have to say.

Responding privately also turns the discourse into a conversation, rather than a public debate. If they were going to do any self-reflection it's more likely here. But the main benefit is to stop the sick from spreading.

Kahzgul5 karma

I guess that's the part I don't understand. How does privately messaging someone who publicly posts their sickness stop the sick from spreading? The public only sees the links to fringe websites, with no one challenging their claims.

wiczipedia15 karma

The idea is that hopefully it changes their behavior in the long run. I know that is cold comfort, though :-/

chrisfrap77 karma

I'm very excited to read your book, my copy is shipping today!
As for my question: Prior to the rise of Facebook, what were some of the more popular disinformation channels utilized in Eastern Europe, and is there a way to measure how effective those efforts were?

wiczipedia105 karma

That's wonderful, thank you so much for ordering!

Before social media were so ubiquitous, state-run media provided a key influence vector for Russian disinformation. It had a huge impact in Russian interference in Estonia in 2007 when Russian-language media exacerbated the grievances of the ethnic Russian population that led to riots, and in Georgia in 2008, when the Russian state media and international propaganda networks sought to counter the Georgian government's narrative about the five-day war.

Effectiveness, whether we're talking about social or traditional media, is a hard thing to measure. Most people want to know if these efforts changed votes, but I think that's the wrong question. The goal isn't necessarily to change votes, but to change thinking and discourse, and there is certainly evidence of that in both of those cases and in the 2016 election in the United States.

RedWarFour36 karma

What sort of "thinking and discourse" do you think Russia is trying to promote? Are they just trying to create division in the US?

wiczipedia226 karma

Yes, an intermediate goal is to promote discord and division, but in service of what?

I see Russia's influence operations as having three goals, broadly.

  1. The Kremlin wants to keep us (the West, broadly) turned inward, distracted by our domestic problems, so that we aren't paying attention to Russia's adventurism around the world, whether in Syria, Ukraine, Venezuela, or even within Russia's own borders, where human rights abuses have been rampant.
  2. The Kremlin hopes to drive disengagement in the democratic process by flooding the zone with information. Democracy doesn't work without participation, and failing democracies pose less of a threat to Putin's authoritarian rule.
  3. Putin hopes to return Russia to great power status- and I think he's been pretty successful in this regard. Despite not having a very strong economy, Russia is back on the world stage. The West has discussed it every day for the past four years. And even though Putin hasn't absolved of his transgressions (such as the illegal annexation of Crimea), leaders like Trump and Macron are considering inviting him back to the G7.

wiczipedia39 karma

That's all folks- thanks for a great discussion! I will check back over the next few days to see if there are any lingering questions, but I appreciate you taking the time to chat and invite you to follow me on Twitter and stay in touch.

For more info on me and my book: www.wiczipedia.com

PM_ME_YOUR_FARMS9 karma

I just read through this thread and you did a great job! Thanks so much!

If you have time at a later date, I have a question. I know some American progressives who think that reports of the Uyghur genocide are fabricated by Western propaganda and seem to be trusting Chinese reports that there's nothing suspicious going on and that prisoners are being treated well. Do you think the evidence in support of a Uyghur genocide is reliable, or should we be more cautious? Why have educated progressives who are otherwise intelligent and justice-oriented been so convinced by CCP propaganda (not just on this one issue – they seem to think any criticism of the CCP is racist and distrust all Western media on Chinese news)? What can we do about this?

wiczipedia11 karma

Oh wow, I'm sad to hear that. Thanks for this comment.

Yes, there is a genocide going on in China. You can perhaps send your acquaintances the videos of blindfolded Uyguhrs being loaded onto trains and accounts of Uyguhr women being forced into arranged marriages with Han Chinese men.

I am, in general, pretty dismayed by people who tend to whitewash the crimes of the CCP or the Soviet regime, as I more frequently run into. My grandfather and his family were deported by the Soviets during WWII and spent a few years in a labor camp; my great aunt is buried in an unmarked grave somewhere near the Arctic Circle, so it's really sad for me to read about this sort of trend. I'm not sure what to do about it besides hope that people read more history so they understand the long-term context for what they're discussing.

nwilz-11 karma

A half hour, you answered less than 10 questions. Should AMA's this short be allowed?

wiczipedia14 karma

It was an hour (13:00-14:00) and I answered the questions I got in that time frame... But I'm still here answering them!

kingk01739 karma

What are your thoughts on the QAnon conspiracy theory and the possible ramifications it can have on our government, especially in November?

wiczipedia89 karma

Quite frankly, QAnon scares me. I am disturbed that we see some leaders supporting a sprawling conspiracy theory that is a threat to public safety.

oligarch-chic25 karma

What should the U.S. Congress do to reduce foreign and domestic disinformation on Facebook and other online platforms? What are the corresponding political and practical barriers which may prevent Congress from taking action?

wiczipedia78 karma

The first tenet of any counter disinformation policy *needs* to be that disinformation is a threat to democracy, no matter whether it's foreign or domestic in its source. In the US right now, everyone agrees that foreign disinformation is bad but some are a bit more reticient when it comes to domestic disinfo. This is a mistake! It creates far too many loopholes for bad actors to exploit, and indeed, we're seeing adversaries like Russia begin to launder their narratives through authentic local voices. So we need to recognize that first.

Then I'd like to see a lot more transparency- over algorithms, group and page ownership, microtargeting, and all advertising. People need to understand how and why information is making its way to them.

Finally, we need oversight- there needs to be a federal watchdog that is ensuring the platforms are adhering to the laws they are subject to, not impinging upon freedom of expression, and ensuring equal access and safety on their platforms.

What's the hold up? Well, right now there's an incentive to create online disinformation because we don't have any of the mechanisms I described above to keep it in check. Some political candidates have taken pledges not to engage in it, but they're now at a disadvantage, because their competitors have not. We need to level out that playing field with regulation. But less understandably, this issue has become politicized, even though it should absolutely be nonpartisan, so some politicians are afraid to speak up for democratic discourse, particularly relating to domestic disinformation. It's really unfortunate, and they're doing a disservice to their constiuents. This is the main obstacle impeding progress on this issue in Washington.

crunkashell235 karma

It's also difficult to stop because the onus of truth lies on the attacked. Counter-messaging takes time to curate and release, which is often too late because the news cycle has already moved on and the disinformation has already been consumed by the user. A large part of countering disinformation is education; teaching people to look at things objectively and from trusted sources. The UK government even has a page on how to identify misleading info.

wiczipedia12 karma

Couldn't agree more!

winosthrowinfrisbees9 karma

I looked for the UK gov disinformation site and found the SHARE checklist for coronavirus.

https://sharechecklist.gov.uk/

Is that what you're on about or is there another one as well? I love that they're doing this.

crunkashell24 karma

Nope, that's the one. Should have included the link in my post.

wiczipedia10 karma

The UK gov also did a great campaign called "Don't Feed the Beast" which raised awareness about not sharing spurious info!

shihonageth-9 karma

It creates far too many loopholes for bad actors to exploit, and indeed, we're seeing adversaries like Russia begin to launder their narratives through authentic local voices.

This is the most 1984-ish sentence I've seen this month. What an amazing justification for censorship of free speech. "This person isn't expressing their thoughts - it's Russia expressing it through them! Down with Russia! And this person!"

wiczipedia7 karma

I think you're misreading my intent. I don't want censorship by government or social platforms. I talk a little bit more about the sort of thing I would like to see here https://twitter.com/ChathamHouse/status/1285963173716201472?s=20

schloooooo21 karma

In your opinion, what is the best way to explain to someone that the information they are sharing/relying on is untrue without making them feel defensive?

Additionally, what are some easy flags you could point out to someone to let them know in the future about the quality of their information?

TengoElGatoenMisPant21 karma

Hi Nina!

I see on your twitter that you're very critical of the president in terms of no administration ever doing less to deter Russia on this stuff. What would you say though given that the most audacious level of interference happened under Obama and after years of attempted detente with Putin?

Thanks!

wiczipedia41 karma

I don't let the Obama Administration off the hook either, and I particularly wish it had publicly attributed the 2016 interference when it became clear what was happening. Unfortunately with the political environment as it was it would have opened a whole other can of worms and accusations of tipping the scales in favor of Clinton. All that being said, I do think there is some good work happening within the USG on Russia and disinformation right now. It is just being almost entirely undercut by the President's friendly relationship with Putin .

I hope that in future adminstrations the US government is clear-eyed about the threat disinformation poses to democracy writ large, and informs American voters about the threats as they stand in closer-to-real-time.

rejuicekeve20 karma

How do you feel about posting an AMA about disinformation in one of the major disinformation and manipulation outlets?(reddit)

wiczipedia6 karma

Touche :)

I'm not someone who thinks we should boycott all the parts of the internet that have problems, and I do appreciate some of the actions Reddit has recently taken to curb the spread of disinformation on here. Also, I hope that perhaps folks will learn a few things, thus maybe neutralizing some of the more unfortunate content On Here.

garden_h0e14 karma

What challenges do you face in crafting policy recommendations on these issues as someone who has not worked directly in policy making or the US government? (Assuming this based on your bio, correct me if wrong.) Media literacy and disinformation are such cross cutting issues relating to education, tech innovation, foreign policy, cyber security, etc that it seems like a tall order to answer such a huge question in one book without that firsthand insight.

wiczipedia28 karma

I actually view this as an advantage- I'm not weighed down by the thinking of people who have worked only in a single sector. One of the biggest problems in this space is tech folks only seeing the problem from a platform angle, policymakers being burdened by process and securitizing the problem, academics not having practical experience with these themes "IRL." I try to bring a multidisciplinary approach -- informed by time spent in the field -- to bear. I spent a year in Ukraine within the Ukrainian Foreign Ministry as part of a Fulbright grant, and I've also worked in government-adjacent roles, including with the National Democratic Institute, so I'm familiar with how the sausage gets made.

wiczipedia15 karma

Regarding the book's remit, I let my characters do the talking! I was lucky enough to speak with the people who do this work on a daily basis- they drive the story, and I apply my lens to it.

KnightoftheNight69-3 karma

If you haven't worked for a tech company or as a policymaker, how do you speak to the bureaucratic constraints, resource considerations, and various other inputs that inform how actionable or pragmatic a policy prescription is?

At the end of the day, you're trying to get them to listen to you but how do you ensure they see you as credible if you only have a surface-level understanding of the dynamics that affect their corporate or government-level decision-making?

wiczipedia16 karma

I would suggest you read the book (or, alternatively, some of my other work: https://wiczipedia.com/portfolio/) and decide for yourself if I'm credible. The Congressional Committees before which I've testified and entities I've advised seem to think so.

Mr_Shad0w13 karma

Have any thoughts on the Cambridge Analytica / Facebook scandal?

Why do you think the general public was surprised / is in denial about how their data (and social media, generally) is being used to manipulate them?

Why do you think humans would rather "stay asleep" than stand up for themselves?

wiczipedia23 karma

The scandal is disturbing but not surprising, both because of how cavalier platforms are about our personal data, and the fact that most users don't know what they are trading away. I think people legitimately just did not know how their data was being used. Now, there seems to be some general awareness building in society in this regard, but I'd like to see the platforms building better UX to inform users of what exactly they're trading away for free access. (It shouldn't take 20 clicks to change your privacy settings!) And there's a govermental role here too- are platforms being careful stewards of our data? These scandals suggest that's not the case. What should the penalty be when there is a breach? All open questions.

In short, I think these are complicated issues that most people just don't have the time to get into, especially when, at their surface, social media and big tech make their lives easier and more fun.

Mr_Shad0w5 karma

Great answer, thanks. I first started thinking hard about this subject after reading Jaron Lanier's Who Owns the Future?, although I've always been anti-social media when I saw how many petty squabbles it fomented.

The fact that US States are occasionally passing "tough" privacy laws, only to see Big Tech companies like Google and Facebook bribe lobby Congress hard to pass weak, useless privacy laws which would override those at the States level in full view of the public with virtually no push=back is depressing.

wiczipedia7 karma

I agree. I often say that we're abdicating our role in crafting democratic, human rights based social media regulation for the entire world- but especially for US citizens. I'm hopeful that awareness is building to a high enough point where we'll pass some common sense regulation soon.

SlushoMix11 karma

What is your comment on politicians' lack of knowledge of online services, emergent technologies and digital privacy worldwide?

And how should everyday people combat those issues regarding laws that instate imperatives for creating backdoor access, and how should people oppose expansion of privacy invading Huwei tech in Europe? Of course, elections, but that isn't enough. Also, raising public awareness on issues unfortunately gives credibility to conspiracy theories as well.

It seems to me that politicians and representatives on a fundamental level don't understand anything tech related, e.g. how Facebook works as seen during the hearing in the US. Yet, at the same time they introduce bills that are detrimental to privacy. And we all know how being aware of someone's watching over constricts personal freedom and consideration of choices.

If your expertise covers it, what are your predictions regarding democracy and privacy in the Balkans under the influence of Russia, and China as particular countries that are hybrid regimes are introducing extensive face recognition and tracking technology by Huawei?

What are your predictions for the US in case Trump gets reelected and how much could that erode democracy?

Edit: Should studying the methodology of Cambridge Analytica prove that similar tactics could counter the effect of disinformation? Or, should we employ a significantly different approach at fighting state propaganda and how?

wiczipedia29 karma

Wow, lots of great questions here, thanks so much.

Politicians *definitely* need to read their brief on tech. I think there has been a sea change of how politicians on Capitol Hill approach social media since that fateful 2018 hearing I believe you're referencing (the infamous "Senator, we sell ads" answer!). There's an effort to get more staffers with tech expertise in the room, but I also think we need a fundamental shift in our representation! It's not a coincidence that some of the freshmen in Congress are asking the most informed questions about social media and using it more effectively; they understand it in a way older elected officials don't.

How can normal people make their voices heard? You're right, voting is one way- but there's also a fairly robust mechanism for Americans to feed into the policy making process, either through civil society and advocacy groups, or by filing their own comments in notice and comment periods, or writing/phoning their representatives. The democratic process doesn't begin and end on election day!

The Balkans are a bit beyond my expertise, but I know that some great writers and reporters at the Organized Crime and Corruption Reporting Project (OCCRP) look into these issues.

kildoents11 karma

I'm doing an undergrad in cognitive psychology, but I would like to study disinformation. I'm from Canada. What path(s) could I take? How does one get involved in our own country's active defense against disinformation? also, how did you get academically interested in disinformation?

wiczipedia18 karma

My own path came from the foreign affairs/democracy support side of things and inevitably ended up looking at communications, which led to disinfo - related work. I think there's a lot of great psychological research going on in the disinfo sphere these days, so by the time you're doing graduate research I'm sure it will be blossoming! It's great that you know what your interests are so early on. As for getting involved in an active defense against disinformation, I always suggest that everyone be careful when sharing content from an unknown source online, practice "informational distancing" (https://www.newstatesman.com/science-tech/social-media/2020/04/why-we-need-informational-distancing-during-coronavirus-crisis), and do your due diligence in checking sources. Teach your friends and family how to do the same!

KnightoftheNight699 karma

It seems like disinformation inherently exploits domestic tensions within the US. How can anyone measure its effect when those divisions exist irrespective of any foreign influence?

Even if a foreign actor "amplifies" these divisions in terms of messaging, tweets, and posts, ultimately voters already felt that way and are politically inclined in certain directions and seek out information spaces that confirm their prior biases.

wiczipedia13 karma

That's the biiiiiig challenge of disinformation and what makes it so effective and difficult to combat. I explore on this in an excerpt from my book which you can read here: How an Anti-Trump Flash Mob Found Itself in the Middle of Russian Meddling

I go into this at length in the book, but to me this isn't about a direct or measurable effect on elections, it's about the integrity of the discourse. If you look, for example, at the DNC hack and leak in 2016- that changed the discourse around the campaigns, how they talked about themselves and each other, and how the media covered them. It changed what Americans were talking about. The IRA generated posts in 2016 "were shared by users just under 31 million times, liked almost 39 million times, reacted to with emojis almost 5.4 billion times, and ... generat[ed] almost 3.5 million comments.” The discourse changed. Same with the flash mob example in the link above.

I don't believe that we should stand for bad actors inauthentically manipulating the discourse in this way- instead we should be equipping people with the tools, skills, and transparency measures they need to understand why information has made its way to them.

garden_h0e7 karma

This is a bit confusing. Do you consider number of shares/likes/reactions/comments as a unit of measurement here? It seems that way based on you making a causative link between the propagation of IRA material and the change in "discourse." I feel like at a certain point you have to make a call about what exactly it is you're analyzing and how you intend to evaluate its impact. That's sort of why I asked the question earlier about defining information - without that clear definition I feel like you fall into the pit of tackling the kitchen sink of "information operations" in a broad way without clearly addressing the causes and solutions to each unique issue.

wiczipedia10 karma

It's a bit difficult to do in a rapid-fire AMA! This is why I wrote a book on the issue. I hope you'll take a gander at it.

MisterSchnitzel-1 karma

"The IRA generated posts....."

You don't know the discourse changed. You're just assuming that people who engaged those posts had an altered perception rather than it played into their preconceived notions of the Dems

wiczipedia5 karma

First, this has nothing to do with Democrat vs. Republican- this was all across the political spectrum, on both sides of the aisle. Second, we do know that in some instances not only did people's perception change, their behavior changed- the link above lays out how Russia turned out protestors to IRL flash mobs, for instance. Third, with hack and leak operations in particular, that information would not have been present had Russia and the IRA not put it there. All this is evidence of the effect on the discourse.

ifsavage7 karma

How fucked are we?

wiczipedia12 karma

There's a reason my book is called How to Lose the Information War! But I hope we can turn this situation around with more engagement and awareness, and learning from other nations that have been there before us.

Anthadvl7 karma

Seriously, Nina how do I stop my parents to stop believing every conspiracy theory that comes on their feed?

wiczipedia5 karma

I wish I had a silver bullet for you! I think there are some good strategies in this article https://www.washingtonpost.com/technology/2020/06/05/stop-spreading-misinformation/

I also linked a few other resources above. But we need to engender an understanding that just like Nigerian princes and social security scams, we shouldn't believe everything on our news feeds.

MisterSchnitzel7 karma

Nina,

The one thing I struggle with when you and others call for this to be a non-partisan issue is the near constant criticism of Trump for not responding more to Russia while you all simultaneously give a free pass to China for their gross manipulation of COVID information.

How can you claim to be non-partisan when not also highlighting how CNN and other mainstream outlets for example published talking points from the South China Morning Post? Why don't you and the experts give equal attention to China and the administration's stance on them?

Good luck, MisterSchnitzel

wiczipedia36 karma

I don't give a free pass to China at all, but I am a Russia expert. I can speak about China in broad terms, but it's the Russia that is my area of expertise, so that's what I focus on. There are plenty of colleagues of mine - Rui Zhong at the Wilson Center, Laura Rosenberger at the Alliance for Securing Democracy - who have got the China beat covered.

Regarding the administration's stance, I believe we should treat all foreign interference equally. So while the Trump administration has begun to call out China, I would like to see the same sort of full-throated criticism of the Russian Federation, coupled with a counterdisinformation policy that recognizes the foreign and domestic threat it poses to our democratic discourse.

Thanks for the question!

glendarey5 karma

Hi Nina!

Saw and appreciated your zoom presentation with the Wilson Center.

What do you suggest for internal, domestic disinformation? It seems that shutting down conspiracy theories and other disinformation tactics edges on trampling first amendment rights in the US and civil discourse elsewhere, yet simultaneously troubles those two fundamentals for democracy?

Thanks

wiczipedia5 karma

Thanks for tuning into that discussion! (For those who want to watch: https://www.wilsoncenter.org/event/how-lose-information-war-russia-fake-news-and-future-conflict)

In terms of battling domestic disinfo, I'm in favor of more transparency and more context. We should have a better idea of how information is reaching us and why. Platforms should add friction to environments to discourage sharing of harmful information. And they should -- and increasingly are -- add[ing] context to posts that are misleading. (Both Twitter and Facebook have done this in recent weeks to posts from the President). I don't want platforms or governments to trample first amendment rights, but I think equipping users with better info and better tools can mitigate the rampant spread of online disinformation.

garden_h0e5 karma

Another brief question: how do you define “information” and/or “disinformation” in your book? These terms are used so broadly now that they feel almost meaningless. It would be great to know how you’ve tackled putting specific parameters around them.

wiczipedia16 karma

I'm going to plop a bunch of text from the book's prologue below!

"The West’s response was also delayed by a lack of common definition of the problem. Buzz words like “propaganda,” “information war,” “hybrid warfare,” “active measures,” “influence operations,” “disinformation,” “misinformation,” and “fake news” are used interchangeably across policy spheres and the media, with little regard to what precisely is being discussed or what problem needs solving. But we need to clearly define and categorize these phenomena if we are to successfully understand and counter them. Here’s how I look at this confusing landscape.

All of the tactics Russia employs to angle for international notoriety can be categorized as “influence operations.” To exert its influence over foreign governments and their populations, Russia might undertake old-fashioned spying and military operations, but the case studies in this book will focus on the overt, civilian-sphere influence operations. Sometimes these actions fall neatly into the category of disinformation—“when false information is knowingly shared to cause harm”—or malinformation—“when genuine information is shared to cause harm, often by moving information designed to stay private into the public sphere.”5 These include the now-infamous Russian ads purchased by the St. Petersburg “troll farm” in the 2016 US election, which pushed misleading and inflammatory narratives in order to widen polarization between Americans and increase dismay and distrust between citizens, the media, and government. The ads—and the even more successful organic content on the originating pages—attempted to widen divisions in every corner of the political universe. They argued for Texas secession, spread anti-immigrant vitriol, pitted Black Lives Matter and Blue Lives Matter activists against one another, and even distributed “buff Bernie Sanders” coloring books. They were “fake” not because their content was falsified—although they included plenty of false or misleading information—but because they misrepresented their provenance. The posts’ authors weren’t activists at American grassroots political organizations; they were Russian operatives in St. Petersburg who had carefully groomed their online personae for years."

It goes on- but you get the idea! A great resource for these definitions, and one I use myself, is First Draft News' glossary of terms.

jasonite5 karma

What is the single most important thing I should know in an election year?

wiczipedia11 karma

It's going to take much longer to get a result on election night than we're used to- we need to be patient and only trust reputable sources of info that night (state and local election commissions)- not politicians, pundits, etc.

silveredblue5 karma

Hi Nina. I’m a content manager/data analytics profession and really interested in getting into fighting disinformation long term. What would you suggest are ways I can help now, and ways I can help long term?

misskaminsk2 karma

Jumping in to say, same (as a researcher who does mixed method, ethnographic, micronarrative type stuff)! How can we find ways to plug in and help out?

wiczipedia9 karma

Hi folks, thanks for writing! In my view the most important things you can do are:

- patiently engage with friends and family who might be spreading misinfo unwittingly
- familiarize yourself with how to report disinfo or inauthentic behavior you see on each platform you use, and actually take the time to do it! Of course the platforms have issues, but until they improve, this is how we help the AI learn.

Longer term, there is so much that citizen activists can do in this area. Josh Russell is an Indiana dad who fights trolls and bots from his basement: https://twitter.com/josh_emerson

Learning basic open source investigative techniques can help you identify the bad stuff and malicious patterns online. Bellingcat and First Draft both offer good courses in this vein!

CHUBBL3S5 karma

Hi Nina,

It seems right now, in America at least (but surely in other parts of the world) 9 of every 10 citizens have leapt off the deep end. A precious few studious, critical-thinker types are content to say "I know that I know nothing", lament the lack of quality information, and leave it at that. But the grand majority have been yanked from all control and stability, and turn to whatever explanations they can find, no matter how unsubstantiated or ludicrous. They're starved of information, being force-fed this crap as a last resort.

My question is, 1) how do we help our loved ones be content with not knowing? How to help them weather hard times without the need to buy into bullshit? and 2) how does even the most mentally well-equipped person cope with their beloved community transforming into a vitriolic nightmare practically overnight? It's exhausting!

wiczipedia8 karma

Hi Chubbles, thanks for writing. I answered your first question in other parts of the thread, so will refer you there. But on your second question: don't lose heart- it can't happen overnight. We're talking about unlearning years, sometimes decades of unfortunate habits here, particularly with older generations, who are used to having informational gatekeepers make sense of the world. Keep at it with patience.

There is a point, though, where you have to protect your own mental health and disengage. This is why I give people who reply to me (usually on Twitter) a round or two of engagement before I suss out whether they're acting in bad faith; if they're just in it to score points, I bow out (as in other places in this thread ;)) Your personal threshold, or what's worth it for your community, might be different.

inside_out_man4 karma

I heard you on the Russia guy, well done. Economists talk about the importance of emotions or perceptions like optimism confidence. Politicians too, Obama "hope". Is the overall goal overwhelming cynisism? I saw you articulate specifically their ends interms of disengagement, inward turn and make Russia great again. It seems like conversely Putin is desperate to create optimism at home. Perhaps it a reductionist view to focus on emotion w but it is also intimate accessible angle. Thanks for your work.


What are your thoughts concerns on the Twitter hack?

wiczipedia2 karma

This is a great question! Disinformation absolutely runs on emotion and that's something we miss when we securitize the discussion (and this happens far too often). It's something we need to think about when we're considering how to respond. The most successful media literacy programs, for instance, help people recognize when they're being emotionally manipulated!

On the Twitter hack, I think it's extraordinarily scary that this happened and shows a real vulnerability for big events/moments of crisis. I'm not the first one to say this, but had this hack happened on election night or had the hacker(s) had a more serious motive, it would have been a huge problem. I discussed the hack on PBS Newshour last week: https://www.pbs.org/newshour/show/what-high-profile-hacking-attacks-say-about-cybersecurity

Sockemboffer4 karma

Any chance you’ll get this published in audiobook form?

wiczipedia5 karma

I really hope so! It's up to my publisher, but I'd love that!

yepitsalli4 karma

What's the best thing an everyday person can do to avoid misinformation?

wiczipedia8 karma

Think before you share and if you feel yourself getting emotional, ask yourself why (and definitely wait till you calm down to share).

smurfpiss4 karma

If you had infinite time and access to all media and social media, How would you quantify/track disinformation?

Memes spreading across communities, factual accuracy, outright lies or distortions of truths?

wiczipedia9 karma

This is a really hard question! Clicks and engagement are important, but I'd like to see how disinformation travels- where it begins, how it makes its way around the web, how it changes and morphs and gets amplified. This would allow us to track and debunk the origins of some of the Internet's nastiest rumors Some really brilliant network analysts already do this sort of work, but it is hampered by the fact that some platforms restrict access to their data, if it's available at all.

myearhurtsallthetime4 karma

Are we headed for the dark ages?

wiczipedia14 karma

I hope not :(((( I do sincerely believe we can turn this around if we start making generational investments in building people's ability to navigate this fast moving and confusing informational environment.

RickWino3 karma

Are there any resources you would recommend for an 8th grade government teacher? Disinformation is such a complicated, but important subject.

wiczipedia7 karma

My AP Gov teacher was so important to me- you're in the best spot to really have an impact on your students' information consumption habits! I'm so glad you commented.

Mike Caulfield does some really great work on information literacy: https://hapgood.us/ He wrote Web Literacy for Student Fact Checkers which is made for you and all your colleagues! https://webliteracy.pressbooks.com/

I also really respect the Learn to Discern program that IREX runs: https://www.irex.org/project/learn-discern-l2d-media-literacy-training

I hope these are helpful!

wiczipedia6 karma

Also, this is geared at high schoolers but might be helpful! https://ctrl-f.ca/home/

Ethan3 karma

Hi, not sure if it's too late, but: what do you think of the various proposals about how to change social media in order to combat disinformation... for example, requiring strict ID authentification so that one's online self is tightly linked to one's offline self?

wiczipedia6 karma

Thanks for this question! Let me address the specific question about ID verification- coming from my experience working with activists in closed / authoritarian countries, I am not in favor of this. The platforms sometimes work with these governments' requests which can land people in jail (see this piece from a few years ago: https://www.washingtonpost.com/news/democracy-post/wp/2018/04/13/why-dictators-love-facebook/)

I'm also just not sure having "real people" behind accounts will stop the spread of disinformation- this is techically Facebook's policy and disinfo and abuse is still rampant there! Some of my other ideas about social media regulation can be found below:

https://medium.com/@nina.jankowicz/social-media-self-regulation-has-failed-heres-what-congress-can-do-about-it-5b38b6bf9840

https://www.washingtonpost.com/news/democracy-post/wp/2018/11/15/its-time-to-start-regulating-facebook/

LetTheRecordShow1233 karma

Are you optimistic about the chances of democracies managing these problems? If so, why? I really do think modern information technologies pose a massive challenge to democratic societies, a potentially existential challenge.

wiczipedia3 karma

I'm still optimistic or I wouldn't be able to get out of bed in the morning! I think there are some examples of democracies reckoning with this issue- Estonia, Sweden, Finland come to mind- and they all address the fissures bad actors exploit and consider the human element of the problem. It can't happen overnight but with investment and persistence I think we can change direction.

MBR19903 karma

Hi Nina,

I'm late to this, but I hope you may find my comment later.

I'm currently in a MA program at Emerson where I'm studying political communication. I'm interested in pursuing a career similar to yours - am I on the right track? There's a propaganda and persuasion class that they offer and I plan to take.

Do you have a recommendation or suggestion on how I can continue pursuing this after grad school?

Thanks for the informative AMA!

wiczipedia3 karma

Hey, thans for posting. My own path was weird and serendipitious and came about thanks to my interest in Russia and the former communist space, but I think that sounds like a great MA program! You could look at getting an internship with one of the civil society/research organizations working on this (something like First Draft News) to build connections and experience. Like I said to a few posters above, I'd also recommend teaching yourself some OSINT techniques- there are few courses online that might be helpful (or perhaps Emerson offers something similar, too). Good luck, and feel free to be in touch via email if you have further questions :)

KnightoftheNight692 karma

Could you explain a little more what you mean by the "front lines" of the information war? Were you conducting influence operations first-hand? Were you embedded in a government-run information warfare unit?

wiczipedia11 karma

I mean geographic front lines! :) The book looks at how these tactics were practiced and perfected in Russia's geographic neighborhood (Estonia, Georgia, Ukraine, Czech Republic, Poland), before they came here. This is a huge gap in Western understanding of the entire subject of Russian information warfare.

I wasn't conducting any influence operations myself! Some of the book draws on my Fulbright Public Policy Fellowship in Ukraine, where I worked as an adviser to the Ministry of Foreign Affairs. But the rest of the book is about officials, journalists, and activists working to counter Russian ops in their countries.

Semen-Demon__2 karma

What’s your opinion on Facebook?

wiczipedia8 karma

Here's one recent publication that will give you an idea! https://www.wired.com/story/facebook-groups-are-destroying-america/

wiktorpolak2 karma

In your own view.. Is Russia as a state is being framed for certain actions, or are they mostly guilty of things attributed to them? ( Also if you could do a % split of how much disinformation comes from Russia, and how much from China. )

wiczipedia5 karma

I think there is a certain degree of Russophobia and a tendency to blame every bad thing that happens in the US on Russia. That being said, Russian information operations are still a very real threat that deserve our attention and vigilance.

Impossible to know % of disinformation without backend access to platforms and massive studies of all of the content on the internet. Also, don't forget domestic disinformers- there are plenty of those too!

ssyllogistic1 karma

Hi nina!

Is there anyway to WIN an information war? seems like with everyone getting their own news from their own bubbles and endless sources, we will continue to be in an endless age of disinformation

wiczipedia1 karma

Thanks for this comment, sorry it's taken me so long to get to! Yes, I think that with a combination of more traditional strategies (name and shame, imposing costs, shoring up cyber defenses to protect against hack and leaks) and with what I call "citizens-based solutions" we can stop disinformation's spread. A lot of those ideas are laid out in my testimony before the Senate Judiciary Committee here: https://docs.house.gov/meetings/AP/AP04/20190710/109748/HHRG-116-AP04-Wstate-JankowiczN-20190710.pdf

Aedengeo1 karma

How should companies sell ads without the invasion of privacy? Is it ever possible?

wiczipedia2 karma

Well, if the companies have good privacy practices and users are better informed there is a question about whether microtargeted ads would be an invasion of privacy. What I think should happen is that platforms should microtarget less and make opting out of targeted advertising and cross-platform data sharing easier.

curiousjosh1 karma

Hi Nina! I'm starting to see disinformation spread like wildfire through the west coast festival scene, which I've noth documented and helped organize for years.

Are there any things you would recommend we can do locally in our own spheres of influence or resources on tactics to take?

wiczipedia2 karma

Oh wow, interesting and scary! I would recommend learning about platforms' terms of service and about all the on-platform reporting features so you and your community members can report the information as it's spreading. If you find action isn't being taken, contact a reporter (Ben Collins and Brandy Zadrozny are two great ones on this beat) or researcher- unfortunately these days a lot of what the platforms take down is stuff that folks like me and my colleagues flag.

unfather1 karma

At first a joke but now a very serious question:

How can one trust a disinformation actor with a righteous agenda? How does one identify or differentiate?

wiczipedia2 karma

This is a question Evelyn Douek has addressed in her work, most recently vis a vis the Tik Tok Teens who upset Trump's Tulsa rally: https://slate.com/technology/2020/07/coordinated-inauthentic-behavior-facebook-twitter.html She raises so many good questions in this piece- I wholeheartedly endorse!

TheOtherQue1 karma

Hi Nina,

This is a fascinating area, thank you for posting.

You mention tech interacting with democracy in unfavourable ways.

Are there any positive interactions possible between tech and democracy?

Thank you!

wiczipedia3 karma

Thanks for posting! I do believe in the value of technology and social media to connect people with their elected officials and allow them to feed back on the issues that matter to them. In short, I think platforms can create more responsive policy if they keeps privacy and human rights principles at their core.

Vrael_Valorum1 karma

How concerned are you with the the PR industry perpetuating misinformation through misleading advertising, astroturfing, and industry funded science? Junk food companies can market sugary cereal to children by claiming to be a "balanced part of a complete breakfast". How do we deal with that type of destructive misinformation?

wiczipedia2 karma

This is a huge problem! In general there is an entire cottage industry of for-profit disinformation and beyond that, a junk science/for-profit industry. This is why we need equitable, across the board enforcement of platforms' terms. Whether the vector is foreign or domestic, PR or individual, politician or ordinary person, the policies need to be applied consistently. Right now they're not, and bad actors exploit those loopholes. On health misinfo, I would recommend the work of Renee DiResta, who has been on this beat for years. https://twitter.com/noUpside

dot-pixis1 karma

What do you think about the concept of using consumer tech for voting (voting apps, etc)? Do you think the potential cost could outweigh the potential benefit of increased voter turnout/accessibility?

wiczipedia5 karma

While I love the Estonian way of doing e-governance including online voting, we are so far from it working in the US. I lay out some of the reasons why we are behind in e-governance here: https://www.theatlantic.com/international/archive/2020/05/estonia-america-congress-online-pandemic/612034/

I think there are a lot of other ways tech can help democracy -- connecting people with their representatives and constituent services for example -- but I worry about online voting. What I'd much rather see is Election Day become a Federal Holiday / federally mandated PTO to vote, as well as automatic voter registration.

Diovobirius1 karma

So, I'm a student in Urban Planning and interested in using the opportunites of internet to strengthen local bonds, democracy and citizen/municipality-interactions. For my Master's Thesis I'm thinking of looking at an app doing, or trying to do, things in this direction (e.g. 'Nextdoor').

This is probably a few population levels below your expertise and in general with quite a different focus, but do you have a suggestion for a paper or something to look at concerning democracy and/or disinformation issues that would apply for tech focused on neighbourhood organisation and socializing?

wiczipedia7 karma

That sounds so interesting! It's actually something I used to work on when I did democracy support work. In a closed societies I always thought an app like this would help activists reach their constituents and neighbors. I still believe in the good that tech might be able to bring about for democracy, if it keeps human rights and privacy at its core. I'm not sure about a paper off the top of my head, but my old organization did a lot of interesting stuff in this area that is worth checking out! https://www.ndi.org/what-we-do/democracy-and-technology

concerned_citzn1 karma

Are you familiar with the 2005 film “Earthlings,” and if so do you recommend it?

wiczipedia19 karma

HA! For those that arent' in on this joke, yesterday there was a scary hostage situation in Ukraine. The hostage taker demanded that President Zelenskyy recommend Earthlings... and, well, he did. Once the hostages were released, he deleted it, but my friend and colleague Chris Miller's thread has the video (and more on the hostage situation) for posterity: https://twitter.com/wiczipedia/status/1285676610755145728?s=20

wiczipedia12 karma

(But no, I haven't seen the film ;))

ElonMusksMusk22-1 karma

you seem to frequently say that the adminstration isnt doing enough to combat the russians but how do you know what they're doing or not doing?

it's not like that stuff is public information....would sort of defeat the point. right?

wiczipedia11 karma

There are, of course, some responses that would be covert. But all the things I suggest (naming and shaming, imposing costs, and most critically, investing in education and repairing the fissures in our society that leave us so vulnerable in the first place) are in the public domain.