Guest: Gavin Wilde | Hosts: Misha Simanovksyy & Taylor Helmcamp
An interview with Gavin Wilde, senior fellow in the Technology and International Affairs program at the Carnegie Endowment for International Peace. SlavX hosts Misha and Taylor spoke with Gavin about the differences between information and cyber warfare, how Russia structures its information warfare activities and institutions, and whether Russia is really as good at information warfare as portrayed. Listen to the full episode below.
TAYLOR: Thank you so much for being on the podcast today. It's really an honor to have you. And I just wanted to kind of start out by asking if you can kind of give an overview of what is cyber warfare. For the listeners that may have heard of cyber warfare, they may be thinking of hacking or these really grandiose operations in dark rooms with sunglasses on, banging away to the computer. So can you give us kind of the landscape of what cyber warfare is?
GAVIN: At least from a US conception, let's say or, how I imagine if you ask fifteen different US scholars or practitioners, you'd get fifteen different answers. But the way I would explain it is cyber warfare is the idea that you can use cyber means to achieve some kind of effect that might rival or enable a kinetic effect by either disrupting or manipulating or denying an adversary some access to data or the ability to communicate using some kind of cyber means. So some kind of information or communication technology to deny the adversary the same. And I would say often it's difficult for folks to distinguish between cyber activities that are designed to kind of perform intelligence or reconnaissance or surveillance, which is essentially look, but don't touch, and the more kind of offensive cyber operations, which I would put under cyber warfare, which are designed to have some kind of effect, to achieve some kind of objective above and beyond merely having a better sense of what your adversary is doing, saying or seeing.
TAYLOR: So does cyber warfare include information warfare or are these two separate concepts that kind of have an overlap?
GAVIN: So that's an interesting academic question. I think there's probably some dispute on the spectrum. I think that's kind of the genesis of where I started to get really interested in it from a Russian, Russian perspective because in Russian strategic culture, they're kind of part and parcel under one umbrella of information, confrontation, or information struggle, if you will, where there's kind of the psychological aspect and the technical aspect, and they are kind of symbiotic. I would say, in at least the US or Western conception, they've been somewhat different. Different disciplines, different kinds of certainly training, but I would say over the last five years, 5-6 years, certainly since 2016, I think the US and NATO and a lot of Western militaries have started to put them under the same umbrella as well in, kind of, in terms of how they think about information warfare, quote unquote.
TAYLOR: How did you end up focusing on cyber? You have a very—you have a lot of experience in the private sector, also in the public sector, dealing with the region and just this field. So what brought you to this field and why did you choose this.
GAVIN: So I started out as a Russian linguist working for the government and was originally, when I started out, I did a short stint to at the FBI and then started at the National Security Agency and was originally really reticent, and I would say probably resentful of the idea that I had to learn so much tech and learn so much about cyber, because that's just an innate part of signals intelligence. It's just a very technical field, correct.
TAYLOR: You're involved in SIGINT.
GAVIN: Correct. And so understanding how SIGINT works and understanding how cyber works was—I kind of begrudgingly went down that road when 2016 happened. And I got—I was fortunate enough to be able to work on assessing what Russia had just done in the 2016 election. Having to learn that lingo and collaborate with colleagues that were in those very technical fields and—they were translating, you know, very technical stuff to me. And I was in turn translating a lot of Russian strategic culture. I found it was advantageous for me to be able to understand the lingo and the terminology and the tradecraft behind so much that technical work that was going on, and in turn they found it advantageous for me and some of my kind of Russianist colleagues to be able to translate a lot of the geopolitical backdrop and the Russian strategic cultural context behind so much of this behavior. In that kind of fusion of disciplines, there was so much power to be able to explain not only what was—what had just happened, but to try and maybe inform the decision making going forward on how to deal with it. That was kind of the entry point for me and to the really opaque and thorny area of cyber conflict. But that's kind of been where I parked ever since then.
MISHA: How does Russia define its cyber warfare? And does it include information warfare, and what kind of entities guide Russia's policy in that realm?
GAVIN: So I would say under the Russian rubric of information confrontation, you have what are called information technical effects, and that's where I think they would probably been cyber warfare as such, although they don't, they tend not to use that word much in their doctrine or their policy papers. So they think of information far more holistically than I think a lot of Western security thinkers do, and then you'll have to remind me the second question.
MISHA: Well, United States has like cyber command, and that everything is pretty hierarchical versus in Russia many things are also informal, or we don't know about, but you're the expert. So, so you know, so how does Russia structure its kind of institutional hierarchy in that realm?
GAVIN: Right. So I think the majority, the bulk of the kind of cyber know how has historically rested within the GRU and the FSB, the latter with a far more domestic focus, although they do perform some expeditionary kinds of operations abroad, but I think the bulk of the, of technical prowess and sophistication on offensive cyber operations is rested within the GRU. I think the misconception that sometimes, that leads a lot of us on the Western side to make, is to assume that yes, the GRU falls under kind of the MOD military construct. But in the Russian kind of system, the GRU has an awful lot of autonomy. They kind of jealously have guarded those cyber, that kind of cyber know how and that talent and that sophistication has by and large been wielded the way we would technically think in terms of an intelligence agency, kind of for subversion and disruption. It's only relatively recently within the past few years that there's been some thinking in Moscow about how to kind of harness that sophistication and that know how in service of a military set of objectives.
But historically that's kind of created some parochialism and some turf warring between the services and the agencies because for a long time, the FSB, the MBD, the GRU, kind of jealously guarded their capabilities and didn't want the military as such, the conventional military arms, to have, to play any real role in that. So I think we're still seeing and have seen in Ukraine's—in the context of the Russian war on Ukraine, the inefficacy that that kind of turf warring has now brought about, because it's hard to tell where the GRU ends and the kind of conventional military command begins. And even within the hierarchy there you have, according to the Estonians, an entirely—a directorate within the general staff that's in charge of, kind of, signals intelligence or cyber enabled surveillance intelligence reconnaissance for military objectives, but then you also have what looks like the—another directorate that's kind of designed for information warfare. And we're seeing maybe some of the shortfalls of that organizational construct come to the fore in Ukraine.
MISHA: And when you say information warfare, do you mean disinformation?
GAVIN: Yeah. So that's where information warfare more broadly from a, from a Russian perspective, I would say is that fusion of the technical side and the cognitive psychological disinformation side of the House as well.
TAYLOR: So in any type of warfare, especially for the conventional warfare you typically will
have a battle, and there's a success if you win or a loss if you lose. Even in guerrilla warfare, you typically know if the conventional army has won or lost. In calling cyber conflicts and whatnot cyber warfare, is there a consensus on how to measure successes or wins and losses in the battles in the cyber states?
GAVIN: I would say far from a consensus, there's probably a deep and abiding debate, at least among a lot of academics and some practitioners about, that's ongoing to this day, about that very issue of how do you measure success? Is it kind of, I would call maybe the John Stockton theory of cyber warfare, of what did it enable? What did it assist? What was it in and was it a contingent, but without not aspect of a battlefield gain or a geopolitical gain? On the other side, I think there are folks that are, would kind of measure it from a less…kind of a more austere perspective of: did it have the effect? Did it have any effect at all that was, kind, of noticeable to the adversary, noticeable to the outside. I think a lot depends in how you measure it, but I think that's probably one of the drawbacks of thinking of the information domain or cyber domain as a war fighting domain, because in every other domain you have the laws of physics that prevail and you have a lot of observability.
So, the ability to do battle damage assessment and the ability to kind of say in fact, I was in Tallinn just earlier this year, and it's interesting when you talk to a lot of folks in military cyber commands that kind of pointed out that, if I received the order to fire and, you know, to—for a cyber operation, for my commanding officer and I didn't I, but I just said that I did, like, the commanding officer would have very little way of knowing. Whether or not to, you know, that's different from a cannon that's different from a gun.
TAYLOR: Right, or even taking stock of the munitions that you've expended.
GAVIN: Right.
TAYLOR: Or just the raw resources.
GAVIN: Exactly. And then we talked about that a lot too, of, you know, off the shelf or, kind of, capabilities held in reserve. One of the drawbacks of cyber warfare in, insofar as it exists as a concept, is that it's very difficult to, kind of, keep stuff on the shelf for a rainy day because it's, you have to keep sustained access, and there's no guarantee of efficacy or durability overtime, and there's also no guarantee that you can mobilize it quickly and reliably according to, kind of, battlefield demands. And so that's where it gets kind of tricky for militaries in particular.
MISHA: And how do successes in cyber informational warfare translate into the real world? So let's say Russia attacked Ukraine, and on February 24th, and there are reports that before that they've launched a massive cyber attack on all of Ukraine's infrastructure. Did that cyber attack, for instance, contribute to Russian success? Would Russian invasion be even less successful if that cyber attack did not occur?
GAVIN: I think there's probably even a third option, which I would argue for is did it actually—was it actually counterproductive? Most of what, most of the cyber attacks that Russia has lobbed against Ukraine since the outbreak of the war, with a few major exceptions, have been aimed at civil infrastructure with the intended effect of grinding down public morale and leadership morale and creating disorientation and eroding morale and all those things. And in fact, I would argue that those attacks have backfired in that there's been such a digital rally around the flag effect, both within Ukraine and among Ukraine's backers from a cyber perspective, that Ukraine is now much more resilient, or arguably among the most resilient infrastructures on Earth, I would argue at present from a cyber perspective, because of this expectation that Russia was going to undertake all of these operations and has.
It's not dissimilar, I would say from a lot of scholarship about air bombardments in previous wars. Where the—it's very easy and simple for military leaders and national security leaders to assume that we can bombard the civilian population into submission, and in fact that tends to have the reverse effect, they're less demoralized than they are angry, and they less—rather than losing trust in their political leaders, they rally around them, whether that's the Brits in World War II, whether that's North Vietnamese, et cetera. Like most of the studies show that that kind of morale bombing is less effective than a lot of folks would hope. But leaders turn to it because it has—it's relatively easy.
TAYLOR: So it seems like it would be more simple to launch cyber attacks from a centralized state, like if you're authoritarian, you have full control over a single narrative in your nation that you are trying to push out. You can orchestrate a cyber attack following that single narrative. What are the advantages of democratic societies in cyber warfare with that freedom to have different ideas and almost organic truth telling and organic checking of narratives from below?
GAVIN: So I think there's a degree of complexity and federation that comes not only from like a technical perspective, but from a cognitive one. So comparing the kind of RuNet approach to the one that—to a more open architecture means that there's a whole lot more complexity and a whole lot fewer, kind of, “can't fail” vulnerabilities across a given system. I think Martin Libiki from National Defense University wrote, that like that cyber weapon can only be as effective as the simplicity of its target. And whether that's a society or whether that's a network, I think autocracies just have to have a lot more.
TAYLOR: But also from a propagandizing point of view, like perhaps this isn't even cyber warfare, this is just cyber propaganda and disinformation. And as you mentioned before, the lines are blurry, and even if there's a line drawn, there's no consensus on the line or whether there should even be a line. But in the propaganda space, you know, democracies innately do have some sense of, like, internal debate that is allowed. Freedom of expression, freedom of ideas, freedom to disagree, and to hash out those ideas. And some actors have characterized that as a weakness because different narratives that may not be true become the main narrative. For example, WMD's in Iraq, and Obama as senator was the only senator that said there's no proof, and then he was written off as an Iraq sympathizer. And lo and behold, the whole history comes out, and he was the only one that was ultimately correct. So in democracies, we do have competing narratives. But are there any strengths to that in fighting against disinformation campaigns?
GAVIN: I think one of the things that—this is probably why I'm a bit of a skeptic of the idea of thinking about information warfare from a narrative perspective as a coherent strategy or necessarily one that we ought to either rely on or be extremely fearful of. Because of course in autocracies there's a relative ease of, kind of, setting the framing and setting the terms of the discussion and of cementing the Overton window, if you will, around a certain set of issues. In democracies, even governments don't have that. Even influential figures may not necessarily be able to capture and hold attention or maintain a narrative for long, and so there's a whole lot of kind of organic, zeitgeist-y, indefinable, and emergent kinds of phenomena that are, I think, inherent to open democratic systems where free expression can reign, that it's very hard to, kind of, assume that there's a wieldable—that influence operations or information operations can be conducted with any skill, A. And B with any reasonable expectation of some kind of impact. That's obviously going to be a lot easier within an autocracy, but I would argue even there, there's a lot of limitation. I think in Russia's case it stems very deeply from all the way back to kind of Bolshevism and this idea that the new Soviet person or the new Soviet man can be molded out of their—primarily out of their exposure to some kind of media. And that, you know, Stalin even bought into this idea that, that…
TAYLOR: That Pavlovian response.
GAVIN: Yeah, that there's a behaviorist kind of input in and output out kind of relationship between humans and media that would be very, we could…
TAYLOR: Like a straight cause and effect line.
GAVIN: Exactly. And I think that far from that, the advent of a lot of advanced technology and computation and a lot of the discussions around cybernetics and that kind of thing in the 50s and 60s and 70s lend a lot of thinkers, Kolmogorov and Lefebre and a lot of others in that era, to kind of say, oh this is the pathway into perfecting this idea, into perfecting this trade craft of molding societies and identifying the laws of human nature that they believed existed. And so my concern right now is, insofar as there's a lot of focus on Russian information warfare, is that we acknowledge that there's this kind of tautological self reinforcing way of thinking about human perception that's long been present in Russian strategic culture. Recognizing it without necessarily adopting it ourselves because there's a certain degree of kind of scientism behind it and—that drifts ever more closely to kind of this magic mind control conspiratorialism that has a lot of what I would call unfalsifiable conjectures that border on magic at the end of the day. And so insofar as we lump all of that into our broader conception of, you know, information warfare, I think there's some danger of both overestimating our enemies and adopting some of that hubris ourselves.
TAYLOR: Kind of a follow up to that, is there a danger in characterizing the field as, you know, cyber warfare or, you know, in propaganda as the people's minds as subjects? Because it seems like inadvertently, at least in popular media, especially with the rise of like Q Anon in the United States, for example, around Thanksgiving, there's always funny articles about what to do with your crazy uncle. You know, things like that. For—that's a silly example, but truly kind of this distilling or deluding of a very complex cognitive process on how somebody comes to their worldview into “that person watches this one media stream or they subscribe to this one channel on Reddit, therefore they are this backwards crazy person.” And are we in danger as, like, Western society in America more specifically, of kind of inadvertently adopting that kind of Pavlov like analysis just in how we're thinking about these complex issues?
GAVIN: I think that is absolutely spot on, and I think that that tendency is has always been there, particularly from the point of view of governments and elites that kind of view their policies as an object to be protected. There's this kind of idea of this ontological security that my, the, the society's broader sense of itself and its sense of routine and its sense of predictability, of daily life, as a thing that has to be protected. And that's a good thing. It's natural, it's rational, but unchecked it can often lead to kind of a bit more of a technocratic, paternalistic, dare I say authoritarian, way of thinking about protecting the public. Even back in the 20s and 30s, as offensive to our modern day conception of democracy as it might have been, you know, Walter Lippman and Harold Laswell and some of those other early thinkers about propaganda viewed mass manipulation of the public by the elites in the, in democratic governments as a feature of democracy, that it was their kind of solemn duty to interpret and lead the public to these…
TAYLOR: As paternalistic democracy.
GAVIN: Very much so, yeah. But the flip side of that is it kind of assumes that the public are a bunch of blank, blank slates, empty vessels, rootless, disinterested sheep that can just be pulled to and fro, and I think that assumption itself is a little bit, probably equally corrosive to the idea of democracy or trust in institutions that we're saying is at most threat because it reduces, again, these very complex phenomena. It distracts probably from a lot of the real world policy responsibilities that leaders have to their publics.
MISHA: Coming back to Russia's kind of version of information warfare, do you think that having many people from FSB, former KGB, in leadership right now in Russia, including the President of course, contributes to proliferation of information warfare and to its effectiveness against Russia's own population and also outwards?
GAVIN: I would say yes. Because of the inherently conspiratorial mindset and will to control, if you will, of—that's kind of inherent to certain intelligence agencies generally, but certainly to those particular intelligence agencies and the pedigree, the KGB pedigree that they emanate from, so that kind of fixture—not to essentialize, you know, Moscow. But I think it's fair to say that that's a bit of a fixture within strategic culture there that, that the conspiratorial mindset lends you both an outsized sense of hubris about what can and cannot be controlled through these covert operations, but also an outsized sense of danger around every corner that may or may not comport with reality.
I would argue, though, that that makes them less successful at information operations abroad because it, as societies—targeted societies, become more inured to the tradecraft and the novelty starts to wear off it becomes easier to detect. It becomes a little bit more cartoonish in hindsight, and everyone, whether on the psychological and the technical end, ends up kind of stealing their own defenses as a result. And as you zoom out and say, well, what did that accomplish for Moscow? What was the—was there a piece of ground captured or a geopolitical objective that was achieved through these information operations? And I would argue in a lot of arenas the reverse is actually taking place. So much of the information bombardment of Ukraine has had the absolute backfiring effect certainly since 2014. And so at some point you have to ask, well, to what end was all of that effort?
TAYLOR: This is a slightly related question, but it's also a little bit different. Have you seen in your observations about what's going on in Russia, is there any cyber mercenaryism going on? Is anybody using third parties to try to conduct cyber warfare in any capacity, or is it mainly state actors that are launching cyber attacks?
GAVIN: So I think ChVK Wagner had started to spin up, at least put a name on the box of kind of some, you know, cyber front Z think it was called or… My sense, however in terms of, like, the sophistication is a lot of low level DDoS attacks, a lot of, kind of disruptive, but not necessarily decisive activity that is probably not very well coordinated with anyone in the state—it makes for good propaganda fodder. It creates this perception that there's, you know, cyber mercenaries abound in Russia, and they're going to target Ukraine and Ukraine's backers, and perhaps they did. But again, it's, I think I would argue on pretty much all fronts, cyber mercenaries by and large have a larger propaganda value necessarily than they do a technical one.
MISHA: How does Russian leadership or Russian leadership in the information warfare space—how do they view technological innovations, for instance, as the advent of AI, do they try to integrate it into their operations in any way?
GAVIN: I don't know that I've ever seen any use of AI at a broad scale that I would, that I would attribute to kind of a mass adoption at scale by the state.
MISHA: For instance, like, would it be hard to convince President Putin that AI would be more, in the long run, it would be more effective at achieving Russia's goals because he might have an older tradition, conception of what information warfare is.
GAVIN: Well, President Putin has made—at least publicly stated that, you know, that whoever gains the edge on AI will kind of be, lead the 21st century or something to that effect. I don't know the degree to which that reflects his own kind of personal views. He's kind of reputed to be a technophobe and a little bit aloof from a lot of the most highly technical stuff. I think the biggest application of AI that I've seen in this space, the information warfare space is kind of the proliferation of deep fakes and cheap fakes, but I don't think it's necessarily…it's reinventing a pretty, reinventing the wheel in am impactful way. Like the fake of President Zelensky was pretty easily detectable, I think for most folks. And again, that cuts to the question I think that's inherent to a lot of the research into information operations like that is, is it truly persuading the not already persuaded? Is it truly fooling those who don't already have an inclination to be fooled or and, I think those are probably unanswerable questions more broadly, but probably remains to be seen how much the deep fakes, how much impact those end up having on the broader environment.
TAYLOR: Going viral on Twitter doesn't necessarily mean that that many people actually changed any minds, so it's a hard metric for sure. So why is it important to study cyber issues, specifically, information warfare, propaganda, and a follow up to that, why is it important to study this specifically from the Russian perspective? Like what makes focusing on Russia important?
GAVIN: I would argue probably that Russia has been the most experimental and has been the most rigidly dogmatic about information as a tool, as an instrument, as a weapon, going back even into the Tsarist era. And so that's why I think Russia is in particular a fascinating use case because there's so much effort at society building, and effort at shaping the bounds of suasion and cognition and identity formation that itself was very experimental for its own day, whether that's in the post revolutionary period 1917 when broad swaths of the then Soviet republics, who were themselves illiterate, and so there was a real nation building, if you will, that in the Russian experience that took place first with the benefit of propaganda, but then later on with the benefit of these technological breakthroughs. And so that for me at least, is why Russia is such a useful lens through which to view information warfare as an area of study. As an extension, I think that's all. It also gives me a greater sense of humility about what is perhaps outsized faith in what information and coercion and subversion cam achieve from our kind of geopolitical sense.
MISHA: Maybe lastly, do you think 2016 caused an overreaction to Russia's cyber capabilities, that people overestimated what Russia can do not only with its audience but also with American audience? And did you have to convince anybody that Russia was indeed even a bigger threat or…?
GAVIN: I would say broadly, yes, there was probably an overreaction, and we do a fair deal of mythologizing Russian cyber prowess that may lend them far more credit than they’re due. But I also would say much depends on how you define the threat. The threat to critical infrastructure or the threat to, you know, from unchecked ransomware actors acting out of Russian territory like those are, those that have real costs and they affect real lives and the degree to which that kind of activity is necessarily addressable or deterrable or resolvable through military or sanctions activity, like I think remains to be seen. But in that regard, I would say our response probably overemphasized the military component of it, and underemphasized a lot of the other tools in our toolkits, whether that's law enforcement or diplomatic or export controls, you name it. But I think we're making progress in that regard in the US. Also, by shifting the focus away from so much of a threat itself and more towards the resilience piece. In that regard, Ukraine's kind of offered a real model of: assume that bad things are going to happen. Assume that there will be breaches. Assume that there will be disruptions. Put your focus on what will it take to reconstitute, how long will it take? What do you need to get back to good once that does happen, and I think that changes the policy conversation as well.
TAYLOR: Thank you so much for your time today. And now is the time that if you want to give a plug to any cool projects or any developments you're excited about. Are you working on anything that you'd like to share?
GAVIN: Well, no, I have. You saw out of the Texas National Security Review that's hopefully, we're coming in the next couple of months, that cover some of the same ground. So it might, with any luck, it will coincide relatively close to the podcast. But yeah, a pleasure talking with you about it.
TAYLOR: Awesome. Thank you so much.
Gavin Wilde is a senior fellow in the Technology and International Affairs Program at the Carnegie Endowment for International Peace, where he applies his expertise on Russia and information warfare to examine the strategic challenges posed by cyber and influence operations, propaganda, and emerging technologies. Prior to joining Carnegie, Wilde served on the National Security Council as director for Russia, Baltic, and Caucasus affairs. In addition to managing country-specific portfolios, he focused on formulating and coordinating foreign malign influence, election security, and cyber policies. Wilde also served in senior analyst and leadership roles at the National Security Agency for over a decade, after several years as a linguist for the Federal Bureau of Investigation. The insights he generated for counterintelligence, policymaking, and warfighting consumers included co-authorship of the Intelligence Community assessment of Russian activities targeting the 2016 U.S. presidential election.
Wilde is a nonresident fellow at Defense Priorities and an adjunct professor at the Alperovitch Institute for Cybersecurity Studies at the Johns Hopkins University School of Advanced International Studies. He previously assessed geopolitical risk for multinational corporations as a managing consultant at Krebs Stamos Group, a cybersecurity advisory. His commentary has been featured in War on the Rocks, Lawfare, Just Security, Barron’s, New Lines Magazine, and elsewhere. He holds a BA in Russian Studies from the University of Utah and graduated with distinction from the National War College with an MS in National Security Strategy.
ABOUT THE HOSTS
Misha is a third-year graduate student pursuing a dual Master’s in Global Policy Studies and Russian, East European, and Eurasian Studies at UT Austin. He obtained a B.S. in Political Science from Texas Christian University in 2021. Being a native of Donetsk, Ukraine, Misha is passionate about researching Russian foreign policy in its Near Abroad and the Middle East, US-Russia relations, and Ukrainian politics. He is a native speaker of Russian and Ukrainian and speaks Arabic and Polish at an intermediate level. Follow him on X.
Taylor is currently pursuing a dual JD and Master's of Arts in Russian, East European, and Eurasian Studies at the University of Texas at Austin. Her background includes a BA in Global Affairs from UT San Antonio with minors in Russian and Linguistics. She intends to work in energy law because she wants to make the world more efficient and sustainable while working with sophisticated clients, and her “dream job” would ideally expect her to utilize multiple languages (especially English and Russian) in the workplace. Connect with her on LinkedIn.