ArchiveOODA Original

John Robb on Hyper-networked Tribes, Digital Sovereignty, Digital Identity, Digital Rights and “The Long Night” (2 of 2)

In June of 2020 and again in October 2021, Matt Devost spoke to former Air Force pilot John Robb. John has worked successfully in a variety of domains, including the special operations community, as an industry analyst, successful start-up founder, and national security expert.

John Robb has distinguished himself with a career of critical analysis on a variety of security, technology, and cultural issues throughout his career, in his popular book “Brave New War”, as well as his current Global Guerrillas content community and the GG Report.

We continue our effort to underscore certain patterns and themes found throughout the OODAcast library of over 80 conversations with leaders and decision-makers, on topics such as leadership, clear decision-making while operating in a low information environment, future threats, and strategic action.

In Part II of this conversation, Matt and John discuss hyper-networked tribes, digital sovereignty, digital rights, and “The Long Night.”

Matt Devost: Since [our last conversation] over a year ago, we were right in the middle of the pandemic, still a lot of emerging issues and frameworks being developed around the pandemic.  And things were in a lot of turmoil, but we’ve had a lot of turmoil since then as well. So, I’d love to kind of get your perspective: we had an election and then we had rioting on January 6th. And as I was watching and observing that, it seems to play into a lot of your thinking around the emergence of tribes and tribalism, and a new framework for thinking about the world. So, can you give us your perspective on those issues? And does it fit within that tribal framework that you’ve been articulating for a while now?

John Robb: What we saw last year and what we’re seeing today is playing into network tribalism. First thing is that the online world, how we interact online, is rewiring our brands. We are thinking differently now. It’s moving us from reading something and thinking in isolation, towards scanning information. The information flows are very, very high. We can’t focus and drill down much. What we do is pattern match. We pull pieces of information out of that flow that matches patterns that we’re curating when we do it alone. That’s our own small pattern that helps us make sense of the world. But they can also be done in tandem with millions of other people curating a larger pattern – and those patterns tend to be the basis for a lot of tribalism.

Devost: There’s been a great Wall Street Journal series around Facebook, and not to just pick on Facebook, we have lots of different social networks and they operate with similar dynamics, but because of this series of stories, we get a little bit more insight behind the scenes at Facebook and the dependence on algorithms to provide us with these feeds that we see as we browse the social network and this bias towards things that would make people angrier or that get them upset, right? The algorithms are rewarding that type of behavior because it gets a reaction. Does that mean that we’re creating these kinds of hyper-network tribes that are guided by these AI algorithms that are only reacting to engagement, but seem to create some of these issues that we’re dealing with?

Robb: I saw that firsthand a couple of weeks ago when I was testifying in front of the Senate in front of the [Subcommittee on Competition Policy, Antitrust, and Consumer Rights], the big data group trying to regulate Facebook and Google. The day before there was a story about some bit of research done inside of Facebook, about how Instagram impacted the psychology of the young females in particular, and all the senators piled on, they were all attacking Facebook. And the poor Facebook rep is like you should’ve seen his face when he wasn’t on camera. It was like the guy was being accused of child murder. It was a total pile on. Some of those senators were really aggressive.

The big problem I have with this kind of Facebook regulation and Facebook algorithms: it’s clear though that Facebook’s algorithm does promote mimetic level content, id level content distribution, and the stuff that people like to share, and people like to giggle over, they think it’s kind of cool, that kind of low-level content gets shared pretty easily on Facebook because it drives engagement. And sometimes that bleeds into political attacks. So there have been a lot of changes at Facebook over the last year and a half, particularly in the run-up to the 2020 elections that tended to mute that.

A lot of the stuff that we’re talking about in terms of how these algorithms work prior to 2019, the way they changed those algorithms, I think played a lot into a kind of the lessening of Trump going into the election. How we measure that, or how that impact could be accurately assessed is not going to be possible because there’s no access to any of that data. But the way the Senate is approaching it. And maybe I was brought in to talk about digital ownership and digital rights in order to kind of scare Facebook and Google a little bit. But the thing going into that hearing, it was clear to me that the Senators and Congress, and a lot of other interest groups, are very interested in not so much reforming Facebook and Google and the like, but they want to take control of it.

If you step back and you look at what they’ve created and you know a little bit about how the NSA and other organizations work, you can see that the advances over the last 20 years in terms of data collection and surveillance and being able to actively make sense of what’s going on and then act on it have been intense. We are talking about a quantum leap in capability in just over the last 20 years. But the thing is that all of this has been done on the commercial side. For the most part, you know, that’s where most of the data is and that’s where most of the AI development is happening. The more data you have, the better your AIs. They have basically created a surveillance state in a box.  China recognized this pretty early on, three, four years ago when Xi decided he wanted to become a dictator.  He latched on to the surveillance state system being built by their corporations.

Devost: The FDA released a list of AI and machine learning-enabled medical devices. And it’s a lot longer than I thought it would be. The use of that data in medical technology is definitely accelerating. But we do seem to have lost track with regards to how do we enable that data. Some of your thinking that I’ve been a huge proponent of is trying to find models for digital self-sovereignty, getting more control over how our data is tracked and how that data is used. That seems like an incredibly difficult challenge, and we can’t even convince some of these platforms to give us time-based timelines. You know, they won’t let us just scroll sequentially based on when something was posted. They want to feed us what the algorithm wants us to see. So, do you think that we can develop any sort of model for digital self-sovereignty, that will help define how our data is collected and used and give us a little bit more control over it over the next 10 or 20 years?

Robb: We’re seeing more technology applied to healthcare. I think COVID helped a lot with that. Sure. And COVID helped us advance in many, many ways. The work from the home quality of life, for most people quality of life went up, when you do the surveys, it’s actually gone up during COVID. And it did allow online medical visits and all that other stuff, that all accelerated all those rules that protected doctors from online competition kind of evaporated for a while and we were able to kind of get into that groove. So, we’re seeing a lot more advances. In terms of the sovereignty issue. I see it as kind of one of those fundamental switches, and if we can get it right, everything else gets better, faster – and we get wealthier and we get happier.

I’ve been likening it to feudalism, the feudal relationship we had in the 1500’s hundreds that were broken up by the printing press and through the reformation. Up until that time, everyone was pretty much a farmer, but you were farming land owned by Lord, by the aristocracy and only the aristocracy was allowed to own land. I see privacy within that context as, you know, the Lord could only beat you on Sundays, right? That kind of thing. It’s just a lessening of the damage. But it was only at the point where – and I think the U. S. the colonization accelerated this – people had these individual plots of land, their own farms, free from any kind of external control or ownership and were allowed to have private property, we’re able to build wealth at an individual level or family level, that things just changed.

And we saw the middle-class and we saw broad markets for all the products and services that we enjoy today and electricity and the telephone and all the other appliances. So how do we do that in this data space? And if data becomes that value creation over time, the way to do that is through data ownership. You know I was sitting during the testimony and they had a major data broker there. And, you know, they’re familiar with taking and buying data from all these different providers and selling it to marketing companies. Well, where is the consumer group? Where is the one that represents my interest?

If I have money to invest, I have banks and brokers I can go to a financial advisor and they have a fiduciary duty and lots of regulation in terms of how they do their job, but they’re looking after my interests. They’re trying to make me money for the most part and trying to better my life. I would want something similar for my data. I want to take it out in real-time. And I want to put it into a repository, kind of a data bank, from all these different sources, that then is then made available to people who are building AIs for XYZ, it unlocks the potential of the data because the data only has a certain number of uses within Facebook and within Google. They only have a certain vision for where things are gonna go, but if you open it up to all these different players, you can see things develop. New products and services come out of this stuff that we could only imagine.

Devost: I think you’ve tied into a lot of things that have been a focus for me as well but introduce some new concepts. I like this idea of the data plot and the analogy with feudalism for certain, and then having a data fiduciary, I mean, that’s important as well, right? they’re required to look after your best interest from a data perspective. And identity I think is also central as well. I’ve been playing around with that and a piece I’m working on right now. If we start thinking through new models of cyber security and we are trying to protect some of these critical infrastructures, say banking for example. maybe you are unable to route to a bank’s application unless your identity is registered as a customer of theirs, the same way that you provide your driver’s license, right? It kind of reworks how networks and routing and all of that is supposed to work. But I think those are the new models that we’re going to need in order to really advance cyber security and deal with some of these problems that we’ve been dealing with from a cyber-crime, cyber espionage, cyber conflict perspective. So, fascinating,

Robb: Once you have an identity, you kind of trust that it’s being used in your best interest. And I don’t have any problem with it being a thousand-factor identity check, right? It’s looking at every single service and everything I’m looking at, touching, including the standard kind of facial and thumbprint or whatever else you want to add to it. But then you build AIs to tie it all together and make it hard to guess which things would trigger it because you don’t know what the data model is behind the scenes, you could make it so it’s very tough to crack an identity. I mean, but you’ve got to get people to trust it: accessing all this information from your GPS data on your phone to the patterns of use, all the different pictures and things of that would indicate who you are is safe to access.

Devost: Deriving value has been a huge thematic for me. I want to derive value from my own data. You know, I just feel like there’s so much information that’s collected on me that I could use for my own benefit individually, and we just don’t have access to the data, and you don’t have access to the tools. So that becomes difficult. But hopefully, we’ll be successful with some of these initiatives, but obviously with a lot of friction because the current economic model doesn’t support that. So, there’s a lot of disruption that needs to take place.

Robb: Yeah. I think we can get it ported out. I mean, it’s just standard-setting and getting the political backing to say: “this is the standard.” And giving people who know how the systems work six months to come up with a standard for the API that gets stuff out and how it should be stored. And then having companies set up as data brokers who will store that and different business models, some will resell, some will just store it. But you won’t get the kind of privacy control features for telling Facebook and telling Google and telling everybody else what you want to share and what you don’t want to share. Those things won’t be built in an intelligible way unless you have that kind of centralization or where you have a data broker doing that for you, because these guys at Facebook and [other social media platforms] will make it hard to access and hard to find. And the tools will be limited.

I call it The Long Night. It is just locked into the infrastructure. There’s no dynamism…there’s no progress. Anyone who dissents is shut down…

Devost:  Some of the Twitter tags you’ve been doing around the stream of “The Long Night” caught my eye.  Can you step us through what you’re referring to when you refer to “The Long Night?” 

Robb: So, The Long Night. Lots of ways it could play out. It’s very similar to what we’re seeing in China right now is this idea that we can have this kind of surveillance system, that we’re seeing being built in the consumer space with big tech, that gets deeper and deeper into our lives and starts to control us and limit us in terms of what we can say. We’re seeing that right now in Congress, they want more of this system to enforce speech codes more aggressively and shut things down like QAnon and the anti-vax stuff earlier.

They don’t want any dissent. Anything that could potentially be disruptive they want to shut down. And as this develops, you’ll see more and more controls on what you’re allowed to think and how you’re allowed to act, what you’re allowed to buy. It narrows us down to this very sterile orthodoxy in terms of how you think and how you act generally at work at home online in public. I mean, everything’s going to be interconnected with that. I think once that locks in, and once that’s enforced by AIS, that can look at the conversations of a billion people in real-time and make sense of them and then act on them. And as those AI’s get better and better, then you are trapped in, you are locked in because part it is part of the infrastructure. There are no gulags in the traditional sense if you violate any of these things egregiously, you will be just disconnected. And being disconnected in a modern society like ours is tantamount to basically cutting your income to nothing.

And your social life and your social sphere are limited to who you can interact with on a face-to-face basis. It’s economic and social suicide to a certain extent. So, you don’t have to imprison them. You just have to disconnect them. So that’s, I think that kind of system, it wants, it locks in, in different societies around the world could be the equivalent of a kind of a feudal kind of dark age. I call it The Long Night. It is just locked into the infrastructure. There’s no dynamism, there’s no moving forward, there’s no progress. Anyone who dissents is shut down because it’s disruptive. Some of the bad ways it could end up being directed as towards, for instance, using a lot of these emergency procedures to create this kind of continuous emergency, this long, long emergency – the perma-crisis as I have been calling it – as we move into global warming issues and using that to kind of control society to reduce, consumption and reduce their activities.

But anyone who dissents from that – I want to drive a little extra or I want to use my electricity a little bit more, I want to buy this or that product – will be crushed. People will be squeezed. They will be ostracized and disconnected. There will be punishment for that. And you know how that ends up, no matter how you slice it – without the kind of digital rights and data ownership and all these other things kind of protecting us – we have no protections against corporations doing this.

So, this kind of connected world could be abused in many ways.  we are actually seeing it but no one seems to really care.

Devost: It’s interesting. Then you weave in the government. It’s been fascinating to watch over the past couple of months, there is a large global retailer that had a little bit of friction with China. And, as a result, what you end up with is you can’t get a ride-sharing app to that retailer’s locations at the mall because they no longer exist. Like it just kind of wiped them out of the data stream. And now it starts to have an incredibly significant impact. And you can’t route to the website, the brand does not exist on the internet. It doesn’t exist as a place that you can route to. When you centralize that control, it can be exerted in ways that can be impactful, not only to individuals but to corporations as well.

Robb: It’s interesting because the way the Chinese are kind of exerting the control of their domestic corporations applies a lot to how they’re exerting control globally. I mean, just look at the Hollywood situation. Hollywood’s not allowed to make any movie that’s detrimental or negative about China, about the government or about his people. No geopolitical thrillers where the Chinese are the bad guys are allowed to be built or made. And that applies all the way back in the writing, as soon as somebody writes something like that, they’re immediately disconnected. And it doesn’t even matter if the film is going to be distributed in China, they’ll punish the entire studio if they do that. And the studios, therefore, make nothing that would piss China off because then they wouldn’t be able to do any business with them or between any Chinese companies.

So, this kind of connected world could be abused in many ways.  we are actually seeing it but no one seems to really care. Right? And as it starts to come into our daily lives, we see more little bits and bits of it.  That person was disconnected? Well, yeah:  “They said something wrong and they deserve it.” But what’s the impact of that? Is it commensurate with what happened? Do they have any records? What if it happens to you and you don’t know why?

“What we are seeing now because of networking…is an increasing sliding towards that kind of network fascism.”

Devost: You are cut off from those social interactions and we use them for everything. I mean just by example, you know we communicate for the middle school hockey team that I coach through a Facebook page. So, if I’m locked out of Facebook, I have lost the ability to use a primary communication mechanism just to coach a school hockey team. To get back to the entertainment side. I mean even if you just think about the economics of control, Legendary Pictures was bought by an entertainment group, five or six years ago. How many people are aware of that or the potential influence that could have on what a studio like that might be producing?

Robb: Yeah, it’s one of the models I was using for what we are seeing, it is kind of a redux of classic fascism. The fascist model is basically dictatorial capitalism. They figured out a way to turn a capitalist system, where you have a bunch of corporations that do a lot of the work in terms of organizing society and the economy, to get them to all align and focus in this same direction, they all are going in the same direction and they tend to negotiate disputes between each other, but there’s still a little bit of friction and capitalism that operates in terms of driving towards efficiencies. But the old model required, you know, tons and tons of propaganda, to achieve the limited way that they had to align these companies.

But it seemed to work. They aligned every organization in society that way. What we are seeing now because of networking, this kind of research that the democratic capitalism didn’t actually win at the end of the Cold War, that now we are seeing now is an increasing sliding towards that kind of network fascism, which we are seeing in China where the government sets the direction, uses all its network controls and to get all the companies and all the organizations and individuals to face in the same direction for the same model of society that they are advocating.

And that we could end up seeing the same thing here, except the corporations are probably more on a partnership level with the government rather than subservient and agreeing. I suspect that probably we will see agreement more on a lot of the progressive agenda because those groups tend to be the strongest and most vocal and most influential in terms of their ability to influence corporations in the government and setting those goals.

Part I:  John Robb on the Early Internet, Frameworks to Drive Decision Making, Network Tribalism and Emerging Threats (1 of 2)

The original OODAcasts:

OODAcast: OODAcast: John Robb on Global Guerrillas and Frameworks to Drive Decision Making

OODAcast:  Digital Self Sovereignty and Avoiding the Long Night with John Robb

Other Resources:

John’s Full Bio

Global Guerrillas Report

John Robb on Twitter

John’s Book Brave New War

Anathem (John’s favorite book from the last year)

Other recent OODAcast thematic posts 


Chet Richards and the Origin Story of The OODA Loop (Part 1 of 2)

Chet Richards on Applying OODA Loops in Business (Part 2 of 2)

Dan Gerstein and Lance Mortlock on Technology Futures and Scenario Planning

Ellen McCarthy and Kathy and Randy Pherson on Intelligent Leadership and Critical Thinking

Richer and Becker on Domestic Terrorism, Cyber, China, Iran, Russia, and Decision-Making

Omand and Medina on Disinformation, Cognitive Bias, Cognitive Traps and Decision-making

Clapper and Ashley on Joint Ops/Intel Operations, Decision-making, the History and Future of Intelligence and Cyber Threats

OODAcast 9/11 Perspectives 

Decision-Making Inside the CIA Counterterrorism Center Before, During, and After 9/11

A CIA Officer and Delta Force Operator Share Perspectives on 9/11


Daniel Pereira

Daniel Pereira

Daniel Pereira is research director at OODA. He is a foresight strategist, creative technologist, and an information communication technology (ICT) and digital media researcher with 20+ years of experience directing public/private partnerships and strategic innovation initiatives.