ArchiveCyber SecurityOODA OriginalTechnology

The New Red Atlas is You

During the Cold War, the Soviet Union engaged in a detailed clandestine urban mapping project designed to develop detailed cartographic material for major United States cities. These efforts are detailed in the book “The Red Atlas: How the Soviet Union Secretly Mapped the World”.

As we’ve clearly entered into Cold War Two, Russia is engaging in a new clandestine mapping effort, but this time the new Red Atlas territory is you.

In 2015, my personal data was breached in three very significant ways. My health care insurance provider, a credit bureau, and the U.S. Office of Personnel Management were all breached resulting in a plethora of data about Matt Devost falling into the hands of current and future adversaries. While that data represents a significant risk to my personal security, it is also a fairly static digital trail. The data being collected through social networks and my digital applications and devices paints a far more accurate picture of who I am and how I make decisions. This micro-data collection and its aggregate value in psychographic profiling represent the data exploitation target of the future.

Given that most users are likely being exposed to these issues for the first time, this essay captures what I think are some useful data points about the current state of affairs and where we should direct our efforts in the future.

We Built This City

As technologists have famously articulated, if you aren’t paying for a digital product, you are the product. Social networks and other “free” digital applications become boring investment opportunities if their sole path to financial success is to charge a monthly subscription fee. They trade on high multiples based upon how they can monetize the data they collect to pair advertising demand with targeted consumer purchasers. It is, in essence, the nature of the snake so we should not feint surprise when we are occasionally bitten as a result. In fact, I’m not only an active user of Facebook, but an early investor. It is a Catch 22 hedged bet against the economic value of my own personal data – my way to monetize the privacy I’ve traded away as a user of the platform.

However, data collection is not limited to “free” services. For example, I pay hefty amounts to my mobile telephone provider each month, yet they collect and monetize granular data about me as well. So does my Internet Service Provider, music and video subscription services, and others. Ask yourself, is the company Square valued based upon the transaction fee they charge merchants to process credit cards or the value of the purchase history dashboard they can offer advertisers? Data is the new economic model of the digital era and since we keep coming up with creative ways to collect more of it and more granular ways to extract value from it, there is a whole ecosystem of current and future economic opportunity inherently linked to it. There is, for lack of a better analogy, a Data GDP.

Eyes Wide Open

The Cambridge Analytica data privacy disclosures are just the very tip of the iceberg on a concerted wide-spread effort to develop detailed data archives, behavioral analytics, and psychological profiles on each and every user of online data services. The infrastructure for this data collection has rested on the backbone of the advertising industry – designed to tailor online advertising and experiences that drive you towards a commercial transaction outcome. However, as we’ve seen with the Cambridge Analytica story, this data can also be exploited to influence other behaviors, both online and offline. It can influence sentiment, foster alienation, and amplify societal conceptions.

Watching how these issues play out in the legal and regulatory bodies in D.C. will be interesting, but we shouldn’t let it be a distraction. The problems around aggregate data collection go far beyond Facebook. We need to drive consumer awareness around how their data is collected and how it characterizes them in the data value chain.

Your Friends are a Vulnerability

In the late 2000’s, a Georgetown University student of mine embarked on a study to explore the risks that social networks introduce to individual privacy. By nature, social networks expose what could be considered private information to friends so what happens when your friends are used as collection channels for your personal data? I won’t re-iterate his Master’s thesis here, but you can find an excerpt from 2009 that focused specifically on Facebook friends. What Nagle found was that friends do greatly impact our privacy as many represent a lowest common denominator exposure to our social network data. A problem that is exacerbated by the fact that we all have friends that are less discerning in which friend requests they accept and which applications they give access to their social media profiles. As as result, we develop a social network herd vulnerability that amplifies our individual susceptibility to privacy compromises. The only way to counter this will be additional privacy controls that restrict third party access to our profiles even if that access is facilitated through a friend.

We Are Stepping in Front of a Moving Train

As we introduce augmented reality and quantified health devices into our lives over the next 3-5 years, the quality and quantity of data that can be collected is going to expand exponentially. Where a web site is able to track how long your mouse hovers over an object on their page or how long your attention lingers on an article, augmented reality devices will be able to collect that data for the real world. Did your gaze linger on that piece of Rimowa luggage in the airport? Does your heartbeat elevate more when talking to brunettes at the local watering hole? The ability to collect and quantify this type of data is on the near-term horizon so we should be planning for how to protect it now. In the cybersecurity field, I’ve been a longtime advocate of building security into the design process for new technologies. If we don’t start building privacy into the design process, we will continue to deal with these issues on a recurring, but much more impactful, basis in the future.

There IS a Blockchain Opportunity Here

It is quite in vogue to blockchainify all of our current approaches to complex problems, but there is a realistic blockchain solution here. Imagine a blockchain that provided complete transparency on who is advertising to you and what elements of your profile tagged your for the advertisement.

Imagine a blockchain that provided the user with control over your online profile and provided economic incentives for those portions of your profile that you chose to expose to advertisers (in the form of blockchain credits that can be redeemed for discounts and special incentives). My guess is that immutable ledgers of advertising would appease the regulators and also equalize the incentive architecture of advertising with a shift towards the consumer. Such a change seems almost inevitable at this point.

Old Data Never Dies, but it Should Expire

I’m a big fan of ephemeral messaging applications like Wickr and recently acknowledged during a conference presentation that I had just 1) sent my social security number to someone over email and 2) discussed BBQ brisket recipes over Wickr – neither of which seemed discongruent to me. The walk you take through your neighborhood and the conversations around the water cooler are ephemeral by nature in that they persist only in the memories of the persons involved. Not so for your walk around the social graph or the online conversations you have.

We need user defined options for how and when to store our online information. The default in most platforms is to indefinitely retain everything which provides incredible long-term granularity for technological profiling. Data expiration dates won’t stop state actors from infinitely storing data they get access to, but it will at least limit the exposure when the privacy of these platforms is exploited. It also goes a long way towards neutralizing the long-term effects of data branding which I’ll define loosely as associating a person over the long-term with a piece of data that might not have long-term applicability. For example, the expectant mother that suffers a tragic miscarriage but continues to be targeted with advertising for a newborn or toddler products over the subsequent years because the algorithms didn’t see an important data point.

All Along the AI Watchtower

For many years Facebook has been able to declare ignorance around exactly how their platform is used for influence operations based solely on the mass quantities of data and social and economic transactions they store and process. However, as machine learning and artificial intelligence technologies progress, companies should look to implement those technologies to discover anomalies or malicious transactional behavior in their data. These AI watchers can be designed to be as agnostic as possible and also removed from the economic bias inherent in the advertising business. The role of the AI would be to surface issues for a companies internal intelligence function to investigate. To notice trends, nuances, and provocations in how private data is being used.

I am Big Data and So Are You.

Individuals need mechanisms to mine their own big data for personal benefit. This is a position I outlined in my 2013 essay calling for access to the personal repositories of the data we are generating to build self-serving applications on top of them. Today’s personal data environment has disenfranchised users from obtaining value from their own data. Future applications should focus on how to empower users to make use of their digital DNA to drive decisions and efficiencies at the individual level.

Big Data – Big Problems – Big Opportunities

The continuous generation of big data is a new global reality and the Cambridge Analytica issue just scratches the surface on the way this data can and will be exploited in the future. It also presents a unique opportunity to establish meaningful precedents for how this data will be collected, protected, and utilized. Some of that opportunity will come through regulatory initiatives, but the real change will be driven by innovation that disrupts the existing big data dynamics and rewrites the status quo.


Can Friends be Trusted?

I Am Big Data and So Are You

The Red Atlas

Weapons of Math Destruction

Data and Goliath

Age of Context

Small Data

Matt Devost

Matt Devost

Matthew G. Devost is a technologist, entrepreneur, and international security expert specializing in counterterrorism, critical infrastructure protection, intelligence, risk management and cyber-security issues. Matt co-founded the cyber security consultancy FusionX from 2010-2017. Matt was President & CEO of the Terrorism Research Center/Total Intel from 1996-2009. For a full bio, please see