We have integrated Center for Security and Emerging Technology (CSET) research into our OODA Loop research and analysis on topics ranging from artificial intelligence, dis- misinformation and information disorder (what we characterize as a crucial strategic need for National Cognitive Infrastructure Protection), technology talent retention, and the CHIPS Act.
The recent CSET report “China’s Advanced AI Research: Monitoring China’s Paths to ‘General’ Artificial Intelligence “examines what paths to general AI are available in principle, as a prelude to describing work underway in China to realize that capability. The report authors also “preview a pilot program…as a starting point for China-focused indications and a warning watchboard…that will track China’s progress and provide timely alerts.”
With the technological infrastructure that physicists and astronomers have brought to bear for decades as its foundational metaphor, the Stanford Internet Observatory (SIO) was launched two years ago to create an equally as powerful social sciences-based “Observatory” for internet researchers.
We analyze the SIO’s latest offering, in partnership with Graphika: the August 24th release of a joint investigation into “an interconnected web of accounts on Twitter, Facebook, Instagram, and five other social media platforms that used deceptive tactics to promote pro-Western narratives in the Middle East and Central Asia. The platforms’ datasets appear to cover a series of covert campaigns over a period of almost five years rather than one homogeneous operation”…which the SIO authors believe is “the most extensive case of covert pro-Western influence operations on social media to be reviewed and analyzed by open-source researchers to date.”
US authorities have indicted a Russian national who is accused of running a campaign to cause discord and interfere in elections. The campaign took place in California and was orchestrated by at lest three Russian officials. The campaign ran from December 2014 to March 2022, according to authorities. The individual
DHS calls it MDM, “mis- dis- and mal-information,” and according to the Feb. 7th National Terrorism Advisory System Bulletin – February 07, 2022 (which is released quarterly), it is the greatest terrorism threat to the U.S. It is time to move away from a pure analysis of the problem and to start testing tools and formats for mounting an American-style psychological defense and addressing the failures in our cognitive infrastructure. Graphic novels, aka comic books, are a uniquely American art form – which may be a great place to start with a younger generation in the classroom, or when onboarding employees at a company or at a government agency. It is a compelling medium for many generations. At least CISA thinks so – and has released the first in a series of graphic novels.
Promising Research and Analysis Topics and Projects Emerge from the April 2022 OODA Network Member Meeting
To help members optimize opportunities and reduce risk, OODA hosts a monthly video call to discuss items of common interest to our membership. These highly collaborative sessions are always a great way for our members to meet and interact with each other while talking about topics like global risks, emerging technologies, cybersecurity, and current or future events impacting their organizations. We also use these sessions to help better focus our research and better understand member needs. This month’s call was marked by more than the usual number of follow-up commitments on what were clearly promising ideas and projects with great potential for OODA Loop research and analysis (and are also a bit more time-sensitive than usual due to the crisis conditions in Ukraine).
EU DisinfoLab is an independent non-governmental organization (NGO) focused on “researching and tackling sophisticated disinformation campaigns targeting the European Union, its member states, core institutions, and core values.” The lab has created the Ukraine Conflict Resource Hub with essential information and links to reliable research, analysis, and fact-checks to help [organizations] navigate during this crisis.
While these tools have been compiled in the context of the war in Ukraine, these open-source intelligence tools are also broadly useful to gain a competitive advantage by strengthening your organization’s cognitive infrastructure.
Meta has been forced to take down a deepfake created of Ukrainian President Volodymyr Zelensky after it went viral on Russian channels. The deepfake consisted of doctored footage of Zelensky in which he appeared to call on the military to lay down their arms. The deepfake was allegedly used by
Ben Dubow is CTO and founder of Omelas, a firm that provides data and analysis on how nations manipulate the web to achieve their geopolitical goals. He has a background in research on Russian and Chinese online information operations and is a recognized expert, having appeared on international media including Reuters, Bloomberg and Roll Call. Ben began his career tracking jihadi, white supremacist, and Iranian activity online before joining Google where he played a lead role in removing ISIS content from YouTube and establishing the Redirect Method to counter violent extremism. Before Omelas, Ben was Secretary of Code To Inspire, a nonprofit that teaches Afghan women to code. Ben speaks Arabic, French, Farsi, and basic Russian.
The Commissioner on the Information Disorder Final Report opens by clearly sounding an alarm: “Information disorder is a crisis that exacerbates all other crises. When bad information becomes as prevalent, persuasive, and persistent as good information, it creates a chain reaction of harm.” We take a look at Commission on Information Disorder Final Report. What sets this research apart? In our final analysis, of the many formative efforts to research and provide solutions to the misinformation crisis, this report is the seminal document to date for how best to frame this issue.
Meta has been working to take down adversarial networks across the world that were operating on Facebook and engaging in behavior such as spreading false information, harassment, and attempting to have legitimate information taken down. Meta stated that the groups violated rules set forth in its Coordinated Inauthentic Behavior policies