Social Media IO Roundup
This project is focused on identifying possible State-Sponsored Information Operations (IO) across various Social Media platforms.
“You cannot defeat an enemy you do not admit exists.”
- Michael T. Flynn
——
Review:
Trolls Part TWO : Trolling: The business of chaos.
Wrap up:
What is a sock puppet?
What is a troll?
How to set up a sock puppet account for trolling.
Characteristics of a troll/sock puppet account.
Influence operations.
Examples of influence operations.
What is a narrative?
How a narrative is weaponized?
Trolls Part One : Trolling: The business of chaos.
Wrap up:
Definitions: IO, PYSOPS, Influence Operations, Actor(s), Sock Puppet, Persona, Nation-States.
Troll (individuals, transnational organized crime, and nation-states).
IO campaigns (Russian Examples).
——
Watch First:
These videos talk to the disproportionate amplification of messages on social media.
Actors are gaming algorithms so that their content is pushed - implying a particular type of narrative in which hashtags are flooded to gain control of that hashtag. When a very small group(s) of people achieve this kind of access the threat of a terrorist organization doing the same becomes a real potential. An example of this would be ISIS’s BOTS and amplification to own their narrative and push this brand of a digital Caliphate.
Actors and their use of memetic warfare:
MEAT & POTATOES:
Social Media IO: Actor(s), Examples, Techniques PART 1
Goal:
For this release we will cover the following:
What is an Actor?
What is a CVE and why is it important?
What is Phishing?
What is a Tribe (marketing to a people)?
What is a MEME (low cost with high return)
What is Memetic warfare?
What is a Troll factory?
Influence operations (Expanded).
Examples of influence operations (China, Russia, Iranian) .
Terms and their function:
Definitions are necessary - yet we need to understand how it/things work.
Actor: An entity that is partially or wholly responsible for an incident that impacts – or has the potential to impact - an organization's security. Also referred to as a malicious actor. Ways to identify are tradecraft, specific tactics, techniques, behavioral characterization, tools, and CVE’s they exploit. [https://www.dhs.gov/sites/default/files/publications/ia/ia_geopolitical-impact-cyber-threats-nation-state-actors.pdf]
Common Vulnerabilities and Exposures (CVE): This is a list of publicly disclosed computer security flaws. When someone refers to a CVE, they mean a security flaw that's been assigned a CVE ID number. Security advisories issued by vendors and researchers almost always mention at least one CVE ID. CVEs help IT professionals coordinate their efforts to prioritize and address these vulnerabilities to make computer systems more secure. [https://www.redhat.com/en/topics/security/what-is-cve]
Phishing: Phishing is a cyber attack that uses disguised email as a weapon. The goal is to trick the email recipient into believing that the message is something they want or need — a request from their bank, for instance, or a note from someone in their company — and to click a link or download an attachment. [https://www.csoonline.com/article/2117843/what-is-phishing-how-this-cyber-attack-works-and-how-to-prevent-it.html]
Tribe: A social division in a traditional society consisting of families or communities linked by social, economic, religious, or blood ties, with a common culture and dialect, typically having a recognized leader. In the world of targeted IO a tribe is the demographic that is being targeted. [https://languages.oup.com/google-dictionary-en/]
Meme: An idea, behavior, style, or usage that spreads from person to person within a culture. [https://www.collinsdictionary.com/us/dictionary/english/meme]
Memetic warfare: Memetic warfare has been seriously studied as an important concept with respect to information warfare by NATO's Strategic Communications Centre of Excellence. Jeff Giesea, writing in NATO's Stratcom COE Defense Strategic Communications journal, defines memetic warfare as "competition over the narrative, ideas, and social control in a social-media battlefield. [https://en.wikipedia.org/wiki/Memetic_warfare]
——
Video reference of definitions:
This section is a series of supplemental
What is a “Threat Actor” or “Actor”?
What are the ways Actors move, exploit, and communicate?
4:14: The Content Blitz: Quantifying the Islamic State’s Impact on the Saudi Twittersphere.
53:10: Deplatforming, Deconstructed. A typology for Technology platforms.
Where do actors work? How does a nation identify future threats?
What is a CVE?
How are CVE’s used for access and placement?
An example of a phishing effort:
——
Memes (Memetic warfare):
“Memes (discrete units of knowledge, gossip, jokes and so on) are to culture what genes are to life. Just as biological evolution is driven by the survival of the fittest genes in the gene pool, cultural evolution may be driven by the most successful memes.
- Richard Dawkins
Memes:
Memes are a “hidden in plain sight” method for spreading narratives through social media.
In order to spread or go viral, a message needs to resonate with tribes, when this occurs, members of a group are likely to share it in their social network. Established outlets such as media and government organizations, may not acknowledge or try to suppress a meme, social media is the perfect volatile ecosystem for unverified messages to breed and morph. Posts on social media can go viral, spreading rapidly throughout the social media platform on which they originate and even “jumping” onto other platforms where they are widely discussed.
Without acknowledging that memetic warfare is a threat or a lack of appreciation for social media as a battlespace this vector will remain problematic. Perhaps this is generational: as an outsider looking in, it appears that military and foreign policy decision-makers are behind the power curve in regards to social media at, much less as a tool and weapon for the common defense. This is changing - within the United States Department of State and other organizations have active anti-misinformation campaigns. Once one starts viewing the Internet through meme-colored glasses, you see memetic warfare everywhere — in political campaigns, in contested narratives about news events, in the thoughtless memes shared by social media “friends”, and in videos on YouTube. It shows up in movements like #insertmovement, where there is an attempt to shape perceptions and strengthen public support.
Influence operations (definition and examples):
Influence operations: Are the coordinated, integrated, and synchronized application of national diplomatic, informational, military, economic, and other capabilities in peacetime, crisis, conflict, and post-conflict to foster attitudes, behaviors, or decisions by foreign target audiences that further U.S. interests and objectives. [https://www.rand.org/content/dam/rand/pubs/monographs/2009/RAND_MG654.pdf]
Russian influence and disinformation campaigns:
Former IRL employee:
Chinese influence and disinformation campaigns
Countering CCP propaganda efforts:
Iranian influence and disinformation
Iran CNO capabilities:
——
Scheduled events:
14MAR: Social Media IO: Actor(s), Examples, Techniques Part 2
21MAR: Behavioral Characteristics analysis of potential Actor(s) Part 1
28MAR: Behavioral Characteristics analysis of potential Actor(s) Part 2
——
Read of the Week:
Military Narratives and Profiles in Russian Influence Operations on Twitter 2021
Abstract:
Since 2016, Russia has engaged in a dedicated influence operation against the United States. Although distinctly modern in its use of social media platforms, the current methods align with old Soviet doctrine using information warfare to gain a strategic edge over competitors. Using the WARP framework, a heuristic tool for understanding narrative weaponization for mobilization and radicalization, we focus on Russia’s use of narratives to exacerbate existing cleavages in American society and to undermine US national security. Using data from Twitter’s comprehensive data archive of state-backed information operations, we find that military and patriotic narratives constitute one of the most frequently deployed narrative sets (second only to Trump), comprising 12.9% of the 1,357 million English-language tweets. The Russians weaponized these narratives and profiles to support or smear a variety of political actors and to escalate the urgency of various social causes. Moreover, these narratives actively recruited audiences to embrace New War cultural identity, an anti-government ideology focused on defending their families from an increasingly hostile state and world. Using a variety of persuasive strategies, the Russians leveraged these narratives to deliver emotionally resonant signals about allies and perceived enemies, both foreign (e.g. Muslims and immigrants) and domestic (e.g. government officials and the media), and to set the groundwork for potentially violent mobilization. Impersonating military profiles increased message resonance, and posturing as credible sources may have aided normalization and legitimization of New War cultural identity and divisive messaging. We conclude the Russians used military and patriotic narratives and profiles to wrap anti-government sentiment in patriotic trappings and to set the stage for Americans to engage in armed domestic conflict. [https://osf.io/preprints/socarxiv/b9a2m/]
——
Feedback:
Social Media IO Roundup is an effort charged with educating and bringing attention to the murky world of cyber information operations. Highlighting tradecraft, concerns trends, techniques, and raising questions to a sector many don’t see. I’m not all-knowing and want to improve the content, so I need you the readers to interact.
Drop a line:
Email: dominanceinformation@gmail.com Instagram @informationdominance
——
Closing:
This is becoming an entertaining ride and I look forward to seeing where this goes. Standby for more at a later date.
-Bob aka INFODOM