Before reading this realize this is a hobby and my opinions are just that. Imagine we are at the bar having a drink - “So what do you know about Trolls”. Let’s drink these dranks.
——
——
PART ONE
What terms do we need to know?
Before we start let’s establish some of the definitions being used.
Information Operations (IO): This is the integrated employment, during military operations, of information-related capabilities in concert with other lines of operation to influence, disrupt, corrupt, or usurp the decision-making of adversaries and potential adversaries while protecting our own. [Ryan, Kristen, et al. Resistance, and the Cyber Domain. Fort Bragg: US Army Special Operations Command, 2019 ]
Influence operations: Are the coordinated, integrated, and synchronized application of national diplomatic, informational, military, economic, and other capabilities in peacetime, crisis, conflict, and post-conflict to foster attitudes, behaviors, or decisions by foreign target audiences that further U.S. interests and objectives. [https://www.rand.org/content/dam/rand/pubs/monographs/2009/RAND_MG654.pdf]
Psychological Operations (PSYOP): Are operations to convey selected information and indicators to audiences to influence their emotions, motives, and objective reasoning, and ultimately the behavior of governments, organizations, groups, and individuals. [https://www.soc.mil/SWCS/IMSO/aboutPSYOP.htm]
Military Information Support Operations (MISO): Convey selected information and indicators to foreign audiences to influence their emotions, motives, objective reasoning, and ultimately the behavior of foreign governments, organizations, groups, and individuals. The purpose of MISO is to induce or reinforce foreign attitudes and behaviors favorable to the joint force commander’s objectives. Dramatic changes in information technology and social networking have added a new, rapidly evolving dimension to operations, and the ability to influence relevant audiences is integral to how asymmetric forces address local, regional, and transnational challenges. [Ryan, Kristen, et al. Resistance, and the Cyber Domain. Fort Bragg: US Army Special Operations Command, 2019 ]
Sock Puppet: A term for online identity used for purposes of deception. Uses would be to praise, defend, or support a person or organization, to manipulate public opinion, or to circumvent restrictions, such as viewing a social media account that they are blocked from, suspension, or an outright ban from a website. [https://en.wikipedia.org/wiki/Sock_puppet_account]
Actor: An entity that is partially or wholly responsible for an incident that impacts – or has the potential to impact - an organization's security. Also referred to as a malicious actor. Ways to identify are tradecraft, specific tactics, techniques, behavioral characterization, tools, and CVE’s they exploit. [https://www.dhs.gov/sites/default/files/publications/ia/ia_geopolitical-impact-cyber-threats-nation-state-actors.pdf]
Persona: An electronic identity that can be unambiguously associated with a single person or non-person entity (NPE). A single person or NPE may have multiple personas, with each persona being managed by the same or different organizations. [https://csrc.nist.gov/glossary/term/persona]
Nation-states: Well-funded, state-sponsored actors working on behalf of a country. [https://www.dhs.gov/sites/default/files/publications/ia/ia_geopolitical-impact-cyber-threats-nation-state-actors.pdf]
——
What and or who are Trolls?
In our last release, we introduced some conceptual building blocks on Social media disinformation, one of those need-to-know facts is an understanding of trolls. Their intent is to purposefully post offensive or provoking content by deliberately baiting people to elicit an emotional response. Early trolls posted inflammatory messages on Usenet groups and Bulletin Board Systems(BBS) in an attempt to catch newbies in well-worn arguments. During the ‘00s, this motivation became known as the “lulz”: finding humor (or LOLs) in sowing discord and causing reactions. The act of trolling is another way that misinformation spreads or causes disruption.
Trolling is typically done from accounts/personas/sock puppets/bots that are created for this purpose (tailored to the mission). Profiles will likely have false gender, race, ethnicity, or nationality. “Seasoning” of accounts makes them seem authentic.
As the effectiveness became apparent the automation of trolling using bots to create fake accounts thus spreading fake text, images, and video content through posts, pages, and paid advertising. At the same time, state-backed entities often spread overt influence through legitimate accounts and pages that use paid advertising to disseminate their messages.
Who: Troll(s) - This role can range from individuals, transnational organized crime, and nation-states.
Nation-State Sponsored efforts: *This not an all-encompassing list”
Russia:
Russian Trolls “Армия троллей России” also known as web brigades “Веб-бригады”, are made-up state-sponsored anonymous Internet political commentators linked to the Russian government. Participants report that they are organized into teams and groups of commentators that participate in Russian and international political blogs and Internet forums using sockpuppets, social bots, and large-scale orchestrated trolling and disinformation campaigns to promote pro-Putin and pro-Russian propaganda.
China:
The Chinese government has long been suspected of hiring as many as 2,000,000 people to surreptitiously insert huge numbers of pseudonymous and other deceptive writings into the stream of real social media posts as if they were the genuine opinions of ordinary people. Many academics, journalists and activists, claim that these so-called “五毛党 ” or “50c party” posts vociferously argue for the government’s side in political and policy debates.
Individuals:
Different aspects of the trolling phenomenon have been called several names: trolling, harassment, cyberbullying, shit-posting, etc. While not the same, they do have a similar theme: They all describe unwelcoming behavior that occurs online to disrupt conversations, often by entering into discussions uninvited.
What: The act of media manipulation of public opinion over social media platforms has emerged as a critical threat to public life. Around the world, a range of government agencies and political parties are exploiting social media platforms to spread disinformation (Dezinformatsiya), exercise censorship and control, and undermine trust in the media, public institutions, and science. At a time when news consumption is increasingly digital, artificial intelligence, big data analytics, and “black box” algorithms are being leveraged to challenge truth and trust: the cornerstones of our democratic society. Disinformation is as old as human conflict, Sun Tzu wrote that all warfare is based on deception.
“The whole secret lies in confusing the enemy, so that he cannot fathom our real intent.” ―Sun Tzu,The Art of War
The Russians aren’t the only practitioners of “Dezinformatsiya" which was anglicized as "disinformation" during the Cold War era - but they are said to have coined the term. So for a better understanding of media manipulation and disinformation, we should view several recent cases:
Russian interference in the lead-up to the 2016 US presidential election heightened the public’s awareness of disinformation attacks against the United States. A 2017 report by the US Director of National Intelligence concluded that Russian President Vladimir Putin ordered an influence campaign that combined covert cyber operations (hacking, troll farms, and bots) with overt actions (dissemination of disinformation by Russian-backed media) in an effort to undermine public trust in the electoral process and influence perceptions of the candidates.
Russia honed its online disinformation efforts in its immediate sphere of influence. As noted at the outset of this report, Russia deployed a coordinated online influence campaign during its annexation of Crimea in 2014. Russian state-controlled media outlets painted a uniquely anti-Ukrainian, pro-Russian narrative surrounding then-President Viktor Yanukovych’s flight from Ukraine and the subsequent Russian invasion of Crimea. To help shore up domestic support for Russia’s actions, Russian government bots dominated the domestic political conversation in Russia during this period. Between 2014-2015, as much as 85 percent of the active Twitter accounts in Russia tweeting about politics were, in fact, government bots.
Okay, there is a lot to cover here so let’s break it up. So enough for now. Standby for more at a later date.
-Bob aka INFODOM
Great writeup!