A Primer on Troll Farms for Cyber-Elves

Rome Viharo
10 min readDec 26, 2021

--

Cyber Elves are groups of individuals who engage in theatrical online tactics solely for the purposes of frustrating online troll farms engaged in disinformation campaigns on the internet.

Cyber-Elves are more active now than ever before, especially since 2015 uptick in online disinformation coming from Russian troll farms.

The tactic is a type of “signal jamming”, usually using humor mixed with socratic questioning, exposing inherent contradictions in online disinformation messaging, thus neutralizing it. I’m retired, doing it is rewarding yet exhausting, and I was doing it for a long time.

Beginning in 2003 when I found myself apart of the very first “online protest” agains the invasion of Iraq, I along with a few dozen others began to confront what is now known to be disinformation around the existence of Weapons of Mass Destruction in Iraq. Get this, there were troll farms operating on AOL message boards as far back as 2002–2003.

During my eight year case study into Wikipedia editing wars, from the years 2013 -2019, I uncovered a number of “troll farms” comprised of Wikipedia editors and online influencers––and I never met one troll farm that did not justify their broadcast of disinformation against their ideological opponents.

Us Cyber-Elves think we are the good guys, but every troll farm believes they are the good guys too, and therein lay the kicker. If it exists, there is a troll farm operating around it, distributing disinformation and misinformation for a number of ideological reasons.

Indeed, one major troll farm I discovered is led by a prominent influencer in the community and actually identifies himself as someone who “battles misinformation” on the internet as his professional career.

While his public facing image is noble, his private activities disinformation campaigns include him publishing faked data sets to generate contexts that do not exist, blogging misleading information about subjects he targets while staging online theatrical events to generate false impressions.

He justifies this; after all his opponents he truly believes are the ones that spread misinformation about science on the internet and therefore, they deserve to have a disinformation and harassment campaign against them in return.

Is it possible that highly educated individuals who are pro-science could operate troll farms, engage in toxic behavior, and spread literal misinformation about their targets just as much as anti-vax troll farms, or Russian troll farms?

Yes, of course it is.

Disinformation and “troll farms” is a phenomenon that emerges from political behavior, i.e. human nature, and not from the ideologies themselves. It just looks like the only group that operates disinformation campaigns is exactly the group you don’t like.

This is not something that is easy to self reflect on.

All of us cyber-elves should look at how easy we could justify disinformation, trolling and harassment under certain conditions.

Let’s do a thought-experiment as proof.

Imagine if I told you that there was a group of individuals on the internet who have been exposed for extreme racism, Neo-Nazi propaganda and on top of it, it turns out they are also pedophiles.

Specifically, in our thought-experiment here, let’s say there are 10,000 neo-nazi pedophiles in the world, and that while we have no direct proof to get them arrested for their horrendous activities, we could get the authorities to look into them by staging a disinformation campaign that they are involved with credit card fraud.

Mind you, as deviant and evil as you could imagine a group of neo-nazi pedophiles to be, they never actually engaged in any credit card fraud.

Credit card fraud was just a disinformation story about this group that could go viral the easiest, and thus expose them. Should we do it? Would you mind if that action against pedophile neo-nazis were to occur?

Who cares if they deserve it, right?

That’s the kicker––we instinctively believe that some people deserve it, and this is what enables toxic behaviors online to spread like wildfire, a place where disinformation feeds misinformation as a tactic that works (but only in the short term).

This underlying psychological vulnerability creates a global distribution network for digital wildfires and, the data from Facebook shows, eventually leads to genocide if not kept in check.

And this is what makes the difference between cyber-elfing, and disinformation trolling. The best way to neutralize a disinformation campaign is with humor and a sincere appeal to common truth without contradiction. A cyber-elf practices, as a virtue, honesty in the face of deception.

We cannot use disinformation to fight misinformation.

We cannot use “troll farms” to fight “troll farms”.

While I am retired from the craft, the “cyber-elves” of Lithuania is one online group that gets this right.

They call themselves “cyber-elves” to distinguish their behaviors online against the behaviors of Russian “troll” farms, which they battle on the internet in the fight against misinformation.

This distinction between “elves and trolls” online derives from Tolkien mythological themes, yet becomes an easy narrative overlay to two distinct strategies on the internet.

When battling Russian troll farms, the cyber-elves learn that transparency, clear communication, rational broadcast, and most importantly humor and theater become the weapons of choice.

This choice is not taken from any sort of moral or ethical position either, it simply is discovered that this is the only methodology that works. It is the only methodology that neutralizes the problem by identifying its cause, without itself creating more misinformation in return.

In Lithuania, they can now train volunteers to adopt these tactics, as this is the only thing that works to bust up troll farms and stop the spread of misinformation around a topic.

So what specifically is a troll farm?

While some troll farms can be funded and operated by more experienced digital magicians working for corporations or government agencies, a troll farm overall has specific identifiable qualities across millions of online social groups that emerge easily without a paycheck.

A “farm” is a place that nurtures and recycles life. A troll farm nurtures “trolling” and the spread of misinformation. A farm cultivates the behaviors and distributes the content that emerges from those behaviors.

Trolls and misinformation

A “troll” as a word has such historical relevancy on the internet that this singular word and the history of its own adoption on the internet is a great microcosm itself to understand how subtle and how easily misinformation can spread on the internet.

In the early days of the internet, if someone was to refer to someone else as a “troll”, it meant two different things.

Large internet groups used the word “troll” to refer to a toxic individual in an online community, someone “bombing” a forum with negativity, personal attacks, toxicity.

Yet, other large internet groups would use the word “troll” almost in an heroic way, a reference to the more Sasha Baron Cohen, Ali G type online behavior. Someone “trolling” was someone playing an often “well deserved” practical joke on an unsuspecting individual or community, who takes the “troll” performance as real.

So, in 2008, if someone was to call someone else a “troll” in a community that was mixed between these two types of usages of the word, two entirely different stories would be communicated, and without any group knowing it.

Without any group knowing that, one group would be spreading misinformation, without realizing it.

Seeing how just one simple word, “troll”, used in good faith, to relay information from one community to another community who has a different context usage of that word, can allow misinformation to spread without intention, just multiply that as much as you can, adding both intentional and unintentional reactions to information online, to get a bigger glimpse into the problem.

Since global adoption of misinformation and troll farms as a sort of political everyday internet reality, the word “troll” meaning a toxic individual on the internet has “won” adoption over the usage of the word “troll” as a sort of comic hero, John Oliver, Sasha Barron Cohen, The Yes Men embodying the archetype.

Thus, the Lithuanian Cyber-Elves needed a word to distinguish their reality from the reality of “troll farms”, which can emerge in any social community online, anywhere, for any reason.

So how do troll farms cultivate misinformation and trolling?

Cultivating a troll farm is an entirely organic process that requires no paycheck. A paycheck just offers a troll farm more time to spread misinformation and troll.

Cultivating a troll farm that spreads misinformation requires no more than “groupthink” to form, and this is the first dynamic aspect to forming a troll farm, a way to distinguish the “group” from the “other”. This is easily understood as the classic “us versus them” mentality.

The social group depends upon building consensus around what are for all sakes and purposes just keywords that identify both the group and the “other” they are targeting.

Now that a social group has formed somewhere on the internet, anywhere from Facebook to Twitter, from Reddit to Twitch to Wikipedia is all possible, the social dynamics of inner group competition will begin to form.

The Groupthink dynamic

These group dynamics within rely on avoiding contradictions within the group while relegating the groups worst possible behaviors and choices made in the expression of being against the other.

This creates a competition within the group for attention and status within the group.

Who is the biggest “defender” of the group? Who is the angriest? Who vocalizes the loudest message within the group? Who is the strongest voice within the group? Who has the authority within the group? Who controls the groups message? Who decides which is the group consensus, and which the consensus rejects?

This competition within the social group for power of broadcast creates a psychological irony, each troll farm actually requires the troll farm of the “other” to justify their own quest for dominance or influence within their own group.

While warring ideological sides appear in media to be in a competition between their counterpart influencers and bloggers, they actually wind up supporting and building each other’s audience, a type codependence. The enemies of one group empower both sides responses, which then can be amplified by naturally occurring troll farms.

The Dark Collaboration

To give an example of this tragic irony playing out, I will use one more well known example, the “social justice warriors” of left-wing to center politics to the “alt-right”, “dark enlightenment”, “MAGA” counterparts online.

A known center to left wing influencer, with tens to hundreds of thousands of followers, requires something to complain about for them to continue to nurture their own followers and audience.

Who has the biggest “left-wing” voice? What other left-wing to center influencers are out there, competing for audience size and credibility?

After all, YouTube and Facebook’s algorithm is going to recommend the content of the left to center influencer, and only one influencer at a time can be at the top of each algorithm that is fed to us.

While such an influencer may continue to, for example, criticize their counterparts such as Joe Rogan or Jordan Peterson, it’s not Dr. Peterson nor Joe Rogan whom is their real “enemy” in their own struggle to build more followers, it is “other” left wing to center influencers, competing for the same audience size and set of followers and likes that they are, optimized by social media platforms.

And the opposite is also true.

Dr. Peterson and Joe Rogan have embedded themselves into their audience sizes by also participating in having, and requiring, their ideological opponents from the center to left criticize them, troll them, indeed using the “trolling and misinformation” tactics of the other sides “troll farm” to amplifying their own message to their audience, thereby building it. “We’re under attack from the left” becomes a viral message that builds more community for them while they are competing for other center to right voices, along the same algorithms optimized by social media to distribute the most competitive minded influencers at the top of discovery and recommendation algorithms.

Each side, from left to center to center to right, is literally creating the circumstance for the exact issue they voice to concern, unaware that the entire dynamic is co-created in an entire ecosystem of justified online harassment and trolling and optimized for these behaviors by social media algorithms seeking to boost their monthly active users (MAUs) to justify their CPMs (cost per thousand eyeballs for advertisers).

So while influencers, troll farms and misinformation sink into group think competition, what is actually transpiring over a much larger view of the data is that ideological troll farms on opposite ideological sides actually help keep each other in “power” in terms of their own communities and this dynamic is further amplified by social media algorithms rewarding competition within peer groups by competing their top influencers against each other.

This has created quite a nasty feedback loop making much of this problem intractable within current media and tech architecture, as the economy is fueling a $150B a year industry in the US alone.

So while SJW and MAGA appear to “battle” online, there is an unconscious collaboration happening instead. They need each other and both sides use each other continually to gain status within their own social groups.

The loudest voices of each potential social group has a payoff in the form of advertising dollars paid to them by the platforms that host them.

Because this inner group competition emerges from individuals who wish to serve their social groups in such confrontational capacity, the “other” becomes more and more dehumanized, and those offering “tough” solutions, arising from “frustrations” and “angers” they distribute within their group to influence support, win in the group consensus. This group consensus, formed from suppressive, toxic, or abusive methodologies then inform what is broadcast to peak discovery on the internet through internet search and social media algorithms optimizing for data via likes and voting along specific keywords.

All ideological groups will have members that view their ideological struggle as the imperative. There will be members of that ideological group that will seek their purpose in that group by engaging in toxic confrontation with their opponents.

The internet has given this particular psychological type a set of broadcast tools, and they have perfectly gamed the tools the internet has given them.

If the computer interface of social media currently empowers group think, troll farms and misinformation, then this means that there can exist another type of digital environment, an environment where only mutually resolving behaviors have control of the permission to publish to the internet itself.

In Non-dual consensus methodology of Aiki Wiki, a digital interface for large scale online consensus building, the psychological dynamics of group-think can be predicted to collapse online through a unique type computer interface design, a few simple new online tools.

While I am not so naive as to assume the social media giants would adopt something like it, I do show how it is possible, that a pathway could emerge through a re-design of basic standard online tools that could in theory eradicate misinformation and trolling from internet publishing, where the internet comes to discover information that was collaboratively formed between ideological differences as easier to discover than the troll farms on all sides seeking to win influence by dark collaborations that only solve the problem of audience development, not resolution.

About Rome Viharo

--

--

Rome Viharo

https://bit.ly/RomeViharo Founder, Aiki Wiki. Co-Founder, Big Mother. Designer of the "win win" protocol for the world wide web.