Aiki Wiki, Wikipedia We Have a Problem update, pt 3
The Wikipedia case study.
The case study began with an inquiry into how majority and minority views build a consensus on a Wikipedia article, specifically when a phenomenon known as “Wiki Wars” erupt on the article.
For years I was intimidated away from ever participating in one to view the dynamics for myself. Primarily, because of my own lack of expertise in topics where Wiki Wars could emerge, such as politics, Israel or Palestine, Oil and Gas or pharmaceutical industry.
Finding the right entry point into the case study was always challenging.
In 2013, I made the decision to initiate the case study into what I thought would be a small and benign wiki war. On Wikipedia, there were two groups of editors who would constantly “battle” on various wikipages.
On one side of this editing community were the “skeptic” editors, who in their minds defend Wikipedia from pseudoscience and bizarre new age claims entering the encyclopedia. On the “other” side is an incredibly diverse community of various viewpoints around things like alternative and integrative medicine or things considered “fringe” research, things like parapsychology, cold fusion, even cryptocurrency.
At the time, I was a bit naive to how Wikipedia worked, and I assumed that “skeptic vs woo” editor culture would be a safe environment for me.
Additionally, these “wiki wars” were on pages that were biographies of notable individuals, not complex theories or political movements, so easy to distinguish factually correct or incorrect biographical information.
I assumed this would a harmless community contrasted against the broader heated political or religions pages and editors.
Boy was I mistaken.
When I began my case study on Wikipedia, the “skeptic community” was one of the more larger social networks on the internet, and early adopters of web tools to bolster broadcast and consensus.
“Skeptics” as a collection of social networks and groups who identify as “skeptics” around this time were primarily white males which divided into two groups at the time of GamerGate.
One cultural side of skeptics became more focused progressive or left wing political influences (the SJWs), and the other skeptic group group became what is now known as the “alt-right”, suspicious of “mainstream media sources”, applying their “skeptic prowess” to online research and YouTube videos. This has become much of the culture of the “dark enlightenment”.
Both groups are highly toxic online, independent of their ideological swing as skeptics.
With GamerGate lighting the match, within a short amount of time what was once a semi-unified community across social media, now “skeptics” became politically divided like the rest of the country.
Since 2014, the internet has become increasingly more toxic, and there is no question that GamerGate was a significant culture milestone of online division, creating streams of “troll farms” across the web engaging in what are “misinformation” wars about each other’s motivations.
I wrote an executive summary of the MediaWiki problem here.
Below is the eight year case study that would inform it.
The MediaWiki Case studies
This overall case study focuses on contentious consensus building on social media platforms specifically wikis, and primarily focuses on Wikipedia with a few exceptions.
Originally, the study was a focus on how majority viewpoints and minority viewpoints build a consensus on a contentious Wikipedia article in a “wiki war”.
I thought it would consume about three months of my life. It would become a case study which went on for over eight years.
The study now broadly focuses on consensus building with individuals or groups of individuals who make three distinguishable choices or assertive behaviors in a group consensus process;
- individuals who practice deception in an online consensus.
- individuals who attempt to influence an online consensus through intimidation, non-resolving communication, or bullying.
- individuals who attempt to control all the permissions (access to participation) in a consensus building process.
It was these specific behaviors which can emerge within consensus building that I developed the Aiki Wiki algorithm to and completed in 2020.
As a case study on majority and minority view editing on Wikipedia, the event travels across three of the most controversial issues and topics on Wikipedia; Biographies of a Living Person (BLP), Fringe subjects (WP: Fringe), and Paid/Agenda Editing.
This case study presents both first-person narratives along with third-party evidence detailing “harassment”, as defined by the Wikipedia community editing guidelines, which is referred to in this study as editor suppression.
Editor suppression is a community strategy to control editing permissions on Wikipedia.
This study details how an agenda group (troll farm) can control and influence an article on Wikipedia, and influence perception of a topic via various MediaWiki platforms.
This study focused on two well-publicized and well-known wiki wars occurring on Wikipedia, the biographical articles about Rupert Sheldrake and Deepak Chopra, and I worked directly with these subjects through the entire process.
I offer complete transparency within the case study about my relationships with these individuals.
I only personally met Rupert Sheldrake once previously and had no relationship with him prior to the study.
While Rupert initially asked for my help as a professional, I told him that I would propose to him a counter proposal, I wanted to turn his BLP problem on Wikipedia into my case study that I will fund.
Since the case study, Rupert and I have become friends. I don’t mind defending Rupert Sheldrake as an individual. I find him to be a lovely person, his wife as kids as well, lots of fun to hang out with. A type of “bohemian intellectual” and family.
With Deepak Chopra, he contacted me after he learned about the case study on Rupert’s biography, as the individuals know each other. My brief relationship with Deepak was professional for a few months as his media representative on Wikipedia. Since the case study, we have continued no further contact and relationship.
First when we met, Deepak was simply interested and fascinated with both Aiki Wiki and the research, and offered a donation of $5,000.00 to the project, so I could keep working on it. I accepted, it was put to good use.
He then asked about my professional services, and if they could be acquired, as he needed help. He did need help. No matter anyone’s personal view of Chopra, his BLP on Wikipedia was truly a mess factually.
I proposed he could pay me to be his direct media representative on his article, as this would be another dynamic to research, “paid editing”.
What happens when a famous person has a legitimate problem with their BLP?
They can hire a representative, Wikipedia allows this.
The representative cannot perform any edits, only negotiate with the community.
Does this process work? Can it produce a reasonable outcome between subject and community? (short answer, no, deeply flawed.)
It was again, perfect for this case study.
So despite what my many detractors claim, this was not and is not an “ideological” report, defending either Rupert Sheldrake or Deepak Chopra or their ideas.
The “content” of the majority view and the minority view is irrelevant from the viewpoint of this study.
For many years and even to this day, the troll farm that I exposed on Wikipedia attempts all sorts of broadcast about this case study, claiming the case study is a “promotion of pseudoscience” or, even a type of fraud itself, claiming that throughout the process, I harassed and doxxed Wikipedia editors defending “charlatans”, even claiming I was in Deepak Chopra’s “cult”, in bed with PR companies, “illegally” editing Wikipedia.
Nothing could be further from the truth. And nothing could be further from having any evidence to back up those claims.
As my entire study transpired over MediaWikis, all of my activity is published, and no links to any doxxing or harassment that I would have engaged in is ever supported by any evidence, while the evidence I have collected is verifiable to a third party.
This troll farm of experienced Wikipedia/MediaWiki editors usually find a home on RationalWiki, which has featured itself into the MediaWiki case study by broadcasting misinformation about the study itself as well as target the author, moi.
RationalWiki formed when conservative Wikipedia editors formed “Conservapedia”, a MediaWiki alternative to Wikipedia, where as RationalWiki began literally as a way to troll their conservative counter-parts.
RationalWiki, looking like it could be Wikipedia, and branding itself with a title that should speak to its own credibility, is a common example of how MediaWiki’s are created by “wiki-wars” beginning on Wikipedia, and then the ideological wings of the wiki-war finding another platform to broadcast their lost narrative on Wikipedia, or, when necessary, broadcast misinformation about your ideological opponent.
Because I became a target within my own study, an opportunity opened where I had a perfect view of how information could be twisted into misinformation, as I happen to be a subject matter expert on myself, and therefore easy to spot where misinformation and misunderstanding about my identity and motivations are.
Because of this targeting and misinformation about both myself and the case study, the work now had a dual purpose. Wikipedia, We Have a Problem had to serve as an account of events that happened within my research, but also as a way to negotiate and build consensus with the many troll farms now spreading misinformation about me, quite a challenge to overcome, about the author, which was having a direct impact on my professional life.
This did expose activities of publicly known Wikipedia editors, however, accounts that already revealed their identities, primarily Tim Farley, an influencer in the skeptic “movement” specifically a thought leader on Wikipedia editing for skeptics as an activity for their activism, as well as a software developer who creates digital tools for skeptics to use across the web. As an influencer, Tim Farley is an activist.
Notably mentioned in the case study as well is senior Wikipedia editor David Gerard, a UK citizen, who oversees the publication of RationalWiki along with Trent Toulouse, a professor of psychology and political activist from New Mexico.
Having to focus on communities whose overall content I had mixed sympathy with was challenging from the case study viewpoint. I don’t have that many ideological differences with RationalWiki, especially when it comes to a pro-science position, or progressive policies publication. What I deeply object to is the dualistic (competitive, divisive) promoting science or progressive politics as specific acerbic style of confrontation, applying GOAT to what is supposed to be “rational” commentary, attacking their perceived ideological foes on the far right.
I do have problems with how these subjects are treated by communities who are responsible for publishing content.
A troll farm, at the end of the day, is still a troll farm.
The case study (with help from others on the internet) did expose a well known internet troll, twin brothers, who were operating a list of over five hundred Wikipedia editing accounts that were used in influence and harassment campaigns, even influencing the press.
While the community the case study focused on in the “majority and minority view” identify themselves as “skeptics” as a way to distinguish their form of online activism, nothing in this case study seeks to criticize “skepticism” as either a philosophy or a form of activism.
The focus was specifically certain types of online behaviors that can emerge within a group consensus process between majority and minority viewpoints.
This meant that if I participated in the process, I would have to walk a very fine and delicate line.
I had to show “real” intent between editing behaviors and choices.
To show the distinctions between genuine editing and genuine editor suppression, this case study solely focused on editing biographical and non-controversial information about controversial subjects.
These subjects were in the “minority view” in relationship to the majority view of the community and Wikipedia, which leans very strongly pro-science and towards mainstream views on many subjects, and seeks to suppress any publication on the platform from what they community considers “pseudo-science” to put it politely, but “woo” as the community calls it directly.
Much of this communal tit for tat between two communities editing on Wikipedia, the “woo” on one side and the “skeptics” on the other, does contain much of a genuine concern of Wikipedia, which has to deal with many problematic situations especially around alternative health claims. Wikipedia is correct to police that.
Yet at the same time, Wikipedia still has a responsibility to treat controversial subject matter to the same terms it treats any other subject. And even a controversial subject can have valid editorial concerns or reasonable boundaries.
So how do the two views co-operate on Wikipedia? That is what I wanted to find out.
So in my case study, I solely focused on bland information about the subjects. On both articles, the problem was identical, the lede section.
Deepak Chopra is a medical doctor, this is factually correct about his biography, and significant to his biographical narrative. Rupert Sheldrake is a Cambridge biologist, this is a factually correct detail in his biography, and central to everything in his personal life.
This information was being suppressed in these two articles by the majority view on Wikipedia. They did not want to list Deepak Chopra as a medical doctor, or Rupert Sheldrake as a biologist, but rather Deepak as a “New Age Guru” and Rupert as a “parapsychologist”.
The community on Wikipedia made it clear that Wikipedia should not give them any credibility, and justified this suppression of factually correct information and the treatment of minority view editors who, rightfully, wanted it included.
Even supporting this editorial decision about this non-controversial information made the minority view, in the eyes of the majority, “promoters of pseudo-science.”
So I wanted to apply the methodology of non-dual consensus building specifically in this environment.
Employed, tested, and piloted was a methodology of collaborative consensus building designed to be employed in contentious environments for the purposes of achieving a resolution, expressed as a completed article.
I informed the community harassing me numerous times that I was authoring a case study into Wikipedia consensus building for Aiki Wiki, and that by participating with me, targeting me, they were willingly participating in this case study.
Additionally, I offered many times opportunities for resolution.
This study continually grew in scope primarily because of one component of my methodology; I documented and responded directly to each instance of targeting or harassment I have received over four years while I attempt to find recourse for a resolution.
I did not predict that I would be a target of harassment when I began this study. Nor did I realize it would ever extend into the darker depths of “wiki” communities that it did, and never in my wildest dreams did I believe it would continue to give me a unique insight and deeper understanding of the psychologies that lurk under the surface of the internet for the past four years since it began.
So below is the relevant list of published articles that were on Wikipedia, We Have a Problem from 2013–2019.
Part One: Collaborative consensus building in a contentious editing environment.
Biography of a Living Person; Rupert Sheldrake
- A worrisome welcome to a wiki war.
- The Battle Begins.
- Enter “The Tumbleman”
- Request for a new consensus, denied
- The First Trial of the Tumbleman: Sock Puppet
- The Second Trial of the Tumbleman: Troll
Biography of a Living Person; Deepak Chopra
- Wikipedia, please delete my article. (pt 1)
- How to catch a skeptic. (pt 2)
- A gathering consensus. (pt 3, not yet published)
- How to win a wiki war, ISHAR (pt 4, not yet published)
- Deepak Chopra’s article reflects a wider problem.
Case study commentary
- Will a definition of online harassment please stand up?
- How to Ban a POV You Don’t Like, in 9 Easy Steps. (with video!)
- Come play the Wikipedia BLP-Palooza!
- Am I a Sock Puppet?
- Investigating Dicks, the Wikipedia Noir chronicles. (Fun Stuff)
Part Two: “Wiki war craft”: Editor suppression, stalking, targeting, character assassination, impersonations, email threats, and blackmail.
Meet the Wikipedia sockpuppet army. (Manul/Vzaak, Atlantid, Goblin Face, Dan Skeptic, etc)
- Oliver D. Smith, MediaWiki poster boy. (Sept 2018)
- True Tales from Weird Wikipedia: Incident at the Fringe Theory Noticeboard (April 2018)
- The Smith’s Dark Entanglement: Full Three year report (Oct 2013 — April 2017)
- Impersonations on Wikipedia, Tim Farley, and flag waving on Reddit, again.
- Flag Waving, watch out for it.
- Oh Reddit and Twitter, am disappoint.
- Wikipedia editor shenanigans (may 4th 2015)
- How Manul/Vzaak greeted me on Wikipedia (2015)
- Vzaak changes their Wikipedia account to Manul (April 2015)
- Manul’s troll pals stalk me on Reddit, defending Manul (May 2015)
- Manul tries to bully Wikipedia admin Liz (June 2015)
- Vzaak/Manul’s misinformation, debunked (2013)
- What will Goblin Face do next?
- Email threatens to increase harassment if I don’t stop publishing Wikipedia, We Have a Problem
- Skeptic sock puppet army gets busted on Wikipedia, finally.
- “We’ll never stop!” claims Wikipedia/RationalWiki editors, claim they are paid by “prominent skeptic”
Meet a MediaWiki troll farm; RationalWiki
- RationalWiki trolls have stolen my narrative, and I want it back (updated: 2017)
- The Attack of David Gerard’s 50ft troll farm, and the RationalWiki hustle. (2018)
- RationalWiki is gaslighting, lying, covering up cross platform harassment. (2018)
- Targeted on RationalWiki, can we talk about this? (December 2013)
- Sh*t RationalWiki says (May 2015)
- RationalWiki editor FuzzyCat Potato initiates “rational discussion”(April 2015)
- RationalWiki censors essay criticizing RationalWiki (April 2015)
- Revisiting RationalWiki (April 2015) and RationalWiki revisited
- RationalWiki’s tangled rationalizations for harassment (May 2015)
- Magical thinking on RationalWiki (November 2015)
- RationalWiki’s Drinking Games (June 2017)
- “This whole mess needs to be cleaned up!” (June 2017)
- So an atheist, a Muslim, a fruitarian and a futurist walk into a saloon… (Nov 2017)
Meet the leaders of the troll farms
- The Attack of David Gerard’s 50ft troll farm, and the RationalWiki hustle (2018)
- Come play the Wikipedia BLP-Palooza! (2018)
- True Tales from Weird Wikipedia: The incident at the FTN
- Tim Farley does not want skeptics linking to Wikipedia We Have a Problem (Jan 2014)
- An open letter to Tim Farley regarding him faking his data sets on Wikipedia (May 2014)
- Tim Farley’s big black hat on Wikipedia (April 2014)
- Tim Farley asks “Why do people edit Wikipedia?” does not mention activism. (June 2014)
- Tim Farley, be honest about activism on Wikipedia (June 2014)
- Tim Farley, if this dirty trick is coming next, don’t (updated 2016, published 2015)
- Skeptic activists now using Google AdWords to promote Wikipedia editing, Susan Gerbic admits to socking on Facebook (may 2016)
- Impersonations on Wikipedia, Tim Farley claims WWHP is “harassment” (2017)
- David Gerard versus the blockchain, a peek into an emerging wiki war (2017