Newsletter February 2021 — The Impending Failure of Wikipedia and other news

In this issue:

  • “Artificial Intelligence,” Bots, and Censorship: Why Wikipedia can no longer be trusted
  • Censored on Facebook
  • Consolidating and Preserving my Legacy
  • The Farm
  • Now, for your edification and amusement

Read it on my Mailchimp site.

7 responses to “Newsletter February 2021 — The Impending Failure of Wikipedia and other news

  1. I agree with Tom’s concerns about Wikipedia. Five years ago, my Wikipedia entry was rewritten to be an attack page, leading me to investigate what was going on. I ended up writing an article titled “Persistent bias on Wikipedia” ( Mark Diesendorf has already mentioned it here. Another article, “Policing orthodoxy on Wikipedia: Skeptics in action?” (, was just published.


  2. Dear Tom, just got your newsletter. If I may, I would like to add to your take on Wikipedia. I used to be an active editor way back when, contributing considerable sections to some articles, and being one of those people who went after small errors and typos as well, out of sheer good will. There was a considerable vandalism problem then, and I am not surprised that bots were invented to cope with that.

    Then came a day when an article of mine was attacked by someone who had three edits to his name, two of them with rude comments attached. (I had by now had hundreds.) I was accused of peddling alternative veterinary medicine, the article was rewritten, and both links I had attached were removed. One link led to my noncommercial website talking about a home cure for a particular cat skin problem, the other was a vet whom I found to have a particularly good description of that issue. This vet also sold holistic preparations for pets to which I did not link.

    The rude editor accused me of secretly being that vet, using wiki to peddle my products, and I therefore stood accused of unethical behavior. Since I had no connections with the vet in question, and since Wiki did allow personal experiences to be linked to, I tried to reason with the rude person, to no avail. Then, I appealed to a higher instance, one of those people who wiki named to deal with editor disputes. He immediately took the side of the rude person, and made my life so miserable I gave up on editing altogether. I have since heard that bullying became a significant problem on Wiki.

    As the political situation degenerated, it became more and more apparent that Wikipedia peddles one sided info on many topics, and doesn’t hesitate to lock things people are attempting to add to if Wiki deems their views undesirable. It’s only gotten worse over time, and while I still use it for many topics, I am always aware of the built in bias, using it more as a jump off point for looking deeper.

    It’s really a shame. In former days, Wiki was one of the things that made the web worthwhile. As you describe it, the AI-run political correctness makes it nigh impossible to make legitimate edits. That is indeed the end. And it only adds to the massive emergent problem of interesting searches being buried by an avalanche of bland mainstream narratives in all the search engines I have tried.

    Things are going downhill in so many ways. It’s good, Tom, you are still out there working and teaching. Have you considered putting your stuff on a blog that others cannot mess with? For the time being, that is. My blog is on WordPress, and even there, censorship is beginning to intrude.

    Be well. I always look forward to your newsletters.




  3. Being expelled from those wiki pages also happened to a very well known lawyer Reiner Fuellmich, who successfully argued that Volkswagen had misled buyers and many governments in a process called the Dieselgate. Early yesterday morning I listened to his impressive talk in English on this Dutch page:
    Looked Fuellmich up on Wikipedia, but didn’t find him there. His presence on Wikipedia is very likely after he had won the Dieselgate.

    This is an inherent weakness of Wikipedia: anyone can contribute to a page, including deletion, it seems.


  4. Study reveals bot-on-bot editing wars raging on Wikipedia’s pages
    “Humans usually cool down after a few days, but the bots might continue for years,” said a researcher. Some conflicts only ended when one or other bot was taken out of action.”


  5. Thomas, Thanks for speaking out about the bias of Wikipedia. Brian Martin has also written on this topic, see:
    Mark Diesendorf


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.