A Mysterious Algorithm On Medium
Based on a quick experiment, I’m confused with a Medium algorithm & trying to figure out whether it is a feature or a bug. Join and help to solve the puzzle.

Purpose
In this short article, I want to share my recent observations based on a couple of experiments. I am confused regarding the Medium system which immediately tagged some of my articles as “Not distributed in topics” when published.
I also highlight my concern posing important implications for writers, curators, and readers.
My evidence is based on the attached two articles immediately rejected for distribution in topics.
Situation: Both of these articles were tagged as “not distributed in topics” as soon as I submitted them.
Why am I concerned?
Because there were not a single view so I assume that the tagging was made by the system and not made by the curators.
Here is the screen capture for the latest article. The same was for the other article.

I am wondering whether there is a bug in the system or it is a feature of the system filtering determined keywords.
A few ideas came to my mind based on the nature of these two articles.
These articles have similarities. For example, both of them include scientific terms, and the spell check shows the terms as unknown.
One of the articles use the term “kill” in the heading and the system may put a hard stop on this questionable term.
One of the article picture has medication, and the AI system may have interpreted it as drug content.
Another thought, there may be a glitch in the system triggered by unknown events. Not sure whether Medium has an autonomic Event Management system.
As a responsible writer, I made every effort to comply with the curation guidelines so that they can be amplified and distributed to wider audience. My purpose is to add value to my readers based on my hard learned lessons.
Messages of these articles are interesting, important, based on personal experience, and supported by science with relevant links. I am not selling or promoting any product or services, purely sharing experience and knowledge which can be interest to many readers.
Both of them are published in the “Age of Awareness”.
What is my primary concern?
My concern is these articles are not distributed to curators. The Medium curators did not even have a chance to review and make a decision on them.
I would understand and respect their decision if they don’t choose them but if they don’t see them at all, it is not tolerable.
No one enjoys to be controlled by inappropriate artificial intelligence.
There are two critical implications of these concerns.
- The faulty or wrongly programmed system is preventing dissemination of articles and destroying the chance of amplifying writers’ messages.
- The curators are not receiving quality content due to system errors or irrelevant blocking.
- This incorrect automated procedure prevent Medium readers to find important content.
Based on the assumptions, the concerns require two actions.
Action 1: Fixing the bug in the system, if it is bug.
Action 2: Correcting the algorithm [as it does not serve the community], if it is the algorithm problem.
Let’s wait and see whether these actions or other alternative solutions can be provided. As a proud member of Medium, I did my duty of call.
In the meantime, since the concern relates all of us in the ecosystem, insights from experienced members would be great.
I know this article will be automatically discarded per curation guidelines as it is ineligible for distribution due to using Medium as the content.
However, if curators see this article, by any chance, feedback on clarifying this concern can be useful and delight the customers.
Trust and transparency are key success factors in collaborative content platforms.