The Mad Libs Era of Science: Why 'Search-and-Replace' Fraud is the Final Stage of the Paper Mill Pandemic
Verified Researcher
Aug 2, 2019•3 min read

Peer Review is a Hallucination
We are no longer dealing with simple plagiarism; we are entering the era of "Mad Libs" science. The recent news that researchers essentially performed a find and replace on a methamphetamine study to produce an LSD study is not just a failure of authorship, it is a categorical collapse of the gatekeeping mechanism.
When a journal fails to spot the difference between a stimulant and a hallucinogen, it stops being a platform for truth and becomes a factory for career building fiction. Many people want to blame the authors alone. But the real problem is that we have accepted a world where science is templated. If you can swap a molecule without changing the results, the data is just a prop. It probably never existed.
The Rise of the Bio-Template
This isn't an isolated accident; it is a business model. Paper mills have discovered that the path of least resistance to an Impact Factor is to create a "Bio-Template." You take a standard experimental design (in this case, retinal damage in CD1 mice) and you rotate the variables. Yesterday it was Meth, today it is LSD, tomorrow it will be Psilocybin.
Calling this paraphrasing plagiarism is far too polite. It is industrial scale forgery designed to exploit the limitations of software. Current tech checks for matching strings of words. It does not look for the biological absurdity of two totally different drugs creating identical biochemical fingerprints. It is a logic failure, not a text failure.
In a recent piece by Adam Marcus for Retraction Watch, the absurdity of this specific case was laid bare: researchers in China lost a 2019 paper because it was a near-exact clone of a 2018 study on a completely different substance. This exposes the lethal blind spot in our current editorial process: we check for words, but no one is checking for logic.
The Fraudulent Profit Cycle
The incentive structure is broken. When publish or perish is the only rule, a paper on LSD is just a coin used to buy a promotion. Journals like Human and Experimental Toxicology get caught in this mess because their reviewers are burned out. Editors are pushed to keep the machine running to protect their metrics. Reality is sacrificed for the sake of the pipeline.
We have to stop treating these retractions as individual moral failings and start treating them as symptoms of a corrupted market. When a paper mill can sell a "find-and-replace" manuscript to a researcher, and that manuscript can pass through peer review at a legitimate journal, the system's immune response has failed.
Structural Reforms: Moving Beyond the Text
Stopping this circus requires two major shifts. First, we need the Receipt Rule. No more summaries of stress markers without the goods. If you say LSD kills cells, you have to show the raw, timestamped files before anyone even looks at your abstract. Second, publishers have to stop being islands. We need a shared database that flags when the same skeleton is being used to build different stories at different journals.
If we continue to let researchers swap nouns to get published, we are no longer building a body of knowledge; we are building a cemetery of data.



Discussion (8)
Join the conversation
Login or create an account to share your thoughts.
I wonder if the animal ethics boards even look at these. If they can't distinguish between LSD and meth protocols, the whole system is broken.
it is just mad libs with phd titles at this point
Peak fraud.
The irony is that these paper mills probably use more processing power than the actual research they are faking.
This explains so much! Back in my day, you actually had to conduct the experiment to get published. Now it's just a digital assembly line. Truly a shame for the profession.
Follow the money. Until the incentives for 'publish or perish' change, the mills will keep spinning.
not sure this is the 'final' stage... they will just start using generative ai next and we wont even find the synonyms
As a peer reviewer, the sheer volume of these 'tortured phrases' is becoming impossible to manage. We need better detection algorithms immediately.