Относится к сборнику статей теори МВАП https://t.me/thinking_cycles
Imagine a large group of people stranded in the endless expanse of the Vasyugan swamps. They have no navigation tools and were utterly unprepared, yet they must survive and move onward, splitting into several scouting parties.
One person discovers that dried growths on rotting birch trees can easily smolder when sparked by the dying flame of a useless lighter, and with some blowing, a fire can be started. This immediately intrigued everyone who hadn’t yet discarded their lighters. News spread instantly to the most distant scouting parties, and soon everyone began using this vitally useful method.
Due to the monotony of their surroundings and daily routines, people constantly tell each other stories—some true, some invented—simply to entertain themselves and combat despair. Naturally, the most discussed topics are attempts to find a way out: to guess the correct direction to go. To verify any such suggestion, one merely needs to walk in the proposed direction; only after a very long time will it become clear whether the prediction was correct. Many have their own hypotheses they believe in, but convincing others is practically impossible. Unlike the method of igniting tinder fungus, these directional hypotheses remain local and unshared.
Each scouting party has its own leader, responsible for the group’s survival. Moreover, the entire collective has an overarching leader—an old and wise hunter who knows much, though not this particular territory. If any hypothesis about the right path were to convince a party leader, that leader might persuade the main leader. But every party leader already has their own preferred idea, which they follow. Thus, even highly plausible hypotheses—those accounting for celestial markers or educated guesses about the swamp’s boundaries—are dismissed by leaders as interesting but ultimately useless tales.
Anyone who persistently promotes an alternative risks being seen as a troublemaker, a dreamer, or a threat to stability. In survival conditions, conformity becomes a virtue, and dissent—a danger.
The discipline maintained by party leaders actually increases the group’s chances of survival compared to a scenario with no leaders at all. Without leadership, people would be left to their own fragmented solutions; those who fancied themselves knowledgeable would fail to convince the rest. Their ideas would be heard like any other campfire story—but nothing would result from them, as no one would have the authority or ability to organize collective action.
History offers rare exceptions: figures like Joan of Arc, who emerged from the lowest social strata and managed to capture society’s attention through divine-seeming ideas she claimed to hear clearly. She leveraged three key factors:
Today, countless individuals create enclaves of belief around certain ideas—but scientifically grounded breakthroughs are almost never among them. Why? Because the scientist capable of developing an idea over years into a rigorously verified concept rarely possesses the charisma, persuasion skills, or early-formed talent for influencing large groups—abilities that often emerge from childhood social conditioning. Conversely, charismatic leaders who hone those persuasive abilities seldom devote the time required to develop disciplined, evidence-based, methodologically sound concepts. They lack the scientific honesty, skepticism, and methodological rigor essential for genuine conceptual development.
A profound idea demands time, humility, and discipline—but attention demands brilliance, confidence, and narrative.
Scientists are trained to doubt, to qualify conclusions, and to seek counterarguments. Their prestige lies in modest formulations and rigorous proof.
Charismatic leaders, by contrast, are trained to persuade, simplify, and speak “heart to heart,” relying on emotional anchors. Their prestige lies in influence and visible certainty.
Those who, from childhood, learn to “win people over” rarely endure the years of solitary work required to formalize something deeply complex.
And those who spend years refining a model of consciousness won’t suddenly stand on a podium shouting, “I’ve found the way!”—even if they truly have.
Modern society is wary of cults and mass deception. Thus, only the overly trusting and socially vulnerable—often the less educated—fall into such traps. Anyone attempting to co-opt scientific authority to build a personal sphere of influence and belief is labeled a “crank.” Many scientists become cranks in their quest for fame and power. This is deeply repugnant to those who have genuinely spent years developing robust, logically coherent concepts—because such individuals refuse to prove their validity through manipulation rather than through the clarity of logical understanding.
Consequently, they cannot rely on charismatic organizers to promote their ideas. They could easily craft their theory into an enticing myth—as L. Ron Hubbard did, famously claiming one could found a religion even around a lightbulb, and that religion is the surest path to wealth and power. But they choose not to.
Society—even educated society—seeks not understanding, but relief from uncertainty.
So what strategy remains for promoting a scientific idea in such a society?
One might publish materials on personal websites and optimize them for high search engine rankings. Many people will stumble upon these materials and may even skim them—though few will invest real effort. True understanding requires not just reading, but actively constructing one’s own mental model of the concept.
As a result, many will have heard of the idea and its author—but no one will act to promote it, because each individual is isolated, powerless to influence systemic recognition.
Unless the idea captures the attention of someone who actually holds influence (and such people are few), it will remain just another campfire tale. But party leaders are already committed to their own ideas, and the main leader—who relies on those party leaders—never hears of alternatives. Even if a party leader notices the external validity of a new idea, he physically cannot spare the time to retrace the intellectual journey required to truly grasp it. His own idea is already in motion—it defines his lab, grants, students, publications, and identity. No matter how honest he is, he will likely stay silent and ignore the newcomer: “Let’s see if it proves viable. Time will tell. Everyone must fight for their own idea,” and so on. These justifications are never followed by action—they are eternal deferrals.
This problem may be called “The Curse of the Competent Leader.”
It is not a conspiracy, nor mere inertia or stubbornness. It is the economics of attention under scarce resources: a leader has only so many hours, so much energy, so much reputational capital. His “idea-project” is not just a hypothesis—it is his entire professional world. Even if open-minded, he cannot afford a “system reboot”—it would be a professional risk. Thus, however brilliant an idea may be, it remains outside the cycle of scientific legitimation. It is not rejected—it is simply never admitted into the game.
This is the essence of the Semmelweis Effect: the case of Ignaz Semmelweis, who advocated hand disinfection in obstetrics, only to be universally ignored and ridiculed. Even when his close friend and colleague died of sepsis after cutting his finger during an autopsy—a death that directly validated Semmelweis’s warnings—it had no impact on the medical establishment. On the contrary, many in the scientific community didn’t just mock him; they actively persecuted him. Truly incorruptible scientists do not exist; all are human, and there’s always a pretext to isolate an inconvenient truth-teller.
For a leader to publicly endorse a “heretical” idea is a professional gamble. He risks his reputation, funding, and position in the scientific hierarchy. Supporting Semmelweis’s idea would have meant admitting: “We doctors have been killers all along.” That is psychologically and socially unbearable.
Moreover, an idea may be correct but lack the necessary adjacent knowledge for acceptance. Semmelweis’s theory of “cadaverous particles” hung in limbo until Pasteur and Koch established microbiology. Without a conceptual bridge, others could not reach it.
This implies that convincing one or two leaders won’t launch a new paradigm. The broader knowledge ecosystem must evolve to a point where ignoring the idea becomes impossible. A breakthrough idea doesn’t merely precede its time—it collides with the very structure of contemporary knowledge and the social organization of science.
Thus, one cannot simply demonstrate something that proves an idea’s validity and utility. Authoritarian persuasion might force attention, but it’s impossible to ethically compel someone to invest immense time becoming an expert in a new field—especially if they lack the requisite cognitive or methodological skills.
It might seem that the solution is to design one brilliant, reproducible experiment whose results cannot be explained by the old paradigm—not just statistics like Semmelweis’s, but a vivid, undeniable phenomenon. Such a phenomenon becomes a koan—an intellectual splinter in the community’s mind. Attempts to refute it will only strengthen its mystique.
But this is an illusion. The problem isn’t a lack of evidence—it’s systemic resistance. In reality, such demonstrations will be ignored for all the reasons already explained. No publication, no journal article, no practical application will compel leaders to collectively abandon their domains.
Max Planck observed bitterly:
“A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it.”
Therefore, the only viable strategy under the Semmelweis Effect is the “seed strategy”:
Be viable. Surround yourself with a nourishing community of like-minded thinkers. And wait—until the soil, exhausted by old paradigms, is ready to let your seed sprout.
The systemic root of this unyielding rigidity lies in the flawed structure of academic science itself—particularly in how peer review functions, despite being held up as the gold standard of scientific legitimacy. The system creates an illusion of objective selection, but in practice, it is deeply subjective, opaque, and vulnerable to conflicts of interest and in-group favoritism (“my friend vs. outsider” dynamics).
Crucially, the scientific method does not require peer review.
It requires hypothesis testing through experiment, logic, and reproducibility.
Peer review is a mechanism of social legitimation, historically developed to filter out errors and charlatans—but today, it more often serves to preserve dominant paradigms.
If peer review were reformed to prioritize strict adherence to scientific methodology—rather than conformity to school doctrines or reviewers’ personal views—it would create honest pathways for valid scientific work to emerge. This would block dogmatic entrenchment and ensure that only ideas meeting genuine methodological standards gain traction, automatically filtering out both pseudoscience and personal bias.
All of this is further discussed at: fornit.ru/71526.