We're stuck in the fake news debate. You know why?



Because we’re still pretending the problem is speech, who said what, who lied, who fact-checked whom, when the real engine of disinformation is hiding in plain sight. The crisis isn’t that people say untrue things. Humans have been lying since we learned to speak. The crisis is that we’ve built a machine that chooses which lies get megaphones and which truths get muffled. And that machine runs on money, not merit.

Let’s be blunt: disinformation today isn’t “going viral.” It’s being amplified. Deliberately. Commercially. Industrially.

We keep imagining an organic wildfire of bad ideas spreading from person to person, but that myth is outdated by about a decade. Today, if you see a piece of disinformation in your feed, there’s a 99% chance it got there because someone paid for it to be there. Not because your uncle shared it. Not because “algorithms like outrage.” Because a targeted, optimized, strategically funded amplification funnel pushed it straight into your attention.

Paid Trumps Organic

On Facebook alone, organic reach tops out at a pathetic 5% of one’s own followers, if you’re lucky. Meanwhile, paid disinformation campaigns hit 25% to 100% of their chosen audience. That’s not “virality.” That’s market penetration. And paid disinfo doesn’t just edge out its organic cousins; it dwarfs them, generating four to eight times the reach of equivalent unpaid misinformation posts. Those numbers don’t describe a chaotic rumor mill. They describe an industry.

This is the part we keep refusing to acknowledge: disinformation isn't a grassroots problem anymore. It’s a marketplace, an influence-for-hire sector with budgets, vendors, targeting infrastructure, and ROI metrics. There are creative agencies that crank out manipulative narratives the way Madison Avenue writes slogans. There are distribution networks that function like digital shipping companies, ensuring every payload of psychological manipulation lands precisely where it should. Troll farms have payroll managers. Influence campaigns have quarterly goals.

Addicted to Fact Checking

And yet, every time we debate solutions, we scuttle back to the same tired place: truth detection. Let’s build better AI detectors. Let’s improve content moderation. Let’s scale fact-checking. Let’s label posts. Let’s nudge users.

None of these are inherently bad ideas. Fact-checking is a public service. Moderation has a role. But when they become the primary response, we’re not solving the disinformation crisis at all, we’re merely refining the speech-policing apparatus. We’re drifting toward regulating what can be said rather than how it’s weaponized.

That’s a problem. A constitutional one.

Slippery Slope

Because whether we like it or not, the U.S. protects speech, ugly speech, stupid speech, harmful speech, even deliberately deceptive speech. That protection is not a bug of democracy; it's a feature. The First Amendment doesn’t crumble when someone posts nonsense. It crumbles when the state decides which nonsense is allowed.

But here’s the twist no one seems willing to talk about: while freedom of speech is protected, freedom of reach is not. The constitution guarantees your right to express an opinion; it does not guarantee your right to an algorithmic bullhorn, a precision-targeting dashboard, or a million-dollar botnet army. It does not guarantee your right to inject manipulated narratives into the bloodstream of a society with surgical accuracy.

“Freedom of reach” is not a civil liberty. It’s an advertising product.

Look For Organized Disinfo

If we want a real path forward, we need to stop pretending that disinformation is primarily an epistemic crisis, a crisis of truth, and recognize it as a distribution crisis, a crisis of amplification. The right question isn’t “How do we stop lies?” It’s “How do we stop industrial-scale manipulation from being blasted into millions of eyeballs as frictionlessly as buying a hoodie on Instagram?”

If a foreign intelligence agency can purchase the same microtargeting tools as a sneaker brand, that’s not a speech issue. That’s a national security vulnerability. If a political campaign can suppress turnout by buying dark ads designed to disappear without a trace, that’s not protected expression. That’s a loophole in democratic infrastructure. If influence-for-hire vendors can reach entire demographic slices with fabricated grievances faster than a newsroom can tweet a correction, that’s not the marketplace of ideas. That’s the marketplace of manipulation.

Protect Speech, Litigate Reach

So let’s separate what’s sacred from what’s not.

Speech? Protected. Reach? Regulate it like the industrial product it is.

We already regulate broadcast frequencies, political ad disclosures, financial advertising, and consumer data practices. None of that violates free expression. Why? Because it governs distribution systems, not ideas themselves. Amplification tools, algorithmic, programmatic, automated, should be treated the same way: as infrastructure with guardrails, transparency requirements, and accountability mechanisms.

You can say whatever you want on the Internet. But you shouldn’t be able to buy your way into millions of feeds with deceptive targeting, covert funding, or industrial-grade amplification pipelines.

We’re stuck in the fake news debate because we keep trying to patch the problem at the level of sentences and beliefs. But the power of modern disinformation doesn’t come from its ideas, it comes from its amplifiers. Until we shift the fight from content to distribution, we’re bringing a fact-check to a gunfight.

And the people running the influence industry are thrilled we haven’t figured that out.

 

 

About Me:

Dominic “Doc” Ligot is one of the leading voices in AI in the Philippines. Doc has been extensively cited in local and global media outlets including The Economist, South China Morning Post, Washington Post, and Agence France Presse. His award-winning work has been recognized and published by prestigious organizations such as NASA, Data.org, Digital Public Goods Alliance, the Group on Earth Observations (GEO), the United Nations Development Programme (UNDP), the World Health Organization (WHO), and UNICEF.

If you need guidance or training in maximizing AI for your career or business, reach out to Doc via https://docligot.com.

Follow Doc Ligot on Facebook: https://facebook.com/docligotAI