Industry Views

Neutraliars: The Platforms That Edit Like Publishers but Hide Behind Neutrality

By Matthew B. Harrison
TALKERS, VP/Associate Publisher
Harrison Media Law, Senior Partner
Goodphone Communications, Executive Producer

imgIn the golden age of broadcasting, the rules were clear. If you edited the message, you owned the consequences. That was the tradeoff for editorial control. But today’s digital platforms – YouTube, X, TikTok, Instagram – have rewritten that deal. Broadcasters and those who operate within the FCC regulatory framework are paying the price.

These companies claim to be neutral conduits for our content. But behind the curtain, they make choices that mirror the editorial judgment of any news director: flagging clips, muting interviews, throttling reach, and shadow banning accounts. All while insisting they bear no responsibility for the content they carry.

They want the control of publishers without the accountability. I call them neutraliars.

A “neutraliar” is a platform that claims neutrality while quietly shaping public discourse. It edits without transparency, enforces vague rules inconsistently, and hides bias behind shifting community standards.

Broadcasters understand the weight of editorial power. Reputation, liability, and trust come with every decision. But platforms operate under a different set of rules. They remove content for “context violations,” downgrade interviews for being “borderline,” and rarely offer explanations. No appeals. No accountability.

This isn’t just technical policy – it’s a legal strategy. Under Section 230 of the Communications Decency Act, platforms enjoy broad immunity from liability related to user content. What was originally intended to allow moderation of obscene or unlawful material has become a catch-all defense for everything short of outright defamation or criminal conduct.

These companies act like editors when it suits them, curating and prioritizing content. But when challenged, they retreat behind the label of “neutral platform.” Courts, regulators, and lawmakers have mostly let it slide.

But broadcasters shouldn’t.

Neutraliars are distorting the public square. Not through overt censorship, but through asymmetry. Traditional broadcasters play by clear rules – standards of fairness, disclosure, and attribution. Meanwhile, tech platforms make unseen decisions that influence whether a segment is heard, seen, or quietly buried.

So, what’s the practical takeaway?

Don’t confuse distribution with trust.

Just because a platform carries your content doesn’t mean it supports your voice. Every upload is subject to algorithms, undisclosed enforcement criteria, and decisions made by people you’ll never meet. The clip you expected to go viral. Silenced. The balanced debate you aired. Removed for tone. The satire? Flagged for potential harm.

The smarter approach is to diversify your presence. Own your archive. Use direct communication tools – e-mail lists, podcast feeds, and websites you control. Syndicate broadly but never rely solely on one platform. Monitor takedowns and unexplained drops in engagement. These signals matter.

Platforms will continue to call themselves neutral as long as it protects their business model. But we know better. If a company edits content like a publisher and silences creators like a censor, it should be treated like both.

And when you get the inevitable takedown notice wrapped in vague policy language and polished PR spin, keep one word in mind.

Neutraliars.

Matthew B. Harrison is a media and intellectual property attorney who advises radio hosts, content creators, and creative entrepreneurs. He has written extensively on fair use, AI law, and the future of digital rights. Reach him at HarrisonMediaLaw.com or read more at TALKERS.com.

Industry News

Matthew B. Harrison Holds Court Over Section 230 Explanation for Law Students at 1st Circuit Court of Appeals in Boston

As an attorney with extensive front-line expertise in media law, TALKERS associate publisher and senior partner in the Harrison Legal Group Matthew B. Harrison (pictured at right on the bench), was selected to hold court as “acting” judge in a moot trial involving Section 230 for law students engaged in a nationalim competition last evening (2/22) at the 1st Circuit Court of Appeals in Boston, MA. The American Bar Association, Law Student Division holds a number of annual national moot court competitions. One such event, the National Appellate Advocacy Competition, emphasizes the development of oral advocacy skills through a realistic appellate advocacy experience with moot court competitors participating in a hypothetical appeal to the United States Supreme Court. This year’s legal question focused on the Communications Decency Act – “Section 230” – and the applications of the exception from liability of internet service providers for the acts of third parties to the realistic scenario of a journalist’s photo/turned meme being used in advertising (CBD, ED treatment, gambling) without permission or compensation in violation of applicable state right of publicity statutes. Harrison tells TALKERS, “We are at one of those sensitive times in history where technology is changing at a quicker pace than the legal system and legislators can keep up with – particularly at the consequential juncture of big tech and mass communications. I was impressed and heartened by the articulateness and grasp of the Section 230 issue displayed by the law students arguing before me.”