Industry News

Mark Walters Programs Add New Affiliates

Two programs hosted by Mark Walters pick up new affiliates as the weekly “Armed American Radio” showimg and the “AAR Daily Defense Hour” join the programming at Omni Broacasting’s WTKE in the Ft. Walton Beach-Destin, Florida market. Additionally, the “Armed American Radio” show is being distributed by CRN Talk to cable systems operated by Cox Communications, Optimum TV, and Xfinity Stream.

Industry News

Mark Walters’ Shows Add New Affiliate Stations

Two programs syndicated by Mark Walters’ CCW Broadcast Media LLC are adding new affiliate stations.img The weekend program – hosted by Mark Walters – “Armed American Radio” is added to the program lineup at four stations including WZGM-AM, Asheville, North Carolina and WXZQ-FM, Columbus, Ohio. The “AAR Daily Defense Hour” adds three new affiliates including WNWS-FM, Jackson, Tennessee.

Industry Views

Mark Walters v. OpenAI: A Landmark Case for Spoken Word Media

By Matthew B. Harrison
TALKERS, VP/Associate Publisher
Harrison Media Law, Senior Partner
Goodphone Communications, Executive Producer

imgWhen Georgia-based nationally syndicated radio personality, and Second Amendment advocate Mark Walters (longtime host of “Armed American Radio”) learned that ChatGPT had falsely claimed he was involved in a criminal embezzlement scheme, he did what few in the media world have dared to do. Walters stood up when others were silent, and took on an incredibly powerful tech company, one of the biggest in the world, in a court of law.

Taking the Fight to Big Tech

Walters, by filing suit against OpenAI, the creator of ChatGPT, become the first person in the United States to test the boundaries of defamation law in the age of generative artificial intelligence.

His case was not simply about clearing his name. It was about drawing a line. Can artificial intelligence generate and distribute false and damaging information about a real person without any legal accountability?

While the court ultimately ruled in OpenAI’s favor on specific legal procedure concerns, the impact of this case is far from finished. Walters’ lawsuit broke new ground in several important ways:

— It was the first known defamation lawsuit filed against an AI developer based on content generated by an AI system.
— It brought into the open critical questions about responsibility, accuracy, and liability when AI systems are used to produce statements that sound human but carry no editorial oversight.
— It continued to add fuel to the conversation of the effectiveness of “use at your own risk” disclaimers when there is real world reputational damage hanging in the balance.

Implications for the Radio and Podcasting Community

For those spoken-word creators, regardless of platform on terrestrial, satellite, or the open internet, this case is a wake-up call, your canary in a coal mine. Many shows rely on AI tools for research, summaries, voice generation, or even show scripts. But what happens when those tools get it wrong? (Other than being embarrassed, and in some cases fined or terminated) And worse, what happens when those errors affect real people?

The legal system, as has been often written about, is still playing catch-up. Although the court ruled that the fabricated ChatGPT statement lacked the necessary elements of defamation under Georgia law, including provable harm and demonstrable fault, the decision highlighted how unprepared current frameworks are for this fast-moving, voice-driven digital landscape.

Where the Industry Goes from Here

Walters’ experience points to the urgent need for new protection and clearer guidelines:

— Creators deserve assurance that the tools they use are built with accountability in mind. This would extend to copyright infringement and to defamation.
— Developers must be more transparent about how their systems operate and the risks they create. This would identify bias and attempt to counteract it.
— Policymakers need to bring clarity to who bears responsibility when software, not a person, becomes the speaker.

A Case That Signals a Larger Reckoning

Mark Walters may not have won this round in court, but his decision to take on a tech giant helped illuminate how quickly generative AI can create legal, ethical, and reputational risks for anyone with a public presence. For those of us working in media, especially in formats built on trust, voice, and credibility, his case should not be ignored.

“This wasn’t about money. This was about the truth,” Walters tells TALKERS. “If we don’t draw a line now, there may not be one left to draw.”

To listen to a longform interview with Mark Walters conducted by TALKERS publisher Michael Harrison, please click here

Media attorney, Matthew B. Harrison is VP/Associate Publisher at TALKERS; Senior Partner at Harrison Media Law; and Executive Producer at Goodphone Communications. He is available for private consultation and media industry contract representation. He can be reached by phone at 724-484-3529 or email at matthew@harrisonmedialaw.com. He teaches “Legal Issues in Digital Media” and serves as a regular contributor to industry discussions on fair use, AI, and free expression.

Industry News

Michael Harrison Interviews Mark Walters

img

TALKERS EXCLUSIVETALKERS publisher Michael Harrison (left) is pictured conducting an exclusive interview yesterday (6/2) with “Armed American Radio” host Mark Walters (right) about this recent court case and groundbreaking defamation lawsuit against Open AI and ChatGPT.  To listen to the interview in its entirety, please click here.

Industry Views

When the Algorithm Misses the Mark: What the Walters v. OpenAI Case Means for Talk Hosts

By Matthew B. Harrison
TALKERS, VP/Associate Publisher
Harrison Media Law, Senior Partner
Goodphone Communications, Executive Producer

imgIn a ruling that should catch the attention of every talk host and media creator dabbling in AI, a Georgia court has dismissed “Armed American Radio” syndicated host Mark Walters’ defamation lawsuit against OpenAI. The case revolved around a disturbing but increasingly common glitch: a chatbot “hallucinating” canonically false but believable information.

The Happenings: A journalist asked ChatGPT to summarize a real court case. Instead, the AI invented a fictional lawsuit accusing Walters of embezzling from the Second Amendment Foundation — a group with which he’s never been employed. The journalist spotted the error and never published inaccurate information. But the damage, at least emotionally and reputationally, was done. That untruth was out there, and Walters sued for defamation.

Last week, the court kicked the case. The court determined Walters was a public figure, and as such, Walters had to prove “actual malice” — that OpenAI knowingly or recklessly published falsehoods. He couldn’t but now it may be impossible.

The judge emphasized the basis that there was an assumption false information was never shared publicly. It stayed within a private conversation between the journalist and ChatGPT. No dissemination, no defamation.

But while OpenAI may have escaped liability, the ruling raises serious questions for the rest in the content creation space.

What This Means for Talk Hosts

Let’s be honest: AI tools like ChatGPT are already part of the media ecosystem. Hosts use them to summarize articles, brainstorm show topics, generate ad copy, and even suggest guest questions. They’re efficient — and also dangerous.

This case shows just how easily AI can generate falsehoods with confidence and detail. If a host were to read something like that hallucinated lawsuit on air, without verifying it, the legal risk would shift. It wouldn’t be the AI company on the hook — it would be the broadcaster who repeated it.

Key Lessons

  1. AI is not a source.
    It’s a starting point. Just like a tip from a caller or a line on social media, AI-generated content must be verified before use.
  2. Public figures are more exposed.
    The legal system gives less protection to people in the public eye — like talk hosts — and requires a higher burden of proof in defamation claims. That cuts both ways.
  3. Disclosure helps.
    OpenAI’s disclaimers about potential inaccuracies helped them in court. On air, disclosing when you use AI can offer similar protection — and builds trust with your audience.
  4. Editorial judgment still rules.
    No matter how fast or slick AI gets, it doesn’t replace a producer’s instincts or a host’s responsibility.

Bottom line: the lawsuit may be over, but the conversation is just beginning. The more we rely on machines to shape our words, the more we need to sharpen our filters. Because when AI gets it wrong, the real fallout hits the human behind the mic.

And for talk hosts, that means the stakes are personal. Your credibility, your syndication, your audience trust — none of it can be outsourced to an algorithm. AI might be a tool in the kit, but editorial judgment is still the sharpest weapon in your arsenal. Use it. Or risk learning the hard way what Mark Walters just did. Walters has yet to comment on what steps – if any – he and his lawyers will take next.

TALKERS publisher Michael Harrison issued the following comment regarding the Georgia ruling: “In the age of internet ‘influencers’ and media personalities with various degrees of clout operating within the same space, the definition of ‘public figure’ is far less clear than in earlier times. The media and courts must revisit this striking change. Also, in an era of self-serving political weaponization, this ruling opens the door to ‘big tech’ having enormous, unbridled power in influencing the circumstances of news events and reputations to meet its own goals and agendas.”

Matthew B. Harrison is a media attorney and executive producer specializing in broadcast law, intellectual property, and First Amendment issues. He serves as VP/Associate Publisher of TALKERS magazine and is a senior partner at Harrison Media Law. He also leads creative development at Goodphone Communications.

Industry News

Mark Walters Celebrates 16 Years of AAR

Talk host Mark Walters tells TALKERS that he celebrated his 16th year as host of “Armed American Radio” on Sunday evening’s broadcast. Walters is CEO of CCW Broadcast Media which produced “Armed American Radio” and “AAR Daily Defense.”

Industry News

Armed American Radio Adds New Affiliates

The nationally syndicated “Armed American Radio” show hosted by Mark Walters adds new affiliateimg stations including KOMY-AM, Monterey-Salinas, California; WSCW-AM, Charleston, West Virginia; and more.

Industry News

TALKERS News Notes

The Salem Radio Network nationally syndicated program “Armed American Radio,” hosted by Mark Walters, adds new affiliates as Steckline Communications’ KGYN-AM, Guymon, Oklahoma; K292HJ, Liberal, Kansas; and Taylor Made Broadcasting’s KGLC-FM, Miami, Oklahoma add the show to their lineups.

PodcastOne will participate in the upcoming 37th Annual ROTH Conference taking place March 16-18. PodcastOne president Kit Gray and CFO Ryan Carhart will host one-on-one meetings with institutional investors on site during this annual invitation-only event.

Industry News

OpenAI Loses Motion to Dismiss in Talk Host Defamation Case

Artificial Intelligence firm OpenAI was denied its Motion to Dismiss the defamation suit filed against it by talk show host Mark Walters, who hosts radio programs produced by his CCW Broadcast Media company. Walters claims the use of OpenAI’s ChatGPT by journalist Fred Riehl that created contentim stating the Walters was accused of embezzling funds from the Second Amendment Foundation defamed him. No such accusation ever actually took place. In its Motion to Dismiss, Open AI argued several points, including that Georgia is not the proper jurisdiction, but it summarized its argument that Walters’ claims didn’t meet the burden of defamation when it said, “Even more fundamentally, Riehl’s use of ChatGPT did not cause a ‘publication’ of the outputs. OpenAI’s Terms of Use make clear that ChatGPT is a tool that assists the user in the writing or creation of draft content and that the user owns the content they generate with ChatGPT. Riehl agreed to abide by these Terms of Use, including the requirement that users ‘verify’ and ‘take ultimate responsibility for the content being published.’ As a matter of law, this creation of draft content for the user’s internal benefit is not ‘publication.’”

Industry News

OpenAI Seeks Dismissal of Defamation Suit

Artificial Intelligence firm OpenAI has filed a Motion to Dismiss the defamation suit filed against it by talk show host Mark Walters, who hosts radio programs produced by his CCW Broadcast Media company. TALKERS reported the suit by Walters back on June 9 in which Walters claims the use of OpenAI’s ChatGPT by journalist Fred Riehl that created content stating the Walters was accused of embezzling funds from the Secondim Amendment Foundation defamed him. No such accusation ever actually took place. In its Motion to Dismiss, Open AI argues several points, including that Georgia is not the proper jurisdiction, but it summarizes its argument that Walters’ claims don’t meet the burden of defamation when it says, “Even more fundamentally, Riehl’s use of ChatGPT did not cause a ‘publication’ of the outputs. OpenAI’s Terms of Use make clear that ChatGPT is a tool that assists the user in the writing or creation of draft content and that the user owns the content they generate with ChatGPT. Riehl agreed to abide by these Terms of Use, including the requirement that users ‘verify’ and ‘take ultimate responsibility for the content being published.’ As a matter of law, this creation of draft content for the user’s internal benefit is not ‘publication.’”

Industry News

Radio Host Mark Walters Suing OpenAI for Defamation

Talk host Mark Walters, who produces and hosts Second Amendment-themed radio programs via his CCW Broadcast Media company, is suing OpenAI in a Georgia Superior Court claiming that OpenAI’s ChatGPT created a false case alleging that Walters embezzled funds from theim Second Amendment Foundation. The complaint states that journalist Fred Riehl was researching the case of The Second Amendment Foundation v. Robert Ferguson and asked ChatGPT to provide a summary of that complaint and received one that stated the suit’s plaintiff is Second Amendment Foundation founder Alan Gottlieb who accuses Walters as treasurer and chief financial officer of embezzling funds. Walters says, and Gottlieb confirms, that he didn’t serve in either position and didn’t steal anything. In the AI world, false text from services like ChatGPT are called “hallucinations.” As with any defamation case, Walters will have to prove he’s suffered damages, but this case will be interesting to watch as it appears to be the first such legal case involving the work of AI. Read the New York Post’s story here.