Summary
Poynter reported that AI company Nota shut down its network of local-news sites after Poynter and Axios found extensive plagiarism in stories that copied reporting, quotations, and photos from local journalists. The story is a clear misuse case: AI-assisted newsroom production was not merely low quality, it reproduced other reporters' work without attribution and forced the project offline.
Why It Matters
For journalists and newsroom managers, this is a strong operational warning about:
- vendor due diligence for AI newsroom tools
- attribution and plagiarism controls
- oversight of AI-assisted article generation
- risks of using automation to simulate local reporting
- the reputational damage from weak editorial supervision
The case is easy to understand and useful as a training example because it turns abstract "AI ethics" talk into a concrete workflow failure.
What the Source Says
Poynter reported that Nota's 11 local sites were shut down after it and Axios documented copied material from at least 53 journalists across 29 outlets. The article says a contractor confirmed that, as part of his work for Nota, he took local articles, ran them through Nota's AI tools, and published the generated text under his own byline. Poynter also reported that the sites contained typos, misquotes, missing context, and misleading sentences, and that some of Nota's newsroom clients were among the outlets whose work was reused.