Summary

Columbia Journalism Review documented how Cleveland.com and The Plain Dealer built an "AI rewrite desk" that takes reporters' notes, transcripts, and source material and turns them into draft stories with the help of an internal ChatGPT-style system. The case is useful because it is neither pure hype nor pure backlash: it shows a live newsroom trying to buy back reporting time while still wrestling with disclosure, bylines, and hallucination control.

Why It Matters

This is one of the clearer recent direct newsroom workflow stories because it shows:

  • a concrete division of labor between reporting, drafting, and verification
  • where editors think AI helps and where human checks remain mandatory
  • how disclosure and byline rules get contested when AI touches copy
  • the trade between production efficiency and craft, trust, and labor concerns

It is also easy to understand, which makes it useful as an explainer case for non-technical readers.

What the Source Says

The article reports that editor Chris Quinn created an "AI rewrite specialist" role to turn newsroom reporting into publishable drafts, with the human specialist and reporters checking accuracy before publication. CJR says the newsroom used an in-house chatbot from Advance Local, that quotes were among the outputs most likely to go wrong, and that editors said no hallucinations had reached publication. The piece also reports that reporters were producing about the same number of stories per week as before, but recovering enough time for more field reporting.