Summary

Reuters Institute documented one of the clearest small-newsroom AI investigation workflows in the archive: The Colonist Report used ChatGPT and Gemini to sift more than 3,000 pages of government material on flood support, compare answers between models, verify page citations against source documents, and accelerate a document-heavy accountability investigation that would otherwise have exceeded the newsroom's capacity.

Why It Matters

This is a valuable legacy reference because it shows a constrained, defensible way to use AI in investigations:

  • use AI to expand document-handling capacity, not replace reporting
  • force the model to point back to source documents and page locations
  • compare outputs across tools instead of trusting one system
  • treat human review as mandatory, especially for factual and legal risk checks

It is a stronger workflow story than generic "AI helps newsrooms" coverage because the article explains what the newsroom actually did at each step.

Investigator Workflow

This maps closely to private-investigator document review and closed-corpus evidence analysis.

The task is concrete: ingest a large document set, ask the model to surface the relevant passages and page numbers, verify those citations against the originals, compare two models when one produces a dubious answer, and use the results to focus human investigation time. The maturity is `advanced workflow`. The journalism workflow is source-stated; the PI application is an internal inference, but an unusually strong one because the same method fits grant records, procurement files, billing records, and case-document review.

What the Source Says

According to Reuters Institute, the newsroom used AI to identify and analyze government budget and support documents connected to flooding in Rivers State after a local community had already raised concerns. The founder described changing prompts so the tools would search verified government sites instead of general web material, asking for page-specific funding details, and checking those page references manually. In one example, ChatGPT returned the wrong page number, which Gemini helped correct, reinforcing the newsroom's human-in-the-loop rule. The newsroom also used AI for visualization support and asked models to flag possible legal issues before publication, while explicitly rejecting the idea that AI should replace reporting or final editorial judgment.