- The Bench Brief
- Posts
- 📱🤖AI, Accountability & The Human Edge
📱🤖AI, Accountability & The Human Edge
From AI writing limits to newsroom workflows, deep-sea mining debates and data-driven investigations — the stories shaping journalism right now
Hello from the Bench!
This week, we’re exploring where technology meets humanity. As AI tools grow more sophisticated, what still makes storytelling distinctly human?
We look at the limits of AI writing, how newsrooms are using generative tools responsibly, what teens are really doing with chatbots, and how data and photography continue to ground journalism in accountability and shared experience.
Let’s dive in..
Here is our featured content this week:
🤖✍🏽AI writing is improving — but it can’t replace the process that makes writing human
This piece argues that although AI writing tools — like those from OpenAI — are producing increasingly polished text, they can’t replicate the deliberate creative process that makes writing distinctly human.
Grace Phillips, the author and other quoted experts say AI can draft, research and edit quickly, but lacks depth, intentionality and critical thinking. The story highlights concerns about students’ over-reliance on AI, declining writing skills, and the risk of diminished literacy if institutions don’t emphasize rigorous writing instruction. It concludes that human investment in writing remains essential even as AI evolves.

Photo by Kaitlyn Baker on Unsplash
📰⚙️What actually works: Clare Spencer on generative AI use in newsrooms
In this Q&A, reporter Clare Spencer discusses practical, responsible uses of generative AI in newsroom workflows. She identifies machine transcription and translation as tools that save time and expand reporting capacity, while cautioning that inaccuracies and hallucinations make careful human oversight critical.
Spencer emphasizes aligning AI tasks with appropriate human expertise, avoiding false “human in the loop” assurances, and using tools where they provide real value. She urges journalists to maintain critical thinking, transparency and a willingness to explore new platforms.

Transcription tools
Cool Stuff Corner: What are we reading?
🌊⛏️Will mining seabeds be a disaster?
As demand surges for minerals used in electric vehicles and renewable energy, companies are pushing to mine the deep seabed for metals like cobalt and nickel. Scientists warn the practice could damage fragile, little-studied ocean ecosystems, while global regulators debate whether to allow commercial extraction. Check this out

Yellow: Cobalt-rich crusts; Purple: Polymetallic nodules
Did you know?đź’ˇ
A new Pew Research Center survey shows a majority of U.S. teens say students at their school use AI chatbots to cheat at least “somewhat often,” even though most teens use the tools chiefly for research and schoolwork help.
About two-thirds of teens report using AI like ChatGPT or Copilot, and 12% say they’ve used chatbots for emotional support — a use most parents disapprove of. Read more here
From the Vault 🏛️
📷❤️“You have to give something of yourself”: The New York Times’ James Estrin on finding shared humanity through photography
This interview with The New York Times photographer James Estrin explores how photography fosters connection and trust. Estrin describes his path into photojournalism and the importance of storytelling that reveals shared human experiences.
He reflects on his work — including award-winning projects — and explains that gaining subjects’ trust requires sincerity, humility and emotional investment. Estrin argues that powerful imagery bridges differences and highlights commonalities across communities.

Sheikh Reda Shata walking to his Bay Ridge mosque. Photojournalist James Estrin was part of a team that won a 2001 Pulitzer Prize for the series “How Race Is Lived In America." Photo credit: James Estrin/The New York Times.
📊💻How to use R to analyze racial profiling at police stops
This in-depth tutorial walks through using the R programming language to analyze racial profiling in police traffic stops, based on data from Ohio’s largest cities. It explains data collection, cleaning and statistical techniques, including the “veil of darkness” test to assess whether race influences stop rates when officers can’t easily see a driver’s race.
The guide breaks down code, methodology and caveats, offering a practical resource for journalists and data analysts investigating bias in policing.

An investigation by Eye on Ohio found data pointing to racial profiling in police stops in the state's largest cities. Photo Scott Davidson/Wikimedia Commons
That's all we've got for this week! Thanks for reading, and let us know if there's anything you'd like to see in these newsletters or in our coverage at [email protected].
And follow us on Instagram, Twitter (or X, or whatever) and LinkedIn for live updates on stories each week!