
The impact of AI improvements has been widely evident in nearly every workplace field, with some companies and academic institutions, including the University of Exeter Business School, introducing new training courses for their employees on how to accurately use AI.
In journalism, this has commonly been used as a replacement for shorthand, an alphabet used by journalists to summarise fast-paced content quickly and efficiently, which can now be replaced with AI speech-to-text software in transcribing.
Furthermore, summarising reports and cross-referencing resources now make it easier for journalists to keep up to date with relevant content, and AI-generated video recaps now help sports journalists analyse recent matches or games.
Yet, there has also been an increasing debate over the extent to which journalism should continue to adopt AI in its content. Though summarising or cross-referencing might seem harmless, even the popular AI site ChatGPT states at the bottom of its pages, “ChatGPT can make mistakes. Check important info.”
In reality, just like we are told in our University modules, the usage of AI, though harmless and perhaps useful at times, risks plagiarising or missing out references.
To me, the art of journalism is partially related to writing – the ability to either report accurately or convey to the reader a given view. It’s what made me want to be a journalist in the first place – I want to write and have my work read by others. With generative AI, this originality and ability to inform or convey risks being limited or dismissed completely.
With generative AI, this originality and ability to inform or convey risks being limited or dismissed completely.
AI is a good tool for indirect usage, such as the methods mentioned above. We cannot, however, rely on it to create informative and emotionally-driven (if needed) articles – otherwise, the argument for writing jobs, including journalism or research, is left behind.
The world must continue to adapt to new technology changes, but an important distinction must be drawn between what usage of generative AI is acceptable and what perhaps crosses a line.
According to recent reports, it is becoming increasingly likely that with circulations of traditional newspapers decreasing significantly, online media outlets will attempt to win the “competition for audience attention” with the usage of AI. And from personal experience, it seems that whoever you ask will have a different view on the topic manner – should we allow AI in all aspects of the news? Some, or maybe none of it?
The most recent initiatives aimed to introduce and determine the appropriate use of AI within journalism include ‘Journalism AI’ from LSE’s think tank Polis, along with CUNY’s ‘AI Journalism Lab’, or ‘AI Unlocked’ from WAN-IFRA, among many others, and I’m sure even many more to follow up. Until clear boundaries are outlined and adopted by major news outlets, however, I remain of the belief that journalism, like any other writing-oriented job, relies on human input – and this simply cannot be matched or replicated with only the use of AI.