People are becoming more relaxed about AI news. They have no idea.
This analysis by Mumbrella's editorial director Hal Crawford was first published in the University of Canberra's Digital News Report 2025 as the commentary for the chapter 'AI and News'.
Ask a Large Language Model AI (LLM) like ChatGPT to randomly choose a number between 1 and 10, and it will choose 7.
So will between 25-45% of English speakers. That’s a big bias, but it’s nowhere near that of your typical LLM. I was at a conference recently where the presenter (Sprinboard.ai’s Pip Bingemann at Humain) asked everyone to get their phones out and pose the random number question to the AI in their pockets. He said “stand up if you got a seven”. Almost every person there, over 100, stood up.
The LLM in this case is just doing its job: predicting “what comes next” in a well-formed sentence. It is not a thinking machine. It is a consensus machine. That makes it wonderful at grammar and composing flowing sentences that feel right. In the context of news journalism, being a consensus machine is dangerous.
***
The University of Canberra’s Digital News Report 2025 examines AI and news in two distinct areas: the use of LLMs to surface and consume news, and the use of AI to produce news. There may be a future where these two areas of consumption and production merge, but we’re not there yet. For now, there are people and organisations who still make news, distinct from the people who consume it.
The use of AI as a news interface – to distribute news for consumption – is limited. The report shows that the overwhelming majority of Australians have not used an AI chatbot in the past week for news: only 6% have, and only 1.4% say it is their main source of news. Use of AI for news is much higher among under 35-year-olds (13%), and this will still probably be underreporting the influence of AI in news consumption, as there are several AI-driven aggregation and summary news apps (for example, Particle) that obscure the use of mainstream LLMs behind a custom interface.
Regardless of the higher use among younger demographics, using AI to source news is not yet a mainstream practice. That is because the chatbots have not been designed or promoted for this purpose. There’s a good reason for that. While companies like OpenAI are still establishing themselves, they know better than to take on news media by competing directly. The path to global success pioneered by the big digital platform companies – like Google and Meta – involves avoiding direct confrontation with traditional news media until firmly entrenched. So OpenAI, for example, doesn’t have a specific AI news interface. While there are small apps that make use of LLMs to summarise and present news, they are not yet mainstream.
***
In terms of news production, however, it’s a different story. Big players like News Corp and the ABC have moved to develop in-house AI tools for journalists. The number of tasks that LLMs and other generative-AI tools can handle is exploding, and as an editor I feel that the use of AI in the work is now mandatory. Uses include document research and summary; general fact queries; creation of transcripts; and generation of illustrations and graphics. If you are not using these tools, you are wasting hours.
The problem for journalists is that it is so easy to forget you are dealing not with an intelligent agent but with a consensus machine. While most good models now use “grounding” to check the outputs of LLMs against real-world databases, sometimes those databases are compromised or the process is flawed. The LLM is still just giving you a string of well-predicted symbols. It doesn’t think, or know, or have a model of the world. Consensus is no measure of truth.

(Midjourney)
For journalists on deadline, LLMs are incredibly alluring: quicker than a phone call and always plausible. But you must cross-check every fact given to you by an LLM, and under pressure it’s easy to forget the provenance of different parts of a story.
Another guideline I recommend is to never use an LLM to write. These AIs specialise in the expected, the conventional, the boring.
***
The DNR found that 21% of all respondents were comfortable with news produced mainly by AI.
I don’t think this means what we think it means, because this kind of true AI news doesn’t really exist yet. Producing news involves reaching out to primary data sources – people, courts, companies etc – analysing and structuring the information, and then distributing it. This exists in a rudimentary form now with stock exchange and sports data services. All the other AI news services – the ones a fifth of Australians seem comfortable with – are actually just scraping and summation services. These paraphrasing factories leech off primary news providers, rather than actually produce news using AI.
Reading through the results of the Digital News Report this year, I get the strong sense that the Australian public is approaching the subject of AI use in news according to existing attitudes to new technology, rather than actually understanding the impact. Those who are predisposed to embrace the latest tech thing – the young, the male, the highly educated – are those most likely to accept AI in news. I don’t believe they, or any of us for that matter, really get how AI is going to transform news.
What we have then in the report this year is an outline of a society only just beginning to come to terms with a technology that is going to have a very deep impact. Some people – a minority – are comfortable with where we are heading. The majority either haven’t thought about it or are uncomfortable. Either way, the use of AI in news production will increase and eventually AI companies will move into direct competition with existing news providers. All of us, particularly those in the industry, are in for a hell of a ride.
Have your say