7 bitter pills (and one sweet) to swallow if you want AI in your newsroom11 Apr 2023
AI, like all technological breakthroughs, also has a less pretty face (photo generated by Midjourney)
Written by Joanna Kocik, content specialist at Autentika
Charlie Beckett, the Director of the Journalism AI Project at the London School of Economics and Political Science, shares with Autentika what every media professional should know about AI implementation.
Few specialists are as experienced in AI in journalism as Charlie Beckett, a professor in the Department of Media and Communications at LSE, and the founding director of Polis, the London School of Economics International Journalism Institute. He has spent the last five years researching how artificial intelligence transforms journalism worldwide, enabling news media to gather, create, and distribute content more efficiently.
We have recently discussed generative AI which led us to the conclusion that as much as it s the new wave of technological disruption in media (after online and social), this hype – like all technological breakthroughs – also has a less pretty face. Beyond the potential to help media professionals address their challenges, it brings risks and raises questions about things we're still missing.
Here are seven "hard to swallow" pills (sugar-coated in some cases) from our conversation with Charlie, followed by one fact that brightens up the scene.
1) We all need to learn
Most news organisations do not have all the knowledge, skills, and resources to implement AI. But with some basic AI literacy, it's already possible to be inspired by and learn from others. It would be foolish to expect everyone to become an expert overnight. This goes for both people responsible for innovations in newsrooms and journalists who benefit from (or struggle with) those innovations.
As Charlie told us, we all need to learn AI – just as we learned how to use social media a few years ago. "Journalists haven't thought about AI until now. They thought they'd ask politicians questions, write them down, and publish news. Now they can work with AI, which is almost like having an assistant that you ask, 'Transcribe the audio' or 'Find me a picture that goes with my story' And those are the skills that journalists need to learn: how to prompt AI to give us the results we want," he said.
He notes that we still need more guidelines and playbooks for using AI in journalism, although several organisations have provided their insights and resources. The prediction is that news organisations must work closely with think tanks, technology companies and startups to accelerate AI development and support innovation.
2) Implementation is difficult
Many newsrooms have reservations about implementing AI. Most often, we hear that media "do not see the need to invest in AI right now," that they got burned with previous solutions that got buried after a while, or that they simply are not in the right moment to spend time and resources on AI innovation because they're struggling with other challenges.
It's true that custom off-the-shelf solutions, in particular, cost a lot of time and money to integrate, test and iterate. Innovation managers need to sift through dozens of tools and choose the right ones, and journalists need to learn how to use them.
According to Charlie Beckett, there are also cultural barriers in the workplace to adopting AI. Journalists and media professionals are afraid of "their jobs being taken over by robots" and fear they'll have more work to do and higher goals to achieve. Adopting AI is a digital transformation – and as such, it's a challenging process. Change is complex, and there is always resistance to it. It takes time and effort to realise that technology is there to solve problems – not to get in the way of anyone's work.
In many organisations, there are both technical and cultural barriers to adopting AI on a wider scale.
3) You need a strategy
JournalismAI's 2019 survey of 71 news organisations in 32 countries found that only about a third have an AI strategy. Today, that percentage is likely higher, but still, many media organisations need a more coherent innovation vision. As one survey participant said, "It's not surprising that we don't have an AI strategy – we don't really have a strategy for anything."
Of course, it's not that simple – you probably already have 50 people knocking on your door every day trying to sell you a new "wonderful tool". In such a situation, you can talk to someone you trust, usually in another media company that is a bit more mature and has more experience with digital innovation, or brainstorm with a trusted consulting team.
Also, think tanks like Polis offer AI starter packs that you can spend a day with to get a feel for what AI can and can't do for you. "If you haven't done it yet, you're missing out," Beckett says. "And at the beginning, you don't have to be particularly original and start from scratch. You can look at what others are doing and copy them," he adds.
What matters is what you want to use AI for.
"However, if you're using AI to build new relationships with your audience, launch new products, or get your journalists to do higher quality work, then I think you have a chance of adding value in the end," Charlie concludes.
4) There will be no immediate return on investment
We have said this more than once, but we will repeat it: AI is not a magic wand that will solve your business problems. And that's precisely what we heard from Charlie: "The idea that you can plug something in and save money within a week is intangible," he said.
Thinking you can replace humans with AI and save staff is also a trap. As some local newsrooms say, using robots doesn't mean they don't need to hire journalists. The opposite is true – they're hiring more people to cover stories that AI can't do.
Of course, AI will likely allow humans to work faster, automate repetitive tasks, and help media manage subscriptions, newsletters, and push notifications. But the decision to implement specific tools needs to fit into your system. It means that when using AI, you need employees who understand it, who know how to use it, and who will benefit from it in their work. Also, remember that AI isn't there to solve your business model problem (no technology can). "However, it can have a big impact on efficiency. And then you have to decide what you want to do with that efficiency," Beckett says.
Your employees need to understand why you've decided to use AI – and how it works.
5) Inequalities will persist
AI risks creating a new digital divide in the media world – this disparity may be more significant than anyone thought. Specific markets and organisations have always had greater resources; AI will magnify that advantage.
"Bigger organisations will get even bigger," says Charlie Beckett. He predicts that hegemons will buy up small and mid-sized companies, and those mid-sized companies that survive will likely shrink. "On the other hand, I see a lot of specialised, hyperlocal news segmented by genre or audience interest. They, too, have some potential to grow through AI," he says. Another factor is that now English-language media have an advantage over others because AI language models work best in English.
Beckett also notes that some newsrooms had fallen out of the technology race before it began.
6) We don't know what's coming up next
When we asked Charlie about possible future scenarios for AI development, he gave us a little thought exercise. Now we're going to play it with you.
Go back five years. Think Facebook was still important to journalists as a source of topics and a distribution channel. TikTok didn't exist yet. No one had heard of the pandemic. Hardly anyone was working remotely.
In the last five years, we've seen a lot of things we couldn't even dream of. Five years is an aeon in technology - and it'll be the same with AI.
At the same time, Beckett points out that the core of journalism is providing information in a way that people can access and understand – and that's not going to change. "I bet I could walk into any newsroom today and be a productive journalist again in a matter of days, even though the last time I worked in that profession was 17 years ago," Charlie says. "The tools and systems are different, so I'd have to learn them, but it wouldn't take long. I could probably learn TikTok in an afternoon," Beckett adds.
7) We'll have to address bias, filter bubbles and ethical dangers of AI
The use of AI in journalism is not without its potential ethical problems. One issue is the possibility of biased algorithms, which can occur due to biased human decisions present in the training data, even when variables such as gender, race, or sexual orientation are removed. Faulty data sampling, where groups are over- or under-represented, can also contribute to bias.
Another concern is the development of "filter bubbles," where personalised stories fed to users can lead to a lack of content diversity, a narrow view of the world and confirmation bias. While this is not yet a widespread problem, it could become one if media outlets focus on delivering only content that users like. Finally, AI's legal and ethical aspects, including privacy and data security, remain a grey area that newsrooms may need to address in the future. Some may even seek legal counsel with expertise in AI.
Professor Charlie Beckett, the Director of the Journalism AI Project at the London School of Economics and Political Science
8) We'll still need humans
So here's the sweet pill – or a sugar cube that sweetens the whole cup. AI will not replace human journalists. We will not witness a storm of automated stories spit out by tireless algorithms that never take sick leave and have no children to care for.
"AI will augment the work of journalists. Certain things will be done at scale (my favourite example is horoscopes), but we will still need a human judgement for more complex tasks. People will still make decisions: Is this story correct? Is this headline accurate? We will need a human in this machine loop," says Charlie Beckett.
He also points out that people love to be entertained, learn about the world, and hear what's happening. "And they subscribe to a newspaper because they somewhat identify with it. They feel they can rely on it. And they think it's entertaining and interesting and shares their values to some degree. That's a human thing. That's not an algorithm. At the end of the day, that's a human relationship.
And with that thought, we'd like to leave you. If you have any comments, we're open to discussing practical applications of AI in newsrooms, proofs of concepts, or other ideas. We're here for you.
This text was written by a human. We tried to engage ChatGPT to help us, but it wasn't useful at all.