The news industry has undergone massive upheaval with the evolution of the digital world. Jenny Darmody spoke with Poynter’s Alex Mahadevan about what this upheaval means for the future of journalism.
The way we consume news has changed irrevocably over the years. Where newspapers were once the main source of our information, we now often look to computer screens and in the last few years in particular, what is on those computer screens has changed too.
The media industry has always had to compete with its peers, just like many other industries. But in the digital world, it also has to compete with every other site on the internet that produces some form of content.
Even when it’s news content that consumers are looking for, they’re no longer necessarily going to those trusted media outlets online anymore, but to social media platforms. As just one example, a report published in 2023 showed that in Ireland, nearly 40pc of people aged between 18 and 24 chose social media as their main news source.
And in the last year, generative AI tools have become freely available, raising concerns about how content might be created in the future and how much that can be trusted. With that whistle-stop tour, it’s clear to see that the evolution of the digital world and accompanying technology are the main drivers behind the changing media industry. One person who has closely watched these changing cycles is Alex Mahadevan, director of MediaWise at the Poynter Institute a nonprofit media institute and newsroom. MediaWise is the company’s digital media literacy initiative that teaches people how to factcheck and spot misinformation in a digital world.
Mahadevan started his career as a jack-of-all-trades at newspapers in Florida. He worked as local reporter from 2011 and in the immediate years that followed, a seismic shift was occurring both in terms of Web 2.0 and in terms of news consumption in general.
“News outlets were kind of struggling to figure out what the hell they were doing. And so from like 2012 to 2014, I was a local reporter, but I was also the youngest person on staff because it was kind of an older weekly newspaper,” he said. “They looked to me to say, ‘Hey, what should we do on our Facebook? Instagram? Oh, what is Twitter? So, I kind of became the de facto audience engagement person along with my job.”
The rise of fake news
By 2015 and 2016, Mahadevan was overseeing websites for more than five local news outlets as well as all of the corresponding social media. This is where he saw the rise of fake news, particularly working in the US.
“I was seeing how consumers were starting to seek out fake news and then share misinformation whether it was through Facebook, comments or comments on our website. And so I watched this new shift in the information ecosystem to this fake news world from the local level. So, I was watching these national narratives kind of trickle down to the local level and it was just really it was it was really bizarre.”
It has been nearly a decade since then, and unsurprisingly, Mahadevan has said the growth of fake news and misinformation has definitely gotten worse. “It’s gotten easier to make and distribute misinformation,” he said. “With generative AI or any other tools, I can edit a video clip on my phone in five minutes and share it on Twitter. So, it’s gotten a lot easier to create fake news.”
He also said people are becoming more polarised, both in the US but also globally. “That kind of exacerbates confirmation bias. So, people are kind of existing in these separate spaces online. And if you exist in these separate spaces online, you’re more likely to seek out sources that confirm your beliefs.”
And with arguably the biggest election year in history on the cards, there’s a big incentive to create misinformation in a bid to sway votes and the advancements in generative AI and deep fakes are only set to make that easier.
Media’s relationship with technology
The nature of news production online means it is beholden to the changing trends and whims of its audiences. That means meeting the readers, listeners and viewers where they are – be it YouTube, Instagram or TikTok – and also rolling with the changing tacks of Big Tech along with its corresponding algorithms, priorities and sometimes the whims of its chaotic leaders.
For example, after taking over Twitter – now X – in 2022, Elon Musk suspended the accounts of several journalists, claiming they “violated the Twitter rules”. He also changed how certain news outlets were labelled in order to indicate those that were believed to be ‘government funded’. This was later scrapped following backlash.
In a broader battle, Google and Meta have gone head-to-head with Canada over the country’s new Online News Act last year, which seeks to bring “fair revenue sharing” between digital platforms and news outlets. In response, both tech players said it would not pay for news and has even resulted in Meta blocking news content on Facebook and Instagram for Canadians. However, in November 2023, a deal was reached with Google.
These constantly changing tides can make it hard for news outlets – especially smaller ones with far lower resources than the major conglomerates – to suddenly pivot to the next new trend. But Mahadevan warned that news production has always had a problem with keeping up with technological shifts.
“Time and time again, what we’ve seen is news outlets will see a new technological change – so it was Facebook in 2011/2012 – [they’ll] be really slow to adapt, and adapt only based on feedback from the platform, the platform saying you should put video on here. And then eventually, by the time you pour all your resources in to do it, it’s too late and people are on a completely different platform.”
He added that, having seen this issue happen over and over again, he’s concerned about what that means for AI and worries that news organisations are going to fall behind in that respect.
AI and journalism ethics
There’s a lot of understandable trepidation and fear around AI within the journalism industry. The New York Times is currently embroiled in a legal battle with OpenAI, claiming that AI models such as ChatGPT have copied and use millions of copyrighted news articles, in-depth investigations and other journalistic work – a claim OpenAI is fighting.
And content farms – websites that scrape other sites’ content to reproduce it as their own – have now been supercharged with the tools to not only scrape the content but use AI to rewrite it slightly.
But these issues are not new for the journalism industry and Mahadevan said the key to focus on right now, which was perhaps not done during the Facebook era, is to start thinking about the ethics and the use cases of AI from the get-go.
“Right now, is the time where US outlets need to be setting up these ethical frameworks for how they’re going to use it and how they’re going to use it ethically. The key to that is to start experimenting and put together committees that can start experimenting because the key is first figuring out, ‘how can we use this to make it make sense for our newsroom?’”
Plenty of organisations are doing this already. Associated Press published their guidelines on AI in August 2023. At Silicon Republic, we also created a set of editorial guidelines which does not shy away from experimentation but explicitly states the importance of not using it to write articles for the site and, above all else, be extremely transparent so that our readers continue to trust what they are reading.
“I think the big question now is how much AI work will audiences see? And I don’t know what the right answer is for that. Sports Illustrated is the latest in a long string of organisations that have created gen AI articles. In Sports Illustrated’s case, completely generative AI, fake journalists, and they really been burned for it. The reason is, they did so, unethically. They didn’t do it transparently,” said Mahadevan.
“News organisations need to figure out how much AI their audiences are comfortable with and then move on to the ethics… the key actually to all of this is just there needs to be someone in the newsroom who’s experimenting with these tools right now.”
Media literacy and fact-checking in the digital age
Mahadevan started with MediaWise in 2019 as a media literacy trainer, a skill that is becoming more important every day as clickbait headlines and unverified information make their way into the online content that readers consume every day.
With generative AI producing ‘hallucinations’ and presenting them as facts and deep fakes becoming more common, how can both media professionals and news consumers separate fact from fiction? From the journalism side, Mahadevan said it’s important that news outlets spend more time on fact checking and debunking false information within their coverage.
“It’s not necessarily the sexiest thing to do, but there are also lots of studies that show that audiences like factchecks. One thing that I’ve encouraged a lot of news outlets to do is when they’re covering a certain topic that involves a lot of misinformation, spending a lot of time explaining why this misinformation is spreading,” he said. “It’s the idea of ‘pre-bunking’, explaining to people why they’re being manipulated by this.”
He also said news outlets should listen to reporters who spend a lot of time in online spaces where misinformation thrives, citing an example of a reporter who was hearing a lot of online chatter at the end of 2020 about people who were planning to go to Washington. She flagged it with her editor who said it was more than likely people on the fringes and not to worry about it. “We all know what ultimately happened,” said Mahadevan, referring to the Capitol riots on 6 January.
“All of the mainstream misinformation that you see starts on the fringes, and the distance between the fringe and the mainstream is getting smaller and smaller. So now it’s a lot quicker for an idea to start spreading on Telegram and Discord and it gets to mainstream a lot quicker than it used to.”
At MediaWise, Mahadevan and his team aims to create the quickest media literacy interventions to reach people where they are in order to help them learn how to seek out good sources, how to follow information to the original source, how to do research on Wikipedia and more.
“We create teen factchecking network videos on TikTok for young people, so we hope that we can reach as many young people there. For older adults, we used to do webinars on Facebook, and we’re doing a programme with libraries. But the problem is it’s very expensive to do,” he said. “So my challenge is always how can we reach the most people in an impactful way with the resources that we have?”
What does the future hold for journalism?
Between generative AI, increasingly impressive disinformation and major changes in news consumption trends, what does will journalism and news production look like a decade or more from now? One change Mahadevan sees is hypercharged recommendation algorithms for all news, which will also be hyperlocalised. “So anything I read will be completely different from what you read, down to articles from the exact same news outlet will be structured differently based on who is reading.”
For anyone who has noticed how sensitive some of our social media platforms are to our preferences, this could mean that people are further sucked into their own echo chambers and confirmation bias, missing out on news that they may not ‘like’ but that is truly important.
“We need to figure out how to get people the news that is important and necessary and avoid clickbait and I think it comes down to re-evaluating the business of news,” he said. “Nonprofit news outlets are in a very good position… they do not have to create clickbait. They cover stories that are important to people, that are impactful, that bolster the public good. They are able to do that because they are funded from donations.”
News outlets that are funded by the public means that they are not beholden to traditional advertising models and the strings that come with that. However, they are still beholden to money and therefore are often at the mercy of economic headwinds. When you consider the idea of nationalised news outlets, political issues can come into play. Figuring out this will take time but it’s undoubtedly something that will change how news is created in the future.
“I think another big shift in the news industry too, is I think you’re still going to see news outlets figure out how to leverage their journalists as influencers or partner with influencers to make good journalism with them,” Mahadevan added. “AI is going to hypercharge that because what I think is in 15 years, half of [influencers’] content will be completely synthetic. They’ll just create half of their videos with deep fakes… and so I think you’re just going to see so much influencer content and then that’s where the majority of people are going to be getting their news.”
Find out how emerging tech trends are transforming tomorrow with our new podcast, Future Human: The Series. Listen now on Spotify, on Apple or wherever you get your podcasts.