“Media companies continue to bet on artificial intelligence as a way of delivering more personalized experiences and greater production efficiency. More than eight-in-ten of our sample say these technologies will be important for better content recommendations (85%) and newsroom automation (81%). More than two-thirds (69%) see AI as critical on the business side in helping to attract and retain customers.”
—Reuters Institute’s Journalism, Media, and Technology Trends and Predictions 2022
Data scientists at The Guardian joined forces with Agence France-Presse to find a better way to identify and attribute quotes, using machine learning. “This could help create new beats focused on accountability reporting like tracking how public personalities’ opinion changes over time by searching and comparing their quotes from archives,” reported Journalism.co.uk.
On Aug. 25, we delved further into this in the first episode of a new webinar series called Signature Live with publisher Carla Kalogeridis. The webinar focused on the realities of artificial intelligence for publishers and media companies—both from a content and editorial perspective, but also from an ethics perspective.
One of Carla’s guests was Paul Lekas, senior vice president, global public policy, for SIIA. “Some publishers recognize the efficiencies and consistencies that AI can bring, but they aren’t completely sold on the ethics of it in certain types of work,” Lekas said. “The biases of the programmer become the biases of the AI.
“I see a role for it on the back end, but some publishers may be concerned about the ethics of content generation. AI can help publishers generate content, but the ethical questions come in when AI is doing something that humans should be doing. You might use AI to generate a first draft, but there’s the potential for unintended bias if you don’t bring in a human to finish it up.”
The other guest on Thursday was David Brake, COO of PageMajik. “If you imagine the roles of all the players in a publishing workflow as both a source of data and someone who acts upon data, you’ve taken a step toward recognizing narrow AI and smart technology as a solution,” Brake said. “By automating the exchange of data and using a technology that acts upon that data on a role player’s behalf, you’ve put both feet in the water.”
Here are other ways that AI is already positively impacting what we do.
Go through vast amounts of data. Emilia Díaz Struck, research editor and Latin American coordinator for the International Consortium of Investigative Journalists (ICIJ), said they use machines to look into large datasets and flag up anything unusual. Diaz Struck and her team used this technique during their Implant File investigation that “uncovered hundreds of patient deaths that were misclassified as medical device malfunction.” In a Q&A with JournalismAI, she said: “What we have figured out is that AI can be really powerful but it’s not magic. I see there is a lot of potential to use machine learning in the type of work we do because we deal with vast amounts of data.”
Track diversity of sources quoted in a publication and bylines. This has become hugely important with the reckoning the industry has faced over the last year. New gender pronouns and the need for more ethnicity metadata would require more refining. “We want to advance the measurement science for being able to not only understand what [fair AI] requirements and practices are but be able to measure them in quantitative or qualitative work,” said Elham Tabassi, chief of staff, IT Lab, NIST.
Search through archived material. “[Sweden’s] Sveriges Radio’s global news podcast recently talked about new developments in the murder of the Swedish prime minister in the 1980s,” reports Journalism.co.uk. “However, the producers soon realized some of their listeners were not even born yet and knew little about the case. To help them out, the team used soundbites from archived shows that featured important information… Taken further, the AI system could recommend more soundbites to listen to according to users’ preferences, instead of suggesting entire shows, making audio content more engaging and relevant.”
Improve the tools that we use. Julie Babayan, senior manager for government relations and public policy for Adobe, gave an example from Photoshop of how having diverse representation led to spotting a bias in the development of their new Neural Filters—allowing them to update the AI data set before the feature was released. “We spent a year-long process to make sure that we were developing AI technologies in an ethical and responsible and inclusive way for our customers and for our communities,” she said.
Scrape big datasets. Last year, Gabriel Kahn, a journalism professor at USC Annenberg School for Journalism and publisher of Crosstown, a non-profit community project, said they use AI-powered tools to scrape public datasets and store content in the cloud, reported Journalism.co.uk. “Humans then turn that data into narratives that address people’s concerns around topics as varied as crime, traffic, air pollution or coronavirus. By having a location tag on each piece of data, every story can be turned into neighborhood news.”
One other great resource is JournalismAI, a global initiative that aims to inform media organizations about the potential offered by AI-powered technologies. It is a project of Polis—the London School of Economics journalism think-tank—supported by the Google News Initiative.