How Journalists Can DeMystify AI's Role in Politics and Elections
Artificial intelligence (AI) is already influencing various aspects of our lives. Its impact in political realms has raised concerns about the spread of misinformation, especially during pivotal elections. As Argentina's recent elections echo globally, the intersection of AI and politics becomes increasingly pertinent.
At a recent ICFJ Disarming Disinformation: Investigative masterclass, AP global investigative journalist Garance Burke delved into effective reporting on AI's political influence.
Understanding AI begins with fundamental terms. AI encompasses computer systems emulating human cognition but falls short of replicating human intelligence. Algorithms are sequences guiding problem-solving, from speech recognition to predictive tools. Large language models (LLMs) like ChatGPT use data patterns to generate responses, shaping conversations.
Journalists must ask essential questions when exploring AI's impact: How do these systems function? Where are they utilized? What is their efficacy?
“It’s important to demystify the ways in which these tools work and help bring your audience in so that they can understand as well,” said Burke.
Navigating AI complexities benefits from external resources. AP Stylebook and the Aspen Institute offer valuable primers for journalists. Engaging tech gurus, AI experts, academics, and relevant NGOs provides diverse insights, even in regions with limited AI development.
"[AI is] one of those realms where engineers feel like journalists never really understand," said Burke, emphasizing the importance of seeking expertise to enhance comprehension.
Additionally, AI's role in elections varies by region. While it might drive information dissemination in some places, in others, traditional methods like community engagement hold sway. Understanding local AI usage and data availability remains crucial, even where AI's direct impact seems minor.
In countries like China, Israel, Australia, and India, AI, combined with mass surveillance data, influences dissent suppression, posing grave concerns.
Perceiving AI as an autonomous force is misleading. Humans create these tools, shaping their capabilities. While AI might target susceptible individuals with misinformation, it won't spontaneously fabricate narratives. Understanding these tools' capabilities and limitations prevents overestimation or misconceptions about their autonomy.
Human-centric storytelling trumps technical jargon. By focusing on individuals affected by AI systems, journalists connect readers with real-life impacts. In an AP report on a child welfare screening tool, highlighting people's experiences with the tool drove empathy and informed policymakers.
At the core, the aim is not to judge AI but to elucidate its workings. Empowering the public with AI insights allows them to make informed decisions—the ultimate goal for any journalist.