Journalists, Be Warned: ChatGPT Can't Be Relied on to Assist with Climate Reporting
ChatGPT is having a large impact on the way journalists approach the process of reporting in general. ChatGPT is a useful tool, as it can sum up large amounts of information in very short spaces of time and help journalists outline their stories. ChatGPT has its limits, however.
For example, ChatGPT will fully make something up if it doesn’t have the answer to a question, or else it will invent sources and cite them as fact. For sensitive topics like climate change, this can pose a real issue.
In order to get a better idea of how ChatGPT can actually assist, we asked ChatGPT itself, but also consulted outside sources to interpret Chat’s work, as it lacks self awareness.
Here is what it said:
“As an AI language model, I can provide assistance and support for journalists and media professionals who are reporting on climate change. Some of the things that I can do include:
“Providing context and background information on climate change: I have been trained on a large dataset of language, which includes information on climate change and related topics. I can provide context and background information on climate change to help journalists understand the issue and report on it accurately.
“Generating insights and trends: I can analyze large amounts of data and identify trends and insights related to climate change. This can be helpful for journalists who are looking to identify patterns or newsworthy angles in their reporting.
“Providing language and style suggestions: I can suggest language and style improvements that can help make climate change reporting more effective and engaging for readers. For example, I can suggest using active voice, including specific data and statistics, and avoiding jargon or technical terms that may be unfamiliar to some readers.
“However, there are also some limitations to what I can do. For example, as an AI language model, I do not have personal experience or expertise in climate change, nor do I have the ability to conduct original research or interviews. Additionally, I am not able to make decisions on behalf of journalists or media outlets regarding how they should report on climate change.
“In summary, while there are some limitations to what I can do, I can provide valuable support and assistance for journalists and media professionals who are reporting on climate change.”
Looking from the outside, ChatGPT presents some more limitations.
CHAT LOVES GENERALIZED ANSWERS; CLIMATE CHANGE REQUIRES SPECIFIC ONES
Some of Chat’s top tips to combating climate change include “reduce greenhouse gas emissions,” “promote sustainability” and “support climate adaptation.” These blanket solutions are not enough to solve the climate crisis, but also—what does “promote sustainability” actually mean?
“As climate scientists and scholars like myself often say, individual actions are certainly important, but by far not the most important thing,” explained Jill Hopke, an associate professor of journalism and climate media scholar at DePaul University. Journalists need to “[interrogate] those approaches to inform the public and policymakers about what works and what doesn’t.”
CHAT CONTAINS IMPLICIT BIAS THANKS TO ITS CREATORS
Chat was funded through billions of dollars of research, and as such, is implicitly biased toward wealthy creators. For example, when asked about corporations contributing to the climate crisis, Chat’s initial response was to protect corporations: However, it is important to note that there are also companies within these industries that are investing in renewable energy and working to reduce their carbon footprint. Chat is not necessarily incorrect, but this is reductive of the work done to mislead the public about the climate crisis that corporations have invested in over the past 50 or so years.
CHAT CANNOT FACT-CHECK
In fact, Chat will completely invent data to support its points: including fake numbers, facts that do not connect, and even entire events that never happened. “ChatGPT might be a tool to supplement, but not replace, original reporting,” Hopke advised. While we are in the middle of the climate crisis, false information can be deadly, and Chat cannot be relied upon to provide accurate information or go through the proper channels of fact-checking.
While indubitably a useful tool, OpenAI’s software is still a new technology and has a long way to go before it can produce reliable, accurate information. It’s great for organizing and keeping information in one place, noticing patterns in the information, and providing leads—but it cannot do the reporting for reporters just yet.