The Ethics of AI Usage
The use of artificial intelligence continues to expand across multiple industries, including journalism. While some are enthusiastic about its use, others still question the ethics of using AI as well as its place in the workforce. In terms of newsrooms, AI has been used for audio, photos, video, and word modifications. It has overall been of value, but AI has come with its own set of issues, some being ethical.
Since AI will clearly not be going away, it will be necessary for journalists to draw the lines and determine the best ways to address the ethical considerations and challenges of incorporating AI into their daily work.
The Prevalence and Risks of AI in the Newsroom.
For years now, AI tools have been used by journalists for content creation and to enhance journalism-based websites. These tools can help journalists analyze story performance, tracking metrics such as readership and time spent on a page. These insights are useful, as they allow reporters to test headlines and predict which stories are likely to engage audiences more effectively. They enable journalists to guide content creation based on audience interaction. It’s clear there is some clear value to the integration of AI in the newsroom, but it has a number of associated risks that need to be considered and navigated around.
While using generative AI tools can boost productivity and growth, it can also raise the risk of misinformation, copyright violations, ethical challenges, and a decline in public trust. Proper precautions need to be taken to ensure the information appearing in the automated responses is accurate and current. Any organization using chatbots will have to be transparent about its usage of these AI tools to meet ethical guidelines. Furthermore, a close eye needs to be kept on whether AI is filtering out certain content and unintentionally creating user biases.
Another ethical concern is the use of deep fakes, which have the ability to sway people who cannot tell the difference between a real clip and one that is fabricated or altered. Deep fakes have created much skepticism among media consumers regarding the authenticity of certain video and audio evidence.
Ethical Principles and Practices for AI Integration
Journalists can find various guides online to provide comprehensive overviews of how to integrate AI into journalism while minimizing the risks associated with it. Before integrating AI in the newsrooms, its roles and tasks need to be clearly defined.
Newsrooms should clearly determine specific areas for AI use while setting clear boundaries to prevent misuse. The key steps to accomplish that include identifying the primary goals for AI implementation, such as improving audience engagement or workflows. Transparency is also key for ethical journalism and should apply to AI use. Newsrooms need to maintain the trust of the audience by informing them about AI’s role in their content creation and distribution. AI-generated content should be clearly labeled to distinguish it from human-created material, along with an explanation of its purpose, limitations, and potential biases for the audience. It is best for internal transparency to be encouraged too so the newsroom’s staff can stay informed about AI implementation and its impact on their week.
Collaboration between newsroom staff and AI developers will be a necessity as well. The two parties must have a shared understanding of ethical considerations and the potential impact of AI on journalism. Open communication should be established between journalists, editors, and AI developers to address concerns and share their expertise. Facilitating regular meetings and discussions can also ensure that AI systems are aligning with journalistic values and ethical standards. AI tools can be used in a way that is not harmful to the main objective of informing the public and spreading the truth. Ethical standards can be maintained if all proper limitations are established.
Aaron Dadisman is a contributing writer for the Association of Foreign Press Correspondents in the United States (AFPC-USA) who specializes in music and arts coverage. He has written extensively on issues affecting the journalism community as well as the impact of misinformation and disinformation on the media environment and domestic and international politics. Aaron has also worked as a science writer on climate change, space, and biology pieces.