Sally Lehrman's "Trust Project" is Bringing Journalism Back to Its Public Service Roots
Perceived trust in news organizations is declining worldwide across various media platforms, especially in the context of online news credibility. Factors contributing to this decline include increased social media usage, generalized skepticism towards digital information, and a lack of transparency in news reporting. To counteract this trend, news organizations are focusing on strategies that prioritize transparency and public accountability, with the understanding that these can enhance credibility perceptions.
The Trust Project, for example, has introduced multiple transparency initiatives known as "Trust Indicators." Three of these indicators include best practices, journalist expertise, and labeling, which aim to enhance the perceived credibility of news content. However, for these elements to be effective, users must notice them.
According to Sally Lehrman, a Peabody Award-winning journalist who is The Trust Project’s Founder and CEO, the organization was born out of a desire to continue serving the public interest. “We were trying to stem this tide of trust that has declined in the news as we moved into the digital space and then as misinformation and disinformation flowed in, it got even worse,” she said in an interview with the Association of Foreign Press Correspondents in the United States (AFPC-USA). “That then took over people's hearts and minds in some ways. So how do we win them back? That's what The Trust Project is about.”
A recent eye-tracking experiment conducted to gain insights into the impact of the Trust Project's Trust Indicators in online news articles on perceptions of credibility has yielded valuable findings. This study explored how users form assessments of online news credibility, combining measurements of user behavior, visual attention, and page viewing patterns. Participants read two investigative news articles from a selection of six, each varying in length and subject, while their gaze was recorded with an eye tracker. After reading each article, participants answered questions about both the article and the website.
The results illuminated the multifaceted nature of online news credibility assessments, revealing that users weigh numerous factors during the process. One notable discovery was that transparency indicators positively influenced credibility perceptions. The mainstream news skeptics cohort—the group most vulnerable to mis- and disinformation—was most impacted by journalistic transparency elements and the way these transparency elements and labels are designed impacts their effectiveness. Notably, best practices such as journalistic standards, funding information, or an explanation of reporting practices, and viewing of site pages related to policies and practices about funding, staffing, reporting, led to the most positive effect on credibility.
The Trust Project’s work is about “engendering an informed public and about helping people have the tools to make informed decisions,” said Lehrman. Her insights into the symbiotic relationship between journalism and the people it serves offer a valuable roadmap toward a future that is, in an age of disinformation now punctuated by the dawn of artificial intelligence, perhaps brighter than we think.
The following interview has been condensed and edited for clarity.
What is the Trust Project, and what is its primary objective in conducting this eye-tracking research?
The Trust Project is now 300 news organizations strong. We have news organizations that range from small digital natives to large broadcast-only and legacy print organizations throughout Europe and Latin America. We also have every province in Canada and most of the United States as part of The Trust Project, as well as Hong Kong. In my primary journalistic career I was covering mainly science and social issues related to science as well as business components where they intersected with science. I was very much interested in ethics related to science and how the public really deserves a voice in science. It's publicly funded after all, and also has these deep effects on our futures, each and every single one of us, in various ways. In my work as a journalist, I've always deeply cared about our work and how we really adhere to this standard of trying to serve the public interest.
As we moved into the digital environment, [I thought about] how we can represent our work and help people see that it's different from everything else you see in the digital space, especially when early on things looked very much the same. And I would say actually, even now things look very much the same, whether or not it's intended to sell you a pair of shoes, or pull you over to a particular political point of view or now even sell you a conspiracy theory or incite you to violence.
Back when everything was moving into digital. I think there was a lot that was happening accidentally and news organizations unfortunately took the step of trying to look more like everything else because they wanted to get the clicks. And now we see disinformation deliberately competing with news and looking alike. So again, the effort here really was to how do we separate journalism news created through this journalistic process, which actually does have a lot of rigor to it? How do we separate that from everything else and how do we help people understand why that difference matters? So that was how The Trust Project was born.
How do Trust Indicators contribute to building credibility and trust in online news articles, and what specific transparency elements do they encompass?
The idea was [to figure out] how we differentiate the trustworthy news from everything else. And what we did was go out and talk to the public and ask them: “What is it that you value in the news and how do you decide whether to trust it?” We use this user-centered design process that I learned as an alum of the Knight Fellowship program at Stanford, and talked to people in the U.S. and Europe. We brought back these user insights to news executives. These weren't focus groups or anything like that—I called it “ethnographic-lite” through really talking to people and observing their behavior. We did workshops in Europe, in the U.S., bringing in senior news executives and saying, “Here's how users are making their decisions and so how do we marry that with journalistic values? How do we communicate what it is we do?”
This rigor behind what we do is the context behind what people are looking for. And out of that came the eight trust indicators and that's what we see on Trust Project news partner sites. So all of our partners put these [indicators] on their pages and they adhere to practices behind them, they adhere to commitments of integrity, and when they say something like, “We have a commitment to inclusive reporting,” then we make sure they provide the data to show they really do that in terms of their staffing. They really have a commitment there. And then we also ask them to do the training and they make that effort. So it's not just a statement, it's actual commitment behind it, action behind it.
Could you elaborate on the key findings of the research and their implications for news organizations and the public?
This latest study was an effort to see how we can refine these trust indicators on these news pages. What kind of advice can we give to news organizations to help them use them more effectively? What can we learn about the way [news consumers] approach news articles and what catches their eye? What doesn't? We really wanted to look at the design side of the trust indicators. I mentioned this user-centered design research; we had asked the Center for Media Engagement at UT Austin to do an independent study to see whether [trust indicators] really worked because at a logical level they did but do they really? They found a statistically significant difference between perceptions of a news site and perceptions of the journalist when trust indicators were present. And it's important that it was the group of trust indicators, not one or another—it was a whole set. And so we were excited about that.
This study deepens our understanding of how you make sure that people really have something; we’re putting [forward] our best design to get people's attention and make sure that we could get that result. I think the two most important things that came out of this study are first of all, that people really are looking at, seeing, or paying attention to these trust indicators. It's not just an impression. So from the earlier study, it could have been just an impression of all these things in place. But no, they're actually paying attention. The journalists’ expertise was very important to users. The way it works from a trust indicator standpoint is we have four pieces of information about a journalist on the article page that links out to a journalist’s page with more information. I think that's very important because people need to know who's behind the news.
When we talk about journalists’ expertise, we have a whole set of requirements around what should be stated about the journalist and they include the usual things like, “What's your background?" but they have some additional pieces like local local expertise and demographic expertise. Those are things that people told us they were looking for. So where are you from? Or where do you live? Do you have an interest? Or are you part of a particular community that you cover? So for instance, some of our news sites were a little cautious about saying, “Well, we're, we're an expert in a demographic group.” But instead you might say, “Well, I have a special interest in covering migrant communities or covering LGBTQ+ communities. Or maybe I am part of this community, and that's why I care.” [It’s about] showcasing your expertise and making a connection. Also important: languages. People talked about wanting to know what languages [journalists] speak, and that's partly to help them know, “Well, can I speak to this person with confidence in my own language so at least they understand my people a little bit better?” These are some of the things that we asked to be presented.
What are some examples of Best Practices in transparency disclosures, and how do they influence users' perceptions of news site credibility and value?
Best practices explain what your policies and practices are behind the work you do. So how do you maintain that commitment to the public interest, to writing in the public interest, to avoiding being persuaded by the interests of your business that owns you. If we don’t tell [the public] what our agenda is, how we guard it, how we protect it and put some guidelines around it, they’ll assume we have a political or social agenda. So we have to be clear about the standards that keep us focused on your interest, as members of the public, and that’s how we guard against conflict of interest: Here’s how we deal with unnamed sources. Here’s where our funding comes from. And if our funding comes from government sources or from stock sales or if it comes from advertising, or somebody that's giving us grants, here's how we protect against them [from] interfering in our coverage.
What we found there in the study was that the skeptics, especially, were the ones that were reading best practices. I found that very encouraging, because that means that we can win them over through explaining these things. Those were the two of the bigger findings. The third thing highlights that design matters, so putting these different elements of transparency on the page in places where people will spot them, see them, pay attention to them, is extremely important.
In terms of design, can you provide insights into how the prominence and presentation of transparency elements and labels affect user interactions with online news articles?
There’s many different sides to what we ended up with. There were six different sites [that paid] varying attention to these elements, and some of that had to do with the length of the article that was commonly shown. A news source like the Washington Post tends to have longer articles and people would naturally spend more time on [those] articles. Other factors were in terms of where people were spending more time. We recommend putting your transparency elements [where] people can see them. What do I mean by that? A lot. You have to imagine that people were mainly coming to your site from social media and [in] this experiment, they were coming from Facebook. Some people will of course have brand loyalty and they will start with whatever your news organization is, but we know the majority of news people are [consuming comes from] these other technology platforms. They already know what the story is about before they’ve clicked on it so that means they often skip the headline, skip the photo. They [tend to] start at the byline, where the top of the story is. The core moment to capture someone’s attention is right then.
I think the jury’s still out a little bit on where the best place to put that is. However, the first thing that they'll see if it's near the top is [that] this is a trusted news organization, at least according to all these commitments they've made through The Trust Project. We know from a different study also done by the same researcher that that makes a difference. Maybe when they get to the end and it's a good time to tell them, “Here, you can look at our policies and learn more.” I would just say the main thing to think about is: “When do you have the person's attention?” And don't bury it. Don't bury these things. [People] weren't seeing them. They weren't affecting their perspective on the news organization, because they probably often may have just missed it. But we do need to communicate that we are trustworthy before they even start. Otherwise, we've lost them for another reason.
Given the current landscape of declining trust in news media, how do these research-backed solutions offer hope and actionable strategies for news organizations?
To me they offer a lot of hope. One reason is—just in this study—is because we saw that skeptics were looking at best practices and we saw that everybody else, and maybe including them also, was looking at journalistic expertise. Transparency does work. You'll see other studies that call that into question but this is a really rigorous eye-tracking study and it follows in other data that have found that it works. The second reason is the Trustmark logo itself. It builds confidence in the news site when it's clear the news organization has made a commitment to participate in The Trust Project. Just the commitment to transparency made a difference.
We did a study in 2020 where we asked the same questions: “How do you decide what you value? How did you decide whether you trust the news? What do you value in the news? Why do you even care?” We saw a lot of similarity across the different users in the U.S. and in Europe. And then this next study was even more geographically broad. That's how we were able to come up with the trust indicators. We were able to define four user types [that] saw the value of being informed… In our update to that work, we found everybody had become more engaged with the news. Unfortunately, the angry disengaged [and] were engaged with conspiracy theories and falsehoods, but everybody else had become more engaged… Everybody also expressed a lot of anxiety: They knew that there was a lot of false information out there. This group, the “anxious middle,” is looking for ways to assess news, they're looking for ways to find that news that they do value and know that it's trustworthy.
Trustworthy news will separate news from opinion or separate opinion from everything else. Trustworthy news will tell you who owns the organization, explaining their funding. Trustworthy news will bring in diverse perspectives and then bounce links to our website [that] explain all the trust indicators. As we think about this anxious middle, what excited me most about the results of that study is that 60 percent of people said they felt more confident in assessing the news after they had read that page.
We explain to [news organizations] that no matter if you're part of The Trust Project or not, you can do those things and you can also do the second piece, which is just talk to the people… to earn their trust. You can point directly to trust indicators. And it's so important right now, with what we know about false information spread deliberately through social media, through websites that look like legitimate news organizations but are not. And now through generative AI, which just gives so much more power to falsehoods, sometimes unintentionally, sometimes very much intentionally.
Could you discuss any plans or initiatives the Trust Project intends to undertake based on the findings of this research to address the crisis of trust and disinformation in the news?
We’re doing another study right now [where we are] trying to dive deeper into the trust indicators and how we can present them in a way that strengthens trust. This study and really most of the studies we've done also point to a really important piece about survival of news organizations because we do see that trust connects to loyalty [and loyalty] connects to willingness to pay.
If people are more willing to trust you then they're going to stick with you especially in these very troubling and troubled times. The next study is going a little deeper into some of these questions looking at two additional trust indicators. One is methods, which is where they're connected, methods and references… We wanted to see where the best place to put that is, like if you're going to explain, “Here's how we went about reporting the story” and “Here's some sources that we used.” We found the reason we have references at all is because we found people don't really respond to [hyperlinks] within the story as much [because] they weren't sure where they were going to. Is it going to go to an ad? Is it going to go to another like the homepage of whatever research it was?” I’m not saying that links in the story hurt trust. It's just that they don't do the job of building trust that many of us probably assume that they do. What is the best format? Is the bottom the right place? Is it the middle or what? And methods? Same thing, so what are some of the best ways to do it?
I want to emphasize that we're not asking for people's blind trust. We're asking for people to have the tools to assess news and for us to present information to enable them to assess it. It really is about engendering an informed public and about helping people have the tools to make informed decisions.
How do you envision the role of transparency and Trust Indicators evolving in the future of digital journalism to combat misinformation and enhance news credibility?
One of the things I'd like to do more is go into these spaces where people are trading misinformation and disinformation, often unknowingly, and continue to build the number of news organizations that are actually showing [trust indicators] on their pages because that builds a consistency across all of our sites. The second is making the public aware, addressing the anxious middle. The third is bringing in more stakeholders. So who are the stakeholders who should be participating in promoting trustworthy news and really helping the public recognize it? Of course news organizations themselves. The advertising community, they're worried about brand safety and brand suitability. They need to step up a little bit and really support the news organizations that are taking the steps to be more trustworthy, and then brands and corporations that also have a stake, they want to be socially responsible.
I want to build a global interlocking system of awareness around what trustworthiness looks like because we are facing increasingly difficult challenges that leave the public, the people we hope to serve, really questioning whether they can trust anything, and when they do that they disengage with the democratic enterprise. And we need them. We need them. There's no way to overstate how much we need them.
We might learn new things that we have to disclose, like with AI. For instance, we're looking at how we [can] become more transparent about when AI is used and when it's generative AI versus AI so that people don’t assume that it’s used when it’s not… There will be new media that we have to think about communicating more effectively. We've been working at just getting the trust indicators more directly integrated into video and audio. And explaining like when you're listening to a podcast, when are you actually hearing journalism like reported fact checked information? When are you hearing an expert point of view or when are you hearing just a story, like a narrative piece? Becoming more transparent in ways that we may not have thought we needed to be, that's where we'll start to see it more and more.
Our goal as journalists is to serve the public and that means not treating them just as customers or treating them as objects that we're dumping information on and often not. Dealing with people and continuing to learn about their needs and wants and being responsive to them and being accountable to them. And that's where we differentiate ourselves. We are accountable, and that's where we differentiate ourselves from the machines as well as the fakers.
Alan Herrera is the Editorial Supervisor for the Association of Foreign Press Correspondents (AFPC-USA), where he oversees the organization’s media platform, foreignpress.org. He previously served as AFPC-USA’s General Secretary from 2019 to 2021 and as its Treasurer until early 2022.
Alan is an editor and reporter who has worked on interviews with such individuals as former White House Communications Director Anthony Scaramucci; Maria Fernanda Espinosa, the former President of the United Nations General Assembly; and Mariangela Zappia, the former Permanent Representative to Italy for the U.N. and current Italian Ambassador to the United States.
Alan has spent his career managing teams as well as commissioning, writing, and editing pieces on subjects like sustainable trade, financial markets, climate change, artificial intelligence, threats to the global information environment, and domestic and international politics. Alan began his career writing film criticism for fun and later worked as the Editor on the content team for Star Trek actor and activist George Takei, where he oversaw the writing team and championed progressive policy initatives, with a particular focus on LGBTQ+ rights advocacy.