Fighting misinformation with social media influencers – Scholar Q&A with Staci Smith

Jonathan F. McVerry

Title card featuring a smiling woman with curly hair.

The rise of the social media influencer has turned the public communication world upside down. From gaming to health to politics, these individuals shape attitudes by sharing information with followers who trust them and rely on their advice. Of course, this opens the door for misinformation to spread unchecked. To counter this, two-time Page Center scholars Staci Smith and Brian Smith, University of Mississippi and Karen Freberg, University of Louisville, are leading a project that will help organizations prevent mis/disinformation. They are partnering with Mario Nicolini, North Atlantic Treaty Organization (NATO) Strategic Communications Centre of Excellence and Angela Dwyer, Fullintel. The team is developing a guidebook for government communicators to work with well-connected influencers to prevent the spread of misinformation about political and social issues. The project is part of the Page Center’s 2024 research call featuring scholar/practitioner collaborations. In this Q&A, Staci Smith discusses the collaboration with NATO COE, what sparked the idea for this project and how a suggestion from the Page Center advisory board added AI to the mix.

How did this project come together?

I was a Fulbright Scholar in Slovenia last year. While we were there, NATO COE invited us out to do a training with government communicators on best practices in influencer relations. Following the training, we met with the NATO COE leadership, who expressed concerns about the spread of faulty information through influencers who were often misinformed about government issues. NATO communicators told us that there was very little guidance on how to work with influencers to ensure reliable information is distributed to their followers. We designed a research project together that would help them address this gap in expertise and would lead to developing a guidebook for government communicators to work with influencers.

The only problem was that the NATO COE was having trouble getting funding to have us conduct this research. We were trying to find alternative options for funding when the Page Center call came out about strategic partnerships. It was a perfect fit for our project.

What does it mean for an organization like NATO COE to do influencer relations?

Influencer relations is about moving away from a traditional information distribution model and towards a relationship-focused approach to working with influencers. We recognize that influencers are gatekeepers to their followers, but that doesn’t mean we treat them like information subsidies. Influencers work in the currency of relationships. For the NATO COE, this means helping government communicators work with influencers on the influencers’ terms. It’s government communicators understanding how influencers determine information accuracy and working with them to counter misinformation (rather than just giving influencers NATO-sponsored content and telling them to post it). It’s a trust thing and balance of power thing as well.

So, we’re looking at countries where the management of misinformation is going well, and where it’s not going so well, with the help of the NATO COE. We’ll be interviewing both high-ranking government communicators and the influencers they work with to understand the imperatives for effective relationship building in government influencer relations. This is where the Page Center has been crucial. The funding will allow us to incentivize officials and influencers who are the power players in this space to give us their time for an interview.

Later in the process, you added Angela Dwyer from Fullintel to the collaboration. How did that come together?

One of the suggestions posed at our Research Roundtable* was to examine how AI plays a role in all of this. Having had previous conversations with Angela on this topic, we decided to reach out to her to strategize ways she and Fullintel might help answer the AI question.

From those discussions, we have added a media analysis examining influencer content and AI in political discourse on social media. Specifically, we’re looking at the spread of misinformation in political social media content, and to what extent influencers and AI have been involved in its dissemination. We think this data will complement the findings from our interviews to provide a holistic picture of influencer- and AI-driven misinformation in government and political social media content.

What happens if you find that most of the information is actually pretty accurate?

Wouldn’t that be awesome? We’d love to find that influencers are actively avoiding inaccurate information. But unfortunately, that’s not the reality. In fact, if you do a quick search of political trending topics today—say the presidential campaigns, for example—you’ll find significant misinformation is flowing on social media from both sides of the political aisle. Still, even if we find a dearth of misinformation, our study isn’t about proving that it exists or not. It’s about providing effective guidelines for influencer relations to prevent the spread of misinformation. So, how are government communicators working with influencers to share accurate information with their followers? What are the hallmarks of a good strategic communicator-influencer relationship? What are the critical influences on that relationship? And on the influencer’s evaluation of information accuracy? These are all questions that, when answered, will provide direction for effective influencer relations.

How does resilience fit into all of this?

The phrase that came out of our meeting with NATO COE was “misinformation resilience.” How are government communicators helping influencers build resilience to misinformation and, consequently, how are influencers doing the same with their followers? Many of these influencers are opinion leaders for young adults, teenagers, and even kids. These are groups NATO COE is specifically interested in because they may take influencer content as truth, especially given the trust they often put in influencers.

We also realized that this level of influencer relations can be dangerous. They're working with highly sensitive information, right? NATO COE leaders told us that some influencers they work with just disappear. It was one of the things where Brian and I both were like, “Whoa, this is real.” So not only does NATO COE want to know the best ways to work with influencers to spread accurate information, but they also want to keep influencers safe.

*The Research Roundtable is an annual meeting where scholars present their research ideas and completed research to the Page Center Advisory Board.


Topics:

Blog Post Type: