October 20, 2022
'A video game to increase literacy of health misinformation' by senior research fellow Frank Dardis
The phrase “misinformation on social media,” unfortunately, has become quite common and outright mundane in modern times.
As we all know, with unlimited posts and threads running in all directions simultaneously, there are limitless possibilities for all the directions, angles, and paths that the countless lines of communication can take. And of course, once all that occurs, it can be really difficult for people to understand – or even know – what the original information or message about a topic was in the first place.
Some of this confusion stems from disinformation attempts of disseminating knowingly false information, which of course can be very manipulative and effective. But some of the gray area also can be attributed to simple “dialogic” elements that have been around forever and are not necessarily sinister or purposively manipulative.
At the same time they can be disseminated at an exaggerated rate via social media … people perhaps sharing their own opinions or “takes” on the original information, adding some of their own inputs and inquiries, questioning the original information, and so on. All such activity and conversation usually is good for any democratic society.
However, social media obviously allows for basically unidirectional paths of information to develop, as opposed to what would have been a traditional back-and-forth conversation. And once these paths of information start flowing, it is very easy to get further and further from the original and actual information. Exacerbating this issue is that people by nature are quite selective in what information they attend to and how they interpret it.
So, the obvious paradox of social media is that, overall, yes, there is plenty of information or interpretations “out there” about a topic for anyone to examine, read, research, etc. Likely more so than ever before. But at the same time, depending on whatever information “path” or trajectory an individual tends to be in or exist in at that time for whatever reason, they obviously would only be seeing information that relates to that portion or interpretation of the overall topic. They may also start from one foundation over another and therefore exist only in that “realm” of information, and so on … whether or not they are aware of why they are seeing what they are seeing, and whether or not they care.
Study on how a video game can increase literacy of health misinformation
The above characteristics of social media are quite established, and they are not even necessarily “bad” in and of themselves when it comes to things like sharing information with other people. But these same characteristics can become quite harmful and sometimes injurious or deadly when it comes to disseminating accurate medical or health information.
Therefore, using a grant from the Page Center, I am working with several Penn State colleagues in the Donald P. Bellisario College of Communications Video Game Research Group and the Media Effects Research Lab to execute an experiment that allows participants to play a “misinformation game” in which they score points by being able to correctly identify which elements of different social media posts that offer medical advice are accurate, questionable, or outright false.
Games have been shown to allow for enhanced learning and engagement compared to traditional teaching or tutorial models.
The current game was envisioned by the research team and was created by a professional game developer with Page Center funding. The funding is also supporting the experiment this academic year. The results of the study – and future use of the video game itself – could be used afterward to help people learn how to better identify medical misinformation on social media.
Game players will receive training regarding which elements of social media messages typically could raise red or yellow flags. Then each player will evaluate several posts while being allowed to further research any elements they deem questionable. To score proper points, the player ultimately must correctly identify up to seven elements on each post as being accurate, neutral or inaccurate. The game also deals with only concrete, factual information about several well-established medical topics (not COVID), so it controls for any broader subjectivity or ambiguity that any participant might have.
The overall objective of the research is to enhance literacy of misinformation on social media – in this case by focusing on specific message elements within a health/medical setting. The elements themselves are not unique to health information specifically, so the greater hope is that participants’ general ability to identify any misinformation on social media will be enhanced, too.
fed3@psu.edu. This project is supported by Dardis’ position as a senior research fellow for the Page Center. Results from the study will be available in 2023.
For further information on this study, please email Dardis atTopics:
Blog Post Type: