Penn State professor S. Shyam Sundar says he loves sitting down with coffee and a newspaper in the morning, getting his “news fix” for the day.
“I usually end up spending 20, 30, 40 minutes reading The New York Times, Centre Daily Times, and other newspapers,” he says. “Even if it’s online, I have a dedicated time to do it.”
But he knows that’s no longer the way many people consume news. More often now, people are scrolling on their mobile phones, sharing items from their personally customized feeds on social media along the way.
While this process is efficient for people on the go, it also makes us vulnerable to misinformation.
“The processing is not very analytical,” says Sundar, a leading researcher on fake news and misinformation. “It’s much more based on what’s attention-grabbing, what’s clickbaity, those kinds of things. That’s the information environment within which [many] people are consuming news these days. Our research shows that people are less vigilant on mobile phones.”
Sundar is the James P. Jimirro Professor of Media Effects in the Donald P. Bellisario College of Communications, founding director of the Media Effects Research Laboratory at Penn State, and director of the Penn State Center for Socially Responsible Artificial Intelligence, and last year was named an Evan Pugh University Professor, the highest honor that Penn State bestows on a faculty member.
In the past, Sundar says, “We didn’t feel this burning desire to share the news as we consumed it. Maybe at some level we loved to be able to share that in a cocktail party. But it was not like every time a story catches your attention, you need to tell somebody about it. But now, that seems to be the norm. As a result, things get immediately viral. While I think it’s fine for a good story to go viral, there’s a heightened chance of misinformation going very viral as well, and very quickly.”
A highly polarized political environment and an erosion of trust in traditional news media adds fuel to the fire.
Elon Musk often tells his X (formerly Twitter) users, “You are the media now.”
It’s enough to make many a social media user wonder who and what to believe.
One research nugget that helps illustrate how misinformation can spread quickly online: In an analysis of more than 35 million public Facebook posts containing links that were shared extensively, Sundar’s team recently found that about 75% of the shares were made without the posters first clicking the link to read the item, according to the study published in November in Nature Human Behavior.
“Extreme content and content that is aligned with your particular political view tends to be shared more, and more often than not, without being read first,” Sundar says.

Here are other highlights from our conversation, by topic:
Consider the Source
Sundar: Everything begins with a source. In traditional media, we know who or what the source is, but online, source is quite murky. One of the big problems is “source layering,” which is the idea that you get your information through a sequence of sources. So, it’s not just a news organization that is the source of your news. Somebody might tweet something that is then picked up by somebody else and perhaps even a news outlet or an online blog. And then you see it come to you on Facebook or even as a passing notification on your smartphone. When this happens, and it happens a lot these days, you don’t know who or what is the actual source. …
One [piece of] advice that I have is to try and locate the original source of the story as far as possible. Try to find the source signal in whatever you see online and pursue that, and try to go there rather than take whatever you see on Facebook at face value.
Video Fakes
Sundar: Fake news through video is much more serious and much more problematic than fake news in audio or text. And this is because when people see something with their own eyes, they tend to more readily believe it without stopping to think that videos can also be manipulated.
Watching a video evokes strong reactions, much more so than reading the same thing in text. Take, for example, the WhatsApp lynchings in India, about which there’s even a Wikipedia page. The case involved a grainy video of so-called child kidnappers being circulated through WhatsApp, which is an encrypted service. That means the videos circulated on this platform do not get seen by the general public because they are only shared among friends and family. Somebody played mischief by altering a public service announcement that showed two guys on a motorbike riding by a playground and swooping up a little child and going off, and at the end, there was a message that flashed on the screen saying “keep an eye on your children” so that they don’t get kidnapped. But somebody took out that message part and circulated only the grainy video with a scare-mongering note saying that child kidnappers were on the prowl.
This version of the message was forwarded numerous times on WhatsApp in India and went viral very quickly. And in many rural areas of India, when people saw anybody who looked even remotely like those two guys on the motorcycle, they would pull them down, beat them up, and in many cases kill them. And this has become a bit of an epidemic, not just in India. Such incidents have been reported in other countries with other doctored videos circulated via online platforms.
The reaction to video is so visceral, it’s so emotional, that people tend to act in a very direct and dramatic way. We see media effects in full bloom in video compared to text.
Diminishing Trust in News Media
Sundar: There has been an erosion of trust in traditional news outlets over the years, over the decades. … There’s been a greater faith in general, culturally if you will, in crowdsourcing. People like what other people have to say much more — we’ve seen this in several of our studies — than experts or trained journalists. And this is true across the board in a lot of domains. We used to rely a lot on Consumer Reports or the Good Housekeeping Seal of Approval for products. But these days we rely much more on what other people on amazon.com or e-commerce sites say; we look at how many stars there are, what other people’s experiences are about the product in their reviews. Or if we have to book a hotel room, we go to Tripadvisor and see how people have rated the hotel. We’ve become much more dependent on other users as sources than experts and experienced professionals. The same is true for news — we have come to rely less on professional journalists.
That respect, or the hallowed place that professional journalists held in our society, has diminished. And so, by extension, the place that major news organizations held in people’s minds is also sort of diminished. And that’s why I think you find an increasing willingness on the part of news consumers to buy into what seems to be said by a lot of other people, even though they are laypersons and not trained in journalism and often don’t know what they’re talking about, let alone what they’re sharing.
Vetting News Sites
Sundar: There are tools, like allsides.com, which actually array all the different sources from a left-to-right scale on how they are aligned, and there are certain neutral sources. If you have that kind of an understanding of the sources, where they come from — even these major news organizations — I think you’ll have a better handle on how to treat the information that’s coming from these sources.
Artificial Intelligence in the News
Sundar: The ways in which generative AI will be incorporated into news is going to be, I think, quite exciting. You’re already seeing attempts by several organizations to deploy large language models or generative AI tools to tell news stories to people. Traditional mainstream news-telling strategies that we’ve been so used to may not appeal to everyone, and/or may not be appropriate for everyone. But, if you can have customized GPTs [Generative Pre-trained Transformers] for individual users, conveying the news in a way that makes sense to each one of us, it is likely that news will become much more popular than it is.
The consumption of what we would consider traditional news has decreased. It could be because it’s not told in a way that appeals to the newer generations. Large language models or generative AI technology in general can be deployed to try to do better news storytelling because that’s what it does best — summarizing and retelling information. I can foresee all kinds of innovations in how news is being disseminated to individual users, not just to the public at large, even though it is mass media. But it’s mass communication that, in the last mile or in the home, is customized in a way that makes the most sense, just like we customize the voice of Alexa, or Siri. It might be such that the news would be told to me in a way that’s different than the way it’s told to you, based on my preferences and characteristics and vocal intonations, and so forth. That would make the news more accessible and appealing.
Misinformation Detectors
Sundar: There are several other possibilities with generative AI, including in the domain of misinformation. We could have better misinformation detectors, which should be able to detect fakes because in some ways they are better able to examine images at a more granular level, like the pixels of images, which may not be obvious to the human eye. Some things that are not humanly possible can be done with machines.
A colleague at our Information Sciences and Technology school and I had an NSF [National Science Foundation] project a few years ago trying to come up with algorithms for fake news detection. The basic premise is that AI algorithms would be better equipped to detect fake news than humans simply because they can do it at scale and much faster. We cannot be ever-vigilant all the time. That’s not human nature.
If we know one thing about human psychology, it’s that we are cognitive misers. We only expend as much mental energy as we think we need to, and so we always try to conserve it. And so that’s why we resort to shortcuts.
It’s important for us to understand that, and generative AI tools can, if deployed responsibly, be our more cognitively effortful confederates, if you will, to prevent us from falling for some of the misinformation and other forms of deception in our online information environment. T&G
Mark Brackenbury is a former editor of Town&Gown.