As technology continues to evolve at unprecedented rates, For the good and bad, they have a style of communication and public communication on social media. This rapidly changing landscape of Colorado State University students and faculty gathering March 7th At the Lory Student Center at CSU.
Part of the Liberal Arts Democracy Summit 2025, a panel discussion entitled “Code vs. Results: Technology and Policy Discussions on Misinformation and Social Media,” featured the revival of former CSU political science professor Dominic Stecou.a. he the current on hold Assistant Professor at Ohio State University.
“His research agenda is primarily located in the fields of political science and communication,” Sam Hoveling said. Strayer Centre for Public Service Leadership Program Manager. “In his research, he analyses the supply and demand aspects of information and how it affects important social issues such as public opinion formation, climate change, vaccination, and how misinformation affects those processes.”
The discussion, hosted by student democratic fellow and political science student Ethan McGuinness, began with the impact of artificial intelligence bots and deep learning algorithms on the misinformation social media landscape. Although studies of their effects are still increasing, their immediate effects are very obvious.
“This really doesn’t just spread individual falsehoods, it also promotes this epistemic uncertainty,” Stecuła said. “It erodes public trust in everything, not just misinformation and specific sources. It only undermines trust in all sorts of institutions, in media, and that’s the biggest problem, whether it’s the election agencies, government agencies, etc.”
Misreport Defined as incorrect or inaccurate information False information It is false information designed to be purposefully misleading through misrepresented facts. However, true consumption of both information categories may differ from what is expected, as explained through the percentage of total US internet browsing, including news sites.
“3 percent,” Stekwa said. “Of that 3%, there’s a 14% (Are) website that focuses specifically on political news. That’s a very small part of people’s information consumption, right? Fake news, misinformation is an even smaller percentage. …Who consumes a lot of (misinformation)? It’s basically 1% of the population (it) consumes an overwhelming majority of all misinformation, right? So, it’s a very skewed distribution.”
As Stecuła explained, it shows that online misinformation exposure does not always dramatically change someone’s political behavior compared to ideology before consuming information.
“People seeking partisan political information tend to seek ideologically consistent information, so those who are already quite biased are the ones who are most likely to end up in echo chambers,” Stekwa said. “Before social media is truly a major source of information, polarization continues as a process.”
Furthermore, online viewers of news media, regardless of political partisanship, are not likely to actively fact-check information presented.
“The problem with fact-checking is that people exposed to misinformation usually don’t start their day by going. FactCheck.org“Stekwa said. “Therefore, there are two different venues where false information occurs and fact checks occur.”
When deciding how to combat misinformation online, several factors need to be considered, such as which parties have the ultimate authority to define what fiction is.
“So we have to think about what we are actors who are used to making these decisions,” Stekwa said. “There’s no simple answer here. Do you want government or do you want to regulate itself by the industry?”
“There’s always a misinformation, but there are methods and practices that can be implemented by slowing the spread of misinformation.” – Kaitlyn Spencer, CSU student
To explain the growing discussion of this nuance, Stecuła gave participants the opportunity to explain the participants’ discussion. German Network Enforcement Law. Passed in 2017 That’s what the law was like design To address online hate speech and extremism. Instead, it had unintended consequences.
“When Germany passed that law, it definitely restricted speech,” Stekwa said. “It wasn’t particularly transparent in terms of whether something was misinformation (and) fake news, and how they were making decisions, as people like Vladimir Putin and other authoritarian rulers said, “We said Germany was passing this law.” It passes similar laws. We also crack down on misinformation, but “but in their context, misinformation was essentially following journalists, closing their speeches, and muting political opposition.”
Another way to deal with the spread of misinformation is to install more fact-checking methods across social media sites. When present, these cliff notes are known to be suspended by media consumers and consider the evidence before them.
“That’s friction, right?” Stecuła said. “It’s slowing you down. … And research shows that when you slow down, when you use cruise control, it’s when all your bad partisan instincts begins. So there’s one that’s a platform design that you can introduce.”
Social media users can also defend themselves by diversifying news sources solely from the origins of social media by curating their feeds and searching for traditional publications.
Only through constructive and intentional dialogue can we fight misinformation and overcome partisan divisions.
“Quote like Marshall McCulhan: () Media is a message. And social media as a medium is not a very reliable forum for very thoughtful dialogue and conveying information. Stecuła said, “It’s convenient. Short videos are fun, but I think it’s like they’re not just about to be informed on social media (important).”
This sentiment was reflected by Caitlyn Spencer, a student of political science and international studies.
“There’s always a misinformation, but there are ways and practices that can be implemented by slowing the spreading of misinformation,” Spencer said. “There are some hopes there, but it’s a very slippery slope to navigate. My biggest thing I learned today is… () be patient with those who are undoubtedly spreading this misinformation.”
Only through intentional, collaborative action can actual change be made possible in both the digital and physical domains.
“I think each of us needs to play our own role in getting out of the fate loop of political polarization (and) misinformation. And what you know is to really do what we can to find that quality news to help us stay alive in a truly healthy democracy and we will remain that steward.
Reach Katie Fisher news@collegian.com Or on social media @csucollegian.