Global Market Intelligence Firm International Data Corporation estimated in 2024 that more data was produced in the three years before all previous history. This trend is predicted to continue by generative artificial intelligence tools that can automate and schedule unlimited numbers of persuasive falsehoods and deepfakes at unprecedented speeds, so K-12 educators must prepare their young minds to navigate a sea of information that has not existed until recently. To make these challenging tasks easier to manage, the presenter at this year’s 25 conference in San Antonio said on Wednesday that it lies in qualities that AI doesn’t have, such as critical thinking and cognitive independence, that are qualities that make us human.
Understand the problem
Armine Movsisyan first summed the issue as one of the “too much information” from his experience as a learning manager at ED3 DAO, a nonprofit organization at Ed-Tech Teaching.“It’s not just the fact that you can be fooled by online stuff with these deepfakes and these confused sources,” she said. “It’s just their being that can really reinforce people when they can actually dismiss the truth.”
Movsisyan added that the consequences of this excess and the channels it comes with some of the culture “flattening” and the creation of echo chambers. The filter bubble allows people to select only the information they want to see. Dissonance makes them uncomfortable with challenging their beliefs, even through truth.
In effect, people want comfort, which creates an echo chamber that eliminates some realities. The more persuasive the generation AI, the more capable of strengthening these chambers.
Victoria Andrews, a partner in an education design and advocacy company, rebutted the wise facts. Teens are very interested in learning how to deal with this.
“The union’s status is that when it comes to AI literacy, the youth want that!” she said. “Think about it: we’re talking about Generation Alpha and Generation Z. They were born with literally Tiktoc coming out of their ears. They were born with a phone call on their faces from when they came out of their mothers.
Calling the growing tsunami of Ai-Generated Nonsense “Slop,” Andrews pointed out that recognition of the issue was one of the finalists of the era, as he was one of the finalists in Oxford’s 2024 term.
Overcoming the slop
Movsisyan and Andrews agreed that the first step educators take to bring media literacy to their students is to set an example.
“How we consume information affects the way they talk to young people about how they consume information,” Andrews said.
To do this, Movsisyan provided examples of several resources, including the ED3 DAO’s own Media Literacy Course and other organizations: National Association for Media Literacy Education, AD Fontes Media, its Media Bias Chart, Newseumed, IC4ML, News Literacy Project, Media Literacy.
She also specifies four building blocks, or the educational foundation to teach students, giving them some confidence in dealing with AI.
Critical thinking skills that question assumptions, evaluate resources, recognize bias, and consider alternative perspectives.
Basic knowledge of AI understands its strengths, limitations, and what it cannot be used
Their own values, beliefs, motivations, perspectives, emotional states, and perceptions of others
Cognitive independence using AI as a thinking partner.
Andrews and Movsisyan emphasized that media literacy is not an independent course or foundation, but all four, and should be incorporated into lessons on other subjects.
To achieve this, Andrews advised the audience to teach students and themselves to:
I’ll examine it. Before you believe the content, check for bias, procurement and possible actions.
Be aware of the restrictions. Use multiple sources and consider data limitations to expand your perspective. Do not be classified as an algorithm-driven echo chamber. Discard the algorithm that tracks you.
Observe. Ask why the media was created. Think about whether it is intended to inform, persuade, entertain, manipulate. Identify potential hidden agendas, political groups, or AI-driven content farms.
partner. AI is not an authority. Although we partner with AI at every level of Bloom’s taxonomy, please remember that AI can’t really “understand” like humans.
While Andrews pointed out that everyone is trying to find their way in a new information ecosystem, what educators can do is contribute more to the solution than the problem.
“We don’t want to contribute to the problem, do we?” she said. “We don’t want to contribute to the slops there, so please be kind to love and care for everything we do and hand over this message and drop the slops.”