a Recently, a short video clip has been made online showing people rising from the dead and reacting to Jesus – absurd, hilarious, widely shared. At about the same time, images of “people of interest” spread throughout WhatsApp long before official statements or confirmations from the authorities. That’s also the media. It makes people listen, laugh, feel and react. But it also tells us what to care about, what to ignore, what to believe.
The media was never neutral. But now the game has changed with its speed and scale. We live in a world where media no longer reflects reality. The platform decides important things before you even have time to think. Deepfakes distort the truth. The newsroom has been reduced and scaled with algorithms.
Media Houses that contain this should ask more difficult questions. I’m not a gatekeeper anymore. Influencers, civic journalists and content creators all form an information ecosystem. Some fill important gaps. Others deepen their confusion. And without shared standards and trust, we are guessing the truth.
A few weeks ago, John Oliver aired a segment of “AI Slop.” The New York Times had a question: Can you believe in your own eyes anymore? answer? It’s not without context, clarity and healthy doubt.
Here in Barbados, we are not exempt. Generator AI is already in the newsroom. Sometimes it is used inevitably. In a small newsroom, the relentless cycle makes speed a survival issue. However, survival cannot be confused with strategies. What do you lose when you automate your thinking tasks? What values are embedded in that speed? And who pays the price?
Was it written by people or machines even in this editorial? Is that important? That pause – that doubt – is the key.
This is because despite what high-tech boosters say, generative AI does not “hastisation” or “thinking.” It incorporates what we trained it on a large scale. Without regulations, the burden will unfairly lower on “users.” “Users” are expected to navigate unbuilt systems. However, structural issues cannot be resolved with personal vigilance. The real power lies in those who design systems, profit from content and shape rules.
Today’s media ecosystem is not built for public services, but for optimizing engagement, emotion and scale. And it’s not just misinformation. It is a breakdown of the context itself. It is the erosion of meaning, nuance, and memory.
This is why media literacy is important. Not only do we find fake news, but also to understand how the media works for us, often without us realizing it. We know that costs are too slow. in polarization, misunderstanding, and cultural loss.
So who has the rights? Who is responsible?
The government must act like them. It means clear data governance rather than ambiguous digital slogans. This means media literacy, which is essential for schools. That means enforce accountability when tech giants dodge regulations and inherit culture.
Furthermore, media houses cannot treat viral or AI as neutral tools. Standards for editorial use, cultural care and labor dignity must be established. Because the media is scrolling and teaching us zoning even when we’re laughing. Lauryn Hill’s misconception wasn’t just the album title. It was a warning about invisible instructions. The media is always teaching. The only question is who decides the lesson?
I already have the tools. The United Nations publishes global principles on digital rights. Barbados’ cybercrime bill is under discussion. There are research into policy templates, climate data on digital waste, and regulatory frameworks. You have to stop pretending that they don’t exist because you’re looking for solutions. There is a tool. What we lack is a solution.