Stephen McConnell, Adjunct Lecturer
Instead of searching on Google, we now turn to AI-it.
Artificial intelligence is all the rage right now, especially with the announcement of generative technologies like ChatGPT and Perplexity.
Thanks to the magic of these technologies, once excruciating tasks like writing a company memo or learning how to make bread (at least for this writer) are now as easy as a walk in the park. We are amazed at what has become.
But AI has been subtly intertwined with our lives, work, memories, news and information for quite some time, and in fact, it’s much more subtle, intimate and pervasive than ChatGPT.
For all of us in the UNC Hussman community, when we use social media, we are under the surveillance of AI.
As we flip through our Facebook feeds for the latest news and information about our friends and the world, AI is a silent guide that tells us what kind of posts we’ll see. decide for us which posts are displayed.
When we hop on TikTok and become fascinated by the endless funny and informative banter that the platform offers, AI is the invisible hand channeling that stream of content to you.
And when we log on to Instagram or any of the many other social media platforms to communicate with friends and family, AI is there, a digital Wizard of Oz analyzing our every move.
While ChatGPT, Perplexity, and Claude have captured our attention, AI and other machine learning technologies have taken substantial control of our social media and digital information environments in recent years.
Thanks to recent advances in AI, it is becoming even more powerful and potentially more troublesome.
Unlike ChatGPT, this AI is the everyday gatekeeper of our reality, forming a calculated portrait of us based on what it perceives to be our values, affinities, ideologies, and aversions. , from that portrait sculpts a significant river of content that appears seamlessly on our screens. Feed.
Facebook has an estimated 3 billion active users, Instagram has an estimated 2 billion active users, billions of users worldwide, and most of the world is currently under the influence of some type of AI. It’s below. It’s hidden in these platforms and many people may not even be aware of it. I know that.
The impact, as mine and other research has shown, is large, from the possibility of putting you in a bubble of news and information shaped to your detected interests, to keeping you on the platform longer. This ranges from providing you with content that we predict you’ll like.
Now a UNC Hussmann faculty member, I first got a glimpse of the dangers and promise of social media as an investigative reporter for the Scranton Times Tribune, a daily newspaper in Scranton, Pennsylvania.
One day, around 2012, I noticed my editor, John, coming out of his office with a then-novel smartphone in his hand. He was looking at Twitter, clearly fascinated. That day, I felt that journalism was about to change dramatically. I was inspired and scared at the same time.
A few years later, I completed my Ph.D. After attending the program at Colorado State University, I began researching what fascinated and perhaps alarmed John: the power of social media as a communication channel. We also discovered that beneath these flashy posts lies a series of machine learning technologies.
All the likes, shares, posts, photos, and videos you pass along are fodder for the algorithm to understand you. These “signals” feed into large farms of servers, which then predict what the algorithm thinks users will want to see in their feeds next, as well as other recommendations.
While our focus is understandably on the generative AI technologies that are making headlines today (and billions of dollars), these less obvious AI technologies deserve attention as well.
According to Pew Research, AI-driven personalization of news and information is having a negative impact on democracy, from worsening political polarization to restricting news from appearing in the feeds that more than half of Americans get their news from today. may pose a risk. When the page is blank and a mysterious algorithm makes a call, we have no information or autonomy.
My research also reveals how AI-driven recommendation engines are so effective at providing perfect personalization that people end up getting lost in content “rabbit holes.” I did. That’s very good – and that’s by design.
Personalization allows us to see the world as we want to see it. This is an indisputable advantage. But it can also form a synthetic reality, perhaps created by opaque algorithms that only provide us with a small part of life’s complex puzzle.