Meta’s has announced several new additions to its Ray-Ban Meta glasses, which are already gaining sales momentum heading into the holiday season.
Meta’s stylish smart glasses are quickly becoming a must-have for tech-savvy people, and Meta is building on this by adding AI capabilities to the device to extend its responsiveness and interactive capabilities. Masu.
First, Meta adds “Live AI”. It provides an always-on AI companion for up to 30 minutes at a time.
As you can see in this example posted by Meta CTO Andrew Bosworth, Live AI allows you to interact with Meta AI hands-free, allowing you to ask questions in a more natural conversational manner.
According to Meta:
“During a live AI session, Meta AI continuously sees what you are looking at and allows you to converse more naturally than ever before. Get hands-free support and inspiration with . Ask questions without saying “Hey, Meta,” reference what was discussed earlier in the session, or stop at any time to ask follow-up questions. You can also change the topic. Eventually, when the time is right, real-world AI will be able to make helpful suggestions before you even ask. ”
This is an interesting use of generative AI, but I wonder if they have yet considered the mental health implications of creating an AI companion.
For example, what happens if people come to rely on AI tools as friends, and their providers cut them off? This way, enhancing connections as if they were actually talking to a real person is actually could be a risk to connection and engagement?
I’m sure we’ll see, but as with social media before, my concern is that the desire for innovation has led to the need for associated impact evaluations being ignored and these harms to be seen in hindsight. I wonder if it will just be recognized.
As Bosworth also points out, Meta Ray-Ban also has new Shazam integration for users in the US and Canada, so you can always ask the Meta AI what song is playing.
Yes, that’s Extreme Zack. I just got back from a jiu-jitsu competition and I’m out in the paddock pounding away at the dirt buggy. I’m not sure why Mehta is trying to give Zack more personality these days, but his recent donation to President-elect Trump’s inaugural fund isn’t doing him any good in this department. I know that.
Finally, Live Translation has been added to Meta Ray Ban as well. This could be a particularly useful update.
“With this new feature, your glasses will be able to translate audio between English and Spanish, French, or Italian in real time. When you’re talking to someone who speaks one of these three languages, your glasses’ You can hear the person’s English through open-ear speakers, see it as a transcript on your phone, and vice versa.”
So if you’re in a situation where one of these languages is spoken, you’ll know if and what the other person is talking about you. I mean, they’re probably not talking about you, so you’ll probably be disappointed, and you might even have to wear sunglasses indoors like a weirdo to translate this. But it’s still useful in many situations, and as Meta adds more languages, this could become a killer application for your device.
As mentioned above, sales of Meta’s Ray-Ban glasses have steadily increased over time, and many people will be putting them under their Christmas trees next week. And as Meta continues to evolve its devices, it could become an essential connector in many ways, increasing the value of Meta’s AI tools and helping guide the direction of product development.
And arguably, the use of meta AI in this context is more valuable than Facebook or IG chatbots.
That’s where Meta’s AI development is actually interesting. Because there are more places and more ways to use AI to bridge the gap between where we are today and how we will connect in the future.
So while it seems unnecessary and even intrusive for Meta to push its AI chatbot into every app, there are other ways you can get the most out of your AI tools.