Meta Smart Glasses can now track all the food you put in your mouth

A new, more prescription-focused style of Ray-Ban-branded smart glasses grabbed the spotlight this week, but Meta also quietly announced a few more features for its smartglass range, including… a way to track everything you eat.

According to Meta, owners of the Meta Ray-Ban AI Glasses or the Meta Ray-Ban Display will soon be able to take a picture of what they’re eating using a voice prompt and then register that food item in the Meta AI app. Meta says it will « extract key details about the meal » using AI and said photos. The idea is that Meta wants to use your food photos in concert with AI to give users « personalized insights » and help people make « healthier, more informed choices. »

This process might involve asking the Meta AI things like « What should I eat to increase my energy? » or other prompts in that vein. However, one thing that occurred to me in the explanation of this feature is that Meta has big plans to expand this functionality in the future.

Obviously, logging everything manually is a bit of a pain, and having smart glasses that do the same but in the « ambient » way would be more convenient. That’s why Meta says that « in the future » its smart glasses will « understand what you eat and automatically log your food. » Sounds great if you’re into that sort of thing, but there are some pretty big problems with this idea.

On the one hand, I’m pretty sure the Meta’s smart glasses need to always be recording for this to work, and given the way things are going with privacy, I don’t think people will be very receptive to smart glasses recording everything all the time. On top of that, leaving the camera on all the time is a one-way ticket to terribly short battery life. So, I don’t know…sounds like a good idea in theory, but I’ll file this idea under « probably not » for now. That’s not even counting the fact that people might be a little more hesitant to hand over their data to Meta right now, even if it’s just the sad sandwich they panicked for lunch.

Meta says nutrition tracking will soon be available on its non-display AI glasses and the Meta Ray-Ban display this summer.

Meta also announced hands-free WhatsApp summaries that will be available in the early access program « soon, » as well as display recording so you can capture what the screen looks like in the Meta Ray-Ban display, which is also coming « soon. » As for features you can use right now: Meta announced the ability to scroll through Instagram Reels on the Meta Ray-Ban Display, « visible widgets » that show reminders, weather, stocks and a calendar on the Meta Ray-Ban Display’s home screen, and a new Spotify shortcut. Neural handwriting, which uses the Meta Ray-Ban Display’s Neural Band to write by hand with just your fingers, is also set to launch « in the coming weeks. »

Ultimately, there’s nothing groundbreaking here, but as smart glasses stand right now, it’s a mix of things you’d think devices already have and other things that feel like a privacy nightmare waiting to happen.

Gadgets

#Meta #Smart #Glasses #track #food #put #mouth

Laisser un commentaire

Votre adresse e-mail ne sera pas publiée. Les champs obligatoires sont indiqués avec *