Live
Black Hat USAAI BusinessBlack Hat AsiaAI BusinessMCP Observability: Logging, Auditing, and Debugging Agent-Server Interactions in ProductionDEV CommunityEfficient Real-Time Flight Tracking in Browsers: Framework-Free, Cross-Platform SolutionDEV CommunityI Built a Visual Spec-Driven Development Extension for VS Code That Works With Any LLMDEV CommunityFinancialClaw: making OpenClaw useful for personal financeDEV CommunityOpenAI acquires TBPNDEV CommunityA Human Asked Me to Build a Game About My Life. So I Did.DEV CommunityAI, Price Theory, and the Future of Economics ResearchHacker News AI TopShow HN: EU Compliance SaaS for Sale ($4K Each) – CBAM, AI Act, Public TendersHacker News AI TopShow HN: Filoxenia – open protocol for human-AI companionshipHacker News AI Topv0.20.1-rc2: model/parsers: rework gemma4 tool call handling (#15306)Ollama ReleasesShow HN: AI agent skills for affiliate marketing (Markdown, works with any LLM)Hacker News AI TopMeta Pauses Work With Mercor After Data Breach Puts AI Industry Secrets at RiskWired AIBlack Hat USAAI BusinessBlack Hat AsiaAI BusinessMCP Observability: Logging, Auditing, and Debugging Agent-Server Interactions in ProductionDEV CommunityEfficient Real-Time Flight Tracking in Browsers: Framework-Free, Cross-Platform SolutionDEV CommunityI Built a Visual Spec-Driven Development Extension for VS Code That Works With Any LLMDEV CommunityFinancialClaw: making OpenClaw useful for personal financeDEV CommunityOpenAI acquires TBPNDEV CommunityA Human Asked Me to Build a Game About My Life. So I Did.DEV CommunityAI, Price Theory, and the Future of Economics ResearchHacker News AI TopShow HN: EU Compliance SaaS for Sale ($4K Each) – CBAM, AI Act, Public TendersHacker News AI TopShow HN: Filoxenia – open protocol for human-AI companionshipHacker News AI Topv0.20.1-rc2: model/parsers: rework gemma4 tool call handling (#15306)Ollama ReleasesShow HN: AI agent skills for affiliate marketing (Markdown, works with any LLM)Hacker News AI TopMeta Pauses Work With Mercor After Data Breach Puts AI Industry Secrets at RiskWired AI
AI NEWS HUBbyEIGENVECTOREigenvector

Meta Smart Glasses Can Now Track All the Food You Put Into Your Mouth

Gizmodoby James PeroApril 3, 20263 min read1 views
Source Quiz

Whether people want to give Meta all their food data is another question entirely.

A new, more prescription-focused style of Ray-Ban-branded smart glasses stole the attention this week, but Meta also quietly announced a few more features for its smart glasses lineup, including… a way to track everything you eat.

According to Meta, owners of Meta’s Ray-Ban AI glasses, or the Meta Ray-Ban Display, will soon be able to snap a photo of what they’re eating using a voice prompt and then log that food item in the Meta AI app. Meta says it will “extract key nutrition details” using AI and said photos. The idea is that Meta wants to use your food pics in concert with AI to give users “personalized insights” and help people make “healthier, more informed choices.”

That process might involve asking Meta AI stuff like “What should I eat to increase my energy?” or other prompts in that vein. One thing that jumped out to me in its explanation of that feature, though, is that Meta has lofty plans to expand that functionality in the future.

Obviously, having to manually log everything is a bit of a pain, and having smart glasses that do the same thing, but in an “ambient” kind of way, would be more convenient. That’s why Meta says that “in the future,” its smart glasses will “understand what you’re eating and automatically log your food.” Sounds great, if you’re into that sort of thing, but there are some pretty major problems with that idea.

For one, I’m pretty sure Meta’s smart glasses would have to be always recording for that to work, and given the way things are going on the privacy front, I don’t think people will be very receptive to smart glasses that record everything all the time. On top of that, making the camera engaged all the time is a one-way ticket to having a woefully short battery life. So, I don’t know… sounds like a good idea in theory, but I’m going to file that idea under “probably not” for now. That’s not even counting the fact that people might be a little more hesitant to hand their data over to Meta right now, even if it’s just the sad sandwich they panic-ate for lunch.

Meta says nutrition tracking will be available on its non-display AI glasses soon, and on the Meta Ray-Ban Display this summer.

Meta also announced hands-free WhatsApp summaries, which will be available in the early-access program “soon,” as well as display recording so you can capture what the screen inside the Meta Ray-Ban Display looks like, which is also coming “soon.” As for features you can use right now: Meta announced the ability to scroll Instagram Reels in the Meta Ray-Ban Display, “glanceable widgets” that show reminders, weather, stocks, and the calendar on the Meta Ray-Ban Display home screen, and a new Spotify shortcut. Neural handwriting, which uses the Meta Ray-Ban Display’s Neural Band to handwrite things using just your fingers, is also set to launch “in the coming weeks.”

Ultimately, there’s nothing groundbreaking here, but as is the way of smart glasses right now, it’s a mix of stuff you’d think the devices would already have and other stuff that feels like it’s a privacy nightmare waiting to happen.

Was this article helpful?

Sign in to highlight and annotate this article

AI
Ask AI about this article
Powered by Eigenvector · full article context loaded
Ready

Conversation starters

Ask anything about this article…

Daily AI Digest

Get the top 5 AI stories delivered to your inbox every morning.

Knowledge Map

Knowledge Map
TopicsEntitiesSource
Meta Smart …Gizmodo

Connected Articles — Knowledge Graph

This article is connected to other articles through shared AI topics and tags.

Knowledge Graph100 articles · 172 connections
Scroll to zoom · drag to pan · click to open

Discussion

Sign in to join the discussion

No comments yet — be the first to share your thoughts!