Make the most of your Ray-Ban smart glasses with these tips from Meta

0comments
Make the most of your Ray-Ban smart glasses with these tips from Meta
The Ray-Ban smart glasses are a fun gadget that give us a glimpse into the future of computing. Recently these smart glasses received Meta AI support, so Meta has made a list of tips on how to get the most use out of your pair.

In case you’re unfamiliar with the Ray-Ban smart glasses, they are a simple pair of glasses with no display. There’s an in-built camera and speakers but the utility was very limited until Meta AI got added. Meta’s Project Nazare, on the other hand, are purported to be a high field of view pair of AR smart glasses with a futuristic display and more goodies.

So, what uses has Meta put forward for the Ray-Ban pair? These tips revolve around Meta AI. So, unfortunately, this list isn’t for you if you don’t live in the United States or Canada. As those are the only two countries where Meta AI is currently available on Ray-Ban’s glasses.


Video Thumbnail


Meta AI on the Ray-Ban smart glasses can be prompted by saying, “Hey Meta, look and…” followed by your prompt. A very ingenious use of this, and something I hadn’t even considered, is asking Meta AI to help you pick an outfit.

For example, a user could stand in front of a mirror and ask their smart glasses what goes well with the shirt they’re wearing. And when you’re out and about in your fancy new outfit, Meta claims you can use the smart glasses to even identify foreign fruit. Simply pick up something you’re unsure about and ask Meta AI to identify it.


The perfect vacation with Meta AI





The Ray-Ban smart glasses, now that they’re powered by Meta AI, make for an amazing vacation companion. Touring a foreign country and can’t read a menu at a restaurant? Simply look at it and ask Meta to translate it for you. See a cool statue somewhere? Meta can identify it and even give you a brief history lesson.

Lastly, Meta points out that their AI can even help you come up with a caption. If you see something worth capturing, ask Meta AI to come up with a caption for it and it’ll output a result in seconds. It also helps that you can use the smart glasses to take a picture whenever and wherever you want.


Meta AI is still early tech and can make mistakes



As Meta pointed out when Meta AI rolled out to the Ray-Ban smart glasses, this is still very early tech. It works great when it does but it also makes mistakes. So don’t be surprised if your glasses mistake a cat for a small dog.

Meta provided a couple more tips to use Meta AI most efficiently. These tips apply to basically any modern Large Language Model (LLM) AI, and Meta AI is a multimodal model. As in, it can communicate via language but can also understand visual input.




When communicating with Meta AI, the company advises users to “perfect the prompt”. In simpler terms, phrase your question as clearly as you can. For better results keep your head still when asking a question until you hear a shutter sound. You can even point at something to help the AI better understand what you’re talking about.

Also, don’t forget to use adjectives. This is pretty standard practice for LLM AI models. Basically, the output you get depends on the type of output you asked for. For example, if you’re asking for a caption, you can heavily alter the response you get by asking for a funny caption or an informative caption.

Ray-Ban’s Meta-powered smart glasses are a pretty cool gadget, no doubt. But I personally cannot wait for Project Nazare to hit the market. If it’s even half as good as Meta is hyping it up to be it will immediately become one of the best AR glasses we’ve ever seen.
Create a free account and join our vibrant community
Register to enjoy the full PhoneArena experience. Here’s what you get with your PhoneArena account:
  • Access members-only articles
  • Join community discussions
  • Share your own device reviews
  • Build your personal phone library
Register For Free
Loading Comments...
FCC OKs Cingular\'s purchase of AT&T Wireless