Open in App
  • Local
  • U.S.
  • Election
  • Politics
  • Sports
  • Lifestyle
  • Education
  • Real Estate
  • Newsletter
  • Tom's Guide

    Meta Ray-Ban smart glasses just got a huge upgrade with live translation, multi-modal video and more

    By Scott Younker,

    1 days ago

    https://img.particlenews.com/image.php?url=27TqBi_0vjPE2b500

    Meta Connect 2024 is all about AI on your face. Meta CEO Mark Zuckerberg stepped on to the stage to introduced the newest version of their AI-powered smart glasses.

    The big announcement regarding the newest version of Meta's smart glasses is much deeper integration of AI and potential new AI-powered features that are coming to the glasses in the future.

    Zuckerberg announced that with the updated AI, you'll be able to have more natural and conversational prompts using the smart glasses. An example he gave was prompting, "Hey Meta, what kind of smoothie can I make with these" while showing ingredients. From there you don't need to prompt with 'Hey Meta' and can just continue the conversation.

    The smartglasses are getting the ability to be a memory bank for you as you go through your day to day life like asking the glasses to remember where you parked, reading QR codes on flyers or calling numbers.

    Zuckerberg claimed that Meta AI on the glasses will be capable of mult-modal video meaning that it can give "real-time advice." For example, they showed someone getting ready for a party with a Roaring Twenties theme, the AI helped them pick out appropriate pieces for their outfit.

    Probably the most interesting feature Mark Zuckerberg showed off was live translation with the the smart glasses translating into English right in your ears. He said you could use a companion mobile app to translate for someone who isn't wearing smart glasses. Zuckerberg tested by having an English-Spanish conversation with Mexican MMA fighter Brandon Moreno.

    For people with low-vision or blindness, Zuckerberg announced a forthcoming partnership with Be My Eyes, which uses volunteers over video to help people see what they're looking at. It will soon be available to be used via the smartglasses so that users can show the volunteer what they're looking at and hear their responses in the glasses.

    Additionally, some smaller new features include voice control for Spotify and Amazon Music as well as new iHeart Radio and Audible integration.

    In a throwback to my see-through N64-loving heart, Zuckerberg announced a clear, limited-edition version of the Ray-ban Meta glasses where you can see the technology inside the glasses. It does appear that there won't be many of these ones if you're interested in them.

    Meta also announced that they are teaming up with EssilorLuxottica to create a variety of lenses from prescription to optical and transition.

    More from Tom's Guide

    Expand All
    Comments /
    Add a Comment
    YOU MAY ALSO LIKE
    Local News newsLocal News
    Total Apex Sports & Entertainment10 days ago

    Comments / 0