Meta is set to roll out an AI update for its Ray-Ban smart glasses, introducing multimodal AI functionalities such as translations and object, animal, and monument identification.Activation of the smart assistant is voice-controlled, initiated by saying “Hey Meta,” enabling a hands-free user experience for information access and various tasks.The glasses currently support multiple languages for translation and are expected to see continuous improvements in feature accuracy and language options.
Importance:
This enhancement marks a significant step in Meta’s AI integration into consumer products, offering innovative, practical applications of AI technology in everyday life.