Home » Technology » 6 ways Meta can improve Ray-Ban Stories smart glasses

Share This Post

Technology

6 ways Meta can improve Ray-Ban Stories smart glasses

Colin Hughes on six ways to improve Ray-Ban Stories first gen smart glasses

Ray-Ban Stories smart glasses

As someone who has been using Ray-Ban Stories smart glasses all day every day for the past five months, (with added prescription lenses), I have been impressed with their ability to capture photos and videos, and chat and call hands-free without having to touch my smartphone.

However, there are some areas where I think the smart glasses could be improved to make them even more convenient and accessible to use.

Ray-Ban Stories are first generation smart glasses that have quickly become a popular choice among tech enthusiasts and fashion-forward individuals. Created through a collaboration between Meta (formerly Facebook) and EssilorLuxottica the company behind the Ray-Ban sunglasses brand, these glasses allow users to capture photos and videos, listen to music, and make calls and send messages while looking stylish.

Since the launch of the glasses in 2021, Meta has no doubt been working to improve their functionality in time for a second-generation of the Stories rumoured to be released this autumn.

In February the company announced in a blog post that it will soon be updating the glasses software with a new feature that will allow users to make hands-free phone calls using their phone number. Additionally, the glasses will be able to read incoming text messages out loud, providing an even more convenient way for users to stay connected while on the go.

As a new wearable product category, Ray-Ban Stories still have room for improvement. In this blog post, I will suggest six ways Meta can enhance the user experience and functionality of these smart glasses.

Better sharing

One of the main features of Ray-Ban Stories is the ability to capture photos and videos with the built-in cameras on the frames. However, sharing these moments with your friends and family is not as easy as it could be. You need to use the View app on your smartphone to download the photos and videos from the glasses, and then share them through other apps like Facebook, Instagram, WhatsApp, or Messenger. In the fast sharing world we live in these days this process feels clunky.

Meta could make this process more seamless by allowing you to share the last photo or video you have taken with the glasses hands-free with a voice command. For example, you could say “Hey Facebook, share my last photo with Joe” and the glasses would automatically send it to Joe via WhatsApp or Messenger, which ever you set as default. This way, you don’t need to take your smartphone out of your pocket and interrupt your activity.

Improved WhatsApp integration

Another feature of Ray-Ban Stories is the ability to listen to messages from your smartphone through the open-ear speakers on the frames. This includes messages from WhatsApp, one of the most popular messaging apps in the world. However, the integration with WhatsApp is not very robust. If you receive a message that is longer than a line, the Facebook Assistant will just tell you “You have a long message from Joe” and you will have to check your phone to read it.

Meta could improve this by allowing you to listen to full-length WhatsApp messages through the glasses. This way, you can stay connected and informed without having to stop what you are doing and take your phone out of your pocket. It would also be useful to reply to WhatsApp messages with pre-recorded audio clips, and have other smartphone notifications read out to you

More voice commands

Ray-Ban Stories come with a built-in Facebook Assistant that lets you control some functions of the glasses with voice commands. You can use it to take photos and videos, play and pause music, adjust the volume, check the battery status, and send messages. However, these commands are quite limited and do not cover all the potential uses of the glasses.

Meta could expand the range of voice commands available for Ray-Ban Stories to make them more versatile and convenient. For example, you could ask for the time, temperature, weather, directions, news updates, reminders, calendar events, and more.

Smart home control

Ray-Ban Stories do not have any integration with smart assistants like Siri or Google Assistant that can help you control your smart home devices with voice commands.

Meta could consider adding support for third-party smart assistants to Ray-Ban Stories so that users can access their smart home devices with their glasses. For example, you could say “Hey Siri, turn on the lights” or “Hey Google, play Netflix on Chromecast” and your glasses would execute the command. This way, you can enjoy a more seamless and hands-free smart home experience.

Change the smart assistant name

One of the drawbacks of using Ray-Ban Stories is that you have to say “Hey Facebook” to activate the Facebook Assistant on the glasses. This can be confusing and awkward for several reasons. First, it does not match the branding of Meta or Ray-Ban as the makers of the glasses. Second,  it can raise privacy concerns among users who do not trust Facebook with their personal data or voice recordings. More than once, when demonstrating the photo and video capabilities of the glasses to friends, someone has cringed and said: “you’re not going to share this on Facebook are you?”

Meta could improve things by changing the name of the smart assistant on Ray-Ban Stories to something more unique and appropriate for the glasses. For example, they could use “Hey Meta”, “Hey Ray-Ban”, or even “Hey Glasses” as alternative names. This way, they can avoid confusion and create a more positive association with their brand and product.

Accessibility

Accessibility is an important aspect that Meta should consider when improving Ray-Ban Stories smart glasses. At the moment the company has not included any accessibility features or controls. There is a music feature, a tie in with Spotify, called Spotify Tap that requires the user to lift their hand and tap the frame to trigger music to play. Why isn’t there an alternative option to do the same thing with a voice command that would be accessible to those with upper limb disabilities? It’s clear that Meta has not considered the accessibility potential of smart glasses for disabled people. Hopefully, we will see more thought and consideration in the second generation n of the Stories.

Conclusion

Ray-Ban Stories are a great way to capture and share your life moments while wearing fashionable glasses. However, they are not perfect and there are some areas where Meta can improve them to make them more user-friendly, functional, as well as accessible to disabled people. I hope that Meta will consider these suggestions and implement them in the next generation of smart glasses rumoured to be released in the autumn. Until then, I will continue to enjoy using my Ray-Ban Stories and exploring their possibilities.

Colin Hughes is a former BBC producer who campaigns for greater access and affordability of technology for disabled people

Leave a Reply