2024 will be the year of the Meta AR glasses… But they are “internal only”
Honestly, I’ve got to say it: Google made Glass in 2013, people. It had a microphone, a camera, it had a heads-up-display (HUD) feature — also known as a screen — and, for all intents and purposes, it was a pair of AR glasses. Not perfect, but still: AR glasses, ten years ago.
Now it’s 2024. Glass is dead and we’re talking about smart glasses that can listen to us via AI and take pictures. And that’s great! And we have a ton of great VR headsets too, but those aren’t even in the same category in my book. They aren’t AR glasses.
And the part that gets really confusing to me started when Meta’s CTO talked about the concept, describing it as “the most advanced thing that we’ve ever produced as a species ” only to not showcase it. And now, Bosworth did so again with an even harder to swallow tease.
This info stems from yet another AMA (ask me anything) that Andrew Bosworth — CTO of Meta — hosted over on Instagram. And if you’re new to this, let me tell you: Andrew does a lot of these and he loves mingling with the community… And teasing Meta’s fans in the process. To quote:
So, basically, Bosworth toned things down and he’s no longer claiming that these theoretical AR glasses are the bees’ knees, but then again “world's most advanced pair of glasses” isn’t that far off either. The kicker?
Oh, they’ll be ready in 2024. Just not for you and me: they’ll be for internal use only.
While that does mean that we may get a glimpse of the concept, since Meta isn’t all too shy when it comes to showcasing prototypes, I’ve got a better question:
How is the issue “large-scale production and distribution”? How are these not economically viable in 2024, given that Google made Glass in 2013? I know that inflation has been crazy lately and I’m sure that there is also tons of other stuff involved, that we can’t even begin to imagine.
But getting a bit more context and detail would be much appreciated, right?
Given that we haven’t seen anything close to Glass in an entire decade, doesn’t it make more sense to release some sort of showcase product? Imperfect, yet functional, in order to test the waters. Much like Apple did with the Vision Pro and how Meta did with the Ray Ban smart glasses colab.
In all honesty, I can’t think of a single reason to prefer the release of the ultimate pair of AR glasses, which will apparently help the human race make sense of its very existence. But hey, Bosworth also loves to talk about how eye-tracking for UI navigation on the Quest 3 wasn’t plausible, given that it is a thing on the PSVR2.
Yet again, all of this is food for thought. At least until Meta actually gives us a glimpse of these (allegedly) spectacular AR specs.
Now it’s 2024. Glass is dead and we’re talking about smart glasses that can listen to us via AI and take pictures. And that’s great! And we have a ton of great VR headsets too, but those aren’t even in the same category in my book. They aren’t AR glasses.
This is where we're currently at when it comes to smart glasses. Did you catch it? No screen.
This info stems from yet another AMA (ask me anything) that Andrew Bosworth — CTO of Meta — hosted over on Instagram. And if you’re new to this, let me tell you: Andrew does a lot of these and he loves mingling with the community… And teasing Meta’s fans in the process. To quote:
Heads-up-displays are a very specific concept of what a screen could look like in glasses. And insofar as it's really interpreting the world in real-time and processing things around it, that's probably further out on the processing side. On the display side, we are gonna have, I stand by it, the world's most advanced pair of glasses this year, but it's internal only. We'll have quite a few of them but they're not economically viable for large-scale production and distribution yet. It's a bit of a time machine for us to look into the future so we can build and design for the future. We're certainly working on glasses that have displays at multiple levels, smart glasses all the way up to full AR glasses, but I don't have the time line to share now.
— Andrew Bosworth, CTO of Meta, February 2024
Oh, they’ll be ready in 2024. Just not for you and me: they’ll be for internal use only.
While that does mean that we may get a glimpse of the concept, since Meta isn’t all too shy when it comes to showcasing prototypes, I’ve got a better question:
How is the issue “large-scale production and distribution”? How are these not economically viable in 2024, given that Google made Glass in 2013? I know that inflation has been crazy lately and I’m sure that there is also tons of other stuff involved, that we can’t even begin to imagine.
But getting a bit more context and detail would be much appreciated, right?
This is what Google Glass was capable of when it first released.
Given that we haven’t seen anything close to Glass in an entire decade, doesn’t it make more sense to release some sort of showcase product? Imperfect, yet functional, in order to test the waters. Much like Apple did with the Vision Pro and how Meta did with the Ray Ban smart glasses colab.
In all honesty, I can’t think of a single reason to prefer the release of the ultimate pair of AR glasses, which will apparently help the human race make sense of its very existence. But hey, Bosworth also loves to talk about how eye-tracking for UI navigation on the Quest 3 wasn’t plausible, given that it is a thing on the PSVR2.
Yet again, all of this is food for thought. At least until Meta actually gives us a glimpse of these (allegedly) spectacular AR specs.
Things that are NOT allowed: