Metaverse, schmetaverse
Jul. 28th, 2022 12:13 amThe "Metaverse" was summoned into existence with IRC, back in the 80's. Zuck's vision brings nothing new to the table, except some retrograde vision of the adorable cyberspace meeting rooms that William Gibson was already writing about. (Again, back in the 80's.)
It completely and perhaps deliberately misses the point. The next step is not about anything so grand. The "killer app" is niche and is right in front of most people: AR/VR headsets are on the verge of becoming the new essential engineering and artistic tool, not a replacement for the current emperor of communication tools (the smartphone).
Here's an example that should be close to everyone here, i.e. software developers: You sit down in a coffee shop with a foldable keyboard in front of you, and put on the VR headset. You bring no laptop with you, and no notebooks or manuals either. As soon as the headset is in place, external cameras on it immediately recreate the view around you as-is, but slightly dimmer. Then in front of you (in the virtual space), half a dozen five-foot-high curved screens of code appear, in precisely the configuration you left them 20 minutes ago when you left the house.
"So what?" you say. "This is basically like having a laptop." Well sort of. But the point is, it's actually EASIER to deploy than the laptop. THAT'S when it becomes the killer device.
The headset has one wire that goes over your shoulder down to a battery, which you can keep in your pocket. Other than that there are no wires to deploy. There are no controls to pick up. You can see and feel the keyboard in front of you still. You do not need a mousepad or a trackpad, but you do have a pointer: You move it by raising one index finger off the keyboard and pointing. Note that this is actually EASIER than lifting one hand up and placing it on a trackpad. You click by tapping your thumb, ever so slightly, wherever it is. The LIDAR sensors on the bottom of the headset track all this, as long as you don't turn your head more than 90 degrees away from your hands.
The displays are fixed to the environment, and of course can be easily rearranged. Of course, by the time you see an implementation of this, the whole concept of windows in a display will have been re-thought, to include elements that respond to the movement of your eyes specifically, elements that warp space, faux "work environments" with notes scattered on a desk, tools to examine data structures and memory contents in cube form, integrated side-by-side interaction with the same environment by two people, and so on. It would take the whole notion of screen sharing in a meeting to the next level, obviously.
If the device is light and responsive and high-resolution enough, you will suddenly hate developing on tiny screens anchored to keyboards. You will get used to having your custom 360-degree "work environment" deployable around you at a moment's notice no matter where you are, and you will begin using the more positional aspects of the environment to keep track of things and context-switch in ways you haven't even thought of.
Now, if it can be that useful to you as a software developer - a person who really just needs big legible grids of code and a good keyboard to do their job - think about how useful it's going to be to an architect, a painter, a photographer, an interior designer ... a piano teacher guiding her student's hands, a mechanic learning about an unfamiliar engine, a geologist making sense of land survey data ... anyone who would rather not have their visual information confined to a square.
Of course, a gadget that's this good at what it does would be pricey. But you could spend $1500 on a light but powerful laptop, or you could spend $2500 on this, and never need an external display. At that point you would break even.
Now, to get back to my point about Zuck and his "vision":
NONE OF THESE APPLICATIONS HAVE JACK SQUAT TO DO WITH A METAVERSE. They are serious things done by people at work, and they do not need any "social" component beyond what is already implemented. That whole vision of cool people "hanging out" in some virtual universe and having a blast with their spare time ... bugger that. Anyone with spare time is going to pull these devices off and go outside. But make no mistake, the technology is just about here to make a device of this type and quality, and people are going to find it very useful.
It completely and perhaps deliberately misses the point. The next step is not about anything so grand. The "killer app" is niche and is right in front of most people: AR/VR headsets are on the verge of becoming the new essential engineering and artistic tool, not a replacement for the current emperor of communication tools (the smartphone).

"So what?" you say. "This is basically like having a laptop." Well sort of. But the point is, it's actually EASIER to deploy than the laptop. THAT'S when it becomes the killer device.
The headset has one wire that goes over your shoulder down to a battery, which you can keep in your pocket. Other than that there are no wires to deploy. There are no controls to pick up. You can see and feel the keyboard in front of you still. You do not need a mousepad or a trackpad, but you do have a pointer: You move it by raising one index finger off the keyboard and pointing. Note that this is actually EASIER than lifting one hand up and placing it on a trackpad. You click by tapping your thumb, ever so slightly, wherever it is. The LIDAR sensors on the bottom of the headset track all this, as long as you don't turn your head more than 90 degrees away from your hands.
The displays are fixed to the environment, and of course can be easily rearranged. Of course, by the time you see an implementation of this, the whole concept of windows in a display will have been re-thought, to include elements that respond to the movement of your eyes specifically, elements that warp space, faux "work environments" with notes scattered on a desk, tools to examine data structures and memory contents in cube form, integrated side-by-side interaction with the same environment by two people, and so on. It would take the whole notion of screen sharing in a meeting to the next level, obviously.
If the device is light and responsive and high-resolution enough, you will suddenly hate developing on tiny screens anchored to keyboards. You will get used to having your custom 360-degree "work environment" deployable around you at a moment's notice no matter where you are, and you will begin using the more positional aspects of the environment to keep track of things and context-switch in ways you haven't even thought of.
Now, if it can be that useful to you as a software developer - a person who really just needs big legible grids of code and a good keyboard to do their job - think about how useful it's going to be to an architect, a painter, a photographer, an interior designer ... a piano teacher guiding her student's hands, a mechanic learning about an unfamiliar engine, a geologist making sense of land survey data ... anyone who would rather not have their visual information confined to a square.
Of course, a gadget that's this good at what it does would be pricey. But you could spend $1500 on a light but powerful laptop, or you could spend $2500 on this, and never need an external display. At that point you would break even.
Now, to get back to my point about Zuck and his "vision":
NONE OF THESE APPLICATIONS HAVE JACK SQUAT TO DO WITH A METAVERSE. They are serious things done by people at work, and they do not need any "social" component beyond what is already implemented. That whole vision of cool people "hanging out" in some virtual universe and having a blast with their spare time ... bugger that. Anyone with spare time is going to pull these devices off and go outside. But make no mistake, the technology is just about here to make a device of this type and quality, and people are going to find it very useful.