A picture-perfect metaverse is years away. Meta’s prototypes prove it.

Placeholder while article actions load

Meta’s Starburst virtual reality prototype doesn’t look anything like a traditional headset.

From some angles, it looks like someone ripped the guts out of a tiny desktop computer — including the fans — and attached a pair of heavy-duty handles to it. And those are crucial because Starburst is too heavy to wear, a result of the bulky, self-contained lamp bolted to its back.

By Meta chief executive Mark Zuckerberg’s admission, Starburst is “wildly impractical” in its current form. But for a company that wants to give its users virtual experiences that are nearly indistinguishable from the real thing, these enormous VR binoculars are still an important development.

To truly blur the line between the physical and the virtual — or passing the “visual Turing test,” as some researchers say — Meta has to clear some serious hurdles. Future headsets need to be sleeker than the ones we have now, and yet more capable. And the screens inside them need to be sharper, smarter, and brighter than anything out there right now.

That’s why Starburst was built around a big lamp — it’s one prototype, meant to tackle one big issue. And it’s not alone.

“The goal of all this work is to help us identify which technical paths are going to allow us to make meaningful enough improvements that we can start approaching visual realism,” Zuckerberg told journalists during a presentation.

That verisimilitude is a crucial part of his vision of the metaverse: an immersive “embodied internet” where users will feel like they’re inhabiting a space instead of just looking at it. But despite the wave of metaverse hype Zuckerberg launched after laying out that vision last year, Meta’s prototypes offer a palpable sense of just how far the company is from delivering on that promise.

For one, the company has to figure out how to make everything we see through a headset more detailed.

Think of your TV, or your computer monitor: the higher the resolution, the crisper and more realistic things displayed on them look. But the tiny screens inside current VR headsets can’t get close to that crispness — they have too few pixels, stretched across too wide a space.

Another prototype, Butterscotch, sort of fixes the problem. It’s bigger than anyone would want to wear for very long, and “nowhere near shippable” according to Michael Abrash, chief scientist at Meta’s Reality Labs division. Even so, the visuals it produces are detailed enough that a wearer could read the bottom, 20/20 line of a virtual vision chart — not bad, compared to the blurry splotches seen through a Meta Quest 2.

The catch? Researchers had to narrow the field of view to about half of what you’d see through the Quest 2. That is, looking through Butterscotch shows you less of the virtual world in front of you — but what you can see looks very clear. Not a great trade-off, but Abrash concedes it’ll be a few years at least before the right kinds of screens exist.

“There are currently no display panels that support anything close to retinal resolution for the full field of view of VR headsets today,” he said.

Stepping into virtual reality as a parent brings adventure and unknowns

Another prototype, called Half Dome, was first dreamed up in 2017 and is now on its third revision. Inside this headset and others like it, Reality Labs researchers have been fine-tuning what they call “varifocal” lenses — those that physically and automatically move to help wearers’ eyes focus on virtual “objects” in front of them.

If you were wearing a traditional VR headset, you’d find that the focal distance is set a few feet in front of you. Try to bring an object — say, a virtual handwritten letter — closer to your face, and you might find that you can’t read it.

In a situation like that, your real eyes are focusing just fine — the issue is, your view of the world is naturally a little farsighted. Varifocal lenses, then, are a like pair of glasses with a life of their own, moving around to keep virtual objects in focus no matter where they are.

Meta has been experimenting with these lenses for the better part of five years, the company says, and despite once claiming that they were nearly “ready for prime time,” they have not appeared in any headset you can buy at the moment. And for now, that seems unlikely to change.

“Even when you sometimes have a prototype that looks like it’s working, actually getting that into a product can take a while,” Zuckerberg said. “We’re working on it.”

One final prototype Meta showed reporters — dubbed Holocake 2 — drove Zuckerberg’s point home.

Unlike other experimental headsets Meta showed off, Holocake 2 is fully wearable and functional — it can connect to a computer and run existing VR software without a hitch. And because of the specific way researchers designed its optics, Holocake is the thinnest, lightest VR headset the company says it has ever made.

But even that doesn’t mean Holocake is ready to debut on store shelves anytime soon. Unlike more conventional VR headsets, Holocake 2 uses lasers as light sources instead of light-emitting diodes, or LEDs. (You know, the things in some of your lightbulbs).

“As of today, the jury is still out on finding a suitable laser source, but if that proves tractable, there will be a clear path to sunglasses-like VR displays,” said Abrash.

That fact that these prototypes exist is proof that these problems can be tackled individually — if not always elegantly. The real rub, though, is building a single headset that addresses all of these areas and manages to be comfortable and power-efficient at the same time. And researchers suspect the end result could resemble a concept design called Mirror Lake.

While it doesn’t exist as a working prototype (and probably won’t for a while), Mirror Lake packages many of those visual advances — plus a display that show bystanders the wearer’s eyes and face — into a headset that looks like a pair of ski goggles.

Nobody can see your eyes when you wear a VR headset. Facebook wants to change that.

Douglas Lanman, director of display systems research at Meta’s Reality Labs division also called Mirror Lake the company’s first “mixed reality” concept, referring to a kind of wearable display meant to blend digital objects and environments into your view of the physical world.

It would be a “game changer for the VR visual experience,” says Abrash. Now Meta just needs to make it — or something like it.

In the meantime, the company faces other head winds.

Meta’s revenue growth has begun to slow and Reuters reported last month that the Reality Labs division could not afford to pursue certain projects. Hiring has also slowed at the company, though spokesperson Elana Widmann said Meta has “no plans for layoffs at this time.” And while the company was expected to release a pair of augmented reality glasses code-named Project Nazare in 2024, those plans were said to have been scrapped in favor of turning them into a demo device.

“We’re evaluating key priorities across the company and putting energy behind them especially as they relate to our core business and Reality Labs,” Widmann said in an email.

Loading…

Source: WP