The future of 3D Lightfield technology
In this short Display Week series, Leia Inc CEO David Fattal explained the origins of our newest 3D display (The Monitor 15.6") and went into detail about how the company’s Lightfield technology has evolved over a very short period. For this final installment, David shares his thoughts about some of the markets that will be able to benefit from it, sample use cases, and hints at what comes next.
Where do you see OEMs (and people) getting the most benefit from using 3D Lightfield?
First and foremost, I see a huge benefit in opening the doors to the Metaverse to so many more people. With this 3D upgrade to familiar devices, worlds of 2D content are new, deeper. So, where you could see millions of people in VR touching the metaverse in the future, what about the billions of others that could experience it through their next phone, tablet, or monitor? Of course, we are also looking to some OEM partners to help incorporate our technology into their visions.
One of the big opportunities I see includes 3D chat that comes off the screen conveys a much deeper sense of presence.
For that purpose, we could use our 2D-3D conversion so that the Leia user can even see video coming from a 2D source converted into 3D. Of course, the best result would be people on both sides of the video chat using stereo cameras and Leia hardware. This is part of our plan, going forward, for our own products and we are encouraging our OEM partners to enable that feature as well and incorporate a stereo camera into their devices. Having those front-facing stereo cameras removes the need for computer vision to convert live-streamed video chats. That means much higher quality on both ends and removes artifacting.
When we talk about bringing people together within the Metaverse this is the technology we envision helping us along that path. It is immersive 3D with the fewest possible barriers.
Another place where we think people will benefit from better 3D is gaming. For casual gaming, it’s awesome. That is easy enough to see with the games we already have on the Lume Pad. We are currently doing a study on what it could mean, adding those layers of depth to first-person shooters and competitive gaming — everything we can do to give you better situational awareness without losing frame rate. We do think that it can enhance the gameplay and give players a better depth perspective.
That is very interesting in terms of competitive gaming. Being able to convey a better sense of depth and distance within the environment — being able to see the difference more pronounced — could provide players with an advantage.
Yes, typically you have a flat scene and you have your opponents moving in the distance. You’re able to easily notice objects and people popping out in the foreground. When something is moving on a 2D screen, it can be a little tougher to tell if it’s moving closer or farther away from you, but here it’s very naturally pronounced.
Like I’ve said though, this is still something we are studying.
A few moments ago, you mentioned how this technology helps democratize the Metaverse. Is that why you think this is a good time for devs to design for 3D Lightfield?
With the proliferation of VR content, the investment of Metaverse-focused content and the maturity of 3D engines, there’s never been a better time to develop 3D content. There are better tools available and you also have software, such as Substance from Adobe, which are aimed at democratizing access to 3D creation for the Metaverse.
You have all these tools that weren’t available even a couple of years ago. That’s a big part of why we’ve created SDKs that simply and easily plug into existing tools such as Unity and Unreal — to make it a quick, seamless process. And, from here, it’s only going to get better.
I also truly believe that our tools, both software and hardware, have an opportunity to bridge the gap between the Metaverse and how people can access it. By that I mean getting there without having to invest a lot of money or effort. You just look at a Lightfield screen and you’re experiencing a different perspective of the Metaverse.
Right now, on most desktops, we see people with huge monitors and even multi-monitor setups. Where do you think this monitor fits into the equation right now? And how well does it function as a second screen within a multi-screen setup?
Once we get further down the line, we envision large screens sitting on desktops capable of what we’re showing off at Display Week. I mean, 15.6" screens are rather small and this is the start. For now, these screens would be good for laptops, infotainment systems…and personal usage second screens.
Larger monitors from us are coming and they will be focused on creators and gamers. Following that, more general consumer models. As I’ve said, we see 3D chat as being that killer app and we want to ultimately make it as widely available as possible so that everyone can experience this game-changing way to connect with people.
Can we talk about any partners using this panel at this time?
What I can say for now is that we are working with big US companies on this theme of telepresence and you’ll see further news from us in the near future.
If you are coming to this series late, go back to the start where Leia CEO David Fattal gives a lot more context to the new 3D monitor platform we showed off during Display Week.
If you’re at Display Week 2022, there’s still a chance to experience this new 3D Lightfield monitor for yourself at BOOTH #1521. Not at this show? We will also be at the upcoming AWE 2022 in Santa Clara in June!