How Leia’s new Lightfield monitor works
Yesterday, Leia CEO David Fattal shared background on the inception of the platform, and now he’s digging into why Leia’s 15.6" 3D monitor platform is breaking new ground through a combination of computer vision, software, and hardware design.
How has 3D Lightfield improved over the last generation of this technology?
Perhaps the most meaningful improvement is the addition of head tracking, which we didn’t use before. We don’t need to render all the views, just two views on the PC and have it move as your head moves. We can do this to great effect from real-time 3D content, stereo images and videos, or straight from VR. As a result, we can get much greater graphic fidelity and more views over previous generations of 3D Lightfield technology.
Our software is determining down to the pixel what needs to be a right-versus-left pixel. This is all done dynamically in real-time, moving with your head. It provides you with a pretty awesome sense of immersion — three times better than the original Lume Pad.
That active Lightfield mode really does make a huge difference. It’s a great, cleaner and deeper experience for a single user as opposed to a good experience for multiple users crowding around the screen.
There’s a camera at the top of this new display. Is it optional? Or is that required for this monitor to function properly? And how many people can view this at the same time?
There’s a shared-viewing 3D mode that is possible with this new platform. It can operate similarly to our previous Lightfield devices and can be viewed by more than one person at a time. That doesn’t rely upon the camera.
Most of the time, though, you’re going to have a single user in front of the monitor. We have it as optional for now, but it will likely just be a part of the monitor going forward. Really, the results speak for themselves. It’s just an overall better, more naturally immersive experience when paired with the camera. That decision would be up to our OEM customers, though.
We’re going to be showing some tracked-stereo content at Display Week, for example. It’s easier on the rendering. If you’re rendering in Unity or Unreal and you only need to create two instead of eight-or-more views. That ensures compatibility with a lot of stereo content out there, like what we license from major Hollywood studios.
Let’s talk about the differences between the mobile experience we’ve shown in the past versus what’s coming next.
Obviously, there’s a big difference between mobile versus desktop use in terms of the rendering. When we use that tracked stereo mode you only have to render two views vs four on the Lume Pad, it tends to be a lot easier… if we want to go into competitive gaming, for instance, we cannot compromise on frame rate or quality in any way.
This being said, computer vision is being used for 2D-3D conversion. It was already really good on the Lume Pad, but on future products, including this monitor,we can render these conversions in real time at full-frame, high resolutions with depth maps, and all the details. That means our app, LeiaTube (which converts YouTube 2D video into 3D converted content in realtime) will get better. It also means that we can take a plain 2D chat and convert it into 3D on-the-fly in really good quality.
Every improvement that we make with our computer vision is going to make things work better whether it is local / offline on a tablet or being fed through our cloud-based LeiaPix Converter. In both cases, we’re constantly working to improve functionality.
Tomorrow, in the final installment of this Display Week series, we’re going to look at some of the markets David Fattal believes have a lot to gain from 3D Lightfield display tech and we look to the future of Leia.
Don’t forget that you still have a chance to see this new 3D monitor platform at Display Week 2022. Stop by the Leia Inc booth (BOOTH #1521) where you can experience this new 3D Lightfield monitor platform for yourself!