The history - and future - of Leia's 3D Lightfield and AI
2023 will be a breakthrough year for 3D Lightfield technology. In fact, let’s just call it “2023D.”
It’s starting at CES 2023 with four Innovation Award wins. We’re taking this moment to celebrate, but also reflect on the lessons learned along the way, and start laying the foundation for what’s to come this year, chatting with David Fattal, Co-Founder and CTO of Leia Inc.
Several converging technologies are transforming science fiction into reality and David has seen it all firsthand – since he first co-created 3D Lightfield technology in HP’s lab.
Before we talk about anything else, we have something to celebrate: four CES Innovation Awards!
I couldn’t be prouder of this moment. If you win one award at CES, that’s one thing. When you win FOUR you know that you’re onto something. We were recognized for Computer Hardware and Components, Computer Peripherals & Accessories, Embedded Technologies, and Gaming.
This is a tribute to the hard work and dedication of the team here at Leia. We’ve been experimenting with new 3D platforms, AI-driven 3D, iterating on what works, and streamlining it to be better than ever. From the variety of display form factors, bringing head-tracking technology to our 3D Lightfield devices to our first ever 3D chat and we’re bringing more to users in 2023. We’ve come a long way!
Taking a step back for second, how has the display world changed since you first forged the initial ideas of 3D Lightfield?
It’s quite interesting because when we formed the company in 2013-14, it was during the craze - and flop - of 3D TV. We lived through that transition and every seven years there’s a new cycle. The failure of that old 3D implementation has purged itself and now we’re seeing a renewal of interest.
As you said, it’s definitely been a journey these past eight years. What are some of your fondest memories of the early days and key things learned along the way?
When we left HP and spun out our company, it obviously felt freeing. What we were initially doing was on the side, while working in HP’s labs was our day job. Suddenly, we were unleashed and about to focus on the tech.
When we formed Leia, it took quite a lot of setup with our manufacturing facility and lots of prototyping to lay the groundwork for what we have now. It took us about three years to get properly up and running. From the beginning, we charged ourselves with creating a scalable manufacturing solution for high-quality results people would expect in a consumer product. It quickly felt quite liberating, but the reality of the business caught up.
FIRST TEST DEVICE FROM LABS (2013)
The ratio of screen to device was pretty poor on that first device [laughs], now we’re talking about borderless 3D displays. For this one, we were using a high-density display that you likely see in VR these days, but this was from 2013. On this one we’re trying to provide for a lot of viewpoints. It was working, but the image was very poor and the image was blurry - all that said, it laid the foundation for what came next.
FIRST DEMO DEVICE FROM LEIA (2014)
This was our first commercialized dev kit. It was a better, cleaner overall package and it was our first way to show people the direction we were looking to take things. People could use WebGL, people could hook this up to their computer and see content in 3D.
SCREEN FOR THE HYDROGEN ONE PHONE (2017)
I think the phone was very interesting. It came almost out of necessity to talk to the partner. They challenged us when we first met. We had to be ready in six months to deliver the backbone for the 3D screen technology that was ultimately used in the phone and it was a challenge on so many levels – a good challenge to have!It was a challenge on the design, it was a challenge on the supply chain, and this is where we started our software and content efforts. We realized that we needed a place to put our 3D apps (an app store) and streaming movies. Oh, and that all had to be done in a couple months. So we had to create the hardware and define the backend experience at the same time. It was the most prolific, exciting, and crazy time for the company.
THE ORIGINAL LUME PAD (2020)
This was really scaling up the Hydrogen phone into a tablet form factor. We leveraged the content and app store that were made for the H1 – they also worked on the Lume Pad. The positive feedback from the Lume Pad was that bigger was better. Movies, pictures, and everything was so much better. However, we also learned that if you release technology too soon – even if it’s a limited release platform – you must meet certain quality standards.
We took that lesson and went back to the drawing board, simplifying the tech in some ways. The biggest example, we’re now using head tracking to provide a superior, more focused experience.
Always trying to push the technology where the cost is acceptable and the performance is there. Which gets us to where we are today, which is absolutely broadly consumer-ready.
What do you think had been holding things back up to this point?
It took another change that we implemented in 2021 that really pushed us forward. It was the realization that we needed to implement eye-tracking to deliver an amazing experience to one person as opposed to what we were exploring in the past with multiple views projecting at a lower resolution so that more people could experience it at once.
You need to dedicate too many pixels and resources to predict where people will be without head-tracking. When you focus on one viewer being picked up by cameras, it becomes a much more pleasing experience. Quality drastically increases, the resolution improves and the viewable depth becomes much better with our newer approaches.
There’s a lot of complimentary tech that is also helping stretch the limits of 3D. For example, people have thought about creating 3D content for VR headsets and to a degree for exploring the concepts of the metaverse. Going into 2023, we’re in a very different position where display tech has evolved. We can support higher resolutions, brighter and more vivid pixels, but at the same time you have an awareness of the need for content that can be perceived the way it is in the real world.
We have announcements coming later in 2023, but can you help explain why you think this is 3D’s breakthrough year?
We now have the tools and creative software (and AI, of course) that allows not only professionals, but also consumers to easily create and experience content in 3D. Once you have that, it makes sense to have a 3D device for everyone.
With the advances we’ve made over the past two years alone, what we hear from people now at trade shows when we’ve shown off the newer development platforms, it’s not a matter of “Do I want this?” it’s “When can I buy it?” The answer from us is going to be, “Sooner than you might think.”
It’s a very different challenge for us. We can keep improving the tech, but we’re at the point where we’ve passed the quality bar. Now we have to ask ourselves how we accelerate adoption, whether it’s on a high-end device like a tablet or maybe even on a mid-tier device.
…and what we have to show off and announce in 2023 is all a part of that big next step. I’m excited for the future because we’re putting creation in the spotlight. We’re making it easier than ever to produce and experience 3D like no one has to-date.
For those attending CES: You’ll be able to see some development platforms for yourself.
I’m really excited for this year – and can’t wait to show everyone what we’ve been developing in our labs!
If you’re attending CES 2023 in Las Vegas, come join us!
Booth# 50961 in The Venetian
Stop by or DM us on socials at @LeiaInc! We’ll have a few people on hand that are happy to show you how it works – and give you a glimpse of what’s coming next!