GB: Tanks of what?

Blackley: A monomer is a resin that can be polymerized in some way so it becomes a solid. In an epoxy, for instance, you mix two liquids and they become a solid. In 3D printing that uses lasers, the laser solidifies this fluid wherever it touches. Our method uses a fluid similar to that. We have chemists here now working on this all the time. Our fluid is a light-sensitive monomer that polymerizes when you have a certain threshold of photons hitting a specific molecule. We can project this structure made out of light into the tank of monomer and it will solidify wherever there is hologram photons. It won’t solidify elsewhere. If I make a hologram in the shape of a paper clip, you can pull out a paper clip right away. We can do a bunch of other objects as well.

It works, and it’s really neat. You’ve made an actual object out of light, using the principle of holography as computed by this approximation strategy we have, using consumer graphics hardware. We started doing that, and it became immediately clear that, to me at least, the potential for this was enormous. There’s no reason that we can’t push this all the way to large scale displays that can be fully interactive and holographic. Groups of people can look at a phone display, a phone display, a television display, and they’ll all see the same 3D scene from different perspectives as if they were looking at real objects. All the math we’ve done, all the simulation, all the engineering work shows that this is the case. We can do it at a small scale already.

Daqri is also working on augmented reality heads-up displays for cars.

Above: Daqri is also working on augmented reality heads-up displays for cars.

Image Credit: Daqri

We’re continuously building toward larger devices and more applications of the technology, which we’re calling software-defined light. “Hologram” has become a completely maligned word. It’s lost any meaning. The idea of a hologram now essentially means Pepper’s ghost, a video image reflected off a glass in front of you so it appears to float in the world. These are different. They’re actual things you can see. When you move around the perspective changes, they occlude themselves, and do all the things that holograms do. Because they’re real objects made out of light, you can use them to print, because it’s a real thing. It also means you can make beautiful displays, and you can do other interesting applications that take advantage of the fact that you can now directly connect the mathematics in your software to physical light in the world.

That’s where we get to this idea of software-defined light, or SDL, which is really the generalization of this technology. We can compute a light field that we want to generate, a pattern of photons, and if we compute it correctly, it will exist. It’s a direct connection between mathematical computation, information theory work in a machine, and a light field in the real world. This means that, for everything from lidar applications to headlights that don’t blind oncoming cars to anywhere you have a system of lenses or complicated optical devices, in just the way that software-defined radio allows Qualcomm to put custom radio hardware in phones and other devices, letting a computer send and receive radio, we are now able to use a computer to directly send and receive light.

When all of this became clear, it was obvious that this was an important thing to pursue. I found myself, as a person with a background in field theory and real time computing and making hardware and running a physics lab — it seemed like I was the guy to take this technology and exploit it as quickly as possible. That’s what we’ve done. My lab has become the graphics prototyping and rapid research lab for Daqri around software-defined light. I’ll be working on all the stuff I just mentioned, and probably some more.

GB: How many people were in your lab? Are they all coming over to join this effort?

Blackley: Everybody came over. We had five people, and then another four or five contractors. Most of them have come over. Now we’re rapidly accelerating our hiring, getting more people with backgrounds in holography, more electrical and optical engineers. We’re having a really good time. It’s a pretty easy sell because we have this stuff working. We bring people in who’ve been working on optics and we say, “Hey, we can do this.” They say, “OK, I’ll start now.”

GB: As far as the application here, is it a brand new product line for Daqri, or is there some way this comes back to help with their smart helmet?

Blackley: This is something of a serendipitous outcome from the AR work on the HUD. The guys on Jamie Christmas’s team who invented this knew that there were other applications. It’s not like we discovered some kind of secret. But we did discover that the techniques they were employing made it possible sooner than anyone had thought. It doesn’t affect the strategy at Daqri. In fact there are places in which using these devices can cause their AR displays can be much better and more powerful. In that way it complements them. But it also introduces ways to take our thinking about AR and extend it into the real world.

Instantaneous printing, if you think about it, is the ultimate example of AR. You take a computer object and you make it real. It augments reality. That sounds like a joke, but it really is the sort of thing people have been thinking about around AR since the ‘40s and ‘50s in science fiction. One of the weird things that happens, in fact, when we show this to people who aren’t familiar people with 3D printing technology, is that they’re not that impressed. If you’ve never used a 3D printer you’re not aware of the state of the art. You probably expect, from Star Trek, that 3D printing involves just pressing the button and you have the object in your hand in 10 seconds. So people say, “Well, of course that’s how 3D printing works.” Which is super depressing when you’re trying to impress someone with a demo, but it makes a lot of sense.

GB: Is the first application in 3D printers? Or will you be doing something else?

Blackley: We have a lot of partners at Daqri in various business categories. We obviously have an automotive partner for the HUD. Hundreds of thousands of vehicles on the road are using the first generation of the HUD technology. Those vehicles are already on the road using the holographic technology in that version, from phase two of our product road map, for the near future. We also have other automotive partners that we haven’t announced yet.

When we showed the capabilities of SDL to those automotive partners, they became incredibly interested in its applications for Lidar (laser sensing system). If I can mathematically steer light and read back the signal, I have a big advantage over mechanically scanning Lidar systems. Also, if I’m mathematically defining the outgoing light I can encode in that all sorts of information that makes the Lidar problem easier, faster, more robust, and so on. We’ll be talking about this more in the future. It’s an area of research right now because it’s so important to self-driving car efforts and safety features. It may be that we end up licensing that technology to partners. It may be that we try to build some products. We haven’t made that decision yet.

Enterprises could use augmented reality glasses to improve technical maintenance.

Above: Enterprises could use augmented reality glasses to improve technical maintenance.

Image Credit: Daqri

The same thing goes for printing. We have partners in our AR business, customers of our helmets and goggles and other sensing devices, who are very interested in not only just straight 3D printing, but in the mechanism of that printing, which would enable you to not just print things in a tank, but also print things on objects, print things inside objects, mark objects, and so on. That’s a licensable technology. It’s also something that could be a stand-alone product in conjunction with a partner.

The overall message is that I really am running an R&D lab. The decisions we make, based on what succeeds and fails in that R&D, have a whole spectrum of implications. But the immediate application will be in the automotive side, just to put a fine point on it.

GB: Are you guys a standalone lab, or do you join another R&D unit that’s already inside Daqri?

Blackley: We’re a standalone lab. The R&D that went on in holography took place in Milton Keynes, in England. Fortunately, they’re incredibly nice, incredibly smart people, and so we’ve been able to partner with them. They’re doing fundamental research for the automotive products, the fundamentals of the holography. We’re looking at applications and improvements and other ways to push that technology into different places.

GB: As far as technologies you’re excited about, do you think some of these big problems are going to get solutions soon? Things like what Magic Leap is dealing with, and making outstanding AR in general. Is that going to be solved soon? Does this technology play a role there?

Blackley: This technology is completely distinct from the technologies that are used by Magic Leap or HoloLens or any of the other AR companies. This is a much harder way to go about solving the problem. I was nervous about whether or not we would be able to use it properly to generate holographic images in something like real time.

To some extent, the technologies at HoloLens and Magic Leap and so on are ideas that people had to create images that look a lot like holograms, without having to try to solve the general holography problem, because that’s been viewed as very difficult. It is very difficult. We operate in a totally different way. Our progress with software-defined light is in a separate direction from what those guys are doing.

We met, actually, with a very senior engineer who works on virtual reality, not augmented reality. Like all engineers we show this stuff to, he was very skeptical, and then when we proved it’s working, he was super excited and very impressed. He was laughing, because he suggested we call it Pepper’s ghost instead of calling it holography. He said that for so long, since the ‘60s or the ‘70s, people have been projecting images onto screens and sheets of mylar and stuff and saying, “Oh, look, it’s a hologram.” It would be a funny joke if, now that we’re making a real hologram, we went and called it Pepper’s ghost. It’s different enough that tech people crack jokes about how different it is.

I have no idea what the fate of those companies is going to be. I don’t even really know anymore what struggles they’re facing, because I’ve been so focused on my own stuff. But I can tell you that it’s a completely different technology tree. It comes from an unexpected place. We never thought that automotive HUD holographic projection units could be pushed to do live 3D printing or polygonal holographic displays. But it turns out we can.