Where does your enterprise stand on the AI adoption curve? Take our AI survey to find out.
Seamus Blackley was one of the renegades at Microsoft who created the original Xbox game console, launched in 2001. (Somebody I know wrote a book about that). He went on to a long career in games and as an agent at Creative Artists Agency.
More recently, he cofounded a mobile game company called Innovative Leisure, to release games made by former Atari classic arcade game designers. That company didn’t quite make it, though Blackley still has some games that could potentially be launched at some point in the future.
Then Blackley returned to his original career: doing research related to high-energy physics. He met with Brian Mullins, CEO of augmented reality firm Daqri. Now Daqri is acquiring Blackley’s research lab and assigning him to work on commercialization of an interesting new 3D printing technology dubbed “software defined light.” It could enable instant 3D printing, as you can see in the video below.
I talked with Blackley about his new job, and how his quest for augmented reality and instant 3D printing could one day lead him back to gaming.
Here’s an edited transcript of our interview.
GamesBeat: How did this new job come about?
Seamus Blackley: I had a contact with a Hollywood guy who wanted some help doing physics work. I ended up starting a physics lab, and it’s sort of like the mob in the Godfather movies. Once you’re a physicist, it always drags you back. I set up a rapid prototyping lab, primarily focused on physics, in Pasadena, in the space where I was restoring all my arcade games. We ended up having staff and doing rapid prototyping projects.
Through this intern we brought in from Cal Tech, I met someone who turned out to be one of the first investors in Daqri. He introduced me to the CEO of Daqri, Brian Mullins, who I really took a shine to. I consider Brian one of my best friends today. I thought Brian was looking for a job at the time, but it turned out that he was interested in what we were capable of doing at my lab. We were doing very difficult, very abstract, very technical problems and converting them into working prototypes and devices that exploited new ideas in physics to accomplish real things. It was at an intersection between mathematics, high-speed programming, and reality.
The company we were doing our work for at the time started facing some problems with financing. They suggested we look for other work to keep doing what we wanted to do. Brian showed me some of the stuff they were doing at Daqri. He invited me to his place in downtown L.A., and I was not prepared for what I saw down there. I wasn’t prepared to see this big company with cool Star Trek-looking offices and tons of augmented reality stuff.
Brian’s real interest is that he has this large collection of analog holograms, the kind you make with real objects. He’s always been fascinated by that, and I have too. When we were talking about Xbox, back in 2001 or 2002, you might remember that at some point you asked me what the next step in games would be, and I think you were expecting me to talk about HD resolution or something. I said the next big thing was holographic rendering. I’m always going to talk up that. And here I met this guy with funding who said that what he really wanted to do was holographic rendering. That’s where he believes AR is going. The company is set up around that.
They had acquired a company in the U.K. called Two Trees that had been making a holographic device for use in automotive heads-up displays. If you use a holographic device as opposed to a video device for a HUD, you can render things at different distances. You can render stuff on the dashboard near to the driver, and then render navigation commands over where the driver is focusing toward the road. It causes much less eyestrain, and because it operates using lasers, you can compensate for the optical power needed to overcome the sun. You can render in broad daylight.
Two Trees had been working on this a long time before they were acquired. They were acquired because heads-up displays (HUDs) are ground zero for AR. It’s AR performing a meaningful service for the driver. It’s a very directed thing. It’s a place you can be introduced to AR, become dependent on it, and understand its value. One of Brian’s key missions with Daqri is that AR will help humanity as it solves problems for people. That’s why Daqri concentrates entirely on professional, commercial customers for its AR devices. They solve a problem. You’re working at a petroleum plant and there are 650,000 valves. You can put the smart helmet on and tell which one is which. Everyone is safer and operations and smoother.
The HUD fit perfectly with that idea, so Daqri acquired Two Trees, and along with it acquired this holographic technology that was being used for automotive HUDs. I learned about that from Brian, and his dream of holography, and it became apparent to us — as well as Brian and the principal at Two Trees, whose name is Jamie Christmas, a brilliant guy — that there was much more potential for holography in this device than was being exploited with just the HUD. They had thought of a few things, so we discussed them. Amongst them was using holography to do instantaneous 3D printing. You probably by now have seen the prototype we built here in Pasadena doing that, in the MIT Tech Review.
I looked at the system and got a feel for the hardware. I attacked the problem from the standpoint of being a high-energy physicist and having done a lot of field theory. If you recall, I wrote a flight simulator called Flight Unlimited. Flight Unlimited was able to do aerobatic flight simulation because I wrote a different approximation to the Navier-Stokes equations from the ones that typical flight simulators use. My approximation was written with computability in mind relative to state of the art hardware at the time, so it could run in real time and simulate all sorts of nonlinear stuff for aerobatics and tumbling that nobody else had been doing.
People said, “Oh, you pulled off the impossible,” and that’s not really true. It’s just that because I was a physicist, I could look at the differential equations and write an approximation that was computable, as opposed to using the normal approximations that everyone is taught. When people use a normal set of approximations and there’s a status quo, you start to think those are the equations, and they’re not. They’re approximations of the equations. People have been using those for flight simulation since the ‘50s.
Fast forward, I’m looking at this holographic device and thinking of Maxwell’s equations, wave solutions to Maxwell’s equations for light. Light is electromagnetic radiation, so it follows the equations of James Maxwell that he assembled in the 1800s. Maxwell has this great moment when he was first assembling these equations, when he’s sitting there looking at them and trying to make sure they’re correct by solving all these electricity and magnetism problems that people had done using separate sets of equations in the past, making sure he gets the right answer.
He noticed there was a wave solution to this set of equations, and he saw that in the wave solution — there’s a term for the speed of a wave as it propagates. It’s the square root of these terms that were measured by rubbing a glass rod with cat hair and connecting batteries to cloth-covered wires near pieces of iron on scales, these two constants. The square root of their product turns out to be exactly the speed of light. He was sitting there looking at the proof, that light is electromagnetic radiation. He knew that when no other people could know that. Others had guessed it, but he had shown it.
I looked at the problem of holography fundamentally as a problem of solving Maxwell’s equations. I started to write down, here in my lab, solutions and approximations for those equations, for the set of Maxwell’s equations that holograms represent. We came up with alternate approaches that were really computable using high-end graphics hardware, for computing holograms in real time. The purpose of this was so we could generate objects made out of light and project those into tanks of monomer, in order to solidify it and do instant 3D printing.