When Hewlett-Packard released its Sprout by HP creativity computer last October, it wasn’t fully fleshed out. But now that the company has integrated both 3D printing and 3D capture into Sprout, the desktop computer is starting to fulfill its promise of democratizing the art of creativity.
The Sprout computer moves us a step closer toward enabling a “creator economy,” or a world dominated by people who create things rather than just purely consume them. It does so by making it easier to break down the barrier between the digital and physical worlds. I got a hands-on demo of the newly updated Sprout at HP’s headquarters in Palo Alto, Calif., and I had a chance to fully absorb how much progress HP has made in the months since it released Sprout.
Sprout has a regular all-in-one desktop screen as well as a screen that is projected on the table in front of the computer via an overhead camera/projector. HP sees the $1,900 Sprout fitting into the concept of “blended reality,” a world where you can work with both 2D and 3D, said Eric Monsef, vice president of the Immersive Computing Experience Group at HP, in an interview with VentureBeat.
“We believe that technology can be humanized,” Monsef said.
HP’s Sprout division spans both hardware and software, as well as consumer and enterprise divisions. It is housed within the consumer PC division of the new HP, which is dividing itself into enterprise and consumer companies. Sprout is in a category by itself.
“In 2D, it’s so easy to capture and do something with a photo,” Monsef said. “When is that happening with 3D? This is the promise of the PC. Blended reality is the answer. It will enable creativity for everybody in both 2D and 3D. With Sprout, we put it all in one box.”
An upgraded Sprout experience
The initial Sprout debuted with 3D Snapshot, an app that allowed you to scan in about half of an object. Last month, HP created an update dubbed 3D Capture software, which can capture a full object in 360 degrees.
“It’s an end-to-end solution, from scan to print,” Monsef said. “We told our customers that, over time, this product will evolve … You don’t see [it] as a static thing.”
For 3D scanning, the computer uses the Intel Real Sense 3D cameras, which can sense your hand gestures, as well as the machine’s overhead projector.
This summer, HP released a proprietary HP 3D Capture Stage, a $300 rotating turntable accessory where you place objects that you want to capture in 3D and scan into the computer. It tilts 15 degrees and uses a camera to stitch together a 3D object. You can take the digital scan and manipulate it inside the computer using automated editing tools. The 3D capture stage allows for faster, easier, and more accurate scans.
The 3D Capture software is a free upgrade to the earlier 3D Snapshot software. Now, the capture software can grab an entire object, and it automatically deletes the capture stage from the image. You can scan in an object like a toy dinosaur in six stages, each lasting a few minutes. The software automatically stitches the seams of the scanned object together, and you can smooth out any imperfections manually. Then you can send it off to a connected 3D printer, such as a $1,000 Dremel 3D Idea Builder printer. And if you want a higher-quality build that takes days to print, you can use Sprout to contact a service which can take your digital submission and send the printed 3D object back via mail delivery.
“A lot of other experiences today are not hands-on enough,” Monsef said. “You just roll up your sleeves and get on Sprout. You don’t need a thousand hours of training. You just grab, mash, make. It has a flat learning curve where you just get in.”
Last year’s 2D scanning capability allowed you to take an object, scan it in as a 2D digital asset, and start using it. Now, with 3D, HP is working on getting the process to work in as close to real-time as possible. It’s not quite there yet, as it can take a while to scan and even more time to print an object.
“I think HP’s Sprout is clearly on the right track for the next generation of usage models,” said Patrick Moorhead, analyst at Moor Insights & Strategy. “While many of the new usage models coming out are around mobility, I think HP’s idea of ‘blended reality’ is the best case I’ve seen so far for a next generation PC usage model. Sprout isn’t for everyone, but for explorers of new things and new technologies.”
How Sprout was born
HP spent more than 4.5 years working on the technology for Sprout, said Brad Short, a distinguished technologist and an architect of Sprout, in an interview. They gathered sensors, cameras, touch devices, and dual-screen display technologies. They had to wait for some technologies to mature and become affordable.
Short said, “The ambitious goal here is redefining how you interact with a computer. People think in 3D. The world is in 3D. Computers really need to work in 3D. That’s the future of where we think the interface is going.”
They wanted to make the creative process of building 3D objects easier, as if you were capturing, manipulating, and printing a photo. They wanted it to be seamless and simultaneous. The team had to figure out how to pull 3D content into the computer. Then they had to get people used to manipulating and seeing things from multiple perspectives. A projected light was required for the 2D capture process, and a 14.6-megapixel high-resolution camera was also necessary. By combining them together using a “structured light scanning process,” they were able to create a way to capture objects in 3D.
They came up with three ways to scan objects into Sprout. The first was by using the Intel Real Sense camera, which captures imagery as part of its ability to sense gestures. They also implemented photogrammetry, basically taking a series of 2D photos and stitching them into a 3D image. For high-resolution capture, HP settled on a visible structured light-scanning process using a digital light processing (DLP) projector. They dialed the camera into the right resolution, and they wound up with a 200 micron scanning resolution, or about 0.2 millimeters. That means Sprout scans in a data point every 200 microns, so the camera can see intricate patterns in the texture of a toy dinosaur or a doll’s shirt.
The hard part was creating the software that makes it easy to manipulate the captured images. A human has to look at a scanned object and get rid of the errors and artifacts that don’t belong there. That process has to be something easy to do with a couple of taps on the touchscreen.