Organic Motion is launching a new generation of its motion-capture system for getting an accurate 3D digital representation of a human body in motion, the company announced today. The new technology will create exact digital clones of multiple people in real-time as they perform on a stage.
Used for computer animations and games, the original Organic Motion machines introduced the idea of “markerless” motion capture. In the past, actors had to wear black suits with small white balls, or markers, attached to them so that cameras could capture their motion on a sound stage. With markerless motion capture, the actors didn’t have to wear the cumbersome gear.
“It’s a very different experience now,” said Jonathan Rand, president, in an interview.
With its new OpenStage motion capture system, New York-based Organic Motion will allow for more demanding animated scenes. It will be able to capture scenes with multiple actors performing in the same space. It will also have increased accuracy and no longer requires a backdrop to discern figures in motion. Rand said that the OpenStage is much more configurable than the company’s previous product. The previous one had 14 cameras and had to have a backdrop. But the new one ranges from eight cameras to 24 cameras and can cover areas ranging from four-square feet to 30-square feet. The cost is $40,000 to $80,000.
The company is targeting the technology at commercial animators, educators, and public exhibitors. Organic Motion is showing off the technology at the Siggraph computer graphics conference in Vancouver, Canada. Professionals can use it for preproduction, production, live entertainment and training.
OpenStage uses a unique computer vision software and high-speed color cameras to capture multiple angles from 360 degrees. Now it has the ability to track basic props and a larger scanning space, and it is portable as well. Andrew Tschesnok, chief executive of Organic Motion, said that the new features are based on feedback from thousands of people who have worked with the company’s earlier system.
Among the companies and institutions that will use the new system are Game On Audio in Los Angeles to capture voice and body performances; the School of Visual Arts in New York to create computer art and animation; University of California at Santa Cruz for film, digital media, and computer science; and ZeroC7 in Japan for animation production and live use at public events.
The company has raised $10 million to date from the Foundry Group and Megu Capital. Tschesnok began research on the technology in 2002 and founded a company around it in 2006. Rivals that compete in some fashion include Mova and Image Metrics.
VB's research team is studying web-personalization... Chime in here, and we’ll share the results.