This article is made possible by Intel’s GameDev BOOST program — dedicated to helping indie game developers everywhere achieve their dreams
Throughout Kevin He’s career in the tech and gaming industries, something has always bothered him. While rendering and other technologies evolved tremendously over time, animation woefully lagged behind. As an engineer, He wanted to find an efficient way to create more lifelike animations, and he was convinced that advanced physics simulation and AI could solve that problem.
So in 2014, He struck out on his own to start DeepMotion. The goal of the San Mateo, California-based company is to provide developers with powerful software development kits (SDK) that’ll allow them to create realistic animations for games and other applications. DeepMotion is trying to offer a better alternative to what He called the “tedious and labor-intensive” method of traditional keyframe animation.
Initially, the company applied physics simulation to make animation automatic. It then progressed into machine learning and deep learning, explained He, who is DeepMotion’s CEO.
“I think animation systems need a lot of love to get to a level where everyone feels engaged and immersed,” said He.
Some of DeepMotion’s products include Motion Brain, Body Tracking, and Virtual Reality Tracking. Motion Brain uses machine learning algorithms to animate characters that can interact with users in believable ways. Body Tracking captures movements in the real world via a camera and reconstructs it in the digital world (as seen with the emoji avatars in Samsung’s Galaxy S10 smartphones). And Virtual Reality Tracking helps create full body animations for VR.
Instead of individually animating characters or avatars, the company leverages AI to do some of the heavy lifting. Kevin He said the team’s long-term vision is to treat the real world “as a resource that we can use to train AI. Like in the TV series Westworld, the AI can observe how humans do everyday things, and the AI will learn how to do the movements itself.”
It’s still early days for DeepMotion’s technology — while it has a number of partners, only a few have been announced. One is New Zealand developer DryCactus, which is known for the popular bridge-building simulator Poly Bridge. The studio is working with DeepMotion’s Avatar Physics Engine for its next game.
He said DeepMotion is also working on a cloud version of their Body Tracking solution, which it calls its Cloud Animation Service. This allows you to easily convert reference clips in various formats (like MP4 and AVI) into FBX animations. It’s available now in an invite-only alpha testing phase, but developers can request access by contacting the company.
Building the future of AI-driven animation
DeepMotion is currently hard at work to improve its suite of SDKs. It recently earned a MegaGrant funding award from publisher Epic Games for the purpose of continuing Epic’s Unreal Engine support for VR Body Tracking. The company has also been closely working with Intel to further advance the capabilities of its Motion Brain technology, the next step of which is called Generative Motion Brain.
Moreover, Intel was excited about DeepMotion’s vision. It helped the company on hardware and also provided support on performance optimization.
DeepMotion is using Intel’s 192-core SDP server to train its generative Motion Brain models so that they can create infinite AI-crafted motions that look like real humans. The additional computing power from Intel’s hardware means that the company can decrease the costs and development time it’d otherwise spend on training, and thus offer new products sooner.
“In short, the Intel multi-core servers allow us to compress multiple weeks’ worth of work into multiple days. That makes our iteration much faster,” said He.
Another benefit of the Intel partnership is that it frees up DeepMotion’s resources to focus on other endeavors. For example, the speed and efficiency of using the company’s products can be useful for creating immersive virtual worlds filled with naturally behaving characters. And that goes beyond just video games.
“If you look at what all these big tech companies are doing — Google, Facebook, Apple — everyone believes that the future of the digital world will be three-dimensional. It will be potentially AR and VR enhanced,” said He. “So it will go beyond a flat medium like a web page or email.
“If that assumption, that dream, comes true, you can imagine that you will have a huge, virtual 3D world that would need to be populated with content. … We believe that’s the future, and that we need to create powerful tools for content creators so they can use our products and services to populate a futuristic 3D world with a massive amount of realistic and interactive content.”
Sponsored articles are content produced by a company that is either paying for the post or has a business relationship with VentureBeat, and they’re always clearly marked. Content produced by our editorial team is never influenced by advertisers or sponsors in any way. For more information, contact firstname.lastname@example.org
GamesBeatGamesBeat's creed when covering the game industry is "where passion meets business." What does this mean? We want to tell you how the news matters to you -- not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it. How will you do that? Membership includes access to:
- Newsletters, such as DeanBeat
- The wonderful, educational, and fun speakers at our events
- Networking opportunities
- Special members-only interviews, chats, and "open office" events with GamesBeat staff
- Chatting with community members, GamesBeat staff, and other guests in our Discord
- And maybe even a fun prize or two
- Introductions to like-minded parties