Nvidia loves the graphics processing unit (GPU) and all of the new kinds of computing it has enabled, from self-driving cars to medical imaging devices. And venture capitalists are showing their love by investing in GPU computing startups.
Jeff Herbst, vice president of business development at Nvidia, said in an interview that he was encouraged by how big the ecosystem has grown for GPU computing investments. Herbst was host to a couple hundred VCs and entrepreneurs at the GTC 2018 event in San Jose, California. The luncheon was several times bigger than last year’s event.
“They get it now,” Herbst said. “It’s great to see so many VCs here. It’s no longer a risk to see your companies build on top of the GPU platform. I think it’s necessity. It’s real. It’s past its inflection point. The train has left the station.”
Nvidia introduced programmability to its graphics processors in 2001, thereby inventing the GPU. Then it created the CUDA programming language in 2006 to enable programmers to run non-graphics software on the GPU, which had the advantage of having lots of parallel processors. That led to a huge wave of GPU growth, and there are now more than 820,000 CUDA programmers. CUDA has been downloaded 8 million times, and there are 350 applications.
Much of this happened because of advances in deep learning neural networks, which in the past five years have made huge strides in recognizing non-structured data, such as images of flowers. Now deep learning software running on a GPU can recognize a flower in just about any photo.
Many VCs are investing in Nvidia’s hardware rivals. Some are creating chips that are custom-designed for deep learning. But Herbst said he was OK with that because it means that the overall market for AI hardware is strong.
“The big companies are there, the VC ecosystem is there,” Herbst said.
At the same time, growth that continued for decades under Moore’s Law, which predicts the number of transistors on a chip will double every couple of years, has slowed. So central processing units (CPUs) aren’t getting faster. But with GPU computing, there has been a 1,000-fold improvement in processing speeds.
“We can see the opportunity for thousands of times speed-up over the next decade, and we can see that because the ecosystem is appearing before our eyes,” he said. “Every cloud computing software maker is building on top of CUDA.”
Roughly half of the top 50 supercomputers run on GPUs now. So nothing could go wrong, right? Well, other processor designers hope they can create something to accelerate AI computing more efficiently than the GPU. Nvidia’s fly-wheel has a lot of momentum, and it will be hard to stop, though others will keep trying.
Nvidia itself has invested in eight companies since May: BlazingDB, Deep Instinct, Deepgram, Element AI, Tu Simple, Graphistry, H20.ai, and JingChi.
“We want to feed you our best companies and invest alongside as a strategic investor along the way,” Herbst said to the VCs in the audience. “Our door is open for business, and we want to help you.”
George Hoyem, managing partner at In-Q-Tel, said his firm invests in more than 50 or 60 companies per year.
“We’re delighted that GPUs are taking center stage,” Hoyem said. “This is a whole new platform shift. Who would have thought database would run on a GPU platform?”
He said the government is interested in GPU computing because it can process enormous amounts of data.
MapD CEO Todd Mostak said his company received an investment from In-Q-Tel to support its work analyzing billions of pieces of visual data. His customers include federal agencies, because they’re solving hard problems with massive amounts of data. MapD won $100,000 in a Nvidia GPU Ventures contest, and it was able to get a round of funding after that, Mostak said.
Nvidia invested in startup TuSimple, which has since gone on to raise more than $80 million for its self-driving truck business in China. That support helped a lot, said Xiaodi Hou, chief technology officer of TuSimple.
“Taking deep learning and applying it to emotion, like sentiment about something, could be very valuable,” said Dharmesh Thakker, general partner at Battery Ventures. “You should assume more and more data is being created. Being able to extract the features and recognizing them is important.”