VentureBeat presents: AI Unleashed - An exclusive executive event for enterprise data leaders. Network and learn with industry peers. Learn More

Not all IT administrators realize this, but when an enterprise is developing deep-learning applications for industrials, pharmaceuticals, academics and medical research, it’s much more efficient and time-saving to develop them using Linux workstations. Why? Because the apps eventually will run on Linux production servers and they’ll have been talking the same code language long before they connect.

Lining up the apps built on Linux laptops running operating systems such as Red Hat, Ubuntu, Debian or others with production servers running the same operating systems avoids many potential snafus when putting the apps into action, an expert in this field, Stephen Balaban, CEO and cofounder of Lambda, told VentureBeat. 

About 42% of all production web servers run Linux of some kind, while Windows servers make up about 20% of the market, according to W3Techs. In the total global server market (most of which are in data centers), Linux or Unix servers make up 19%, while Windows comprises about 72% of the market, according to Statista.

It’s all about developing deep-learning apps 

Balaban told VentureBeat that his company today released its new Razer x Lambda Tensorbook, a device he described as “the world’s most powerful laptop designed for deep learning.” The laptops, featuring Nvidia GPUs, 64GB of RAM, Ubuntu Linux, Lambda’s deep-learning software and coupled with the Lambda GPU Cloud, provide developers with high-end computing performance for creating, training and testing deep-learning models locally, Balaban said.


AI Unleashed

An exclusive invite-only evening of insights and networking, designed for senior enterprise executives overseeing data stacks and strategies.


Learn More

“Most ML engineers don’t have a dedicated GPU laptop, which forces them to use shared resources on a remote machine, slowing down their development cycle,” Balaban said. “When you’re stuck SSHing into a remote server, you don’t have any of your local data or code and even have a hard time demoing your model to colleagues. The Tensorbook solves this. It’s pre-installed with PyTorch and TensorFlow and lets you quickly train and demo your models: all from a local GUI interface. No more SSH!”

The new Tensorbook comes preconfigured with a complete software environment from Lambda, including Ubuntu Linux with the Lambda Stack for training large workloads anytime, anywhere, Balaban said. The laptop features high-performance hardware from Razer, powered by Nvidia RTX 3080, a highly regarded mobile GPU for dedicated, uninterrupted computing. This works at full compatibility with TensorFlow, PyTorch, cuDNN, CUDA and other ML frameworks and tools, Balaban told VentureBeat.

“Razer’s experience in developing high-performance products for both gamers and creators has been a crucial building block for the Lambda Tensorbook, a deep-learning system for engineers,” said Travis Furst, head of Razer’s laptop division.

Specs Lambda hardware

  •  15.6-in. 2560×1440 165Hz display
  • Nvidia RTX 3080 Max-Q GPU with 16GB VRAM
  • Intel i7-11800 processor (8 cores, 2.3GHz to 4.6GHz)
  • 64GB DDR4 memory
  • 2TB SSD storage
  • Thunderbolt 4, USB 3.2, HDMI 2.1 ports
  • Slim 4.4-lb. aluminum unibody chassis
  • 1080p webcam

Specs for Lamda software

  • Ubuntu Linux 20.04 LTS (Microsoft Windows dual-boot optional)
  • Lambda Stack with PyTorch, TensorFlow, CUDA, cuDNN and Nvidia drivers
  • One year of Lambda engineering support

Since its launch in 2012, San Francisco, California-based Lambda has become the de facto deep-learning infrastructure provider for many of the world’s research and engineering teams. Thousands of businesses and organizations use Lambda, Balaban said, including the top five tech companies (Google,  Facebook, Apple, Microsoft, Amazon), 97% of the top research universities in the U.S. (including MIT and Caltech) and the Department of Defense. 

These teams use Lambda’s GPU clusters, servers, workstations and cloud instances to train neural networks for cancer detection, autonomous aircraft, drug discovery, self-driving cars and others.

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.