It’s been almost a year since Waymo, the autonomous vehicle division of Google parent Alphabet, became the first company to operate autonomous cars on public roads without drivers behind the wheel. Now intends to follow suit.

This month, the Silicon Valley startup will set loose a fleet of self-driving Nissan NV200 vans in Frisco, Texas. They won’t be completely autonomous —  a small army of safety drivers and remote operators will ensure rides go off without a hitch. And the vehicles will be contained in a geofenced area.

But’s six-month test will be one of the largest of its kind so far. When all is said and done, the company hopes to transport over 10,000 people in its driverless cars.

Here’s how it’ll do it.

VB Event

The AI Impact Tour

Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you!


Learn More

City of tomorrow

Frisco, located about 25 minutes north of downtown Dallas, is a relatively small city with a population of about 170,000. But it’s one of the fastest-growing metropolitan areas in the U.S. and is home to a budding tech industry. It’s the future launch site of Uber Elevate, Uber’s air taxi service. And startups within Frisco incubators like the North Texas Enterprise Center and LaunchPad City now contribute more than $117 million to the city’s economy.

Frisco’s tech-friendliness isn’t the only thing that attracted to its glittering skyline. A Texas law that took effect September 2017 lets companies operate self-driving services without restrictions from municipal governments. The receptiveness of regulators, the city’s traffic engineering team, and local authorities also played a role, said Don Lepard,’s Frisco site lead and the former head of fleet operations at Uber.

“We’ve met with fire authorities and EMS to talk strategy and how to respond to situations like, what if somebody calls 911 if a car starts driving erratically?” he told VentureBeat in a phone interview. “Those are some of the scenarios for which we’re trying to account.”

Above: A fleet of’s self-driving vans ready to be deployed.

Image Credit:

Preparation for the rollout began in early 2018, shortly before the Frisco City Council approved a partnership with the Denton County Transportation Authority and several real estate developers — the Hall Group, Frisco Station Partners, and The Star — to form the Frisco Transportation Management Association (FTMA). The working group is focused on reducing congestion through improved walkways and bike lanes, as well as technological solutions such as ridesharing services and connected vehicles.

“Frisco desires to become [an innovator] in [the] space of what we call ‘micro-transit,'” said Sameep Tandon, cofounder and CEO. “It’s the last mile problem — getting to a nearby place. We want to solve that … need [and] do it in a way that embraces local partnerships. That’s the model that works.”

One early result of the partnership is real-time road and traffic data from the city, which Tandon said will be used to fill knowledge gaps in’s autonomous systems.

“Imagine there’s a construction site, and in the left lane is a bulldozer,” he said. “Knowing there’s a construction site coming up is, from an AI perspective, hugely helpful.”

Riding in a driverless car

When’s pilot kicks off, its cars will ferry employees, residents, and patrons of Hall Group properties around Frisco’s North Platinum Corridor — principally Hall Park, a large office complex, and The Star, a 91-acre sports, residential, and entertainment district that hosts the headquarters of the Dallas Cowboys.

A months-long analysis of traffic data determined the routes they’ll take, Lepard said.

“These [complexes] are very congested,” he explained. “We want riders to be able to have a cocktail, get lunch, do whatever, and hail a ride quickly.”

Above:’s Frisco, Texas route map.

Image Credit:

Intrepid riders will use an Uber-like smartphone app to call cars on-demand at one of several fixed pickup and drop-off locations. (Rides will be free of charge.) Initially, a contractor will oversee each ride from the driver’s seat, but after a few weeks, they’ll move to the passenger seat and assume more of a chaperone role, answering riders’ questions but not controlling the cars’ speed or movements. Several weeks after that, they’ll be cut from the loop entirely — with the exception of operators at a remote location, who will monitor each vehicle and provide assistance if one becomes stuck. Otherwise, the cars will navigate Frisco’s roads entirely on their own.

During rides, passengers will be treated to a real-time visualization via an onboard touchscreen, generated from a combination of lidar (a laser-based sensor that measures the distance between itself and objects), radar, GPS, RGB cameras, and inertial measurement data. The current iteration looks a bit like a primitive video game, with a three-dimensional representation of the car and a heads-up display showing footage from three dashboard cameras, the current speed in miles per hour, and selectable points of view (e.g., overhead, bird’s eye, windshield, etc.). A red line extending out from the car — calls it a “red carpet” — shows the anticipated course. visualization

Above: An early iteration of the visualization used for’s in-car passenger screens.

Image Credit:

The intention is to make riders feel more comfortable, Lepard said.

“We’re really taking a methodical approach,” he explained, “and being thoughtful about how we scale. We want to make advocates out of early adopters.”

The cars won’t be tough to make out in traffic, and that’s by design — they’re bright orange, with the words “self-driving vehicle” and’s logo printed prominently on the driver and passenger sides. They’re also outfitted with a conspicuous array of four lidar sensors, 10 1080p RGB cameras, a radar system, and a computer in the trunk that synthesizes sensor data. And they’ve got roof-mounted screens that display written cues, symbols, and emoji to communicate the cars’ next course of action — like a lane change or a right turn on red — to pedestrians and drivers around them.

“We wanted to set people’s expectations that these are self-driving cars, so that they treat [them] differently from other vehicles,” Tandon said. “They’re transparent in their intentions.” van

Above: An exterior shot of’s autonomous vans.

Image Credit:

Teaching a car to drive itself

Machine learning is incorporated into every part of’s stack, Tandon said, including the mapping, sensor calibration, perception, space estimation, controls, perception, fleet management, and more. It’s a core part of the cars’ decision-making engine.

“We take a deep learning-first approach,” he said. “AI is at the forefront of our … vehicle platform [and] sensor platform.”’s engineers use visualization tools to synchronize sensor data streams with three-dimensional street maps and road networks, which they play back to test, train, and validate machine learning models. But it all starts with data collection. As the cars drive, they record driving data logs, localization reports, object detection, motion plans, and basic measurements — like the amount of time it takes to drop off and pick up passengers. It’s a lot to juggle, and it’s useless without labels that allow’s system to understand what it’s seeing.

Normally, it would take a human roughly 800 hours to annotate each data point, Tandon said, but developed a faster system that leans on automation. A human team performs the first iteration — identifying objects like trees, cars, pedestrians, and bicyclists — and uses tools like Director, an open source robotics visualization and interface framework developed by MIT, to quickly “scrub” backwards and forwards through frames. (Previous versions of’s middleware were built on top of the free Robot Operating System, but the team has since moved on to a solution of its own design it calls pub-sub, or DPS.)

“Our stack allows us to very quickly look into a vehicle and figure out what the problem is,” Tandon said. “The simulation data we collect allows us to quickly adapt.”

One tangible benefit of the approach is dynamic traffic light detection. Instead of manually programming rules for lights of different shapes and sizes,’s engineers exposed the vehicles’ computer vision algorithms to thousands of intersections, letting them learn to independently identify different signals.

Above: A top-down view of’s simulator.

Image Credit:

The company’s also created original simulations that introduce unusual situations for its vehicles, like piloting around cars that are double parked or steering tight turns. And it’s created scenerios that include ‘unlikely’ human behaviors like peopple darting in front of traffic and objects rolling into the road.

“Within these scenarios, we can [get] specific: adjusting various parameters to new sizes and shapes … to see how our cars will react if things were slightly different. For example, we might change the angle of a double-parked car, or the size of a left-hand turning lane. By deliberately exposing our technology to ‘stress tests’ … we are able to ensure a level of competency in complicated circumstances we may not encounter organically,” wrote in a recent Medium post.

The company won’t disclose the exact number of miles its cars have driven, but it says they’ve seen “millions” of edge cases and millions of simulated miles on the streets of Frisco. It has also run tests during challenging conditions, like nighttime and rain. (Out of an abundance of caution,’s Frisco test vehicles will only operate during daylight hours.)

Convincing a skeptical public’s launch comes at a time when skepticism about self-driving technologies is at an all-time high.

In March, an Uber self-driving Volvo killed a 49-year-old pedestrian — Elaine Herzberg — in Tempe, Arizona, a suburb of Phoenix. A subsequent investigation by the National Transportation Safety Board found that the car’s automated emergency breaking features had been disabled and that the human backup driver took his eyes off the road in the seconds leading up to the crash. (The vehicle would’ve needed to break 1.3 seconds before impact at its recorded speed of 43 miles per hour, according to the report.)

Following the accident, Arizona governor Doug Ducey’s office suspended Uber’s self-driving privileges, and the company voluntarily halted tests in Pittsburgh, San Francisco, and Toronto. This month, Uber announced that it would redeploy self-driving prototypes on the road with their autonomous systems disabled and with remote operators and two employees in each vehicle — one responsible for driving and the other for documenting “notable events.” In addition, the company said it would outfit its cars with an off-the-shelf, aftermarket monitoring system designed to prevent distracted driving.

Unsurprisingly, studies show that a majority of people in the U.S. aren’t convinced of driverless cars’ safety. More than 60 percent of respondents to a recent Brookings Institution poll said that they were “not inclined” to ride in self-driving cars. In a separate survey conducted by the Advocates for Highway and Auto Safety (AHAS), 70 percent expressed “concerns” about sharing the road with them. And industry think tank HNTB found that 59 percent of people expect self-driving cars to be “no safer” than cars operated by human drivers.

Above: A van sits stationary in a parking lot.

Image Credit:

Tandon acknowledges that no system’s perfect. But he’s confident that the precautions has taken will prevent tragedies like the one in Tempe.

“In contrast with [our competitors],  … we’re taking a much more controlled, systematic approach,” he said. “It’s intended to be [a] long-term one that’ll continue to grow one step at a time. We’ll work with the city and the state and grow this over the next couple of years, and we hope it’ll become the blueprint and the playbook going forward.

Community engagement is another crucial part of the public perception puzzle, Lepard said. To get the word out ahead of the driverless pilot launch, hosted two town halls where residents could ask questions and voice concerns. It’s also been engaging with members of the community on social media and plans to post periodic public reports on the program’s progress.

“We’re breaking the mold of radio silence,” Lepard said. “The question we’re trying to answer is, how are we going to get people onto this platform in the most functional way possible? How do you get something so innovative that’s so different from the norm into the hands of people who want to use it?”

Beyond Frisco

If all goes according to plan, Frisco won’t be the pinnacle of’s self-driving efforts. It’ll be the start of something much more ambitious.

The company’s developing a kit that will allow future customers to retrofit cars with autonomous features. In fact, it’s already working with several automakers, though Tandon declined to name names. And in September 2017, teamed up with ridesharing startup Lyft to launch a self-driving shuttle program in the San Francisco Bay area.

“We have four or five different vehicle platforms,” Tandon said. “Each platform is for a different project. In five to 10 years, we hope to be in a bunch of cities.” certainly has the expertise.

Above:’s team stands in front of self-driving vans.

Image Credit:

Six out of’s eight cofounding members were PhD or graduate students at Stanford’s artificial intelligence lab, where they worked on self-driving and machine learning tech for three years prior to founding the startup in 2015. Tandon was previously the project lead for Stanford’s Deep Learning for Autonomous Driving program, and president Carol Reiley has worked in robotics for more than 15 years.

Equally as impressive is its board, which includes Steve Girsky, who previously served as a senior executive at GM and is widely credited with helping the automaker recover from bankruptcy in 2009; Andrew Ng, a renowned Stanford computer science professor and former chief scientist at Baidu and Google Brain, Google’s AI research division; and Carmen Chang, a long-time Silicon Valley lawyer and investor, who advised Travis Kalanick, then-CEO of Uber, on the sale of the company’s Chinese ridesharing business to rival Didi Chuxing.

And has the funding. Venture capital firms, including New Enterprise Associates, have invested more than $77 million in the company since April 2015.

But the path to expansion is littered with hurdles.

The technology’s expensive. Austin Russell, CEO of Silicon Valley lidar startup Luminar, estimates that the first generation of truly autonomous cars will cost hundreds of thousands of dollars. Among the most expensive components are the lidar sensors — Velodyne charges $4,000 for a single 360-degree unit with 100 meters of range.

Autonomous driving technology is also very much in its infancy. Researchers at the RAND Corporation estimate that self-driving cars may have to drive up to 11 billion miles before reliable statistics on their safety emerge. Waymo’s fleet hit 8 million miles just this month.

But cost isn’t something’s worried about, Tandon said, and the company has no illusions about the challenges ahead.

“What we’re really focused on over the next couple of months is getting people to use the system. Affordability is a theoretical argument, at this point. It’s the early days of self-driving cars, and it’s going to take time to get comfortable with them and to start using them.”

“But deployment is going to happen,” he said.

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.