It’s a beautiful, cloudless day in San Francisco, and I’m sitting in the passenger seat of a Mercedes-Benz CLA sedan. The driver, Lucas, has his hands on the steering wheel, but it’s really just for show: the car is essentially driving itself.

The vehicle is using Mercedes’ new Drive Assist Pro, a point-to-point Level 2 (L2) driver-assist system that is powered by Nvidia and getting ready to roll out to more automakers in 2026. This is the chipmaker’s big bet on driving automation, one it thinks can help grow its tiny automotive business into something more substantial and more profitable. Think of it as Nvidia’s answer to Tesla’s Full Self-Driving.

For roughly 40 minutes, we navigate a typically chaotic day in San Francisco, passing delivery trucks, cyclists, pedestrians, and even the occasional Waymo robotaxi. The Mercedes, under guidance from Nvidia’s AI-powered system as well as its own built-in cameras and radar, handles itself confidently: traffic signals, four-way stops, double-parked cars, and even the occasional unprotected left. At one point, it makes a wide right turn to avoid a truck that’s blocking an intersection, but not before allowing a few slowly moving pedestrians to cross in front.

Tesla fans would likely scoff at Nvidia’s demonstration, arguing that Full Self-Driving is orders of magnitude more capable. Nvidia hasn’t been working on this problem as long as Elon Musk’s company, but what they showed me absolutely would go toe-to-toe with FSD under the most complex circumstances. And thanks to the redundancy provided by Mercedes’ radar, some could argue it’s safer and more robust than the camera-only FSD.

But perhaps a race between two companies is the wrong frame. After all, Tesla is one of Nvidia’s biggest customers, using tens of thousands of the company’s GPUs to train its AI models, representing billions of dollars in AI infrastructure. So even if Tesla wins, Nvidia, in a sense, wins too.

A surprise invitation

The invitation to test out Nvidia’s new system came a bit as a surprise. After all, the company isn’t exactly known as a self-driving leader. And while Nvidia has long supplied major automakers with chips and software for driver-assist systems, its automotive business is still relatively tiny compared to the billions it rakes in on AI. It’s third quarter revenues were $51.2 billion, but its automotive division only made $592 million, or 1.2 percent of the total haul.

That could change soon, as Nvidia seeks to challenge Tesla and Waymo in the race to Level 4 (L4) autonomy — cars that can fully drive themselves under specific conditions. Nvidia has invested billions of dollars over more than a decade to build a full-stack solution, says Xinzhou Wu, the head of the company’s automotive division. This includes system-on-chip (SoC) hardware along with operating systems, software, and silicon. And Wu says that Nvidia is keeping safety at the forefront, claiming to be one of the few companies that meets high automotive safety requirements at both the silicon and the software levels.

That includes the company’s Drive AGX system-on-a-chip (SoC), similar to Tesla’s Full Self-Driving chip or Intel’s Mobileye EyeQ. The SoC runs the safety-certified DriveOS operating system, built on the Blackwell GPU architecture that’s capable of delivering 1,000 trillions of operations per second (TOPS) of high-performance compute, the company says.

“Jensen always says, the mission for me and for my team is really to make everything that moves autonomous,” Wu says.

The road ahead

Wu outlines a roadmap in which Nvidia will release Level 2 highway and urban driving capabilities, including automated lane changes, and stop sign and traffic signal recognition, in the first half of 2026. This includes an L2++ system, in which the vehicle will be able to navigate point-to-point autonomously under driver supervision. In the second half of the year, urban capabilities will expand to include autonomous parking. And by the end of the year, Nvidia’s L2++ system will encompass the entirety of the United States, Wu said.

For L2 and L3 vehicles, Nvidia plans on using its Drive AGX Orin-based SoC. For fully autonomous L4 vehicles, the company will transition to the new Thor generation. Software redundancy becomes critical at this level, so the architecture will use two electronic control units (ECUs): a main ECU and a separate redundant ECU.

A “small scale” Level 4 trial, similar to Waymo’s robotaxis, is also planned for 2026, followed by partner-based robotaxi deployments in 2027, Wu says. And by 2028, Nvidia predicts its self-driving tech will be in personally owned autonomous vehicles. Nvidia has a partnership with Stellantis, Foxconn, and Uber to launch robotaxis in the near term.

Also in 2028, Nvidia plans on supplying systems that can enable Level 3 highway driving, in which drivers can take their hands off the wheel and eyes off the road under certain conditions. (Safety experts are highly skeptical about L3 systems.)

Ambitious stuff, to say least. And some of it will obviously be dictated by Nvidia’s automotive partners, including Mercedes, Jaguar Land Rover, and Lucid Motors, and whether or not they have the necessary confidence (and legal certainty) to include the tech in cars they sell to their customers. A bad crash, or even an ambiguous one where the tech could have been at fault, could jeopardize Nvidia’s ambitions to become a Tier 1 supplier to the global auto industry.

Rapid progress

Fortunately, there were no crashes nor really any hiccups during my experience with Nvidia’s point-to-point system. To be sure, I wasn’t in the driver seat, so I didn’t get to test it on my own terms. That will be up to the automakers, who get to decide when to release Nvidia’s tech and in what models.

Responsibility for when hands-free driving is allowed ultimately lies with the OEM, Ali Kani, VP and general manager of the automotive team at Nvidia, tells me. The company designed its autonomous features to be highly customizable, allowing automakers to define parameters such as acceleration, deceleration, lane-change timing, and aggressiveness. This flexibility allows each OEM to express its own “driving personality,” Kani said, making the system feel like a Mercedes, for example, rather than a generic autonomous driver.

This flexibility allows each OEM to express its own “driving personality,” Kani said.

To that end, Mercedes is adopting something they’re calling “cooperative steering,” which allows the driver to make steering adjustments without disengaging the L2 driver-assist system. This can be useful for avoiding potholes that the system does not consider obstacles. The driver can also tap the accelerator to start moving or slightly increase the speed, all without disengaging the system.

Kani emphasizes Nvidia’s not trying to solve driving for everyone. Instead, the goal is availability: those who want this partially automated system can have it and those who don’t can simply opt out.

The system is based on reinforcement learning, meaning it can continue to improve over time as it gains more experience, Kani says. When asked how it compares to Tesla’s Full Self-Driving, he says its very close. In head-to-head city driving tests over long routes, the number of driver takeovers for Nvidia’s system is comparable, sometimes favoring one system, sometimes the other.

What makes this particularly notable is how quickly progress has been made. Tesla took roughly eight years to enable urban driving with FSD, whereas Nvidia is expected to do the same within about a year. No other passenger-car system besides Tesla’s has achieved this, Kani boasts.

”We’re coming fast,” he says, as the Mercedes slows itself down at another intersection. “I’d say [we’re] very close [to FSD].”

Follow topics and authors from this story to see more like this in your personalized homepage feed and to receive email updates.