Uber is working on a future where its drivers may do much more than pick up passengers and deliver food. The company has revealed plans to eventually use driver cars as rolling data machines that can collect real-world road information for self-driving vehicle companies and other AI firms training models for physical-world situations. The idea was shared by Uber Chief Technology Officer Praveen Neppalli Naga during an interview at TechCrunch’s StrictlyVC event in San Francisco. He said Uber wants to move in that direction, but the company first needs to better understand sensor systems and sort out rules around privacy, data sharing, and state regulations.
At present, Uber’s AV Labs programme uses a small fleet of company-operated cars fitted with sensors. These vehicles are separate from Uber’s regular driver network. But the long-term vision is much bigger. Uber has millions of drivers worldwide, and even if only a small number of them use sensor-equipped cars, the company could create one of the largest real-world driving data networks in the world.
Why Uber believes data is the real goldmine
Naga said the biggest challenge for autonomous vehicle companies is no longer building the technology itself. According to him, the harder part is gathering enough useful data from roads, traffic, weather, and unusual situations to train self-driving systems properly.
He explained that companies may need very specific data, such as footage from a school crossing at a certain time of day or traffic behaviour at a busy intersection. Collecting this information requires fleets of cars, money, and time. Many companies do not have the resources to send vehicles everywhere just to gather training material.
That is where Uber sees an opening. Instead of building its own robotaxi fleet from scratch, the company could become the data supplier powering the wider self-driving industry.
Uber already has partnerships with around 25 autonomous vehicle companies, including London-based Wayve. It is also building what Naga described as an “AV cloud,” a large library of labelled sensor data that partner companies can search and use to train their systems.
The company is also allowing partners to test their AI in “shadow mode.” This means a trained self-driving model can run virtually during a real Uber trip to see how it would behave, without an autonomous car actually being on the road.
Naga said, “Our goal is not to make money out of this data. We want to democratise it.”
AI is already changing Uber from inside
While Uber is planning to support self-driving AI outside the company, AI is also rapidly transforming Uber’s own engineering teams.
Speaking recently to The Information, Naga said Uber’s original AI budget estimates have already been surpassed because of the fast adoption of advanced coding tools such as Anthropic’s Claude Code. He said, “I’m back to the drawing board, because the budget I thought I would need is blown away already,” showing how quickly AI-related spending has risen.
Uber says software development inside the company is changing at a deep level. AI tools are no longer limited to suggestions or auto-complete support. Instead, they are increasingly writing software with very little human help.
Naga described this move as “agentic software engineering,” where AI systems independently generate code and complete tasks with minimal involvement from engineers.
The numbers suggest this transition is already well underway. Uber says around 1,800 code changes every week are now written entirely by its internal AI coding agent. Nearly 95 per cent of Uber engineers use AI tools every month, while close to 70 per cent of committed code now comes from AI-assisted systems.
In only a few months, Uber’s internal AI agent reportedly grew from contributing less than 1 per cent of code changes to around 8 per cent.


