One of the most underrated companies in the self-driving technology sector is Mobileye, an Israeli company that Intel purchased for $15 billion in 2017. Mobileye is the largest supplier of advanced driver-assistance systems (ADAS) that ship with today’s cars. In a Monday interview at the virtual CES conference, Mobileye explained its strategy to stay on top as the industry shifts to fully self-driving vehicles.
Mobileye’s self-driving strategy has a number of things in common with that of Tesla, the world’s most valuable automaker. Like Tesla, Mobileye is aiming to gradually evolve its current driver-assistance technology into a fully self-driving system. So far, neither company has shipped products with the expensive lidar sensors used in many self-driving prototypes.
And like Tesla, Mobileye has access to a wealth of real-world driving data from its customers’ cars. Tesla harvests data directly from Tesla customers. Mobileye has data-sharing agreements with six car companies—including Volkswagen, BMW, and Nissan—that ship Mobileye’s cameras, chips, and software.
During Monday’s presentation, Mobileye CEO Amnon Shashua pointedly criticized Tesla without mentioning the company by name. One strategy for developing self-driving technology, he said, was to “simply record everything from your cameras, and then when you are connected to Wi-Fi, send it to the cloud.” He said that a company might “deploy a crappy system, call it beta” and then try to “improve and improve and improve.”
Shashua argued that this strategy “sounds reasonable, but actually it’s a brute-force way of going about things”—one that’s likely to “get into a glass ceiling.”
Mobileye’s self-driving strategy differs from Tesla’s in some crucial ways. Tesla head honcho Elon Musk has vowed not to use lidar sensors or high-definition maps because he considers them “crutches” that make self-driving systems too brittle. By contrast, Mobileye is investing heavily in both technologies and expects to use them in future iterations of its technology. And that may give Mobileye—and Tesla competitors that buy Mobileye technology—an edge in the coming years.
Mobileye’s massive map
Tesla’s data gathering works basically the way Shashua described. Tesla cars record footage as they are driving around, store it locally, and then select a subset of this massive dataset to upload to Tesla while the car is parked and has access to Wi-Fi. Tesla engineers can query cars in the field for images fitting particular criteria, allowing them to harvest the images that are most useful for training Tesla’s algorithms.
According to Shashua, this strategy focuses on the wrong part of the self-driving task. He argued that it doesn’t take that much data to train a neural network to recognize objects like pedestrians, trucks, or traffic cones. Mobileye’s software has already achieved better-than-human performance on this basic object-recognition task, he said.
The more difficult problem, he claimed, is understanding the “semantics of the road”—the often subtle rules that govern where, when, and how a vehicle is supposed to drive. Software on board a Mobileye-equipped car gathers data about the geometry of the road and the behavior of nearby vehicles. It then processes this data on the vehicle to generate a compact summary. The summary can be as little as 10 kilobytes per kilometer of driving, making it easy to transmit over cellular networks.
These summaries are then uploaded to Mobileye servers, where they are used to build detailed three-dimensional maps. These maps not only show the locations of curbs and stop signs, they also provide detailed behavioral data—showing which lanes generally yield, how fast cars typically move on each stretch of roadway, and how often cars’ actual driving patterns diverge from the official lane markings.
Millions of miles
Mobileye is now gathering more than 8 million kilometers of data every day from cities around the world. And the company says that after five years of work, its map-making process is almost completely automated. This means that Mobileye will soon have detailed maps not just in cities where it’s actively testing self-driving cars, but in cities around the world.
In the next few years, this data could enable Mobileye to improve the performance of its driver-assistance systems. Mobileye has talked about creating “Level 2+” systems that are a step more advanced than today’s “Level 2” driver assistance technologies. The key thing that differentiates a “2+” system is that it operates with help from high-definition maps. These maps help vehicles decide when driver-assistance technology is safe to use, and they decrease the likelihood that the system will get confused and steer a vehicle out of its lane.
Longer term, this mapping capability may allow Mobileye to leapfrog competitors like Waymo that don’t have access to such a rich dataset. Waymo’s self-driving taxi service is widely viewed as the most sophisticated in the nation, if not the world. But it has expanded slowly, if at all, in the four years since Waymo started testing its driverless taxis in the suburbs of Phoenix.
Although Waymo has tested its technology in a number of other locations—most notably the San Francisco Bay Area—it has not said when or where it will launch its next service area. Why Waymo is moving slowly is unclear, but difficulty mapping new areas may be one factor.
Meanwhile, Mobileye is expanding rapidly. After testing its technology in Israel, Detroit, and Germany in 2020, Mobileye says that it’s aiming to expand testing to Paris, Tokyo, Shanghai, and possibly New York City in 2021. Mobileye says its vast data-collection abilities and its flexible software enables it to enter new markets with a minimum of extra work.
Intel is going to make lidar for Mobileye
At the same time that Mobileye works to improve its ADAS products, it is also working to develop fully driverless technology. Like Waymo, Mobileye is planning for this technology to first be offered as part of a driverless taxi service. And while Mobileye’s currently shipping products don’t use lidar, Mobileye does plan to use lidar in its forthcoming driverless taxis.
While Musk has dismissed lidar as a crutch, Shashua argues that redundant sensors are essential to achieving better-than-human driving performance. Mobileye has already developed prototype self-driving vehicles that rely only on cameras. The company is now working on a separate self-driving system based on lidar and radar. Only after Mobileye gets both systems working well on their own does Mobileye plan to combine them into a single self-driving system. The idea is that each system will help counteract the other’s flaws, creating a hybrid system that’s much safer than either system on its own.
Mobileye has used some dubious math to argue that it can prove the system’s safety without a ton of testing. That seems unlikely. Still, building two redundant systems likely does confer some safety benefits.
Beats, shifts, and flights
Monday wasn’t the first time Mobileye said that it would use lidar. But Mobileye revealed a lot more about its lidar plans during Monday’s presentation. Mobileye is building a type of lidar called frequency modulated continuous wave (FMCW) lidar. Rather than bouncing a laser beam off a distant object and directly measuring how soon it comes back (the “time of flight” approach used by most lidar vendors), Mobileye’s lidar uses a continuous laser beam with a steadily increasing (or decreasing) frequency.
The beam is split in half, with half of the beam bouncing off a far-away target. When the light bounces back, the beams are reunited. Because the two halves of the beam traveled different distances, and therefore were emitted at different times, they have different frequencies. Combining them produces a beat frequency that indicates the exact distance to the faraway object.
FMCW lidars tend to be robust to interference. And thanks to doppler shifts, FMCW lidar can estimate an object’s speed as well as its distance.
Being part of Intel gives Mobileye an edge here. Mobileye says that Intel has the infrastructure to design photonic integrated circuits—computer chips that include lasers and other optical components as well as computing hardware. The use of PIC technology should make Mobileye’s lidar cheaper and more reliable when it’s introduced sometime around 2025.
Mobileye is also working on software-defined radar technology that it hopes will improve the angular resolution of conventional radar technology.
Mobileye is hedging its bets on the future of autonomy
The most important philosophical divide in the self-driving technology world is between those who see fully autonomous vehicles as an evolution of ADAS products and those who see them as two totally different products. Waymo is the leader of the second faction. Google actually built a freeway driver-assistance product in the early 2010s but decided releasing it would be too dangerous because human drivers were unlikely to supervise it adequately.
Since then, Waymo has focused on building fully driverless taxis with no one behind the wheel. Other decisions flow from this one. Because immediately providing a taxi service nationwide is not realistic, Waymo has initially focused on getting its technology working in a single metropolitan area. And because taxis are rented, not owned, Waymo can use expensive sensors at the outset, confident that they’ll come down in price over time.
Elon Musk disagrees with Waymo’s philosophy. His strategy is to gradually improve Autopilot until it’s reliable enough that a human driver is no longer needed. He relies on customers, not professional safety drivers, to intervene if Autopilot malfunctions. This has allowed Tesla to gather data and test its software at a far greater scale than Waymo can—even with Alphabet’s billions.
Mobileye is fundamentally on Musk’s side of the argument. In a Monday presentation, Shashua argued that the difference between a driver-assistance system and a fully driverless system is just its mean time before failure. In other words, if Mobileye can make its ADAS reliable enough, it should be able to put the same software into a driverless taxi.
One reason Mobileye and Tesla have wound up on the same side of this battle is that they have the same business constraints. They’re both in the business of selling ADAS systems, and it would be extremely convenient if both companies could gradually improve their systems until they’re fully self-driving. Because Mobileye and Tesla are selling hardware to end users (Tesla directly, Mobileye via OEM partners), they can’t afford to use expensive lidar sensors in the short run. So they have to build the best system they can without lidar.
Dogma vs. pragmatism
But while Musk has become dogmatic on this question, Shashua is more of a pragmatist. Mobileye’s primary strategy is to evolve its ADAS system into a full self-driving stack. But the company is also testing prototype driverless taxis with safety drivers—just like Waymo. While Mobileye isn’t using lidar today, its CEO hasn’t declared that “anyone relying on lidar is doomed,” as Musk put it in 2019. He recognizes that lidar is valuable and wants to start using it as soon as costs come down enough.
And this makes Mobileye well-positioned for the future regardless of whether the Tesla strategy or the Waymo strategy ultimately wins. If Tesla is right that ADAS systems can evolve into fully self-driving systems, Mobileye can keep selling better and better systems to its existing OEM customers. On the other hand, if Waymo is right that driverless technology needs to be built from the ground up, Mobileye’s work on lidar and HD maps will give it a head start. So will its decision to test driverless cars in a half-dozen cities around the world.
By contrast, Tesla is betting heavily on the evolutionary approach. If this strategy turns out to be a dead end, Tesla doesn’t have a backup plan. If lidar turns out to be indispensable for bringing driverless technology to market, Tesla will be caught flat-footed.