This was written shortly before Intel bought Mobileye.
Mobileye on the Road 11-Feb-2017 07:51 EST
Amnon Shashua, co-founder and Chief Technology Officer of Israel-based Mobileye, tells a story about when, in the year 2000, he first began approaching global carmakers with his vision—that cameras, processor chips and software-smarts could lead to affordable advanced driver-assistance systems (ADAS) and eventually to self-driving cars.
“I would go around to meet OEM customers to try to push the idea that a monocular camera and chip could deliver what would be needed in a front-facing sensor—time-to-contact, warning against collision and so forth,” the soft-spoken computer scientist from Hebrew University of Jerusalem told Automotive Engineering during a recent interview. “But the industry thought that this was not possible.”
The professor was politely heard, but initially disregarded: “They would say, 'Our radar can measure range out to a target 100 meters away with an accuracy of 20 centimeters. Can your camera do that?'
“And I would say: “No, I cannot do that. But when you drive with your two eyes, can you tell that the target is 100 meters away or 99.8 meters away? No, you can’t. That is because such accuracy is not necessary.”
In fact, Shashua and his engineers contended that a relatively simple and cheap monocular camera and an image-processing 'system-on-a-chip' would be enough to reliably accomplish the true sensing task at hand, thank you very much. And, it would do so more easily and inexpensively than the favored alternative to radar ranging: stereo cameras that find depth using visual parallax.
zFAS and furious
Seventeen years later, some 15 million ADAS-equipped vehicles worldwide carry Mobileye EyeQ vision units that use a monocular camera. The company is now as much a part of the ADAS lexicon as are Tier-1 heavyweights Delphi and Continental. At CES 2017, Shashua found himself standing on multiple stages, in one case celebrated by Audi's automated-driving chief Alejandro Vukotich as “the benchmark for applied image processing in the community.” [....]
Mobileye is teaching ever-more powerful ADAS processors to better negotiate the road, step by step by using AI neural networks to optimize performance and machine learning algorithms that “learn by observing data instead of by programming.” Such technology actually teaches the car to behave in a human way," according to Shashua, by repetitively viewing various realistic simulations that his company's engineers film and then feed into the vehicle’s computer.
“For the most part, the ingredients for autonomy exist,” he asserted. “At this point it is mostly a matter of engineering.”
By the end of 2017, around 40 modified BMW 7 Series sedans will be roaming U.S and European roads as part of a global trial conducted by development partners Mobileye, BMW and Intel. This is the start of a five-year plan where in 2021, "we are going to launch thousands of vehicles that are autonomously driven—tens of thousands of vehicles that will be autonomously driven on highways and a few thousands of vehicles that will be autonomously driven inside cities,” Shashua said.
more at articles.sae.org |