7,000 meters under the sea, Nissan develops the all-seeing car, to emerge commercially in 3 years

Making Nissan drivers all-seeing since 2007

Making Nissan drivers all-seeing since 2007

Crawling along the bottom of the ocean, an underwater robot uses technology that is one of the core technologies for Nissan’s drive towards self-driving cars. The deeply submerged robot’s vision is provided by an upgraded version of Nissan’s Around View Monitor (AVM). First commercialized way back in 2007, AVM provides a 360-degree view of the outside of the car, as if the driver hovers in a helicopter over the car. Saving the lives of countless pets in parking lots, Nissan added moving object detection technology to AVM in 2011. In the submerged robot, AVM has received three-dimensional picture processing capability, an important step to sense distances at the blink of an eye, and without the need of slower radars, or much slower ultrasound sensors, all of which rely on an echo. If it wouldn’t be so hackneyed, the 3D version opens a new dimension in autonomous driving.

The underwater robot is employed by the Japan Agency for Marine-Earth Science and Technology (JAMSTEC) and its contractor Topy Industries. AVM uses four fixed wide-angle cameras on all sides of the vehicle. The images are then combined into one real-time representation of the car’s surroundings.


A Nissan spokesman refused to say when the 3D version will find its way into Nissan’s cars, but it is probably a good guess that it will by 2018, because that’s when it is needed. At a recent media round-table at the New York Auto Show, Nissan’s CEO Carlos Ghosn laid out Nissan’s autonomous drive plans. Defining autonomous drive as “driver is in the car, and you empower him to do whatever he wants, hands on or off the steering wheel, eyes on or off the road,” Ghosn said that Nissan will have:

“By 2016, autonomous drive on the highway, one lane.

By 2018, autonomous drive on the highway, two lanes. You can pass automatically without human intervention.

By 2020, autonomous city driving.”

For 2018 vintage automatic lane changes, the car needs to have a 360 degree 3D picture of what is going on around it, a forward-looking sensor package and a few ultrasound sensors just won’t hack it.

Driverless cars, the ones you summon with your smartphone, still appear to be decades off, also because regulations demand a driver in the car, currently with hands on the wheel and eyes on the road. A lot of the work in the years ahead is to convince authorities that with an all-seeing computer at the wheel, a distracted driver suddenly is o.k.