A subzero morning, icy patches on the road. My car skids, careens into another lane, then vigorously shakes as I fight to avoid getting hit by a jeep coming from behind. My life flashes before me in nanoseconds.
This near miss happened as I was driving from my home to Ann Arbor, Michigan, for a client meeting. The traffic was moderate, and the previous day’s snow had been cleared off. I had to take a conference call — an unwelcome and even dangerous task because it distracted my attention while driving in less than ideal conditions. Ironically, my call was related to advanced driver-assistance systems (ADAS) and autonomous vehicles (AVs).
A few miles later, as my exit approached, I started to change lanes only to see brake lights in the fog up ahead, a few hundred yards away. Shifting back to my original lane, I quickly realized that the jeep in my rearview mirror was about to hit me. I don’t recall much about the instant just before and after my car started spinning out of control, but I do recall time moving in slow motion even as my heart raced. Moments later, I found myself on the shoulder of the highway, calling the client to reschedule.
These close calls are all too common, and unfortunately result in many accidents. But the car of the future could change all that — a perfect convergence of safety, security, infotainment and convenience with a focus on customer experience.
From ice patch to chill ride
Let’s reenact my close call, but in this scenario, I’ll be riding in a fully autonomous car.
First off, the car will inform me before I even leave my house that the road conditions are not ideal. Travel time to my destination will be longer, so I shouldn’t stop by Starbucks for a cup of coffee if I want to make my client meeting in Ann Arbor on time.
Once on my way, the car dials into the Skype video session for me, so I can comfortably do last-minute prep with my internal team before meeting the client. The car automatically and continuously communicates with the other cars around and ahead of me. It regularly checks road conditions, safely navigates across several icy patches and changes lanes as needed, all while avoiding the other cars in potential blind spots. It also chooses a more advantageous exit, taking me down country roads that can get me to my destination more quickly. Calculating the time saved en route, my vehicle asks me if I am still interested in the coffee. Why yes, I am.
My self-driving car is the most advanced, human-made gadget with all the technological innovations from the last several decades. My car runs on a fuel cell, is connected to the internet with 20G (not 5G) connectivity, has the most advanced quantum computing chip, is equipped with multiple lidars, radars and thousands of sensors, has advanced edge computing, artificial intelligence (AI), machine learning, cloud, etc. — You get the point.
A self-driving data center
A Level 5, a fully autonomous car, knows who you are, what you like, and can know who you are meeting with, be it family, friends or clients. It knows your calendar and can even know your restaurant preferences.
When it’s on the road, it is proactively looking at the road signs, traffic signals, road conditions and traffic, and constantly transmitting that data to a cloud (which is providing insights to multiple stakeholders, such as auto original equipment manufacturers (OEMs) or parts suppliers, other drivers, municipalities, service providers, etc.). The edge computing on various mechanical components of the car is discovering and delivering key signals to the car’s brain, such as the ice patch 200 feet away that’s caused a car to slide into another lane.
The car’s lidar remote sensing system captures and communicates that information to the steering and brakes, and the car’s vehicle-to-vehicle (V2V) communication system sends a signal to all the cars behind it. The blind spot detection system checks the surroundings. The car’s antilock brakes and vehicle skid controls kick in, and the steering moves to change the lane after the car’s turn signals have been activated. The car sends all the data (video, road conditions, accident conditions) in real time to various parties.
The research and development (R&D) and engineering communities are testing ADAS/Autonomous Driving models using the massive amount of test drive data they collect. While some automakers will have a few Level 5 vehicles out in the next 2 or 3 years, we are still at the beginning of the AV revolution. There are hundreds and thousands of use cases to test.
Think about the difference between the flip-top cell phone of the early 1990s and today’s smartphone, then multiply the complexity by a million. That’s the transformation we will see in autonomous driving, starting with the first AVs to hit the market in the next few years to the not-too-distant future when fully autonomous, Level 5 vehicles become ubiquitous. Ice or otherwise, just imagine the possibilities.