Over the past decade, our cars have become progressively smarter. In-car comfort has improved, with climate control now almost standard. Cruise control, automatic braking, lane-keeping assistance and automated parking have significantly changed the driving experience. The common link among all these features is that they depend on the presence of sensors to help drivers make better decisions and avoid errors.

Up to now, those sensors have been carrying out relatively simple operations. For example, automatic braking uses a sensor to detect the distance between your car and the one in front. If that distance is too short or closes too quickly, the brakes are applied without the driver’s intervention. It’s a relatively simple algorithm of “If this happens, then do that”. And your car only really knows about its own capability and actions; it has no way of knowing what nearby vehicles are doing before something happens. 

For example, the only way a car’s sensors can anticipate a lane change is when the car starts moving towards the dividing line between lanes. A camera needs to see this, the data needs to be processed, and a decision needs to be made whether or not to take action and, if so, what that action should be. That kind of decision-making requires significant computational power because the data model used by the algorithm can be quite complex.

Fully autonomous vehicles that can do all the driving for you are substantially more complex and require and generate substantial volumes of data. Some research suggests that an autonomous network of just 2,000 cars could generate as much data as all of humanity did in 2015. 

For an autonomous vehicle network to successfully meet society’s needs for safety, environmental friendliness and convenience, an entire ecosystem is evolving: everything from traffic control systems, to extremely precise mapping systems and communications standards so cars can let each other know what’s going on, to the ability for these presumably electric vehicles to access charging stations. 

The changes that an autonomous vehicle ecosystem brings are wide-ranging. For example, autonomous vehicles could select routes that are not only faster but also allow the car to operate optimally so it uses less energy. And if an autonomous car can easily be made available when you need it, car ownership could move towards car access. This means we won’t need to devote precious real estate at home to garages and carports, and shopping centre car parks won’t need to be as large. With car use expected to increase from 5 per cent to 50 per cent, as we move from ownership to access, the way we allocate land use will change.

What will such an ecosystem need? Cars will need to have on-board artificial intelligence (AI) so they can process the vast amount of sensor data they collect. That data will, in some cases, need to be relayed to nearby vehicles. And while 5G technologies will be a part of this, the inherent latency of the network, while far less than the existing 4G network, won’t be fast enough for the split-second decisions needed when human lives are at stake.

But 5G will be valuable for telling the entire network when road conditions have changed or for collecting data that can be used to refine AI models and sending the new software to vehicles quickly. 

On-board computing power in cars will need to make a quantum leap forward from where it is today. With Qualcomm, Apple, Huawei and others developing embedded computer chips that are optimised for AI and machine learning applications, we’ll see powerful systems embedded in vehicles. These systems will address privacy and security concerns by keeping the personal information of car users on board and sending only redacted data up to the autonomous car cloud, where it will be useful for updating algorithms and providing information to other vehicles.

For this to happen, an evolution in key technologies will be needed.

Continued investment will be required in technologies such as communications, machine learning and AI, data management and security. Standards that enable cars from different manufacturers to communicate will need to be established, much like OBD2 for engine diagnostics. 

We’ll need to understand that while passenger safety should be improved by autonomous vehicles, accidents may still occur. That means algorithms will need to instruct a car what it should do when faced with an unavoidable choice of collisions. 

Common platforms and infrastructure will also need to be created to support interoperation among cars, traffic management systems, safety systems, regulators and other parties.

At the recent CES Asia [Consumer Electronics Show] trade show, a number of car makers repeated the same message: They aren’t trying to make self-driving cars. They are creating robotic drivers that make good driving decisions more reliably than human drivers.