“Generalized self-driving is a hard problem, as it requires solving a large part of real-world AI. Didn’t expect it to be so hard, but the difficulty is obvious in retrospect.”
This is what Tesla CEO Elon Musk wrote in a Tweet last month, backtracking on his comment saying that by 2020, there would be “over a million cars with full self-driving, software, everything.”
Yet, to Musk’s apparent surprise, achieving fully autonomous driving is a lot more complicated than one might think. Apart from the countless technological obstacles, we also need to consider issues such as cybersecurity, regulatory restrictions and wider-scale infrastructural necessities.
It’s time to put the brakes on the autonomous vehicle (AV) hype, as there’s still a long road ahead before we will all be able to cruise in a fully self-driving car.
Different levels of autonomous vehicles
Before delving into the technological speed bumps on the road to autonomy, it would be useful to first establish the classification of AVs.
The levels for self-driving cars range from 0 to 5, level 0 being pretty straightforward. This is used to describe traditional cars with no level of autonomy. From there on, the taxonomy can get a little technical, mostly because companies have a tendency to bend descriptions to suit their own needs, for instance when they promote their own new technology.
When you cut through the technobabble, the standard documentation for AVs is promulgated by the Society of Automotive Engineers (SAE) and is reasonably straightforward.
Level 1 indicates that the driver is in control of the car from the start to the end of the journey. The partly automated driving support system is limited to brake or acceleration support, but not both. Examples include lane centring or adaptive cruise control.
At level 2, the driver is also in control of the car from start to end of the journey. However, the partly automated driving support system is limited to both brake and acceleration support. Examples include lane centring simultaneously with adaptive cruise control. A prominent example of level 2 technology is Tesla’s AutoPilot function.
Level 3 indicates that the driver is no longer responsible for driving the car when the specified automated driving features are engaged, but the driver must reassume responsibility for handling the vehicle when the features request it.
Honda’s partially self-driving Legend sedan is the world’s first – and currently only – certified level 3 autonomous vehicle. The Legend’s “Traffic Jam Pilot” system can control acceleration, braking and steering under certain conditions.
AVs with level 1 through 3 autonomy require a human driver and will often co-share the driving task, meaning that human drivers and automation of the car are supposed to work together while operating the vehicle. These types of cars are properly described as semi-autonomous vehicles rather than autonomous vehicles and typically contain a variety of automated add-on’s that are known as Advanced Driver-Assistance Systems.
Starting at level 4, we can speak of fully self-driving cars. At this level, the driver is not responsible for driving the car or expected to take over when the automated driving features are in use. However, for level 4, these automated features can only be used in specific geofenced areas.
A fully self-driving car classified at level 5 does not currently exist. In this category, the car would be able to drive itself in any location without the need for any human interference at all.
Meanwhile, the level 4 efforts are gradually trying to get some traction by undergoing very narrow and selective public roadway trials, but the rollout is slow.
“The leap from level 4 to level 5 is huge. It’s a quantum leap,” explains Calum Macrae, head of automotive R&A at GlobalData. “It’s not just so I could take in the microcosm of being autonomous. It’s about being autonomous anywhere.”
GlobalData’s thematic analysis points out that part of the challenge is the relative ease with which SAE levels 1 and 2 were developed. Using current technology, those systems were fairly straightforward to develop, which led to the assumption that the evolution of fully self-driving cars would increase in a linear manner.
In reality, the picture is much more complicated. SAE levels 4 and 5 are exponentially harder to develop.
“The whole business of autonomous vehicles, I think it’s just about past peak hype,” argues Michael Orme, senior analyst at GlobalData.
Experts point out that the technology required for fully autonomous vehicles that do not need any human supervision is still not mature enough. These vehicles require processing power orders of magnitude greater than what is usually found in modern vehicles, banks of expensive sensors with multiple points of redundancy, and an infallible software system that handles the driving task while ensuring the safety of occupants and pedestrians.
After some original equipment manufacturers (OEMs) such as General Motors and Honda made ambitious multi-billion-dollar bids on autonomous technology, we are now seeing a more cautious tone emerge. Orme predicts that fully self-driving cars are still a decade away.
Charles Sevior, CTO for unstructured data solutions at Dell, also tells Verdict that we are 10 years away from level 5 autonomy.
Meanwhile, Thomas Dannemann, director of product marketing at Qualcomm takes a more optimistic stance and predicts that fully autonomous vehicles will hit the road in “the second half of this decade”.
Autonomous cars are likely going to be tomorrow’s transport, but there will be some time yet before you can enjoy a coffee and croissant while reading the daily news from the back of a self-driving vehicle.
What are the main challenges?
Self-driving software is a key aspect of AVs and also constitutes one of the main technological hurdles. The software is based on artificial intelligence (AI) and deep learning neural networks that include millions of virtual neurons that mimic the human brain.
This is the area in which current science is still lacking. The neural nets do not include any explicit “if X happens, then do Y” programming. Rather, they are trained to recognise and classify objects using examples of millions of videos and images from real-world driving conditions.
This process involves the collection of a massive amount of raw data. The more diverse and representative the data, the better they get at recognising and responding to different situations. Training neural nets is something like holding a child’s hand when crossing the road and teaching them to learn through constant experience, replication and patience. The focus will thus lie on developing more brain-like AI.
“Human drivers are still better than robot drivers at the moment, and human drivers have this particular thing called a brain,” says Orme.
People are able to rapidly scan the world around them to make the right predictions about the prevailing circumstances and what to do in those circumstances. That’s not the case for AI.
While emergent technology can easily and accurately detect and classify objects, it still can’t mimic the intricate complexities of driving. Autonomous vehicles not only need to detect and recognise humans and other objects but must also interact with, understand and react to how these things behave.
They also need to know what to do in unfamiliar circumstances. Without a large set of examples for all possible driving scenarios, the task of managing the unexpected will be relatively resistant to deep learning and training.
At the same time, the development of algorithm-specific AI chips that, as nearly as possible, merge processing and memory will be one of the key ingredients in auto brains. The ability of vehicles to learn and improve themselves every mile they travel is front and centre of the evolving self-driving phenomenon.
Dannemann predicts that the ability to make sophisticated AI chips will set companies apart in this industry.
“I believe that those chip makers who are providing very powerful processors with very low power consumption will be the ones leading the industry,” he tells Verdict.
Auto brains – especially those tasked with full self-driving – need access to huge datasets to educate themselves. Some of these datasets include real-time streaming of data from sensors, cameras and radar systems to enable safe navigation. Cloud and in-vehicle data analytics and machine learning based on big data are a prerequisite for self-driving.
This generates another issue: the storage and processing of all that data, as Sevior points out.
“What’s happening is that these test vehicles are generating 20 to 40 terabytes per day,” Sevior says. “Now, that’s a lot of data and it actually takes a long time to copy that data. Even just to copy it from the car into the server where that analysis is going to start. So, the first thing you need is really high performance, high reliability storage.”
Companies must digest massive amounts of data to develop the AI models behind self-driving cars. This must then be followed by data analysis, high performance computing, machine learning and deep learning. These are the research and development advances being made at the moment.
To achieve smarter vehicles, we first need a robust infrastructure and immense computing power to process all that data. Then, there is the question of when and what type of data will be transferred.
In driving environments where situations can change rapidly without warning, there is often no time for on-board systems to send great amounts of data to a cloud and then, once processed, receive details of the appropriate action to take.
At level 4 autonomy and beyond, the vehicle will have to carry its own data centre on-board that can sense, interpret and act in real-time given the critical importance of zero latency in real-time road and traffic conditions.
This on-board computer will also play a critical safety role in cordoning the vehicle off from external cyberattacks. Edge computing, to give this trend its proper name, will take over more and more of the work currently done in centralised cloud-based data centres.
Amid the euphoria about the scope and prospects of self-driving cars, there looms the growing threat of cyberattacks. These can come in many forms: hacks that lock people out of their cars and demand ransom, malware that attacks brakes and steering to cause fatal accidents and sabotage of driving environments and GPS mapping.
Securing vehicles against cyberattacks will be critical to public acceptance of autonomous cars. One answer is to store more data used for autonomous driving in self-contained on-board data centres. However, this could increase a vehicle’s bill of materials and place extra burdens on its memory and processing systems.
An alternative solution may lie in the tokenisation of vehicles and blockchain technology proliferates. Many experts argue that the solution lies largely at the chip and firmware level.
“The risk awareness about cybersecurity is mandatory,” says Dannemann. “When you have access to the internet, there is always the need to make sure that your car cannot be hacked. At Qualcomm, we offer several technologies that help building up a secure system that already starts with a system that is supporting safe boot features. This means that one cannot run any software on our processors. It will only allow you to run software that is certified for our processors.”
Hammering home this point, in August, it was revealed that BlackBerry, the smartphone maker turned automotive software developer, had dragged its feet for months to disclose a significant vulnerability that could enable hackers to take control of connected cars.
More inbound regulation
In the wake of an Uber test AV’s fatal collision with Elaine Herzberg in Arizona and Walter Huang’s fatal crash in his Tesla Model X, both in March 2018, regulators have become more focused on establishing rules for self-driving vehicles. The stricter rules will be a key factor to boost public acceptance of self-driving cars.
The USA’s National Transportation Safety Board was highly critical of both Tesla and the National Highway Traffic Safety Administration. It said Tesla’s system did not adequately check if drivers were paying attention and that the carmaker was the only one of a number of OEMs to ignore self-driving safety recommendations it had sent out in 2017.
Separately, Tesla has frequently been criticised for the way it markets Autopilot. For example, the automaker has faced scrutiny for claiming its cars have “full self-driving hardware”, which could lead to misconceptions about how advanced the system is.
Some fear that too much regulation will hamper development in the AV industry. However, comprehensive laws are also essential for the future of self-driving cars.
“What we see now is more and more regulations available, so higher autonomy vehicles with more features are about to hit the road because those regulations are now becoming available,” Danneman says.
“The question that remains at the end is, who becomes liable if something goes wrong? In the past, it was easy, as the car owner/driver was the liable person because he has control of the car, but as soon as you give the control to a computer, the question becomes, who is liable if something goes wrong? Is it the owner of the computer, the one that is running the computer or the one who manufactured the computer? So, the question here is really, how can we make sure that such a system is controlled so that, if something goes wrong, people can deal with it properly.”
Looking at the bigger picture
We cannot look at autonomous vehicles in isolation. Instead, they are part of a larger trend involving the Internet of Things (IoT), smart cities, robotics and the overall digitalisation of life. Smart cars will only be an aspect of this ongoing development. Arguably, the worldwide 5G rollout is central to these efforts.
“It’s really the 5G that opens up a massive information superhighway that uses wireless technology,” Sevior says. “We’re seeing a lot of advances in high tech manufacturing that involves a lot of smart machines, a lot of IoT sensors, a lot of robotics, and bringing all that data together over a private 5G network, bringing that information into so-called large data lakes where subsequent analytics, machine learning operations, automation and all of these capabilities take place.”
Connectivity will be another important area for smart vehicles. Consumers consistently express a desire for more advanced connectivity features and such systems are easier and cheaper to realise than fully autonomous cars.
The vision is for cars to become more like smartphones. Current vehicles can access mobile data networks through their users’ smart devices or via built-in SIM cards. Through this connection, the vehicle can access up-to-date traffic data and even pre-arrange bookings at destinations or pre-paid parking spaces.
“For me, the car is actually the biggest robot and the biggest thing of the internet,” Dannemann says. “So, you will have a car that is driving by itself and can control itself, and it will be connected as part of your smartphone to your personal devices.”
One notable project worth keeping an eye on is the Woven City project spearheaded by Toyota. This ambitious plan brings together all the developments that underpin the expansion of smart cars, smart cities, connected devices, etc.
Established at the base of Mount Fuji, Toyota aims to build a fully connected ecosystem powered by hydrogen fuel cells. According to the company, the project “will serve as a home to full-time residents and researchers who will be able to test and develop technologies such as autonomy, robotics, personal mobility, smart homes and artificial intelligence in a real-world environment.”
Arguably, developing self-driving cars is about more than just mobility.
“It’s about collecting a lot of data, building AI outcomes, developing deep learning models that can be leveraged in other areas of human advancement,” Sevior says.
AV companies to keep our eyes on
Although there are still many obstacles to overcome before we can achieve level 5 autonomous cars, many companies are putting the pedal to the metal in hopes of becoming the leader in the industry.
In fact, investors have pointed out that the market for AVs is heavily over-capitalised, and experts have flagged that there are far too many players.
“There will be a bloodbath for their shakeup,” says Orme.
Notably, there is AutoX and Sino-US company backed by Chinese search giant Alibaba which recently debuted its “Gen5” driverless robotaxis. This name is arguably somewhat misleading, as it concerns level 4 autonomous driving.
Having said that, the company announced in January that it would roll out its robotaxi service, without a safety driver on board, to the general public in Shenzhen. The company has also been operating test robotaxis in Shanghai, but with safety drivers.
Another prominent contender is Google-backed Waymo, which recently secured $2.5bn in external funding, showing how costly it is to develop AVs.
It opened its Waymo One driverless taxi service to the general public in Phoenix, Arizona, last October. In a small area outside the city, the taxis operate fully autonomously, but they are normally monitored remotely by human safety personnel. The ride-hailing service, which uses an app to request a taxi in a similar fashion to Uber or Lyft, is not yet available in other locations.
Baidu, China’s top search engine, also has its own autonomous vehicle programme. The company partnered up with state-owned automaker BAIC Group and plans to roll out a fleet of 1,000 fully autonomous cars over the next three years.
The company had already been testing its Apollo Moon project in a number of major cities across China, including Shanghai and Shenzhen. In Beijing, the company has begun charging passengers for rides in its driverless taxi around Shougang Park, one of the sites for the Winter Olympics in 2022.
In addition, there is US-based Aurora, which made headlines recently, having raised $2bn in its special purpose acquisition company listing. The company, now valued at $11bn, also focuses on developing self-driving trucks in addition to passenger cars.
Notably, Uber in December agreed to sell its autonomous car division to Aurora in a deal that gave the ride-hailing giant a stake in the startup, highlighting the role that ride-sharing will play in the future of AVs.
Finally, as an honourable mention, we have Yandex Self-Driving Group. The Russian AV company recently announced a partnership with Grubhub, an online and mobile food-ordering and delivery platform on college campuses.
According to the deal, Yandex autonomous delivery robots will join Grubhub’s robust food delivery platform to deliver meals at select campuses in the US.