The global automobile industry, worth $3.5tn in annual revenues, faces four concurrent disruptive threats: the connected car, the electric vehicle, autonomous driving technology and the concept of transport-as-a-service. So what are autonomous vehicles in business and what are the main themes?

There remain serious technology challenges ahead of autonomous vehicles. But vehicles continue to become increasingly aware of their operating environments, thanks to 3D cameras, radars, lasers, and sensor fusion. They are also becoming increasingly informed and intelligent, thanks to ultra-fast computing enabling operational agility, advanced predictive analytics and machine learning.

Why do autonomous vehicles matter to business?

The advances in automobile technology will be remorseless and will almost totally reset the industry’s supply lines and value chains within five years.

On a 10-year view, the macro socio-economic effects of urban millennial and Generation Z attitudes to car ownership and mobility will have a dramatic effect and yield a motor industry largely based on selling rides, increasingly deploying autonomous mobility, and monetising miles: an emergent industry that will probably be as large as today’s automotive industry.

What are the big trends around autonomous vehicles?

Building automobile brains

More than 30 companies worldwide are developing the brains for self-drive vehicles. The cluster of technologies being developed to turn vehicles into intelligent, sentient robots includes systems on chips, GPS, analytic sensors, wireless communications, and deep learning. 13 of the top 14 car makers and 12 of the largest tech companies are working to bring Level 4, fully autonomous vehicles to market between 2019 and 2025.

The global R&D drive in autonomous driving technology is accelerating, and its diffusion is widening in the process. Meanwhile, there is a difference of opinion about the best way to evolve towards autonomy: some believe that a ‘big bang’ AI supercomputer approach is the best way to achieve full autonomy; others favour progress via networks of microcontrollers and fused sensors.

How well do you really know your competitors?

Access the most comprehensive Company Profiles on the market, powered by GlobalData. Save hours of research. Gain competitive edge.

Company Profile – free sample

Thank you!

Your download email will arrive shortly

Not ready to buy yet? Download a free sample

We are confident about the unique quality of our Company Profiles. However, we want you to make the most beneficial decision for your business, so we offer a free sample that you can download by submitting the below form

By GlobalData
Visit our Privacy Policy for more information about our services, how we may use, process and share your personal data, including information of your rights in respect of your personal data and how you can unsubscribe from future marketing communications. Our services are intended for corporate subscribers and you warrant that the email address submitted is your corporate email address.


The emerging world of connected, increasingly autonomous vehicles will demand huge volumes of sensors: vision sensors, pressure sensors, temperature sensors and more, to feed live data sets to both onboard computers, accessed locally within the vehicle, and cloud-based data centres that are accessed remotely. Sensors are the sine qua non of self-drive. The sharpest short-term growth will be seen in image-sensing cameras and laser-based 3D sensors where vertical cavity surface emitting laser systems (VCSEL) offer the most accurate auto-focus functionality in a camera.

Edge computing

In driving environments, there is often no time to send data to the cloud and receive back details of the appropriate action to take.

At Level 4 autonomy and beyond, the vehicle will have to carry its own data centre on board that can sense, infer and act in real time given the critical importance of zero latency in real-time road and traffic conditions. This onboard computer will also play a critical safety role in cordoning the vehicle off from external cyber-attack. Edge computing, to give this trend its proper name, will take over more and more of the work currently done in centralised cloud-based data centres and not just in cars but also in smartphones, alternative reality headsets, robots and medical equipment.


Light detection and ranging (LiDAR), is a sensing technology for computer vision based on lasers and used in autonomous vehicles. LiDAR systems are bulky and expensive.

Computer vision alternatives to LiDAR

The business of giving vehicles perceptual acuity in all conditions has further to advance. Without high resolution, hyper-perceptive and affordable 360-degree vision systems, the best auto ‘brains’ are useless. More and more attention within the auto sector is being paid to the other factors in the technology mix that will be essential to enabling vehicles to drive themselves. Most of the focus is on LiDAR systems, which build images of the surrounding environment and detect and recognise objects of interest by bouncing beams of lasers off objects.

But LiDAR systems currently cost as much as the rest of the parts of a self-drive vehicle combined. Until the cost reduces, or economical alternatives are developed in a relatively small batch world, self-drive will be the province of fleets run by ride-hailing, car share companies, whose vehicles have uptimes, the percentage of time that the vehicle is operational, of 50% or more to defray the capital cost per vehicle. This compares to typical uptimes of 5% or less for privately owned vehicles.

Silicon photonics technology, where data is transferred among computer chips by lasers rather than electrical conductors, can reduce the cost of LiDAR systems. Last year, MIT researchers were able to fit a LiDAR sensor on a chip the size of a dime. It has no moving parts and can be mass-produced. This could change the economics of self-driving cars, drones and robots, by dramatically reducing the cost and size of the computer vision systems they need to avoid collisions.

Some players insist that LiDAR will never become commercially viable and are instead banking on 3D cameras, ultrasound and AI.

AI chips

Each new car carries an average of $500 worth of chips. This compares to a chip content of just $60 for the average smartphone. These chips come in all shapes and sizes, RF and baseband chips, sensors, microcontrollers and powerful CPUs to process vision interpretation data. Over the next couple of years, demand for chips from the autos sector will rise significantly. By 2030 80% of the value of a vehicle will reside in its software and content, with AI eating the software and precision targeting and customising the content. Neural net-based machine learning will become a standard auto component. At the same time the development of algorithm-specific AI chips that, as nearly as possible, merge processing and memory, will be key ingredients in auto brains. The ability of vehicles to learn and improve themselves every mile they travel is front and centre of the evolving self-drive phenomenon.


Widespread adoption of Level 4, fully autonomous and able to handle entire journeys without human intervention and Level 5 robot cars that are fully autonomous in all situations, with no steering wheel or pedals, technology will call for 5G broadband wireless networks. These 5G mobile networks will require access to sufficient radio spectrum and be supported by ubiquitous base stations, in-device antennas, and denser yet cleaner-cut multiple inputs and multiple output (MIMO) systems in order to serve millions of connected ‘things’ per square mile in major cities. Autonomous vehicles will have to share these 5G networks with smartphones, TVs, household appliances and wearable devices. Many of these connected devices will be running pixel intensive apps and receiving and transmitting compounding real-time packets of data. Vehicles will need to be in constant communication with other vehicles and with smart driving environments as well as with intelligent cloud-based support. As things stand, there may not be enough fast broadband multiple message frequencies for the demands placed on the mobile internet by 2025.


Amid euphoria about the scope and prospects for self-drive, there looms the ever-larger threat of cyber-attacks. These attacks can come in many forms: hacks that lock people out of their cars and demand ransomware, malware that plays on brakes and steering to cause fatal accidents and sabotage of driving environments and GPS mapping. Securing vehicles against a myriad of cyber threats will be critical to public acceptance of fully autonomous vehicles, which offer a greater ‘attack surface’ exposed to sabotage than current Level 3 autonomy vehicles that offer motorway control. One solution is to store more of the data used for autonomous driving in self-contained onboard data centres. Another solution may lie in the ‘tokenisation’ of vehicles as blockchain technology proliferates.

V2X infrastructure

Autonomous vehicles also need to communicate directly with road infrastructure to provide updates on conditions, traffic, accidents and weather. Vehicle-to-everything (V2X) communication is the passing of information from a vehicle to any entity that may affect the vehicle, and vice versa.

There will need to be sensor-rich driving environments in ‘smarter’ cities and on highways with broadband wireless connections to vehicles, offering them operating environment data inputs and intelligence. Singapore, Hangzhou and Tokyo are leading the way: the latter with the upcoming 2020 Olympics front and centre. Cities will have to become a lot smarter and highways have a lot more intelligence embedded in and around them for self-drive to truly take off.

Big data

Auto brains need access to huge datasets to educate them. Some of these datasets include ‘live time’ streaming of sensor data from LiDAR, cameras and radar systems to enable safe navigation. Cloud and in-vehicle data analytics and machine learning based on big data are part and parcel of self-drive. The traditional car makers are struggling to stay relevant, their key asset is the vast trove of sensor data collected from their on-road vehicles.

What is the history of autonomous vehicles?

The first semi-autonomous cars were introduced in the early 1990s with ‘Adaptive Cruise Control’. These Level 1 autonomous vehicles had computers that would speed the car up and slow it down to maintain a pre-set distance from the car in front. 2013 saw the introduction of Level 2 autonomous vehicles with ‘Active Lane Keeping Assist’, which helped cars stay in lane and took action if the car moved out of its lane, while automatic parking became a common feature of mid to high-end cars.

2016 saw the first ‘Level 3’ autonomous vehicles with ‘motorway control’. In 2018, ‘Level 3’ autonomous vehicles will hit the roads, but not at scale. The Advanced Driver Assist Systems (ADAS) in these cars will be able to take control of the car on motorways but can hand it back to the driver at other times.

By 2020, several different makes of Level 4 vehicles will be fully available – able to handle prescribed, geo-confined journeys without any human intervention.

Meanwhile, on highways and in cities across the world, self-drive cars have been in operation for testing purposes, albeit with human drivers on board in case of emergencies, since 2016.

A wide range of semi-autonomous vehicles will be produced, with each car maker taking a slightly different approach in terms of how much control to take away from the driver and when.

Between 2020 and 2025, fully autonomous Level 4 vehicles will be running in limited zones within certain cities. But they will be permitted on the roads in only a handful of countries, including the US, Germany, Israel, the UK, Japan, Korea and China.

Beyond 2025, fully autonomous vehicles will be available in more zones and robot car sharing networks will replace taxis in many of these zones.

It is likely to be 2040, if then, before vehicles arrive in the consumer market that are fully autonomous in all situations, with no steering wheel or pedals.

The story of autonomous driving thus far…

  • 1939: At the World’s Fair, GM’s Futurama Exhibit envisaged cars that drive themselves.
  • 1953: GM and RCA develop a model automated highway to experiment with electronics to control steerage and distancing between vehicles.
  • 1958: GM tests a rudimentary self-steering Chevrolet. 1960s Stanford University roboticists develop the autonomous Stanford Cart based on the Lunar Rover platform.
  • 1977: Dr Sadayuki Tsugawa unveils first fully autonomous vehicle equipped with cameras and analogue signal processing computers at the Mechanical Engineering Lab at Japan’s Ministry of International Trade and Industry (MITI)
  • 1979: Stanford’s Hans Moravec’s Stanford Cart crosses a chair-filled room without incident or human intervention.
  • 1987: Aerospace engineer Ernst Dickmanns’ VaMoRs vehicle outfitted with 2 cameras, 18 16-bit microprocessors, and a host of sensors self-drove for 20 kilometres at an average speed of 56 mph.
  • 1994: Dickmanns piloted a retro-fitted Mercedes S-class from Munich to Odense with about ”95 per cent of the distance travelled fully automatically”.
  • 1995: A team of CMU roboticists drive a 1990 Pontiac from Pittsburgh to LA with a 70-mile stretch completed without human help.
  • 2005: DARPA stages a Grand Challenge for robot vehicles to complete a 132-mile course full of bends and mountain passes. It is won by Sebastian Thrun’s Stanford University Stanley VW which completed the course without incident in 6 hours and 54 minutes.
  • 2009: Google starts testing robot cars on roads.
  • 2015: Tesla unveils Autopilot. 2015 Baidu drives a retrofitted BMW3 autonomously in Beijing
  • 2015: Delphi autonomous vehicle drives coast-to-coast.
  • 2016: Uber tests autonomous taxis with human co-pilot in Pittsburgh
  • 2017: NuTonomy self-drive taxis tested in Singapore
  • 2017: Gallup poll in the US shows a majority public concern about autonomous vehicles.
  • 2018: March: Uber vehicle with a human ‘safety driver’ on board kills a pedestrian in Arizona.
  • 2018: Autonomous lorries start to appear on highways in the US and UK.
  • 2018: Nissan-DNA launch self-drive experiment in Yokohama.
  • 2018: Waymo runs a commercial robot taxi service in Chandler, Phoenix, Arizona, by year-end.
  • 2019: Tesla expects full autonomy in its vehicles.
  • 2020: GM launches robot taxi service with Lyft
  • 2020: Volvo and Uber launch robot taxi service with Geely and NVIDIA.
  • 2020: Tokyo Olympics showcases Level 4 taxis and Level 5 shuttles.
  • 2021: Ford enters robot taxi service.
  • 2021: BMW goes into series production with Level 4 vehicles.
  • 2025: Daimler working with Bosch, Uber and Nvidia goes into commercial production with Level 4 vehicles.
  • 2025: Robot ride hail/vehicle sharing takes off across the world’s urban regions led by Didi Chuxing, Uber, Lyft, and a clutch of legacy OEMs as well possibly by Apple, Amazon and Alibaba.
  • 2025: Level 4 vehicles able to operate fully autonomously in prescribed areas and in good, bright weather conditions become commonplace. Level 5, too, but less widely.
  • 2040: 50% chance that the Omega point of electric, shared, autonomous will be reached.
  • 2040: More than 50% chance that Level 5 vehicles will be mass market phenomena.

This article was produced in association with GlobalData Thematic research. More details here about how to access in-depth reports and detailed thematic scorecard rankings.