BMW Group first started research into its vision of a car that could drive itself back in 2000, and just six years later a BMW was following the racing line around the Hockenheim circuit without human assistance for the first time.
Since 2011, highly automated test vehicles from the BMW Group have been driving on the A9 motorway between Munich and Nuremberg. And as part of the CES in 2014, the BMW Group gave a demonstration of highly automated driving at the limits of performance on the Las Vegas Speedway.
These are just a few of the milestones notched up by the BMW Group as it progresses toward highly and then fully automated driving. BMW is convinced that autonomous driving will have a decisive impact on personal and sustainable mobility in the future.
Today’s driver assistance systems, such as Driving Assistant Professional in the new 3 Series, form an important building block on the road to highly automated driving. The next objective has already been clearly set out: In 2021 the BMW Vision iNEXT will become the first model from BMW to offer a Level 3 system as an option.
This system will enable drivers to delegate the task of driving to the car for longer periods of time when driving on the motorway at speeds up to 130km/h (81mph). At the same time, a fleet of test vehicles will begin work in late 2021 with the aim of testing out Level 4 functionality in large-scale trials conducted in defined urban environments.
As they learn to get used to road conditions, young children face the combined challenges of inexperience, a restricted field of vision, and small stature. They can therefore be rapidly overwhelmed by what are sometimes complex road and traffic situations. Just like children, self-driving cars also need to learn how to behave in real-life road conditions. To enable this, test vehicles are equipped with sophisticated sensor technology. Artificial intelligence and machine learning then teach the car how to recognize and react to objects on the road.
Self-driving cars continually – and without being distracted – “sense” their surroundings. They thereby collect a great deal of data that also depicts the wider environment, such as buildings, green spaces, and people. Among the basic kit items in such a “seeing” car are cameras. They detect signs, traffic lights, and other items involved in the road conditions. Ultrasound sensors measure the distance from the car to other objects, while radar sensors also detect the speed of those objects. Laser scanners create a 3D image of the environment. LEVEL 2
Just like children, self-driving cars also need to learn how to behave in real-life road conditions.
In this context, HD maps act as a kind of safety net, allowing the vehicle to see further ahead. Thus, the car’s location is geo-located on the map as real-time data from the sensors is synchronized with the mapping data. The on-board computer processes all the information from the various technological components into an overall picture and calculates the route that the vehicle will take. And on the subject of predictive driving, provided that sufficient data is available and is correctly interpreted.
To enable self-driving cars to cope safely with any road situation, millions of test miles are needed, along with millions of pieces of information. It should be remembered, however, that a high number of test miles doesn’t necessarily equate to better driving. With test driving as in many other fields, quality is more important than quantity, as anybody can drive in absolutely perfect conditions. One of the major challenges involved in developing autonomous driving technology is the need for calculations to take account of extreme situations such as evening light, heavy rain or snowfall, and the unpredictable behavior of other road users.
Dealing with such complex situations is simply impossible without a highly developed artificial intelligence system. Simulation therefore plays a key role in the development process. Because test vehicles can’t gather all of the data needed on the actual road, in general almost 95% of all test miles are driven on a virtual, simulated basis. To ensure that a particular feature will work reliably in all conditions, situations are identified and modified on the basis of real-life data. Here, machines face the exact same challenge that confronts a child walking to school for the first time: the need to learn how to behave in actual road conditions. Only by gaining this knowledge will the child – and the self-learning car – build their own skills. Incidentally, autonomous driving also opens up a whole new future of mobility for people with disabilities.
The focal point for the development of self-driving cars: the Autonomous Driving Campus in Unterschleißheim.
People represent another further key factor in the future success of self-driving cars. Alongside the customer, this applies particularly to the developer, whose workplace is changing from a traditional corporate structure to that of an agile technology business with a start-up mentality. At the BMW Group, this digital transformation is exemplified by the Autonomous Driving Campus, opened in April 2018 at Unterschleißheim near Munich. Here, experts in every field have been brought together to make the future of mobility a reality.
Data-driven development: 95% of all test miles are driven virtually.
With an area of 23,000 square meters, the campus is the perfect setting in which to design the future of mobility. In small, agile teams, a total of 1,800 experts in their fields, drawn from a variety of disciplines and recruited from all over the world, drive the development of self-driving cars. Traditional team leader and project manager roles are a thing of the past. Instead, a “product owner” defines every aspect of the features and components of a product, which is implemented in parallel by a number of self-managed teams, composed of mathematicians, developers, and engineers. The advantages? Easier communication, greater transparency, and shorter decision paths. Each and every member contributes different skills and expertise. In a 14-day process, the teams work on the latest practical examples. The hierarchies are flat enough and the team structures agile enough to ensure that any issues arising can be resolved directly.
The campus is a hub for testing, programming, and simulation. The self-driving cars will cover about 240 million virtual kilometers on the journey to being mass production-ready. Petabytes of data are being collected – every day. On the campus, specialists evaluate the data and are then able to code the results directly. Or vice versa – software developers sit in the vehicle with their laptops and test the code they’ve just written.
The process is similar to the move from the horse to the car, in that the mode of transport we’ve grown up with and come to know is undergoing a seismic change. So, what’s the ultimate objective? Will we use self-driving cars to make our lives faster or to slow them down? As a mobile business lounge, a traveling entertainment system, or a moving hotel room? What would you opt for?