The industry will be watching this Tesla Autopilot upgrade – and especially how it deals with the so-called 'edge cases' – very closely indeed
Tesla has informed owners that it intends to roll out its so-called 'Full Self Driving' features to a limited number of users over the next few weeks. Despite its billing as 'Full Self Driving', Tesla's software is more of an extensive upgrade of its existing SAE level 2 Autopilot autonomous features. However, safety groups are raising concerns over what is, in effect, a full-scale beta test of new autonomous software by untrained consumers on public roads.
The update is being offered to a small number of users in Tesla's beta tester program but, according to CEO Elon Musk, the plan is to roll it out to all customers who specified Full Self Driving by the end of 2020. New functionality brought by the update is mainly intended to allow the car to drive in city centres, representing a significant increase in the sophistication of Tesla's self-driving software which, previously, was limited to highway use. City streets are much more challenging to navigate than highways due to the sheer number of variables that have to be safely accounted for – pedestrians can leap out from behind cars, traffic signals can catch drivers out, and all road users behave in generally less predictable ways.
Tesla claims it can safely begin to roll out 'Full Self Driving' features thanks to its extensive self-driving R&D efforts. It mainly points to its network of nearly one million vehicles, each gathering driving data that can be fed back into its neural networks to train its self-driving software. The company highlights that, by capturing this real-world data rather than running millions of simulated miles, it is better able to train for 'edge cases' – unusual situations that require specific driving responses, which are hard to predict and simulate using software.
It is important to note that, despite its name, the feature still falls under level 2 autonomy as defined by the Society of Automotive Engineers – a vehicle that can control the acceleration, braking and steering, but must be supervised by a human at all times. Safety advocates are concerned that Tesla's insistence on calling the feature 'Full Self Driving' may mislead customers into thinking the vehicle can drive itself, causing them to take their eyes off the road and a crash to follow.
For its part, Tesla has informed the US' National Highway Traffic Safety Administration of its plan to roll out the feature. The agency has said that it will monitor closely and will act if it feels safety is compromised. However, commentators including the Partners for Automated Vehicle Education (PAVE) worry that too little scrutiny is being given to the plans. They highlight the dangers of launching self-driving software to untrained users who may accidentally or intentionally misuse it, along with the misleading nature of calling a function 'Full Self Driving' when it actually requires full human oversight.
The results of Tesla's rollout will be watched closely by all players in the autonomous vehicle industry. If successful, it would demonstrate to the wider public that self-driving systems are viable and increase consumer acceptance – much like Tesla has already done with electrified vehicles. The risks are great, however. If a Tesla in 'Full Self Driving' mode hits a pedestrian or collides with another vehicle, significant scrutiny will be placed on Tesla and the NHTSA for permitting this beta test. A worst-case scenario could even see consumer confidence in AVs permanently blunted in the wake of a hypothetical accident, making it harder for any autonomous vehicle developer to convince the public of their product's safety and viability.