[CONNECTED CAR] Tesla’s vulnerable Autopilot …To stickers…

    Researchers at Tencent’s Keen Security Lab recently demonstrated the fallibility of Tesla’s Autopilot. The latter showed that it was possible to mislead a Tesla in Autopilot mode and drive it in the opposite direction with the simple placement of stickers on the road. The Autopilot mode targeted is the one prior to 2017 and the researchers pointed out that the machine’s systems do not need to be hacked.

All you have to do is place small stickers on the road that are treated by the autonomous driving system as an indication of lane change. To reassure drivers, the researchers indicated that Tesla cars require the driver’s agreement to make any lane changes. The operation of the Tesla’s automatic control system is based on a variety of sensors that collect information on the vehicle’s road environment. This information is then transmitted to the on-board computers, which are responsible for making the most appropriate driving decisions in real time using machine learning.

The researchers formed a horizontal line pointing to the left with three stickers to push the Autopilot of a Tesla S 75 to move to the left lane in the opposite direction. Thus, machine learning algorithms treated this information as an indication that the track was moving in this direction. Although this fault is not very dangerous, the driver’s approval is necessary, it can still be used. It could therefore be corrected by “developing lane recognition at the rear of the vehicle“.

In response, Tesla said she had developed a Bounty bug program precisely to get feedback of this type. It further stated that these results were obtained in the context of scenarios where “the physical environment of the vehicle was artificially modified” and that therefore “this was not a realistic concern given that a driver can easily neutralize the autopilot at any time“.