Driver assistance systems found in modern vehicles, such as lane keeping assist or adaptive cruise control, are undoubtedly useful developments, because if attention needs to be relaxed, they can effectively prevent a collision or even a serious accident. However, it is no coincidence that we call them support systems, since their use requires the driver of the car to constantly pay attention to what is happening around him.
The situation is complicated by the fact that a scale has been created for unmanned vehicles that shows how much of the driver’s tasks this vehicle can perform. The autonomy level of “traditional” cars is 0, while modern cars equipped with the driver assistance systems mentioned at the beginning of the article are capable of level 2 autonomous driving. Level 3 already allows the driver to release the steering wheel, but even then he must be ready to intervene if the car’s systems consider that safe driving requires human skills.
Currently there is only one car in Europe, there is something that can be used in this way: the Mercedes EQS received the necessary official approvals at the end of last year, but this is only valid for the territory of Germany, only for highways, and even there you can only drive an electric limousine with the steering wheel released up to a speed of 60 km / h. B Do you decide whether to drive alone or drive?
Yes, but it seems that owners of more modern cars do not really care about either the rules or tend to overestimate the capabilities of their vehicles. This is evidenced by a recent study by the American Insurance Institute for Highway Safety (IIHS), according to which a disturbingly large number of people misuse the mentioned driver assistance systems, writes InsideEVs. the phenomenon is also unprecedented in Europe. The study interviewed users of solutions such as Tesla’s Autopilot driver assistance system (which, despite its name, cannot drive a car on its own), Cadillac Super Cruise and ProPilot Assist, developed by Nissan (and its American subsidiary Infiniti).
The result is frightening: almost half of the owners of cars equipped with such systems trust the software so much that they simply take their eyes off the road while driving, and some even let go of the steering wheel to do other things. About 50 percent of Super Cruise users and 42 percent of Tesla Autopilot users said they were fine. i.e. slowed down and/or stopped after repeatedly warning the driver to put their hands on the steering wheel and concentrate on driving. The InsideEVs editor notes that when he experimentally relied on the Tesla Model 3’s autopilot system, it didn’t take a minute for the software to alert him to the correct steering lock.
However, many try to circumvent such security solutions with tricks, such as taxi drivers inserting a cut-off seat belt buckle instead of a seat belt to avoid the annoying warning signal.
The most important finding of the study is that users (early adopters) using similar technologies do not fully understand the limitations of technologies from the first moment.
Warned David Harkey, President of the IIHS, adding that such misunderstandings can certainly be caused by the design of the system, but the responsibility of marketers, for example, those who call Autopilot a system that is not capable of conduct itself is called
Alexandra Müller, IIHS researcher, drew attention to the need for strong security measures in the development of such automated systems to prevent misuse.
We have something tell you, here you can find all the most interesting!