Current location - Plastic Surgery and Aesthetics Network - Plastic surgery and medical aesthetics - How a Tesla's life disappeared, the three-hour hearing revealed many details.
How a Tesla's life disappeared, the three-hour hearing revealed many details.
The NTSB, which lacks substantive regulatory power, cannot implement its proposal and is unlikely to change in the short term.

National transportation safety board (National? Traffic? Safe? The Board of Directors (hereinafter referred to as NTSB) held a hearing on Tuesday (February 25th), summarizing the two-year investigation into the fatal car accident of Tesla 20 18.

During the three-hour hearing, NTSB listed the causes of many fatal accidents: Tesla autopilot is one of the possible causes of the fatal accident of 20 18 hitting the concrete guardrail; Before the accident, the driver was also playing mobile games when using Autopilot. Have distracted driving behavior; When the accident happened, the collision deceleration device in front of the guardrail was damaged, and Caltrans failed to replace it in time. If it is replaced in time, the owner Walter Huang (Walter? Huang) may survive.

At the hearing, the Committee members discussed the problems exposed by the car accident: the overuse of Tesla Autopilot and the National Highway Safety Administration (National? Expressway? Traffic? Safe? The Administration (hereinafter referred to as "the Administration") is lax in supervising some autonomous driving technologies, and Mr. Huang's employer, Apple, has not formulated a safety policy for distracted driving. (At that time, Mr. Huang, the owner of the car, was playing mobile games with the iPhone developed by his company. )

NTSB said that in this car accident, they saw people's excessive dependence on technology, drivers' distracted driving, lack of safety policies prohibiting the use of mobile phones while driving, and road infrastructure problems. It is the combination of these problems that led to this tragic car accident. "We urge Tesla to continue to improve its autonomous driving technology and ask NHTSA to fulfill its regulatory responsibilities to ensure supervision and correction when necessary. It's time to ban drivers who drive semi-automatic cars from pretending to own fully self-driving cars. "

Investigate new findings

On March 23rd, 20 18, Mr. Huang, a 38-year-old Apple software engineer, drove to work in Mountain View, California. Before the accident, he had been using the autopilot 18 minutes, and the speed before hitting the roadblock was about 7 1mph. According to the early preliminary investigation report, the on-board system gave him two system interface reminders and a voice warning. In the last 6 seconds before the accident, the driver's hand was not detected on the steering wheel.

At the hearing, NTSB investigators presented the latest evidence, put forward 23 investigation results and put forward 9 new safety suggestions.

One of the main findings of the team is that part of the reason for the car accident lies in the limitations of the automatic driving visual processing system. Tesla CEO Elon Musk (Elon? Musk) has always believed that self-driving cars don't need lidar (a laser sensor that can build real-time 3D models), so the design of automatic driving system consists of camera system, ultrasonic sensor and front radar.

Investigators said that the reliability of the camera is limited, and Mr. Huang deviated from the driveway before the accident is an example. In fact, investigators found that several days or even weeks before the accident, Mr. Huang's car had similar dangers many times.

In addition, the investigators pointed out that Tesla "did not provide effective means to monitor the driver's investment in driving tasks." Tesla currently uses a torque sensor to detect whether the owner's hand exerts a force on the steering wheel.

The NTSB survey also found this pattern? X's forward collision warning system did not remind Mr. Huang of the upcoming collision accident, nor did it remind the vehicle to slow down; Actually, models? X accelerates before impact, because the system determines that it can be restored to 75? Cruise speed per hour.

NTSB said that Tesla did not design an emergency system to deal with similar situations, and also accused NHTSA of not requiring companies such as Tesla to use similar emergency systems in such accidents.

Investigators said that if Tesla does not add new safety measures and only restricts the use of Autopilot beyond its promoted functions, then "the risk of future accidents still exists."

Investigators believe that Mr. Huang's distracted driving also led to a car accident to a great extent. Before the car accident, he played mobile games. "Probably" was the reason why he didn't avoid obstacles. The investigation team said that countermeasures such as limiting distracted driving function or completely turning off smartphones will help reduce the probability of car accidents related to distracted driving.

Apple does have a similar driving mode, and many applications and functions can be turned off during driving, but Thomas Chapman, a member of the NTSB Committee (Thomas? Chapman) said, "Frankly speaking, I didn't know there was such a function on my mobile phone."

"This, of course, is the key. The default setting is more meaningful, which can better ensure that users know the existence of this function. " ? Chapman said.

To make matters worse, Mr. Huang is obviously overconfident in the performance of autonomous driving, which he can prove by playing games while driving. Tesla has said in the past that overconfidence is a risk for autonomous drivers, which is why it reminds car owners to pay close attention when using the system in the user manual.

Nine new suggestions

The NTSB's nine new suggestions were issued in many directions, but not specifically for Tesla, perhaps because Tesla has not formally responded to its suggestion on another fatal accident investigation related to autonomous driving in 20 17.

On the contrary, NTSB seems more willing to ask other government agencies to intervene and supervise to influence the decisions of Tesla and other automakers.

The first four suggestions are aimed at NHTSA, and NTSB requires it to start testing the forward collision warning system of automobile manufacturers, especially how to deal with "common obstacles". The current version of the NHTSA evaluation procedure, the so-called new car evaluation procedure, does not take this into account, and some NTSB Committee members are very dissatisfied with this.

NTSB also asked NHTSA to start evaluating the performance limits of autopilot and determine the possibility of the system being overused. If a security breach is found, NHTSA should use its supervisory power to ensure that Tesla "takes corrective measures".

NTSB also requires NHTSA to cooperate with society? Yes? Cars? Engineer) formulates the standards of driver monitoring system to "minimize the negligence of drivers, prevent complacency of autonomous driving, and explain the foreseeable abuse of autonomous driving", and requires all vehicles with similar autonomous driving functions to adopt this technology.

In terms of distracted driving, NTSB suggested that Apple adopt a policy prohibiting the use of smartphones and tablets in company vehicles or work in non-emergency situations. It also requires the Occupational Safety and Health Administration (OSHA? Safe? And then what? Healthy? Administration) helps employers to raise awareness of the dangers of distracted driving and intervene when employers do not comply with the regulations.

NTSB also suggested that smartphone manufacturers develop a better "Do Not Disturb" mode in cars. When this mode is enabled, "Automatically disable any function that will distract the driver when the vehicle is driving" (but allow the device to be used in an emergency) is set to be enabled by default.

Is anyone listening to this advice?

At the hearing, NTSB's findings and suggestions run through the same theme-enterprises and other government agencies are basically taking advantage of their lack of substantive supervision power over autonomous driving.

NTSB is an independent government agency with only the power to investigate and make suggestions. Follow-up depends on the actions of other parties, but in fact, the parties did not act afterwards.

Tesla has not officially responded to the proposal issued by the agency 8,865,438+0 days ago. NHTSA also failed to respond properly to NTSB's previous suggestions on improving the new car evaluation plan. 1044 days passed, and the California Department of Transportation did not respond to NTSB's previous proposal to improve the collision deceleration device.

"What makes me uneasy is that in this report, we made some suggestions, but we didn't get a response." ? Chairman of NTSB Committee Robert Sumwalt (Robert? Sumwalt) said at the hearing that day, "This is another time that we have adopted suggestions to influence change. Frankly speaking, this is frustrating. "

NTSB? Reiterated the previous five suggestions, but these suggestions have not been realized by regulators and enterprises. For example, Tesla, the Committee once again asked the company to increase safety measures and limit the use of autopilot to designed scenes. This time, the Committee still suggested that Tesla develop a better driver monitoring system.

Sumwalt and other members expressed their dissatisfaction with NHTSA and Tesla at the hearing, accusing NHTSA of basically giving up the role of supervisor, saying that Tesla had great defects in the supervision of car owners.

Sumwalt even objected to the name autonomous driving. "Personally, from a safety point of view, Autopilot may not be the (system) name of the most famous brand."

However, NTSB cannot enforce any changes it seeks, and it is unlikely that there will be any changes in the short term. Until then, it may have to continue to investigate accidents like this.

This article comes from car home, the author of the car manufacturer, and does not represent car home's position.