March 29, 2018
Human Driver Could Have Avoided Fatal Uber Crash, Experts Say 0
-
Human driver may have avoided impact: forensic crash analysts
-
Self-driving sensors should have detected victim, experts say
The pedestrian killed Sunday by a self-driving Uber Technologies Inc. SUV had crossed at least one open lane of road before being hit, according to a video of the crash that raises new questions about autonomous-vehicle technology.
Forensic crash analysts who reviewed the video said a human driver could have responded more quickly to the situation, potentially saving the life of the victim, 49-year-old Elaine Herzberg. Other experts said Uber’s self-driving sensors should have detected the pedestrian as she walked a bicycle across the open road at 10 p.m., despite the dark conditions.
Herzberg’s death is the first major test of a nascent autonomous vehicle industry that has presented the technology as safer than humans who often get distracted while driving. For human driving in the U.S., there’s roughly one death every 86 million miles, while autonomous vehicles have driven no more than 15 to 20 million miles in the country so far, according to Morgan Stanley analysts.
"As an ever greater number of autonomous vehicles drive an ever greater number of miles, investors must contemplate a legal and ethical landscape that may be difficult to predict," the analysts wrote in a research note following the Sunday collision. "The stock market is likely too aggressive on the pace of adoption."
Zachary Moore, a senior forensic engineer at Wexco International Corp. who has reconstructed vehicle accidents and other incidents for more than a decade, analyzed the video footage and concluded that a typical driver on a dry asphalt road would have perceived, reacted, and activated their brakes in time to stop about eight feet short of Herzberg.
Other experts questioned the technology. The Uber SUV’s "lidar and radar absolutely should have detected her and classified her as something other than a stationary object," Bryant Walker Smith, a University of South Carolina law professor who studies self-driving cars, wrote in an email.
Smith said the video doesn’t fully explain the incident but "strongly suggests a failure by Uber’s automated driving system and a lack of due care by Uber’s driver (as well as by the victim)."
The video shows the vehicle driving for about four seconds before ending just as Herzberg is about to be hit by the SUV’s front, right bumper. The woman can be seen taking several steps while visible and appeared to be moving at a normal walking pace as she’s crossing the road outside of a crosswalk and does not look up at the SUV. Police have said the car didn’t slow or swerve to avoid the impact. She later died at a hospital.
Earlier: Picking Out the Pedestrian May Have Foiled Uber’s Robot Ride
"Uber has to explain what happened," said Mike Ramsey, an analyst at researcher Gartner Inc. who focuses on autonomous driving technologies. "There’s only two possibilities: the sensors failed to detect her, or the decision-making software decided that this was not something to stop for."
Uber’s self-driving system includes radar, cameras and lidar, which uses lasers to detect objects. The system is designed to provide a 360-degree virtual view of the environment surrounding the car. Ramsey said it is "mystifying" why the vehicle didn’t react given that lidar systems like the one used on Uber’s SUV have a detection range of at least 100 meters and work better at night than during the daytime.
“The video is disturbing and heartbreaking to watch, and our thoughts continue to be with Elaine’s loved ones,” Uber said in an emailed statement after the video’s release. “Our cars remain grounded, and we’re assisting local, state and federal authorities in any way we can.” The company is cooperating with the investigations. It declined to comment further on Thursday.
Herzberg becomes visible in the car’s headlights as she pushes a bicycle across the road at least two seconds before the impact.
Related: Uber Autonomous Accident Video Shows Car Just Before Collision
"This is similar to the average reaction time for a driver. That means that, if the video correctly reflects visible conditions, an alert driver may have at least attempted to swerve or brake," Smith said.
The comments contrast with those made by the Tempe police chief, who told multiple media outlets that the pedestrian moved suddenly in front of the car and the crash didn’t seem preventable after reviewing footage of the collision.
"It’s very clear it would have been difficult to avoid this collision in any kind of mode,” Sylvia Moir, the police chief in Tempe, Arizona, told the San Francisco Chronicle.
Moore, the forensic engineer at Wexco, said dashcam videos tend to understate what humans drivers can see. While the pedestrian appears from the shadows in the video, a human driver may have had a better view if they’d been watching, he said.
Sean Alexander, of Crash Analysis & Reconstruction LLC, concurred. "Video makes everything in the light pattern brighter and everything out of the beam darker. A human eye sees it much clearer," he said.
Alexander also agreed with Moore’s analysis that a human driver could have avoided hitting Herzberg. "During the time the vehicle should have been braking, the pedestrian would have had additional time and would have cleared without the vehicle actually having to stop," Alexander said.
Tempe police released a statement Tuesday saying that "fault has not been determined in this case" and that a decision on criminal charges would be left to county prosecutors once the investigation is complete.
The video also included footage of the Uber backup driver who monitors the vehicle’s operation from behind the wheel while the computers drive. Out of approximately 13 seconds of that recording, the driver was looking down and away from the road for about 10 seconds. The driver looked up about a second before the recording ends and gasped upon seeing the impending collision.
"Even if the safety driver had been totally paying attention, there’s an awkwardness with the machine if you’re anticipating the machine is going to be able to handle a situation," Ramsey said. "You don’t know when you should jump in."
While it’s too early to say why the car hit the woman, the video doesn’t show the crash was unavoidable, said Ryan Calo, a law professor at the University of Washington who specializes in robotics and artificial intelligence.
“The idea that the video absolves Uber is essentially incorrect,” Calo said.
The important question for investigators won’t be whether the woman was visible in the low-definition video, he said. It will be what the car’s sophisticated sensors picked up and how the software interpreted that data.
“Even if the cameras did not perceive her in time, why didn’t the lidar see her and why didn’t the software predict that she would continue on the path she was on?” Calo said.
The video has been obtained by the National Transportation Safety Board and will be examined as part of its probe. The safety board has increasingly used video in its investigations and has a lab in Washington where it examines various recording devices to tease out useful forensic data.
It’s important not to read too much into the video without additional context on how the self-driving system was functioning and the conditions on the road, said Deborah Hersman, president of the National Safety Council.
“Seeing a few seconds of video raises more questions than answers,” said Hersman, who also served as chairman of the NTSB. “We have very little information about the performance of the technology and that needs to be developed in a transparent fashion and shared broadly so everyone can learn from this tragic event.”
Uber said on Monday that it was pausing tests of all its self-driving vehicles on public roads in Pittsburgh, San Francisco, Toronto and the greater Phoenix area. In Boston, self-driving startup NuTonomy Inc. halted its tests after city officials requested a pause following the Arizona crash.
April 5, 2018
Tesla Driver Died Using Autopilot, With Hands Off Steering Wheel 0
by MeDaryl • Cars • Tags: Arizona, Automotive, california, family, hyperdrive, law, NATIONAL HIGHWAY TRAFFIC SAF, National Transportation Safety Board, South Carolina, technology, TESLA INC, UBER TECHNOLOGIES INC
Tesla Inc. confirmed the Model X driver who died in a gruesome crash a week ago was using Autopilot and defended the safety record of its driver-assistance system that’s back under scrutiny following a fatality.
Computer logs recovered from the Tesla driven by Wei Huang, 38, show he didn’t have his hands on the steering wheel for six seconds before the sport utility vehicle collided with a highway barrier in California and caught fire on March 23, according to a blog post the company published late Friday.
“The driver had received several visual and one audible hands-on warning earlier in the drive,” Tesla said in the post. The driver had “about five seconds and 150 meters of unobstructed view” of the concrete highway divider and an already-crushed crash cushion that his Model X collided with, according to the company. “But the vehicle logs show that no action was taken.”
The collision occurred days after an Uber Technologies Inc. self-driving test vehicle killed a pedestrian in Arizona, the most significant incident involving autonomous-driving technology since a Tesla driver’s death in May 2016 touched off months of finger-pointing and set back the company’s Autopilot program. A U.S. transportation safety regulator said Tuesday it would investigate the Model X crash, contributing to Tesla’s loss of more than $5 billion in market value this week.
‘Mushy Middle’
“This is another potential illustration of the mushy middle of automation,” Bryant Walker Smith, a University of South Carolina law professor who studies self-driving cars, said in an email. Partial automation systems such as Tesla’s Autopilot “work unless and until they don’t,” and there will be speculation and research about their safety, he said.
Tesla defended Autopilot in the blog post, saying a vehicle equipped with the system is 3.7 times less likely to be involved in a fatal accident. U.S. statistics show one automotive fatality every 86 million miles driven by all vehicles, compared with 320 million miles in vehicles equipped with Autopilot, according to the company.
Devastating Event
“None of this changes how devastating an event like this is or how much we feel for our customer’s family and friends,” Tesla wrote, pushing back against criticism that it has lacked empathy by bringing up safety statistics to counter past scrutiny of Autopilot. “We must also care about people now and in the future whose lives may be saved if they know that Autopilot improves safety.”
Tesla has introduced driver-assistance features through Autopilot that the company continuously improves via over-the-air software updates. While the company said as of October 2016 that it was building all of its cars with hardware needed for full self-driving capability, it hasn’t said when its vehicles will clear testing and regulatory hurdles necessary to drive without human involvement.
The U.S. National Transportation Safety Board sent investigators to look into the crash. The agency and the National Highway Traffic Safety Administration also are examining a Jan. 22 collision in Los Angeles involving a Tesla Model S using Autopilot and a fire truck parked on the freeway.
NTSB Findings
The NTSB concluded in September that Autopilot’s design was a contributing factor in the 2016 fatal crash in Florida involving a Model S driver who’d been using the system and collided with a semi-trailer truck. The agency criticized Autopilot for giving “far too much leeway to the driver to divert his attention to something other than driving.”
In the wake of that crash, Tesla updated Autopilot to stop allowing drivers to ignore repeated warnings to keep their hands on the wheel.
While the NTSB also criticized partially autonomous-driving systems that only monitor steering wheel movement and don’t measure whether drivers are looking at the road, Tesla hasn’t adopted or enabled scanners that can track whether drivers’ eyes are looking ahead toward the road.
Read more: http://www.bloomberg.com/news/articles/2018-03-31/tesla-says-driver-s-hands-weren-t-on-wheel-at-time-of-accident