Scoop has an Ethical Paywall
Work smarter with a Pro licence Learn More

Video | Business Headlines | Internet | Science | Scientific Ethics | Technology | Search

 

Pedestrian dies in self-driving car accident

Pedestrian dies in self-driving car accident - Experts respond
20 March 2018


A woman has died in Arizona after being hit by an Uber self-driving car - it is reported to be the first time a pedestrian has died as a result of a self-driving car accident.

In 2016, a Tesla employee was killed while operating a self-driving car.

The Science Media Centre has asked local experts to comment on the accident, please feel free to use these comments in your reporting.

Dr Paul Ralph, senior lecturer in computer science, University of Auckland comments:


"It's critical to keep these things in perspective. People are using this incident to dismiss driverless cars as unsafe. Human drivers have killed hundreds of thousands of people. A driverless car has killed one. Moving to autonomous vehicles as quickly as possible is still the best way to reduce automotive collisions and their enormous cost in money, time and human life.

"Now that said, we're talking about Uber, a company with a terrible reputation for unethical behaviour and technological corner-cutting. If Uber knew that their autonomous vehicles were running red lights and did not take reasonable steps to correct the mistake, the company should be held criminally responsible for this woman's death. The individuals who ignored the problem should be held personally, criminally responsible - e.g. they might be charged with vehicular manslaughter or negligent homicide.

Advertisement - scroll to continue reading

Are you getting our free newsletter?

Subscribe to Scoop’s 'The Catch Up' our free weekly newsletter sent to your inbox every Monday with stories from across our network.

"The problem is that this research should be funded by governments and carried out by expert researchers in public-private partnerships. But national governments, including New Zealand's, remain unwilling to invest in innovation at the scale demanded by the 21st Century."
________________________________________
The UK SMC also gathered expert comments.

Prof Neville Stanton, Chair in Human Factors Engineering, University of Southampton, said:

“I don't know the details about the Uber accident or the particular technologies in the car.

“In my research, we have focused on the role of the driver within the vehicle. Typically the driver is expected to monitor the automated technology and the road environment simultaneously, and decide if they need to intervene or not. This is far more work that driving manually, akin to watching over a learner driver. Humans are not good at extended vigilance tasks of this nature, and typically their minds can wander off. We need to work on better ways of keeping the driver engaged in the driving tasks, or wait until we can produce SAE level 4 or 5 vehicle automation (which frees the driver up from this extended vigilance task). We don’t yet know what the factors were in this case, but we know from previous work that the middle ground of drivers supervising automaton is not working out.”

Prof Noel Sharkey, Emeritus Professor of Artificial Intelligence and Robotics, University of Sheffield, said:

“Autonomous vehicles present us with a great future opportunity to make our roads safer. But the technology is just not ready yet and needs to mature before it goes on the road. Google are a good example in continually testing for over a decade. Uber, like Tesla, are rushing headlong into this too quickly. Too many mistakes and the public may turn its back on the technology. A better approach is to use many of the features of self-drive cars to make our current vehicle less accident prone. An incremental approach is not so exciting but it will be much safer in the long run.”

Prof Martyn Thomas, Professor of IT, Gresham College, London, and Director and Principal Consultant, Martyn Thomas Associates Limited, said:

“The technology of self driving cars is not yet fit for purpose. There is too much hubris and not enough professional safety engineering and humility. I hope that this tragedy causes the industry and policymakers to pause and then set detailed criteria before resuming testing.”


© Scoop Media

 
 
 
Business Headlines | Sci-Tech Headlines

 
GenPro: General Practices Begin Issuing Clause 14 Notices

GenPro has been copied into a rising number of Clause 14 notices issued since the NZNO lodged its Primary Practice Pay Equity Claim against General Practice employers in December 2023.More

SPADA: Screen Industry Unites For Streaming Platform Regulation & Intellectual Property Protections

In an unprecedented international collaboration, representatives of screen producing organisations from around the world have released a joint statement.More

 
 
 
 
 
 
 
 
 
 
 
 

Join Our Free Newsletter

Subscribe to Scoop’s 'The Catch Up' our free weekly newsletter sent to your inbox every Monday with stories from across our network.