Comments:Uber suspends self-driving car program after pedestrian death in Arizona, United States
|Thread title||Replies||Last modified|
|The Talk of the Tape||2||02:27, 22 March 2018|
|Speeding, human driver also responsible||9||17:14, 21 March 2018|
The video of Sunday's collision has been released: WARNING IT IS A BIT GRAPHIC.
The video clearly show the vehicle was in the right most lane which means she crossed two other lanes to the point of impact. The bike was tossed with her to the sidewalk. She would have been taken to Tempe's St. Lukes Hospital nearest to the accident. Between the night vision, motion sensors, heat signature, and laser guidance, having that much information should have preformed better then the human driver. Also she was approximately 25 ft a light post, with a car passing her moments before - she was visible. She did not jump out from behind a bush. The vehicle never had it's emergency stop activated until the manual override happened. The driver, whom served years in prison for attempted armed robbery, was looking down which could have been a computer screen or a cell phone. The NTSB is going to have a field day with this. Intel, another company testing these vehicles, grounded their fleet in southeast Phoenix. My guess is Uber is thankful they hit a homeless women. No family for a multi-million dollar civil lawsuit. Sad but true.
I predict nobody is going to suggest that driverless vehicles are a bad idea.
No but the IEEE, I’m a member, is looking at their ethics policy and the group responsible for standards is mobilizing. These car are far more safer then a drunk driver. There are other issues with privacy and government possible government overreach. This subject is far more complicated then just this wreak.
I would say the car was speeding. It needed to move slower. The speed limits are maximum but the drivers need to make conscious decisions and drive to conditions. If it was dark the car needed to go way slower (the SI units for this were 61km/h or so).
- "If road conditions are less than ideal, for example rain, heavy traffic, night time etc, you may be speeding even if you’re driving at or below the posted speed limit" http://www.rms.nsw.gov.au/roads/licence/documents-forms.html#RoadUsers%27Handbook
- "Research shows that speeding is more common at night, particularly by inexperienced drivers. This may be due to having fewer indicators of how fast you are going when it is dark. It is harder to see how quickly objects like trees and poles flash by." http://www.rms.nsw.gov.au/roads/licence/documents-forms.html#HazardPerceptionHandbook
- "Unfortunately, not all cyclists know or obey the road rules. You may even find cyclists riding against the traffic, riding through red traffic lights and riding without lights at night. This means that your scanning needs to be constant and careful when driving in daylight or darkness." http://www.rms.nsw.gov.au/roads/licence/documents-forms.html#HazardPerceptionHandbook
The above materials are mandatory read for all Sydney drivers before obtaining a non-learner driving permit.
I would personally suggest to hold the driver responsible for this not only the self driving car technology.
Yes, I wondered about that, but while specifically not excluding the option of filing charges against the operator in the future, the chief of police went on to warn pedestrians against crossing where there is no crosswalk. I'm aware that cities in the US vary in their tolerance of that; in some places it's quite common for pedestrians crossing against a traffic light or in mid-block to be ticketed, whereas at least one state, California, requires vehicles to stop when a pedestrian is crossing the street regardless of where or when. Also, tolerance of slight speeding varies from one jurisdiction to another; in some places drivers can be ticketed for driving at less than the prevailing speed while that speed may well be a traditional 5 mph above the posted limit. However, a crucial element here is that the car was in autonomous mode. I presume that means the computer chose the speed. For it to be traveling at higher than the speed limit, after dark, raises a red flag in my mind concerning its programming; quite apart from the failure to program in an appropriate response to a pedestrian darting into the road. Next time it could be a little kid chasing a ball in broad daylight.
AFAIK in Sydney cars are required to stop regardless of where and when a pedestrian is crossing, but pedestrians are discouraged,by law, to cross within less than 20 meters of a cross walk (even if it does not have zebra).
Personally I had been jaywalking excessively for years, until a point I read some blog post in which the term was given a ridiculing tone (where I learnt what the word meaning was), and then started learning to drive. That allowed me to fully realize that movement of cars are governed by humans, and are error prone. That was also the same time when I was taught the importance of eye contact with other drivers. An interesting experience.
I'd perhaps suggest self driving cars to choose speeds that are below the speed limit (obviously) and reduce it by at least 10 km/h at night (in high pedestrian activity areas, perhaps by far more than that). I'd also encourage holding drivers responsible because if the computer erred in choosing the desired speed, adjusting it needed to be within the human driver control.
Hopefully programmers of the self driving car had logged enough information to at least explain why it was speeding above the nominated speed limit.
I am pretty sure most jurisdictions have laws requiring drivers to stop if a pedestrian is in the roadway. The driver cannot simply laugh, "Ha ha, you are not in the crosswalk. Now I've got you!" and hit them at full speed. The difference in law from place to place is how much of the blame is placed on the driver.
The police chief said it'd be hard for the vehicle to avoid in any mode, i.e., whether it was being operated by the car or by a human driver. I figure that means, any case in which such an accident was avoided would be a statistical outlier — and statistical outliers like that are going to happen with a human operator. Technological "AI" isn't going to produce outliers like that (at least, not in a positive direction).
I think the officer statement could perhaps be interpreted as 'at this speed, the death would be unavoidable, no-matter who or what was driving'. I doubt he meant 'if a human driver decided to travel at a slow speed then the death would also be inevitable'. Statistical outlier means that statistically the chances of a pedestrian coming there are so low that a majority of drivers would not decide to slow down, perhaps? I could argue that is a signal to improve driver education.
(To have a conclusive discussion, it could be interesting to obtain the footage from the camera that was present on the car.)
You're not likely to get the footage, AZ usually does not release that kind of information in fatality wreaks. If they were to release anything all you would see is vehicle crossing the bridge up to the point where she is visible in the median.
I'm not convinced about the would have hit her anyway defense. The vehicle has damage to the front bumper, but not enough to jettison the bike into the sidewalk. Remember this is a three lane road with two turn lanes - that is a long distance. Also very well lit. It will be interesting to see.