Obviously we do not have a zero tolerance level for human driven cars as thousands of human error deaths occur and we have not outlawed driving. My opinion is that objection lies in not having a person to directly hold responsible for a mistake. I also have the opinion that self driven cars have more potential for safety than the average mindless human driver.
The safety driver is an issue. Hopefully there’s lots of video to sort out the tragedy. There’s boulevards near me where jaywalkers routinely get struck. At night judging vehicle approach speed is difficult plus waiting gets frustrating.
That is not exactly the truth of the matter. Tesla did not market Autopilot as driverless technology, listed it as a work in progress and plainly stated that the drivers should remain engaged with the controls and aware of the traffic at all times.
The man who was killed was disregarding all of the instructions given by Tesla and was clowning with a cell phone when he died. He was the author of his own demise as he operated the car in a manner other than what was described by Tesla.
Autopilot at that time should be described as a driver assistant and in beta. It was not marketed as a driverless technology ready for autonomous operation.
As someone who used to own Tesla shares I have followed this company and I would suggest a retraction of the quoted passage is in order. Mr Brown died not following the clear instructions provided by the company that made and marketed the car.
Part of it is the mistake in naming that mode Autopilot. Marketing got in the way of reality. Just not as sexy as “Driver Assist”; but a lot more misleading. The expectation with AUTOpilot simply is autonomy. Disclaimers to the contrary have to fight that first impression.
Though in any case, yes, that driver had to disregard a bucketload of Tesla instructions and caveats. Humans.
tmay said: Maybe, autonomous vehicles are a shitty solution to most traffic problems.
Other than providing an option for people who can't physically drive a car, I don't really see much practical use for autonomous cars in urban environments. And, guaranteed, the companies that develop them will be lobbying for special legal exemptions and changes to traffic rules or infrastructure to "help" the autonomous cars operate.
Then again, 40,200 people were killed in 2016 by human drivers.
That's beside the point. All these companies are testing deadly technology on public roadways without the permission of all the other people using those roadways. They are putting people in danger without those people's permission or even knowledge. This is the result of Silicon Valley beta-testing mentality being applied to a product that should not be subject to such mentality. Here are the things that should be happening but aren't happening with respect to self-driving technology:
1. Self-driving tech should be more thoroughly tested in a closed course with realistically simulated vehicle and pedestrian traffic before they are unleashed on public roads. 2. But even before closed-course testing is authorized, the feds should research then set up performance and safety standards for closed-course testing. Standards that should be achieved before vehicles are allowed to use public roads. 3. Vehicles that are then tested on public roads should be clearly marked and be required to follow a strict protocol as to routes, hours, environmental conditions, and expansion thereof. With progression to more complex routes and environments, and more hours contingent on achieving properly calibrated safety and performance benchmarks.
Obviously we do not have a zero tolerance level for human driven cars as thousands of human error deaths occur and we have not outlawed driving. My opinion is that objection lies in not having a person to directly hold responsible for a mistake. I also have the opinion that self driven cars have more potential for safety than the average mindless human driver.
This.
And it points to who will really hold sway over the growth of autonomous vehicles - lawyers, insurers & courtrooms.
I am inclined to think that one reason they are 'behind' in self-driving tech is that they have given more thought to the unglamorous issues like safety and reliability and are addressing those more seriously, thus 'slowing down' development. In truth development isn't slowed down because safety should be part of developing this product. What is being slowed down is progress towards product introduction.
So any loss in consumer faith in self-driving technology would actually be an advantage for Apple as people probably have greater faith that Apple would take safety aspects more seriously than those proven reckless beta-testers, Google, Uber, Tesla, etc.
Google is certainly NOT a "reckless beta-tester". They've been working on self-driving technology since 2005, a dozen years, and actively testing since 2009. It's only been in the past several months that they've moved into true autonomy with actual driverless vehicles (no safety driver behind the wheel). Uber is a relative teenager in comparison.
So on the contrary there are some who think they've been TOO cautious and in the process allowed other companies to get up to speed with their own autonomy programs.
Reckless? Hardly.
Duh! I think we all knew this would happen sooner or later. I'm sure there will be more "sacrificial lambs" before autonomous cars really take off.
You clowns might want to get some more facts concerning this event before commenting. I'm not for or against autonomous cars; if anything I don't really have an opinion at this point because I think it's a long way off in the future. At any rate, the accident occurred at night and not in an intersection. It may easily be the case that the person left the sidewalk to cross the street at the worst possible moment, just as the car was driving by. Having a driver in the car may have been no help in such a case.
This is unfortunate but will not be the last human killed from autonomous technology. There are so many complexities to this it will take some time for this to get where it needs to be. I do believe that once perfected the world will be a safer and more efficient place. The question will be if we have the stomach to get there.
Then again, 40,200 people were killed in 2016 by human drivers.
How many cars & trucks and for how many miles compared to the paltry number of Ubers and others? Not exactly a comparable database.
This.
And the people who commented on the Tesla accident are spot-on. Tesla's system was not intended to be a fully autonomous system, nor was it advertised as such. That accident was driver error more than computer error.
I certainly hope so... It's an idea that sounds good but fair in the execution. As an example I worked for IBM when they went into the voting machine business using portapunch punched cards -- I could tell you horror stories
The Votomatic vote recorder, a punched card voting machine originally developed in the mid-1960s. Punched card systems employ a card (or cards) and a small clipboard-sized device for recording votes. Voters punch holes in the cards with a ballot marking device. Typical ballot marking devices carry a ballot label that identifies the candidates or issues associated with each punching position on the card, although in some cases, the names and issues are printed directly on the card. After voting, the voter may place the ballot in a ballot box, or the ballot may be fed into a computer vote tabulating device at the precinct. The idea of voting by punching holes on paper or cards originated in the 1890s[8] and inventors continued to explore this in the years that followed. By the late 1890s John McTammany's voting machine was used widely in several states. In this machine, votes were recorded by punching holes in a roll of paper comparable to those used in player pianos, and then tabulated after the polls closed using a pneumatic mechanism. Punched-card voting was proposed occasionally in the mid-20th century,[9] but the first major success for punched-card voting came in 1965, with Joseph P. Harris' development of the Votomatic punched-card system.[10][11][12] This was based on IBM's Port-A-Punch technology. Harris licensed the Votomatic to IBM.[13] William Rouverol built the prototype system. The Votomatic system[14] was very successful. By the 1996 Presidential election, some variation of the punched card system was used by 37.3% of registered voters in the United States.[15] Votomatic style systems and punched cards received considerable notoriety in 2000 when their uneven use in Florida was alleged to have affected the outcome of the U.S. presidential election.
https://en.wikipedia.org/wiki/Voting_machine like the unmanned, self-driving car for public use -- a disaster waiting to happen... Thankfully IBM got out of the voting machine business before the hanging-chad election!
Someone said : "I am not fully comfortable with the tech yet and don't think it is fully baked for city travel."
It's not, and won't ever be if they stop testing. As someone else already mentioned, vehicles are dangerous. According to the CDC in 2016 more than 42,000 people were killed in the US in traffic/vehicle related incidents. Cars are big and heavy, people are not. I would much rather walk/bike around a bunch of current-tech autonomous vehicles than around a bunch of asshats on their mobile phones.
I can see where all this is going and I’m not even clairvoyant. Every misstep of this new technology will be on the front page with interviews of its “victims.” Like nuclear power the public will be whipped into a frenzy by media coverage and sensationalization and will reject AI self-driving technology outright. I’m surprised we don’t already have incidents of unstable people trying to jump out in front of these vehicles or road rage types trying to run them off the road. Science Fiction writers have been dealing with human’s fear of robots and technology getting out of control for a hundred years. People are already afraid that robots are going to take their jobs and many have indeed lost their job to an industrial robot.
I am not fully comfortable with the tech yet and don't think it is fully baked for city travel.
Heck, we have fully baked people driving around here all the time.
40,000+ dead in in 2016? No problem.
I am just commenting on what I have seen it do here in AZ. Lots of ???? WTF ????? type of actions. Mostly, they are really good, use their blinkers but once in a while, it is like they go blind for short periods of time.
As for you stat. Do you think human drivers have more than 40,000X more hours (given the 150,000,000 drivers) this year than autonomous drivers?
I can see where all this is going and I’m not even clairvoyant. Every misstep of this new technology will be on the front page with interviews of its “victims.” Like nuclear power the public will be whipped into a frenzy by media coverage and sensationalization and will reject AI self-driving technology outright. I’m surprised we don’t already have incidents of unstable people trying to jump out in front of these vehicles or road rage types trying to run them off the road. Science Fiction writers have been dealing with human’s fear of robots and technology getting out of control for a hundred years. People are already afraid that robots are going to take their jobs and many have indeed lost their job to an industrial robot.
There is no question that people will go out of their way to mess with them.
Comments
Though in any case, yes, that driver had to disregard a bucketload of Tesla instructions and caveats. Humans.
Uber has maybe a few dozen self driving cars around, but has already killed someone.
This.
And it points to who will really hold sway over the growth of autonomous vehicles - lawyers, insurers & courtrooms.
So on the contrary there are some who think they've been TOO cautious and in the process allowed other companies to get up to speed with their own autonomy programs. Reckless? Hardly.
You clowns might want to get some more facts concerning this event before commenting. I'm not for or against autonomous cars; if anything I don't really have an opinion at this point because I think it's a long way off in the future. At any rate, the accident occurred at night and not in an intersection. It may easily be the case that the person left the sidewalk to cross the street at the worst possible moment, just as the car was driving by. Having a driver in the car may have been no help in such a case.
https://en.wikipedia.org/wiki/Voting_machine
like the unmanned, self-driving car for public use -- a disaster waiting to happen... Thankfully IBM got out of the voting machine business before the hanging-chad election!
As for you stat. Do you think human drivers have more than 40,000X more hours (given the 150,000,000 drivers) this year than autonomous drivers?