General Motors is taking on Tesla with its next generation of all-electric vehicles.
The company plans to invest more than $20 billion in its next generation of all-electric and autonomous vehicles through 2025.
GM announced the new details during an “EV Day,” its hardest push yet to convince skeptics, particularly on Wall Street, that the more than 110-year-old automaker can compete.
I think my main fear, is that the traditional car-markers are also too focused on all the AI baloney, where they should just be brining their car-making knowhow to EVs. I don't want something like a Tesla. I want something like my BMW E90 with an electric powertrain (Or, someone else probably wants their Ford F150 with an electric powertrain). Until some car-maker figures that out, they'll all have trouble getting beyond being novelty.
The Taycan was the first indication that you should not have any worries, as it actually has a two speed transmission. Ford has already sold out its first year/50,000 unit Mustang Mach E, and there is a project with Rivian for an F-150 EV, plus of course the Rivian R1T Pickup and R1S SUV coming later this year.
Elon pushed Autopilot because he needed to increase his margins, and wow the tech crowd, but there are likely huge legal liabilities from the fact that Autopilot was in fact not close to being autonomous.
Musk runs on the techical edge -- and obviously Wall Street disagrees with your assessment since Tesla recently passed GM & Ford in valuations. And no major auto maker is able to make anything even in the same technical league as a Tesla. In fact, Musk openly welcomes their coming competition. To parrot Steve's famous Wall Street Journal ad: "Welcome GM! Seriously!"
And, wasn't that the same military who denied AWS by illegally granting a contract to Microsoft (since blocked by a judge)? But, even if it wasn't the case, assuming that what they said was accurate, as I said, Musk is running on the edge of technology and they maybe wanted something a little less state of the art and more tried and proven. I don't see them buying any electric cars or trucks anytime soon either. That's ok.
For building the CyberTruck: I doubt he will have any trouble finding a place to build it. If the U.S. f's him over he'll just go elsewhere. He has plants in China and Europe.
But that's all simply business nonsense and of only passing importance. Steve had little patience with it either. Like Musk, he was focused on creating products that made the world and people's live better.
But I find it strange that you, a professional China hater, would be trash talking Musk. Musk is probably our best competition against their technological advances -- after all: he not only designed and built a better car than they can produce, he even went over there and showed them how build it themselves -- something unheard of for the past 50 years or so.
You do know that the Chinese Government owns Tesla's plant in Shanghai, don't you?
"The Tesla (Shanghai) company was formally established 8 May 2018, with a share capital of 100 million yuan, wholly owned by Tesla Motors Hong Kong. In July 2018 Tesla CEO Elon Musk signed an agreement with the Shanghai regional government to build its third Gigafactory, and the first in China."
If Tesla models 3's turn out to be priced too high for the Chinese market, and that is likely, then where will they be sold, if not the U.S.? Lack of incentives, ie, EV tax credits will mean that Tesla has to compete on price with a huge number of competitors.
Tesla's valuation is based on the myth of a "moat" that protects their market, but it doesn't exist. Still, buy lots of shares if you think it is worth it.
BTW, GM is in a very good position to take on Tesla, and at many different price points;
General Motors is taking on Tesla with its next generation of all-electric vehicles.
The company plans to invest more than $20 billion in its next generation of all-electric and autonomous vehicles through 2025.
GM announced the new details during an “EV Day,” its hardest push yet to convince skeptics, particularly on Wall Street, that the more than 110-year-old automaker can compete.
Why would anybody outside of an accountant care who own the plant? Many companies lease everything. That's just finance.
And no, GM is NOT well postioned to take on Tesla. Wake me up when they have a viable electric care that can take you further than the local grocery store.
"With its new Ultium battery technology, GM is on track to cut the battery cell costs - the single greatest expense in electric vehicles - to less than $100 per kilowatt-hour, Barra said. That would be down from about $140 now.
That, in turn, could make battery packs - for electric vehicles like the Chevrolet Bolt EV - up to 45% cheaper, experts say.
In a similar light, Musk, Tesla’s chief executive, told investors last month that the Silicon Valley automaker was working on battery cost and range breakthroughs that it would describe at a “Battery Day” in April.
Instead of 555 internal combustion powertrain combinations, GM will be able to move to just 19 different electric vehicle propulsion systems, slashing costs, GM officials said.
Barra reiterated a goal of selling 1 million electric vehicles annually in the United States and China by 2025."
So, how many years has Musk been making viable, high end, fully electric cars -- while GM watched?
And remind me, which GM electric car can compete with even the lowest end Tesla? is it the Prius? Oh wait! No, that's a Toyota.
No, once again, Detroit got passed over while it was stuck in the mud.
But, eventually they will be putting fully electric cars on the road, and Musk welcomes that -- partly because it will also put more charging stations on the road. But also because Musk, like Jobs, is not driven by money. To him, money is a means, not and end. He is driven by an urge to make great products that change the world. He is not afraid of competition. Competition just makes the world better.
Apple Quality of software has been at a steady decline for some time now. Amount of bugs, unfinished / incorrect documentation and fragmentation is also very present in today's apple products. In addition, apple is starting to execute and scale poorly. That said I still enjoy my older apple products, but let us face reality. Apple is a good toy / hobby company. Apple fails to make software or hardware anywhere near complex enough to handle use in most mission critical industries; i.e. medical, flight, automobiles. Tesla by contrast does this in spades.
Apple Quality of software has been at a steady decline for some time now. Amount of bugs, unfinished / incorrect documentation and fragmentation is also very present in today's apple products. In addition, apple is starting to execute and scale poorly. That said I still enjoy my older apple products, but let us face reality. Apple is a good toy / hobby company. Apple fails to make software or hardware anywhere near complex enough to handle use in most mission critical industries; i.e. medical, flight, automobiles. Tesla by contrast does this in spades.
..."Tesla does this in spades"
Kill drivers that actually believe that Autopilot is fully autonomous? Could it be that Elon gave them that idea?
Because that's something that the NTSB found, that Autopilot isn't close to being an autonomous driving system.
FlaSheridn said: Or even how the software they’re replacing is supposed to work. My (extremely biased) opinion is less that “Apple is uniquely bad for releasing buggy software,” than that its relative decline has been greatest.
Yeah, I think that is a really good point. It is almost like they put a new team on each project, who just produce their new, unique take on whatever particular app each year. And every year, those teams have less understanding of good UI design.
GeorgeBMac said: ... Both are extremely open in their criticism of what they see as second rate performance -- but it's not just criticism. It arises from seeing stuff that can be done better and needs to be done better but is falling short of the mark. In Steve's famous words: "This is shit. FIX IT!"
...
From an employee: "There’s no question that Elon is very aggressive on his timelines, but frankly, that drives us to do things better and faster ... when Elon says something, you have to pause and not immediately blurt
out, ‘Well, that’s impossible,’ or, ‘There’s no way we’re going to do
that. I don’t know how.’ So you zip it, and you think about it, and you
find ways to get that done..."
The problem is that he doesn't seem to apply that to his own stuff the way Steve seemed to. I'd like to think Steve wouldn't have ever implemented Autopilot, or pulled the plug given the level of failure and human cost.
Also, Musk seems more talk than action. He's been over-hyping self-driving vehicles for years, with seemingly little understanding of the actual stat of AI and the industry. It's one thing to scare your employees not to say anything is impossible... it's kind of another to be so out of touch with reality that you actually think nothing is impossible.
You may be over focusing on self driving cars -- which are being held back by government regulations. If he were allowed to Musk would, I am sure, have already moved ahead. Plus, he is developing MUCH more than just self driving cars. As for fatalities, my understanding is that even in these early stages, self driving cars have a better safety record than human driven cars.
FlaSheridn said: Or even how the software they’re replacing is supposed to work. My (extremely biased) opinion is less that “Apple is uniquely bad for releasing buggy software,” than that its relative decline has been greatest.
Yeah, I think that is a really good point. It is almost like they put a new team on each project, who just produce their new, unique take on whatever particular app each year. And every year, those teams have less understanding of good UI design.
GeorgeBMac said: ... Both are extremely open in their criticism of what they see as second rate performance -- but it's not just criticism. It arises from seeing stuff that can be done better and needs to be done better but is falling short of the mark. In Steve's famous words: "This is shit. FIX IT!"
...
From an employee: "There’s no question that Elon is very aggressive on his timelines, but frankly, that drives us to do things better and faster ... when Elon says something, you have to pause and not immediately blurt
out, ‘Well, that’s impossible,’ or, ‘There’s no way we’re going to do
that. I don’t know how.’ So you zip it, and you think about it, and you
find ways to get that done..."
The problem is that he doesn't seem to apply that to his own stuff the way Steve seemed to. I'd like to think Steve wouldn't have ever implemented Autopilot, or pulled the plug given the level of failure and human cost.
Also, Musk seems more talk than action. He's been over-hyping self-driving vehicles for years, with seemingly little understanding of the actual stat of AI and the industry. It's one thing to scare your employees not to say anything is impossible... it's kind of another to be so out of touch with reality that you actually think nothing is impossible.
You may be over focusing on self driving cars -- which are being held back by government regulations. If he were allowed to Musk would, I am sure, have already moved ahead. Plus, he is developing MUCH more than just self driving cars. As for fatalities, my understanding is that even in these early stages, self driving cars have a better safety record than human driven cars.
You continue to provide AI readers misinformation, and your understanding about Tesla in particular having a better safety record than human driven cars is false.
For a fact, Tesla's have a notable problem of crashing into the rear end of maintenance and emergency vehicles.
"On Saturday August 25, a Tesla Model S crashed into a stopped firetruck in San Jose, California. The two occupants inside the Tesla sustained minor injuries, and the 37-year-old driver was arrested on suspicion of driving under the influence of alcohol. According to a police report, he told authorities, "I think I had Autopilot on." Tesla has not confirmed the semiautonomous system was in use, but it's at least the third time this year a Tesla has hit a stopped firetruck at highway speeds. We've updated this story, which originally ran on January 25, 2018, about why Autopilot and similar systems have trouble detecting stopped vehicles."
It is still happening, and Tesla's are still making unplanned exits off roads, and cliffs, and crashing into stationary objects, like restaurants;
"Autopilot is Level 2 semi-autonomous system, as described by the Society of Automotive Engineers, that combines adaptive cruise control, lane keep assist, self-parking, and, most recently, the ability to automatically change lanes. Tesla bills it as one of the safest systems on the road today, but the deaths of Brown and Banner raise questions about those claims and suggest that the Tesla has neglected to address a major weakness in its flagship technology."
"Car safety experts note that adaptive cruise control systems like Autopilot rely mostly on radar to avoid hitting other vehicles on the road. Radar is good at detecting moving objects but not stationary objects. It also has difficulty detecting objects like a vehicle crossing the road not moving in the car’s direction of travel."
"Radar outputs of detected objects are sometimes ignored by the vehicle’s software to deal with the generation of “false positives,” said Raj Rajkumar, an electrical and computer engineering professor at Carnegie Mellon University. Without these, the radar would “see” an overpass and report that as an obstacle, causing the vehicle to slam on the brakes.
On the computer vision side of the equation, the algorithms using the camera output need to be trained to detect trucks that are perpendicular to the direction of the vehicle, he added. In most road situations, there are vehicles to the front, back, and to the side, but a perpendicular vehicle is much less common.
“Essentially, the same incident repeats after three years,” Rajkumar said. “This seems to indicate that these two problems have still not been addressed.” Machine learning and artificial intelligence have inherent limitations. If sensors “see” what they have never or seldom seen before, they do not know how to handle those situations. “Tesla is not handling the well-known limitations of AI,” he added."
Elon should clean up his own mess, before he disses others, but of course, Elon can't contain his ego.
FlaSheridn said: Or even how the software they’re replacing is supposed to work. My (extremely biased) opinion is less that “Apple is uniquely bad for releasing buggy software,” than that its relative decline has been greatest.
Yeah, I think that is a really good point. It is almost like they put a new team on each project, who just produce their new, unique take on whatever particular app each year. And every year, those teams have less understanding of good UI design.
GeorgeBMac said: ... Both are extremely open in their criticism of what they see as second rate performance -- but it's not just criticism. It arises from seeing stuff that can be done better and needs to be done better but is falling short of the mark. In Steve's famous words: "This is shit. FIX IT!"
...
From an employee: "There’s no question that Elon is very aggressive on his timelines, but frankly, that drives us to do things better and faster ... when Elon says something, you have to pause and not immediately blurt
out, ‘Well, that’s impossible,’ or, ‘There’s no way we’re going to do
that. I don’t know how.’ So you zip it, and you think about it, and you
find ways to get that done..."
The problem is that he doesn't seem to apply that to his own stuff the way Steve seemed to. I'd like to think Steve wouldn't have ever implemented Autopilot, or pulled the plug given the level of failure and human cost.
Also, Musk seems more talk than action. He's been over-hyping self-driving vehicles for years, with seemingly little understanding of the actual stat of AI and the industry. It's one thing to scare your employees not to say anything is impossible... it's kind of another to be so out of touch with reality that you actually think nothing is impossible.
You may be over focusing on self driving cars -- which are being held back by government regulations. If he were allowed to Musk would, I am sure, have already moved ahead. Plus, he is developing MUCH more than just self driving cars. As for fatalities, my understanding is that even in these early stages, self driving cars have a better safety record than human driven cars.
You continue to provide AI readers misinformation, and your understanding about Tesla in particular having a better safety record than human driven cars is false.
For a fact, Tesla's have a notable problem of crashing into the rear end of maintenance and emergency vehicles.
"On Saturday August 25, a Tesla Model S crashed into a stopped firetruck in San Jose, California. The two occupants inside the Tesla sustained minor injuries, and the 37-year-old driver was arrested on suspicion of driving under the influence of alcohol. According to a police report, he told authorities, "I think I had Autopilot on." Tesla has not confirmed the semiautonomous system was in use, but it's at least the third time this year a Tesla has hit a stopped firetruck at highway speeds. We've updated this story, which originally ran on January 25, 2018, about why Autopilot and similar systems have trouble detecting stopped vehicles."
It is still happening, and Tesla's are still making unplanned exits off roads, and cliffs, and crashing into stationary objects, like restaurants;
"Autopilot is Level 2 semi-autonomous system, as described by the Society of Automotive Engineers, that combines adaptive cruise control, lane keep assist, self-parking, and, most recently, the ability to automatically change lanes. Tesla bills it as one of the safest systems on the road today, but the deaths of Brown and Banner raise questions about those claims and suggest that the Tesla has neglected to address a major weakness in its flagship technology."
"Car safety experts note that adaptive cruise control systems like Autopilot rely mostly on radar to avoid hitting other vehicles on the road. Radar is good at detecting moving objects but not stationary objects. It also has difficulty detecting objects like a vehicle crossing the road not moving in the car’s direction of travel."
"Radar outputs of detected objects are sometimes ignored by the vehicle’s software to deal with the generation of “false positives,” said Raj Rajkumar, an electrical and computer engineering professor at Carnegie Mellon University. Without these, the radar would “see” an overpass and report that as an obstacle, causing the vehicle to slam on the brakes.
On the computer vision side of the equation, the algorithms using the camera output need to be trained to detect trucks that are perpendicular to the direction of the vehicle, he added. In most road situations, there are vehicles to the front, back, and to the side, but a perpendicular vehicle is much less common.
“Essentially, the same incident repeats after three years,” Rajkumar said. “This seems to indicate that these two problems have still not been addressed.” Machine learning and artificial intelligence have inherent limitations. If sensors “see” what they have never or seldom seen before, they do not know how to handle those situations. “Tesla is not handling the well-known limitations of AI,” he added."
Elon should clean up his own mess, before he disses others, but of course, Elon can't contain his ego.
Yeh, you're right. No human ever rear-ended another car. /s
FlaSheridn said: Or even how the software they’re replacing is supposed to work. My (extremely biased) opinion is less that “Apple is uniquely bad for releasing buggy software,” than that its relative decline has been greatest.
Yeah, I think that is a really good point. It is almost like they put a new team on each project, who just produce their new, unique take on whatever particular app each year. And every year, those teams have less understanding of good UI design.
GeorgeBMac said: ... Both are extremely open in their criticism of what they see as second rate performance -- but it's not just criticism. It arises from seeing stuff that can be done better and needs to be done better but is falling short of the mark. In Steve's famous words: "This is shit. FIX IT!"
...
From an employee: "There’s no question that Elon is very aggressive on his timelines, but frankly, that drives us to do things better and faster ... when Elon says something, you have to pause and not immediately blurt
out, ‘Well, that’s impossible,’ or, ‘There’s no way we’re going to do
that. I don’t know how.’ So you zip it, and you think about it, and you
find ways to get that done..."
The problem is that he doesn't seem to apply that to his own stuff the way Steve seemed to. I'd like to think Steve wouldn't have ever implemented Autopilot, or pulled the plug given the level of failure and human cost.
Also, Musk seems more talk than action. He's been over-hyping self-driving vehicles for years, with seemingly little understanding of the actual stat of AI and the industry. It's one thing to scare your employees not to say anything is impossible... it's kind of another to be so out of touch with reality that you actually think nothing is impossible.
You may be over focusing on self driving cars -- which are being held back by government regulations. If he were allowed to Musk would, I am sure, have already moved ahead. Plus, he is developing MUCH more than just self driving cars. As for fatalities, my understanding is that even in these early stages, self driving cars have a better safety record than human driven cars.
You continue to provide AI readers misinformation, and your understanding about Tesla in particular having a better safety record than human driven cars is false.
For a fact, Tesla's have a notable problem of crashing into the rear end of maintenance and emergency vehicles.
"On Saturday August 25, a Tesla Model S crashed into a stopped firetruck in San Jose, California. The two occupants inside the Tesla sustained minor injuries, and the 37-year-old driver was arrested on suspicion of driving under the influence of alcohol. According to a police report, he told authorities, "I think I had Autopilot on." Tesla has not confirmed the semiautonomous system was in use, but it's at least the third time this year a Tesla has hit a stopped firetruck at highway speeds. We've updated this story, which originally ran on January 25, 2018, about why Autopilot and similar systems have trouble detecting stopped vehicles."
It is still happening, and Tesla's are still making unplanned exits off roads, and cliffs, and crashing into stationary objects, like restaurants;
"Autopilot is Level 2 semi-autonomous system, as described by the Society of Automotive Engineers, that combines adaptive cruise control, lane keep assist, self-parking, and, most recently, the ability to automatically change lanes. Tesla bills it as one of the safest systems on the road today, but the deaths of Brown and Banner raise questions about those claims and suggest that the Tesla has neglected to address a major weakness in its flagship technology."
"Car safety experts note that adaptive cruise control systems like Autopilot rely mostly on radar to avoid hitting other vehicles on the road. Radar is good at detecting moving objects but not stationary objects. It also has difficulty detecting objects like a vehicle crossing the road not moving in the car’s direction of travel."
"Radar outputs of detected objects are sometimes ignored by the vehicle’s software to deal with the generation of “false positives,” said Raj Rajkumar, an electrical and computer engineering professor at Carnegie Mellon University. Without these, the radar would “see” an overpass and report that as an obstacle, causing the vehicle to slam on the brakes.
On the computer vision side of the equation, the algorithms using the camera output need to be trained to detect trucks that are perpendicular to the direction of the vehicle, he added. In most road situations, there are vehicles to the front, back, and to the side, but a perpendicular vehicle is much less common.
“Essentially, the same incident repeats after three years,” Rajkumar said. “This seems to indicate that these two problems have still not been addressed.” Machine learning and artificial intelligence have inherent limitations. If sensors “see” what they have never or seldom seen before, they do not know how to handle those situations. “Tesla is not handling the well-known limitations of AI,” he added."
Elon should clean up his own mess, before he disses others, but of course, Elon can't contain his ego.
Yeh, you're right. No human ever rear-ended another car. /s
That's not a convincing argument for "Full Self Driving Autopilot mode", and if it doesn't work as advertised, and it doesn't, it adds yet more liability to Tesla. Of course, plowing through pedestrians in a crosswalk while under "Full Self Driving Autopilot mode" is going to be lawyer bait, and the certainly has happened as well.
Looks like Tesla is in yet another legal quagmire, and not just in China, installing a much less capability earlier version of hardware, without telling the customer. That's just plain illegal.
But of course, Tesla can't afford to wait for the correct hardware because they wouldn't be able to see as many cars, and that would cause their overvalued stock to drop.
Still, laughably funny that Elon's Tesla "cult" fully supports this bait and switch.
tmay said: The Taycan was the first indication that you should not have any worries, as it actually has a two speed transmission. Ford has already sold out its first year/50,000 unit Mustang Mach E, and there is a project with Rivian for an F-150 EV, plus of course the Rivian R1T Pickup and R1S SUV coming later this year.
Elon pushed Autopilot because he needed to increase his margins, and wow the tech crowd, but there are likely huge legal liabilities from the fact that Autopilot was in fact not close to being autonomous.
Yeah, I haven't looked at those vehicles enough as I'm not the target market, but that sounds good.
I hope the liabilities catch up with Elon, but so far, it seems the government and such agencies are quite all-in on the autonomous thing. Unless they better understand it, he could probably just give them his sci-fi pitch, and they'd believe it.
... Apple fails to make software or hardware anywhere near complex enough to handle use in most mission critical industries; i.e. medical, flight, automobiles. Tesla by contrast does this in spades.
You missed the part about running people into semi trucks and highway dividers, apparently?
GeorgeBMac said: You may be over focusing on self driving cars -- which are being held back by government regulations. If he were allowed to Musk would, I am sure, have already moved ahead. Plus, he is developing MUCH more than just self driving cars. As for fatalities, my understanding is that even in these early stages, self driving cars have a better safety record than human driven cars.
Yeah, I'm sure Musk would go even more wild than he has if he weren't being held back a bit (which isn't very much), given the claims he has made. He couldn't care less how many people die to achieve his dreams, I guess. (Should we mention he's been completely wrong so far on his projections about where the tech capabilities are at?)
I wish I could find the article, but no, they aren't already better than human drivers in terms of safety. You're being tricked by the media hype. Basically, human drivers have a fatality about one every 100 million miles. We don't know about AI cars, as they haven't driven that much yet, and certainly not under conditions the average human deals with.
tmay said: The Taycan was the first indication that you should not have any worries, as it actually has a two speed transmission. Ford has already sold out its first year/50,000 unit Mustang Mach E, and there is a project with Rivian for an F-150 EV, plus of course the Rivian R1T Pickup and R1S SUV coming later this year.
Elon pushed Autopilot because he needed to increase his margins, and wow the tech crowd, but there are likely huge legal liabilities from the fact that Autopilot was in fact not close to being autonomous.
Yeah, I haven't looked at those vehicles enough as I'm not the target market, but that sounds good.
I hope the liabilities catch up with Elon, but so far, it seems the government and such agencies are quite all-in on the autonomous thing. Unless they better understand it, he could probably just give them his sci-fi pitch, and they'd believe it.
... Apple fails to make software or hardware anywhere near complex enough to handle use in most mission critical industries; i.e. medical, flight, automobiles. Tesla by contrast does this in spades.
You missed the part about running people into semi trucks and highway dividers, apparently?
GeorgeBMac said: You may be over focusing on self driving cars -- which are being held back by government regulations. If he were allowed to Musk would, I am sure, have already moved ahead. Plus, he is developing MUCH more than just self driving cars. As for fatalities, my understanding is that even in these early stages, self driving cars have a better safety record than human driven cars.
Yeah, I'm sure Musk would go even more wild than he has if he weren't being held back a bit (which isn't very much), given the claims he has made. He couldn't care less how many people die to achieve his dreams, I guess. (Should we mention he's been completely wrong so far on his projections about where the tech capabilities are at?)
I wish I could find the article, but no, they aren't already better than human drivers in terms of safety. You're being tricked by the media hype. Basically, human drivers have a fatality about one every 100 million miles. We don't know about AI cars, as they haven't driven that much yet, and certainly not under conditions the average human deals with.
Tesla can't decide whether it prioritizes self driving or self parking. Look closely at the rear wheel; the car has a broken suspension, often called "wompy wheel" and cause of more than a few fatalities. That suspension breaking may have caused the accident.
As for the EV's out there, and those announced, and the future, I'm not the target market, though I am keen on some sort of hauling/travel van, EV or hybrid, that would fit with my lifestyle at retirement, which is still off in the fufure. Ford is going to ship Transit EV's here in the U.S., and that might work for me. Since I live on the Eastern Sierra, distances are currently too great for the limited network of chargers without having to have a massive battery. Hybrids make more sense.
FlaSheridn said: Or even how the software they’re replacing is supposed to work. My (extremely biased) opinion is less that “Apple is uniquely bad for releasing buggy software,” than that its relative decline has been greatest.
Yeah, I think that is a really good point. It is almost like they put a new team on each project, who just produce their new, unique take on whatever particular app each year. And every year, those teams have less understanding of good UI design.
GeorgeBMac said: ... Both are extremely open in their criticism of what they see as second rate performance -- but it's not just criticism. It arises from seeing stuff that can be done better and needs to be done better but is falling short of the mark. In Steve's famous words: "This is shit. FIX IT!"
...
From an employee: "There’s no question that Elon is very aggressive on his timelines, but frankly, that drives us to do things better and faster ... when Elon says something, you have to pause and not immediately blurt
out, ‘Well, that’s impossible,’ or, ‘There’s no way we’re going to do
that. I don’t know how.’ So you zip it, and you think about it, and you
find ways to get that done..."
The problem is that he doesn't seem to apply that to his own stuff the way Steve seemed to. I'd like to think Steve wouldn't have ever implemented Autopilot, or pulled the plug given the level of failure and human cost.
Also, Musk seems more talk than action. He's been over-hyping self-driving vehicles for years, with seemingly little understanding of the actual stat of AI and the industry. It's one thing to scare your employees not to say anything is impossible... it's kind of another to be so out of touch with reality that you actually think nothing is impossible.
You may be over focusing on self driving cars -- which are being held back by government regulations. If he were allowed to Musk would, I am sure, have already moved ahead. Plus, he is developing MUCH more than just self driving cars. As for fatalities, my understanding is that even in these early stages, self driving cars have a better safety record than human driven cars.
You continue to provide AI readers misinformation, and your understanding about Tesla in particular having a better safety record than human driven cars is false.
For a fact, Tesla's have a notable problem of crashing into the rear end of maintenance and emergency vehicles.
"On Saturday August 25, a Tesla Model S crashed into a stopped firetruck in San Jose, California. The two occupants inside the Tesla sustained minor injuries, and the 37-year-old driver was arrested on suspicion of driving under the influence of alcohol. According to a police report, he told authorities, "I think I had Autopilot on." Tesla has not confirmed the semiautonomous system was in use, but it's at least the third time this year a Tesla has hit a stopped firetruck at highway speeds. We've updated this story, which originally ran on January 25, 2018, about why Autopilot and similar systems have trouble detecting stopped vehicles."
It is still happening, and Tesla's are still making unplanned exits off roads, and cliffs, and crashing into stationary objects, like restaurants;
"Autopilot is Level 2 semi-autonomous system, as described by the Society of Automotive Engineers, that combines adaptive cruise control, lane keep assist, self-parking, and, most recently, the ability to automatically change lanes. Tesla bills it as one of the safest systems on the road today, but the deaths of Brown and Banner raise questions about those claims and suggest that the Tesla has neglected to address a major weakness in its flagship technology."
"Car safety experts note that adaptive cruise control systems like Autopilot rely mostly on radar to avoid hitting other vehicles on the road. Radar is good at detecting moving objects but not stationary objects. It also has difficulty detecting objects like a vehicle crossing the road not moving in the car’s direction of travel."
"Radar outputs of detected objects are sometimes ignored by the vehicle’s software to deal with the generation of “false positives,” said Raj Rajkumar, an electrical and computer engineering professor at Carnegie Mellon University. Without these, the radar would “see” an overpass and report that as an obstacle, causing the vehicle to slam on the brakes.
On the computer vision side of the equation, the algorithms using the camera output need to be trained to detect trucks that are perpendicular to the direction of the vehicle, he added. In most road situations, there are vehicles to the front, back, and to the side, but a perpendicular vehicle is much less common.
“Essentially, the same incident repeats after three years,” Rajkumar said. “This seems to indicate that these two problems have still not been addressed.” Machine learning and artificial intelligence have inherent limitations. If sensors “see” what they have never or seldom seen before, they do not know how to handle those situations. “Tesla is not handling the well-known limitations of AI,” he added."
Elon should clean up his own mess, before he disses others, but of course, Elon can't contain his ego.
Yeh, you're right. No human ever rear-ended another car. /s
That's not a convincing argument for "Full Self Driving Autopilot mode", and if it doesn't work as advertised, and it doesn't, it adds yet more liability to Tesla. Of course, plowing through pedestrians in a crosswalk while under "Full Self Driving Autopilot mode" is going to be lawyer bait, and the certainly has happened as well.
Looks like Tesla is in yet another legal quagmire, and not just in China, installing a much less capability earlier version of hardware, without telling the customer. That's just plain illegal.
But of course, Tesla can't afford to wait for the correct hardware because they wouldn't be able to see as many cars, and that would cause their overvalued stock to drop.
Still, laughably funny that Elon's Tesla "cult" fully supports this bait and switch.
Despite multiple well funded major firms working on developing a fully capable and independent self-driving system (including Apple) you come down on Tesla for not perfecting theirs yet. Got it!
tmay said: The Taycan was the first indication that you should not have any worries, as it actually has a two speed transmission. Ford has already sold out its first year/50,000 unit Mustang Mach E, and there is a project with Rivian for an F-150 EV, plus of course the Rivian R1T Pickup and R1S SUV coming later this year.
Elon pushed Autopilot because he needed to increase his margins, and wow the tech crowd, but there are likely huge legal liabilities from the fact that Autopilot was in fact not close to being autonomous.
Yeah, I haven't looked at those vehicles enough as I'm not the target market, but that sounds good.
I hope the liabilities catch up with Elon, but so far, it seems the government and such agencies are quite all-in on the autonomous thing. Unless they better understand it, he could probably just give them his sci-fi pitch, and they'd believe it.
... Apple fails to make software or hardware anywhere near complex enough to handle use in most mission critical industries; i.e. medical, flight, automobiles. Tesla by contrast does this in spades.
You missed the part about running people into semi trucks and highway dividers, apparently?
GeorgeBMac said: You may be over focusing on self driving cars -- which are being held back by government regulations. If he were allowed to Musk would, I am sure, have already moved ahead. Plus, he is developing MUCH more than just self driving cars. As for fatalities, my understanding is that even in these early stages, self driving cars have a better safety record than human driven cars.
Yeah, I'm sure Musk would go even more wild than he has if he weren't being held back a bit (which isn't very much), given the claims he has made. He couldn't care less how many people die to achieve his dreams, I guess. (Should we mention he's been completely wrong so far on his projections about where the tech capabilities are at?)
I wish I could find the article, but no, they aren't already better than human drivers in terms of safety. You're being tricked by the media hype. Basically, human drivers have a fatality about one every 100 million miles. We don't know about AI cars, as they haven't driven that much yet, and certainly not under conditions the average human deals with.
Sorry, but self-driving is coming. It's if but when. And, like the transition from horses to automobiles, some will be kicking, screaming and dragging their feet and protesting. But, your caution is a good thing -- adventuresome evangelists like Uber and Musk would likely unleash it before it was fully ready if not for the government anchor holding them back by demanding it is proven to meet (yet to be fully developed) safety criteria. The lessons of the 737Max definitely apply here.
But, regardless, it IS coming and Tesla is one of the ones leading the way. (not to sell more cars, but because that's what Musk does. Innovate to make the world a better place for people to live in. If he only cared about money he would be living comfortably on an island somewhere.)
tmay said: The Taycan was the first indication that you should not have any worries, as it actually has a two speed transmission. Ford has already sold out its first year/50,000 unit Mustang Mach E, and there is a project with Rivian for an F-150 EV, plus of course the Rivian R1T Pickup and R1S SUV coming later this year.
Elon pushed Autopilot because he needed to increase his margins, and wow the tech crowd, but there are likely huge legal liabilities from the fact that Autopilot was in fact not close to being autonomous.
Yeah, I haven't looked at those vehicles enough as I'm not the target market, but that sounds good.
I hope the liabilities catch up with Elon, but so far, it seems the government and such agencies are quite all-in on the autonomous thing. Unless they better understand it, he could probably just give them his sci-fi pitch, and they'd believe it.
... Apple fails to make software or hardware anywhere near complex enough to handle use in most mission critical industries; i.e. medical, flight, automobiles. Tesla by contrast does this in spades.
You missed the part about running people into semi trucks and highway dividers, apparently?
GeorgeBMac said: You may be over focusing on self driving cars -- which are being held back by government regulations. If he were allowed to Musk would, I am sure, have already moved ahead. Plus, he is developing MUCH more than just self driving cars. As for fatalities, my understanding is that even in these early stages, self driving cars have a better safety record than human driven cars.
Yeah, I'm sure Musk would go even more wild than he has if he weren't being held back a bit (which isn't very much), given the claims he has made. He couldn't care less how many people die to achieve his dreams, I guess. (Should we mention he's been completely wrong so far on his projections about where the tech capabilities are at?)
I wish I could find the article, but no, they aren't already better than human drivers in terms of safety. You're being tricked by the media hype. Basically, human drivers have a fatality about one every 100 million miles. We don't know about AI cars, as they haven't driven that much yet, and certainly not under conditions the average human deals with.
Tesla can't decide whether it prioritizes self driving or self parking. Look closely at the rear wheel; the car has a broken suspension, often called "wompy wheel" and cause of more than a few fatalities. That suspension breaking may have caused the accident.
As for the EV's out there, and those announced, and the future, I'm not the target market, though I am keen on some sort of hauling/travel van, EV or hybrid, that would fit with my lifestyle at retirement, which is still off in the fufure. Ford is going to ship Transit EV's here in the U.S., and that might work for me. Since I live on the Eastern Sierra, distances are currently too great for the limited network of chargers without having to have a massive battery. Hybrids make more sense.
That's why Musk welcomes others into the EV market. The more EV cars on the road the more EV charging stations there will be. (But with a 250-400 mile range (depending on the model), Tesla's are already viable for most people.)
The "dilemma" is similar to that of 5G: you have to have enough 5G phones in order to make 5G transmitters economically viable -- and vice-versa.
tmay said: The Taycan was the first indication that you should not have any worries, as it actually has a two speed transmission. Ford has already sold out its first year/50,000 unit Mustang Mach E, and there is a project with Rivian for an F-150 EV, plus of course the Rivian R1T Pickup and R1S SUV coming later this year.
Elon pushed Autopilot because he needed to increase his margins, and wow the tech crowd, but there are likely huge legal liabilities from the fact that Autopilot was in fact not close to being autonomous.
Yeah, I haven't looked at those vehicles enough as I'm not the target market, but that sounds good.
I hope the liabilities catch up with Elon, but so far, it seems the government and such agencies are quite all-in on the autonomous thing. Unless they better understand it, he could probably just give them his sci-fi pitch, and they'd believe it.
... Apple fails to make software or hardware anywhere near complex enough to handle use in most mission critical industries; i.e. medical, flight, automobiles. Tesla by contrast does this in spades.
You missed the part about running people into semi trucks and highway dividers, apparently?
GeorgeBMac said: You may be over focusing on self driving cars -- which are being held back by government regulations. If he were allowed to Musk would, I am sure, have already moved ahead. Plus, he is developing MUCH more than just self driving cars. As for fatalities, my understanding is that even in these early stages, self driving cars have a better safety record than human driven cars.
Yeah, I'm sure Musk would go even more wild than he has if he weren't being held back a bit (which isn't very much), given the claims he has made. He couldn't care less how many people die to achieve his dreams, I guess. (Should we mention he's been completely wrong so far on his projections about where the tech capabilities are at?)
I wish I could find the article, but no, they aren't already better than human drivers in terms of safety. You're being tricked by the media hype. Basically, human drivers have a fatality about one every 100 million miles. We don't know about AI cars, as they haven't driven that much yet, and certainly not under conditions the average human deals with.
Exactly One thing the media leaves out is how many of those human driver fatalities are from driving in bad weather? I’d wager to say a very high percentage. How many autopilot miles have Tesla’s logged driving in blizzards and downpours? Considering where most of them are sold I’m going to say not many.
FlaSheridn said: Or even how the software they’re replacing is supposed to work. My (extremely biased) opinion is less that “Apple is uniquely bad for releasing buggy software,” than that its relative decline has been greatest.
Yeah, I think that is a really good point. It is almost like they put a new team on each project, who just produce their new, unique take on whatever particular app each year. And every year, those teams have less understanding of good UI design.
GeorgeBMac said: ... Both are extremely open in their criticism of what they see as second rate performance -- but it's not just criticism. It arises from seeing stuff that can be done better and needs to be done better but is falling short of the mark. In Steve's famous words: "This is shit. FIX IT!"
...
From an employee: "There’s no question that Elon is very aggressive on his timelines, but frankly, that drives us to do things better and faster ... when Elon says something, you have to pause and not immediately blurt
out, ‘Well, that’s impossible,’ or, ‘There’s no way we’re going to do
that. I don’t know how.’ So you zip it, and you think about it, and you
find ways to get that done..."
The problem is that he doesn't seem to apply that to his own stuff the way Steve seemed to. I'd like to think Steve wouldn't have ever implemented Autopilot, or pulled the plug given the level of failure and human cost.
Also, Musk seems more talk than action. He's been over-hyping self-driving vehicles for years, with seemingly little understanding of the actual stat of AI and the industry. It's one thing to scare your employees not to say anything is impossible... it's kind of another to be so out of touch with reality that you actually think nothing is impossible.
You may be over focusing on self driving cars -- which are being held back by government regulations. If he were allowed to Musk would, I am sure, have already moved ahead. Plus, he is developing MUCH more than just self driving cars. As for fatalities, my understanding is that even in these early stages, self driving cars have a better safety record than human driven cars.
You continue to provide AI readers misinformation, and your understanding about Tesla in particular having a better safety record than human driven cars is false.
For a fact, Tesla's have a notable problem of crashing into the rear end of maintenance and emergency vehicles.
"On Saturday August 25, a Tesla Model S crashed into a stopped firetruck in San Jose, California. The two occupants inside the Tesla sustained minor injuries, and the 37-year-old driver was arrested on suspicion of driving under the influence of alcohol. According to a police report, he told authorities, "I think I had Autopilot on." Tesla has not confirmed the semiautonomous system was in use, but it's at least the third time this year a Tesla has hit a stopped firetruck at highway speeds. We've updated this story, which originally ran on January 25, 2018, about why Autopilot and similar systems have trouble detecting stopped vehicles."
It is still happening, and Tesla's are still making unplanned exits off roads, and cliffs, and crashing into stationary objects, like restaurants;
"Autopilot is Level 2 semi-autonomous system, as described by the Society of Automotive Engineers, that combines adaptive cruise control, lane keep assist, self-parking, and, most recently, the ability to automatically change lanes. Tesla bills it as one of the safest systems on the road today, but the deaths of Brown and Banner raise questions about those claims and suggest that the Tesla has neglected to address a major weakness in its flagship technology."
"Car safety experts note that adaptive cruise control systems like Autopilot rely mostly on radar to avoid hitting other vehicles on the road. Radar is good at detecting moving objects but not stationary objects. It also has difficulty detecting objects like a vehicle crossing the road not moving in the car’s direction of travel."
"Radar outputs of detected objects are sometimes ignored by the vehicle’s software to deal with the generation of “false positives,” said Raj Rajkumar, an electrical and computer engineering professor at Carnegie Mellon University. Without these, the radar would “see” an overpass and report that as an obstacle, causing the vehicle to slam on the brakes.
On the computer vision side of the equation, the algorithms using the camera output need to be trained to detect trucks that are perpendicular to the direction of the vehicle, he added. In most road situations, there are vehicles to the front, back, and to the side, but a perpendicular vehicle is much less common.
“Essentially, the same incident repeats after three years,” Rajkumar said. “This seems to indicate that these two problems have still not been addressed.” Machine learning and artificial intelligence have inherent limitations. If sensors “see” what they have never or seldom seen before, they do not know how to handle those situations. “Tesla is not handling the well-known limitations of AI,” he added."
Elon should clean up his own mess, before he disses others, but of course, Elon can't contain his ego.
Yeh, you're right. No human ever rear-ended another car. /s
That's not a convincing argument for "Full Self Driving Autopilot mode", and if it doesn't work as advertised, and it doesn't, it adds yet more liability to Tesla. Of course, plowing through pedestrians in a crosswalk while under "Full Self Driving Autopilot mode" is going to be lawyer bait, and the certainly has happened as well.
Looks like Tesla is in yet another legal quagmire, and not just in China, installing a much less capability earlier version of hardware, without telling the customer. That's just plain illegal.
But of course, Tesla can't afford to wait for the correct hardware because they wouldn't be able to see as many cars, and that would cause their overvalued stock to drop.
Still, laughably funny that Elon's Tesla "cult" fully supports this bait and switch.
Despite multiple well funded major firms working on developing a fully capable and independent self-driving system (including Apple) you come down on Tesla for not perfecting theirs yet. Got it!
Because it was Musk that opened his mouth and started all of this.
tmay said: The Taycan was the first indication that you should not have any worries, as it actually has a two speed transmission. Ford has already sold out its first year/50,000 unit Mustang Mach E, and there is a project with Rivian for an F-150 EV, plus of course the Rivian R1T Pickup and R1S SUV coming later this year.
Elon pushed Autopilot because he needed to increase his margins, and wow the tech crowd, but there are likely huge legal liabilities from the fact that Autopilot was in fact not close to being autonomous.
Yeah, I haven't looked at those vehicles enough as I'm not the target market, but that sounds good.
I hope the liabilities catch up with Elon, but so far, it seems the government and such agencies are quite all-in on the autonomous thing. Unless they better understand it, he could probably just give them his sci-fi pitch, and they'd believe it.
... Apple fails to make software or hardware anywhere near complex enough to handle use in most mission critical industries; i.e. medical, flight, automobiles. Tesla by contrast does this in spades.
You missed the part about running people into semi trucks and highway dividers, apparently?
GeorgeBMac said: You may be over focusing on self driving cars -- which are being held back by government regulations. If he were allowed to Musk would, I am sure, have already moved ahead. Plus, he is developing MUCH more than just self driving cars. As for fatalities, my understanding is that even in these early stages, self driving cars have a better safety record than human driven cars.
Yeah, I'm sure Musk would go even more wild than he has if he weren't being held back a bit (which isn't very much), given the claims he has made. He couldn't care less how many people die to achieve his dreams, I guess. (Should we mention he's been completely wrong so far on his projections about where the tech capabilities are at?)
I wish I could find the article, but no, they aren't already better than human drivers in terms of safety. You're being tricked by the media hype. Basically, human drivers have a fatality about one every 100 million miles. We don't know about AI cars, as they haven't driven that much yet, and certainly not under conditions the average human deals with.
Sorry, but self-driving is coming. It's if but when. And, like the transition from horses to automobiles, some will be kicking, screaming and dragging their feet and protesting. But, your caution is a good thing -- adventuresome evangelists like Uber and Musk would likely unleash it before it was fully ready if not for the government anchor holding them back by demanding it is proven to meet (yet to be fully developed) safety criteria. The lessons of the 737Max definitely apply here.
But, regardless, it IS coming and Tesla is one of the ones leading the way. (not to sell more cars, but because that's what Musk does. Innovate to make the world a better place for people to live in. If he only cared about money he would be living comfortably on an island somewhere.)
Your propaganda fails;
"Forget Mars, Elon Musk is Colonizing Bel-Air. You can add “land baron” to SpaceX and Tesla Motors CEO Elon Musk's list of titles. The tech billionaire owns five houses in Los Angeles—all of them inside the gated community of Bel-Air."
He's just another uber rich guy, grifting off of selling the dream, and lots of people are still buying into that.
tmay said: Tesla can't decide whether it prioritizes self driving or self parking. Look closely at the rear wheel; the car has a broken suspension, often called "wompy wheel" and cause of more than a few fatalities. That suspension breaking may have caused the accident.
... Since I live on the Eastern Sierra, distances are currently too great for the limited network of chargers without having to have a massive battery. Hybrids make more sense.
Heh, yea. Self-parking being demonstrated there. I do worry about any new vehicle manufacturer coming on board, in that the 'traditional' makers have many decades of real-world experience about what can go wrong and how to prevent it, etc. I'm sure most of that info is obtainable somehow, but whether new-auto-makers obtain is is what I fear.
I still remember my dad telling about some of the stuff he discovered coming out of engineering when he worked at a major fire-truck maker. Once, he bet the engineers that one of their designs was going to be a problem, just from looking at their prints. But, they built it anyway (w/o changes) and proved him right. The truck actually could't be driven and tipped on it's nose.
I worked for a number of years doing CAD and rendering stuff for an industrial design firm. There were similar times where I'd look at something and go, 'that won't work' but typically wasn't taken seriously until they built the prototypes and figured that out. But, even then, there were many things that none of us saw that weren't discovered until real-world testing. Tech can't always save you on that stuff!
Also, yeah, when we lived in Northern BC, we couldn't have even driven a Tesla home from the dealer (had we bought one) without making it into a several day journey and getting permission to charge from hotels and/or home-owners along the way. It was just too remote. I saw one Tesla the whole time we lived there. Now, I see many of them each day, and could easily get by for 99% of my travel with one. But, aside from the expense, no one makes one I'm interested in so far.
razorpit said: Exactly One thing the media leaves out is how many of those human driver fatalities are from driving in bad weather? I’d wager to say a very high percentage. How many autopilot miles have Tesla’s logged driving in blizzards and downpours? Considering where most of them are sold I’m going to say not many.
Yes, so far, the testing has been done in fairly ideal conditions, with test-drivers to take control (ie. the easiest kind of miles). The areas are the best 'mapped' in the world. And, my understanding is that the vehicles still drive like a first-day-drivers-ed-student on crack (which the general public wouldn't put up with). That's why Uber disabled the safety features on that one vehicle that killed the pedestrian. Even so, the tech would still have been a fail most likely in that situation (especially compared to a human paying attention).
BUT, unfortunately, my hunch is that bad weather isn't the major contributor to human-driving fatalities. It is most likely impairment and distraction. And, the autonomous advocates have a good point there, that AI cars are much better in that regard. The problem is that if safety were the primary concern, we could mostly solve that on the human-driven side of the equation. Safety simply isn't the concern (or we'd be doing it).
GeorgeBMac said: Despite multiple well funded major firms working on developing a fully capable and independent self-driving system (including Apple) you come down on Tesla for not perfecting theirs yet. Got it!
It is rather irrelevant how many major firms are working on developing, until they do. The fact that Musk is willing to throw caution to the wind to attempt to be the first out there speaks volumes. Maybe those well funded major firms have a bit more of a sense of responsibility?
Comments
Elon pushed Autopilot because he needed to increase his margins, and wow the tech crowd, but there are likely huge legal liabilities from the fact that Autopilot was in fact not close to being autonomous.
Kill drivers that actually believe that Autopilot is fully autonomous? Could it be that Elon gave them that idea?
Because that's something that the NTSB found, that Autopilot isn't close to being an autonomous driving system.
You may be over focusing on self driving cars -- which are being held back by government regulations. If he were allowed to Musk would, I am sure, have already moved ahead. Plus, he is developing MUCH more than just self driving cars. As for fatalities, my understanding is that even in these early stages, self driving cars have a better safety record than human driven cars.
For a fact, Tesla's have a notable problem of crashing into the rear end of maintenance and emergency vehicles.
https://www.wired.com/story/tesla-autopilot-why-crash-radar/
"On Saturday August 25, a Tesla Model S crashed into a stopped firetruck in San Jose, California. The two occupants inside the Tesla sustained minor injuries, and the 37-year-old driver was arrested on suspicion of driving under the influence of alcohol. According to a police report, he told authorities, "I think I had Autopilot on." Tesla has not confirmed the semiautonomous system was in use, but it's at least the third time this year a Tesla has hit a stopped firetruck at highway speeds. We've updated this story, which originally ran on January 25, 2018, about why Autopilot and similar systems have trouble detecting stopped vehicles."
It is still happening, and Tesla's are still making unplanned exits off roads, and cliffs, and crashing into stationary objects, like restaurants;
https://www.theverge.com/2019/5/17/18629214/tesla-autopilot-crash-death-josh-brown-jeremy-banner
"Autopilot is Level 2 semi-autonomous system, as described by the Society of Automotive Engineers, that combines adaptive cruise control, lane keep assist, self-parking, and, most recently, the ability to automatically change lanes. Tesla bills it as one of the safest systems on the road today, but the deaths of Brown and Banner raise questions about those claims and suggest that the Tesla has neglected to address a major weakness in its flagship technology."
"Car safety experts note that adaptive cruise control systems like Autopilot rely mostly on radar to avoid hitting other vehicles on the road. Radar is good at detecting moving objects but not stationary objects. It also has difficulty detecting objects like a vehicle crossing the road not moving in the car’s direction of travel."
"Radar outputs of detected objects are sometimes ignored by the vehicle’s software to deal with the generation of “false positives,” said Raj Rajkumar, an electrical and computer engineering professor at Carnegie Mellon University. Without these, the radar would “see” an overpass and report that as an obstacle, causing the vehicle to slam on the brakes.
On the computer vision side of the equation, the algorithms using the camera output need to be trained to detect trucks that are perpendicular to the direction of the vehicle, he added. In most road situations, there are vehicles to the front, back, and to the side, but a perpendicular vehicle is much less common.
“Essentially, the same incident repeats after three years,” Rajkumar said. “This seems to indicate that these two problems have still not been addressed.” Machine learning and artificial intelligence have inherent limitations. If sensors “see” what they have never or seldom seen before, they do not know how to handle those situations. “Tesla is not handling the well-known limitations of AI,” he added."
Elon should clean up his own mess, before he disses others, but of course, Elon can't contain his ego.
Yeh, you're right. No human ever rear-ended another car. /s
https://electrek.co/2020/03/12/tesla-lawsuite-lack-of-self-driving-computer-retrofit/
Looks like Tesla is in yet another legal quagmire, and not just in China, installing a much less capability earlier version of hardware, without telling the customer. That's just plain illegal.
But of course, Tesla can't afford to wait for the correct hardware because they wouldn't be able to see as many cars, and that would cause their overvalued stock to drop.
Still, laughably funny that Elon's Tesla "cult" fully supports this bait and switch.
I hope the liabilities catch up with Elon, but so far, it seems the government and such agencies are quite all-in on the autonomous thing. Unless they better understand it, he could probably just give them his sci-fi pitch, and they'd believe it.
You missed the part about running people into semi trucks and highway dividers, apparently?
Yeah, I'm sure Musk would go even more wild than he has if he weren't being held back a bit (which isn't very much), given the claims he has made. He couldn't care less how many people die to achieve his dreams, I guess. (Should we mention he's been completely wrong so far on his projections about where the tech capabilities are at?)
I wish I could find the article, but no, they aren't already better than human drivers in terms of safety. You're being tricked by the media hype. Basically, human drivers have a fatality about one every 100 million miles. We don't know about AI cars, as they haven't driven that much yet, and certainly not under conditions the average human deals with.
Here is one study that talks about it a bit: https://www.rand.org/pubs/research_reports/RR1478.html
Tesla can't decide whether it prioritizes self driving or self parking. Look closely at the rear wheel; the car has a broken suspension, often called "wompy wheel" and cause of more than a few fatalities. That suspension breaking may have caused the accident.
As for the EV's out there, and those announced, and the future, I'm not the target market, though I am keen on some sort of hauling/travel van, EV or hybrid, that would fit with my lifestyle at retirement, which is still off in the fufure. Ford is going to ship Transit EV's here in the U.S., and that might work for me. Since I live on the Eastern Sierra, distances are currently too great for the limited network of chargers without having to have a massive battery. Hybrids make more sense.
Despite multiple well funded major firms working on developing a fully capable and independent self-driving system (including Apple) you come down on Tesla for not perfecting theirs yet. Got it!
That's why Musk welcomes others into the EV market. The more EV cars on the road the more EV charging stations there will be. (But with a 250-400 mile range (depending on the model), Tesla's are already viable for most people.)
The "dilemma" is similar to that of 5G: you have to have enough 5G phones in order to make 5G transmitters economically viable -- and vice-versa.
Because it was Musk that opened his mouth and started all of this.
"Forget Mars, Elon Musk is Colonizing Bel-Air. You can add “land baron” to SpaceX and Tesla Motors CEO Elon Musk's list of titles. The tech billionaire owns five houses in Los Angeles—all of them inside the gated community of Bel-Air."
He's just another uber rich guy, grifting off of selling the dream, and lots of people are still buying into that.
I do worry about any new vehicle manufacturer coming on board, in that the 'traditional' makers have many decades of real-world experience about what can go wrong and how to prevent it, etc. I'm sure most of that info is obtainable somehow, but whether new-auto-makers obtain is is what I fear.
I still remember my dad telling about some of the stuff he discovered coming out of engineering when he worked at a major fire-truck maker. Once, he bet the engineers that one of their designs was going to be a problem, just from looking at their prints. But, they built it anyway (w/o changes) and proved him right. The truck actually could't be driven and tipped on it's nose.
I worked for a number of years doing CAD and rendering stuff for an industrial design firm. There were similar times where I'd look at something and go, 'that won't work' but typically wasn't taken seriously until they built the prototypes and figured that out. But, even then, there were many things that none of us saw that weren't discovered until real-world testing. Tech can't always save you on that stuff!
Also, yeah, when we lived in Northern BC, we couldn't have even driven a Tesla home from the dealer (had we bought one) without making it into a several day journey and getting permission to charge from hotels and/or home-owners along the way. It was just too remote. I saw one Tesla the whole time we lived there. Now, I see many of them each day, and could easily get by for 99% of my travel with one. But, aside from the expense, no one makes one I'm interested in so far.
Yes, so far, the testing has been done in fairly ideal conditions, with test-drivers to take control (ie. the easiest kind of miles). The areas are the best 'mapped' in the world. And, my understanding is that the vehicles still drive like a first-day-drivers-ed-student on crack (which the general public wouldn't put up with). That's why Uber disabled the safety features on that one vehicle that killed the pedestrian. Even so, the tech would still have been a fail most likely in that situation (especially compared to a human paying attention).
BUT, unfortunately, my hunch is that bad weather isn't the major contributor to human-driving fatalities. It is most likely impairment and distraction. And, the autonomous advocates have a good point there, that AI cars are much better in that regard. The problem is that if safety were the primary concern, we could mostly solve that on the human-driven side of the equation. Safety simply isn't the concern (or we'd be doing it).
It is rather irrelevant how many major firms are working on developing, until they do. The fact that Musk is willing to throw caution to the wind to attempt to be the first out there speaks volumes. Maybe those well funded major firms have a bit more of a sense of responsibility?