TX-Self driving Tesla with no one in driver's seat, crashes, incinerates both passengers

TEXAS: Two die in driverless Tesla incident. Where are the regulators?

YDga9B8.jpg


(This is a WILD Story! Authorities claim NO ONE WAS DRIVING)

LA TIMES | APRIL 19, 2021 | Russ Mitchell

It’s a 21st century riddle: A car crashes, killing both occupants — but not the driver.

That’s what happened over the weekend in Houston, where a Tesla Model S slammed into a tree and killed the two men inside. According to police, one had been sitting in the front passenger seat, the other in the back of the car.



Although investigators have not said whether they believe Tesla’s Autopilot technology was steering, the men’s wives told local reporters the pair went out for a late-night drive Saturday after talking about the system.

Tesla Chief Executive Elon Musk pushed back on speculation but also asserted no conclusion, tweeting Monday that “Data logs recovered so far show Autopilot was not enabled.” The company has resisted sharing data logs for independent review without a legal order.



After Musk’s tweet, a county police official told Reuters that the department would serve a warrant for the data.

Autopilot technically requires the human driver to pay full attention, but it’s easy to cheat the system, and the internet is rife with videos of pranksters sitting in the back while a Tesla cruises down the highway with the driver seat empty.

It’s a state of affairs that leaves many auto safety experts and driverless technology advocates wondering just what it will take before regulators step in and put an end to the word games and rule-skirting that have allowed it to continue. Could the crash in Houston provide that impetus?

“I suspect there will be big fallout from this,” said Alain Kornhauser, head of the driverless car program at Princeton University.

Tesla’s Autopilot system has been involved in several fatal crashes since 2016, when a Florida man was decapitated as a Tesla on Autopilot drove him under the trailer of a semi truck. Less lethally, Teslas have slammed into the back of firetrucks, police cars and other vehicles stopped on highway lanes.

Yet little action has been taken by federal safety officials and none at all by the California Department of Motor Vehicles, which has allowed Tesla to test its autonomous technology on public roads without requiring that it conform to the rules that dozens of other autonomous tech companies are following.

The National Highway Traffic Safety Administration said Monday that it had dispatched a “Special Crash Investigation team” to Texas. The agency, an arm of the U.S. Department of Transportation, said it “will take appropriate steps when we have more information.”

The agency declined to speak with The Times about what those steps might be.

Since 2016, NHTSA has launched investigations into at least 23 crashes involving Autopilot; but if they resulted in any conclusion or action, NHTSA hasn’t told the public about it.

Jason Levine, executive director of the Center for Auto Safety, thinks it’s about time that changes.

“There doesn’t seem to be much activity coming out of our federal safety administration with respect to what is pretty evidently becoming a public danger,” he said. “You’ve got the market getting ahead of regulators, which isn’t uncommon, but this all didn’t start yesterday.”

Tesla sells an enhanced version of Autopilot called Full Self-Driving Capability for $10,000, although there is no car sold anywhere in the world today that is capable of full self-driving.

3lvzxOl.png




Although Tesla technology might well be safe when used as directed, Tesla’s marketing can lead people to believe the car is capable of autonomous driving. NHTSA, Levine points out, has rules against “predictable abuse” in automotive technology.

“It is predictable when you call something Autopilot it means autopilot, and when you call something Full Self-Driving it means full self-driving,” he said.

Incidents such as the fatal Texas crash “are foreseeable incidents,” Levine said, “no matter how many disclaimers Tesla lawyers decide to insert in fine print.”

Musk disbanded the company’s media relations department in 2019. Emails to the company were not returned.

The California DMV is in a position to clarify matters but thus far has not. In previously undisclosed emails to the DMV in recent months, made public by the legal document transparency organization Plainsite, Tesla told the DMV that its system is not autonomous but a so-called Level 2 driver assist system.

The DMVs own regulations bar companies from advertising the sale or lease of a vehicle as autonomous if it “will likely induce a prudent person to believe a vehicle is autonomous.”

In public presentations and slideshows, DMV Deputy Director Bernard Soriano described Level 4 automation, which requires no human driver, this way: “Full self-driving.”

In a lengthy emailed statement, the DMV suggested that it views what Tesla is selling as a non-autonomous system. It did not address questions about whether the company, in using the term Full Self-Driving, is violating the regulation against misrepresenting driving systems as autonomous.

Adding to the confusion, Musk himself has appeared on “60 Minutes” and Bloomberg TV behind the wheel of a Tesla with his hands in the air. He’s been talking about Tesla‘s fully autonomous technology as if it’s imminent since 2016. That year, Tesla posted a video showing one of its cars running in autonomous mode through Palo Alto. “The person in the driver’s seat is only there for legal reasons,” the video said.

The same year he announced a coast-to-coast test drive of an autonomous Tesla by the end of 2017, which as of April 2021 has not happened. He told a Shanghai conference in 2020 that the “basic functionality” for fully autonomous driving would be complete that year. It wasn’t. He said the company would have 1 million driverless robotaxis on the road by the end of 2020, which would cause Tesla cars to appreciate in value. So far there are none.

The misleading promises and the confusing nomenclature are beginning to rile other players in the driverless car industry. Several industry executives have told The Times that they fear that Musk’s behavior could disturb the public and cause policymakers to enact restrictive laws and regulations that could unnecessarily delay the introduction of driverless cars.

Now, some are beginning to speak out publicly.

“We’ve had multiple years of claims that ‘by the end of the year it’s going to be magically self-driving by itself without a human in the car,’” Ford’s autonomous vehicles head, John Rich, said at a recent Princeton University conference. “It is not helpful, OK? It is confusing the public. Frankly even the investor community is very, very confused as to what paths are plausible and what the capabilities of the different systems are.”

Musk has long cultivated a maverick approach to robot-car technologies. Other car and tech companies combine radar, lidar and visual sensors in their systems to identify and analyze a robot-car’s surroundings. Musk believes lidar is an unnecessary expense and recently announced Tesla would soon stop using radar, too, relying solely on visual sensors for the main driving task.

And although other companies with advanced driver-assist systems similar to Autopilot use infrared cameras to make sure a human is in the driver’s seat and paying attention to the road ahead, Musk specifically rejected that technology in favor of a steering wheel sensor that can be easily defeated by hanging a weight off the wheel or jamming an object into it.

General Motors’ SuperCruise system, for example, allows hands-free driving and automated lane changing on interstates and other limited-access highways, but monitors the driver to ensure they’re paying attention to the driving task. If not, warning lights and sounds are deployed. If the driver remains inattentive, the car will exit traffic lanes and stop itself.

Ford recently announced a similar product, BlueCruise, expected to become available later this year on Mustang Mach E electric cars and Ford F-150 pickups. Neither company refers to the technology as full self-driving. MORE: https://www.latimes.com/business/st...-autopilot-kills-two-where-are-the-regulators

NHTSA and NTSB are too busy surfing porn sites to do any "Regulating" and who needs "Regulation" anyway, right RPF'rs? Maybe this Musk liar has pictures of the head of NHTSA in panties or something.

Cooked alive - might want to avoid Lithium-Ion battery fires whenever possible. Probably extremely uncomfortable, at least until you lose consciousness. The good thing is, you'll be cremated quickly. Your family will save on the cost of a coffin.



41989612-9491029-image-m-50_1618922910890.jpg


Sometimes these batteries catch fire just driving down the street



RELATED:

Tesla crash that killed two shines spotlight on company’s shaky history with U.S. safety investigators

Officials said Tesla is not party to a probe into a weekend Model S crash that killed two — a departure from typical protocol
https://www.washingtonpost.com/technology/2021/04/20/tesla-crash-investigation/


Related:

Elon Musk said he was a Secret Service 'special agent' when he donated to the Republican Party, an FEC filing shows

An FEC filing showed that Musk listed his employer as the US Secret Service.
https://www.businessinsider.in/tech...-an-fec-filing-shows/articleshow/82165452.cms
 
Riding in a solar oven on top of a napalm bomb. Probably while wearing polyester.

I miss the 'seventies.

The children wonder how anyone survived the decade. But that was safer than the back seat of a self-driving Tesla. Even with Auntie driving.

Precisely.

I miss the seventies as well.

But hey, look at it this way, we're going to get to re-live a lot of it here shortly.

Granted most of it's the bad part, stagflation, inner city crime, foreign oil dependency...but what the hell?

I expect Biden to start wearing WIN buttons any day now.
 
File a patent on this but if you will, a fireproof blanket of sorts to throw or otherwise mount over a burning electric vehicle which then can be pumped full of CO2 automatically would be a good thing for every fire department in the world to have. Thank me later.

Really?
fire-extinguisher.jpg


and which ones state Clearly DO NOT USE on electrical equipment?

You think I just make stuff up?
 
You told me to file a patent,, on what is an already Long existing product..

The patent would be on the containment of the gas to more effectively smother the fire. That patent most likely also exists already in some form. However, it would be a very neat solution for electric cars that are burning out.
 
The patent would be on the containment of the gas to more effectively smother the fire. That patent most likely also exists already in some form. However, it would be a very neat solution for electric cars that are burning out.

It might..

but it it rarer that a Gas car fire,,which are quite common..
Though not nearly as common as Hollywood makes it seem.

I'm guessing that it is mostly poor training.

but I have watched Gas spread wider by spraying water on it.. (Why airports use FOAM)
and I know what happens when you hit magnesium with a cutting torch.
 
Really?
fire-extinguisher.jpg


and which ones state Clearly DO NOT USE on electrical equipment?

You think I just make stuff up?

a small fire extinguisher of water would be useless. But water from a fire engine is sprayed to cool down the batteries.
 
a small fire extinguisher of water would be useless. But water from a fire engine is sprayed to cool down the batteries.

That is Teslas Training advice as well..

not sure it is good advice.. in every case.. It might cool overheated batteries.

It would be useless (worse than useless) on an Arc Fire,, and that would have enough heat to ignite things not normally considered flammable.

You would have to dump it in a lake to quench it.. or wait for the batteries to discharge..

Ever watch a chunk of Magnesium bouncing around a shop floor? Exciting.
 
To me this is just another example of why AI is not ready.

That was the cause of the crash in the first place. Stupid people+AI=disaster.
 
I'll accept that a self-driving car may drive more safely than I do, but I'll never bring myself to trust one.

I do not accept that for a second.

I can make you scream like a little girl in the passenger seat,, and do it safely.
 
I'll accept that a self-driving car may drive more safely than I do, but I'll never bring myself to trust one.

It will be funny to see cars driving around with just dogs inside of them, though.

The problem with the current generation of self-driving AI is that it doesn't know when it doesn't know. In humans, this is what we call "situational awareness" -- it's not just being aware of the facts of your surroundings, it's also being aware of the problems that may exist in your knowledge of the current state-of-affairs. The self-driving AI is certainly processing way more sensory data than a human driver -- it sees all angles around the car, it monitors audio, it has access to detailed accelerometer readings, and so on. And it will try it's best to "recognize the road" even if it's parked in the middle of a grass field. Park a Tesla in a football stadium and try to get it to dead-reckon its way out without using the GPS. It either can't do it, or it will wander aimlessly for a very long time before it finally finds the exit. Even a middle-schooler who had never driven before could probably pull this off without a hitch. That's the problem, it lacks a "substantial" model of the artificial (and natural) environment, which is why would describe human drivers as having "common sense" which the AI lacks, even if it can detect extraordinary conditions that the average human wouldn't be able to detect or react to. When we're in a rain-storm, we know that our model of the state-of-affairs is hampered versus clear conditions. My brain knows that there could be cars which I am not recognizing because they are hidden behind the cloud of water vapor being kicked up by trucks on the road. The Tesla self-driving AI may have "separate training" for each of the various road conditions but it still lacks this kind of substantial, common-sense model that is based on the brain's understanding of the way things "should be" versus the way they are. The net result: I will slow down because I know there could be something hiding behind that cloud of gray, whereas the Tesla will just barrel on ahead because this specific scenario was always safe in the training data. It has to have an "encounter" (in training) with every possible combination of driving conditions before it can "understand".

When they start training these cars on MC Escher-like training environments that are designed to obfuscate / confuse the self-driving algorithm, then I will start to have some more confidence in them. The algorithms need to be able to know when they don't know. "This is confusing... I should be able to locate the road based on the pavement edge / lines but I can't, I will pull over to what I know for sure is the side of the road because I checked for an out before merging onto this highway."

Tesla engineers: "Don't worry, we trained it on a bazillion real-world scenarios. You can trust it."
Me: "In the real-world, there is always the bazillion+1th scenario..."
 
Back
Top