73 Percent of Americans Are ‘Scared’ to Use Driverless Cars

A Tesla driver in Utah whose vehicle slammed itself into a stopped fire truck at a red light earlier this year is now suing the company, claiming that when she bought the car she was told that it would stop on its own if the Autopilot was on and something was in the vehicle's path.
The driver, Heather Lommatzsch, reportedly wrote in her lawsuit that she was told in 2016, when she purchased her Tesla Model S, that she only had to "touch the steering wheel occasionally while using Autopilot mode". She also claims that she tried to engage the brakes when she saw the vehicle stopped ahead of her, but that the car's brakes simply "did not work".

More at: https://www.zerohedge.com/news/2018...driver-suing-tesla-claims-brakes-did-not-work
 
The collectivist will love autonomous cars... They see the breaking of a few eggs as necessary for the whole omelette.
The sacrifice of a few people today, in order to improve the code base, is worth the promise of a safer tomorrow.

"I would rather be a poor master than a rich servant" - Michael Caine
 
In the city of Hinckley, near the city of Leicester, there is a Japanese company that wants to build
a test track for driverless cars on the southwestern section of Bosworth Field. Richard III Society
members have been notorious for getting into online quarrels with members of THE HENRY TUDOR
SOCIETY, which came into being in 2013. The RIII Society has been around since the ROARING
TWENTIES. Before BREXiT tore the U.K apart, these spats amoung mediaevalists were infamous.
 
The company acquiring the legal deed to an 83 acre family farm near or on swampy marshland now has
RIII Society people and Henry VII's diligent supporters cautiously joining together in a common cause.
Both groups of roughly 20,000 members each are signing petitions. Politics makes strange bedfellows.
 
If we are not going to be buying the driverless cars the Leicester car rink will be testing out, and we
might not do so for three decades or more, this makes 'em impractical save for programming or CPU.
 
The realms of FORZA online gaming are well past the "auld" gamer franchise
that is the NEED FOR SPEED. This opens up Virtual Reality into the matrix as
well as ANONYMOUS & all hacker people. Cars are to be controlled via a CPU?
 
Last edited:
Again...

A Tesla driver in Utah whose vehicle slammed itself into a stopped fire truck at a red light earlier this year is now suing the company, claiming that when she bought the car she was told that it would stop on its own if the Autopilot was on and something was in the vehicle's path.
The driver, Heather Lommatzsch, reportedly wrote in her lawsuit that she was told in 2016, when she purchased her Tesla Model S, that she only had to "touch the steering wheel occasionally while using Autopilot mode". She also claims that she tried to engage the brakes when she saw the vehicle stopped ahead of her, but that the car's brakes simply "did not work".

More at: https://www.zerohedge.com/news/2018...driver-suing-tesla-claims-brakes-did-not-work

In the city of Hinckley, near the city of Leicester, there is a Japanese company that wants to build
a test track for driverless cars on the southwestern section of Bosworth Field. Richard III Society
members have been notorious for getting into online quarrels with members of THE HENRY TUDOR
SOCIETY, which came into being in 2013. The RIII Society has been around since the ROARING
TWENTIES. Before BREXiT tore the U.K apart, these spats amoung mediaevalists were infamous.

The company acquiring the legal deed to an 83 acre family farm near or on swampy marshland now has
RIII Society people and Henry VII's diligent supporters cautiously joining together in a common cause.
Both groups of roughly 20,000 members each are signing petitions. Politics makes strange bedfellows.

If we are not going to be buying the driverless cars the Leicester car rink will be testing out, and we
might not do so for three decades or more, this makes 'em impractical save for programming or CPU.
 
In the latest incident to take place while a Tesla car was on Autopilot, at least the authorities were on the scene quickly. That, of course, is because the Tesla Model S in question, located on Freeway 3 in Hsinchu County, Taiwan, reportedly plowed directly into the back of a police car.
Two officers that were directing traffic at the time narrowly escaped death or serious injury, reportedly jumping out of the way "just in time" before the accident occurred.
The accident took place despite reports that there were "100 meters of traffic cones and flashing warning lights" placed behind the police car. After the first patrol car was hit, it subsequently wound up hitting a second patrol car. The damage was estimated to be around NT$3 million (~USD $97,300).
tsla1_0.jpg

The driver, who after feeling "drowsy", engaged his Autopilot on the highway was subsequently tested for alcohol. The test was found to be non-reactive. The driver told police he had finished a long shift at work that started at 6AM that morning, about 16 hours earlier than the accident occurred.
tsla%2023_0.jpg

Tesla's website, as a reminder, states: “Autopilot is intended for use only with a fully attentive driver who has their hands on the wheel and is prepared to take over at any time.”
So, naturally, when Elon Musk demonstrated Autopilot for his recent controversial 60 Minutes interview, he did just the opposite, taking his hands off the wheel before his vehicle appeared to cut off another driver in traffic during a lane change.
hands%20off_0_0.jpg

lane_0.jpg



More at: https://www.zerohedge.com/news/2018...espite-100-meters-traffic-cones-and-warning-0
 
While I solidly consider myself on the side of "automate everything we can", I think the concerns here are fairly well founded.

We're actively imagining neural networks to be more powerful than they actually are and we're ascribing more intentionality and intelligence to them than they actually possess. While neural networks (and broader machine learning) are powerful tools, they're nothing remotely like AGI, nor are they really making "decisions" in the same way a human does. At their core, they're an advanced form of statistical analysis that has options hooked up to those decisions.

While this leads to some genuinely impressive feats (OpenAI beating the world's top DotA 2 players was a genuine sight to behold), the real world is very messy and difficult to simplify. We're also starting to find out that how a machine identifies an image can be very flawed; in one sample, a machine correctly identified "wolves" in various pictures with precision accuracy (or so it seemed), but once it was found out why it as labeling the wolf as a wolf, researchers were more than a bit mortified. It was labeling the wolves as "wolves" because of the background and not the wolf itself. It's not enough for a machine to be right, but it has to be right for the right reasons, otherwise, if the scenario changes, it can start making really poor decisions (ie: wolves in non-typical backgrounds).

These problems are being actively worked on and they are improving, but it's all still mostly statistical correlation. There's no "understanding" the machine has, it doesn't "think", it can't generate new "ideas", and it works in a very narrow range.

I'm sure it'll get significantly better over the next 10-20 years (though, I do suspect an AI winter is just around the corner, which is likely to set us back for an other 20 years or so), but I don't think driverless cars are within our grasp any time soon.

In short, we don't want driverless cars running over people because, at its core, a machine still has no idea what a human "is" or ...well, anything at all.
 
The article mentions millennials as the largest change.. did they ask those afraid - if they are also afraid to drive a regular car? Many are.. watch them try to merge onto a highway... smh
~climbs in the never ever boat~
 
After what happened to Michael Hastings I can understand why people would be scared of driverless cars.
 
It has been years since Teslas debuted its Autopilot and, despite numerous software updates and fixes in response to a litany of accidents involving Autopilot, it doesn’t look as though drivers are getting the message that the software may not be as innovative, safe or autonomous as Elon Musk has led them to believe.
Yet another example came to light on Monday when a driver in North Brunswick, New Jersey wrecked his Tesla on a highway while the vehicle was in Autopilot mode. According to a report published by News 12 New Jersey, the driver said that the vehicle "got confused due to the lane markings" at a point where the driver could have stayed on the highway or taken an exit. The driver claims that Autopilot split the difference and went down "the middle", between the exit and staying on the highway.
The car then drove off the road and collided with several objects before coming to a stop. The driver claims that he tried to regain control of the vehicle but that "it would not let him".
Some of the driver's claims and Tesla's full response were later edited out of the News 12 New Jersey report without explanation.
crash.jpg

This is the latest of numerous similar Autopilot accidents that have taken place as a result of Autopilot being "confused" and acting indecisively when passing exits on the highway. As the details of these accidents have emerged, it has become evident that Autopilot may have serious issues with determining where to go when given the opportunity to exit a highway.


More at: https://www.zerohedge.com/news/2019...ghway-driver-reportedly-unable-regain-control
 
It has been years since Teslas debuted its Autopilot and, despite numerous software updates and fixes in response to a litany of accidents involving Autopilot, it doesn’t look as though drivers are getting the message that the software may not be as innovative, safe or autonomous as Elon Musk has led them to believe.

More at: https://www.zerohedge.com/news/2019...ghway-driver-reportedly-unable-regain-control

You know what's so frustrating about all this?

If this was any other company selling cars with such a bad track record; crashes, fires, delays in delivery and so on, the government media organs would be screeching 24/7 for congressional hearings, there would legislation being written to ban them, Musk would getting grilled and perp-walked in front of Congress, breathless bubble headed bleached blonde reporterettes would be doing "human interest" stories of the injured and killed by the thousands, tugging at the heart and fogging the minds of millions.

IF...this was any other company with any other car.
 
You know what's so frustrating about all this?

If this was any other company selling cars with such a bad track record; crashes, fires, delays in delivery and so on, the government media organs would be screeching 24/7 for congressional hearings, there would legislation being written to ban them, Musk would getting grilled and perp-walked in front of Congress, breathless bubble headed bleached blonde reporterettes would be doing "human interest" stories of the injured and killed by the thousands, tugging at the heart and fogging the minds of millions.

IF...this was any other company with any other car.
If THEY didn't want to force all of us into these nightmares.
 
Back
Top