73 Percent of Americans Are ‘Scared’ to Use Driverless Cars

An Uber vehicle that struck and killed a pedestrian in March 2018 had what are being called "serious software flaws" that led to the tragic incident.
The vehicle reportedly didn't have the ability to recognize jaywalkers, according to a new report from engadget, who cited a report prepared by the NTSB. The safety agency blamed Uber's software for not being able to recognize the victim of the accident as a pedestrian crossing the street. The vehicle didn't calculate that it could potentially collide with the woman until just 1.2 seconds before impact, at which point it was too late to brake.
The NTSB said that Uber's system "did not include a consideration for jaywalking pedestrians."
uber_0.png

In fact, the report says that the system detected her about 6 seconds before impact, but didn't classify her as a pedestrian:
Although the [system] detected the pedestrian nearly six seconds before impact ... it never classified her as a pedestrian, because she was crossing at a location without a crosswalk [and] the system design did not include a consideration for jaywalking pedestrians.
After recognizing the pedestrian (too late) the vehicle then wasted a second trying to calculate an alternative path or allowing the driver to take control. Uber has since eliminated this function in a software update.
Uber vehicles have failed to identify roadway hazards in at least two other cases, the report notes. In one, a vehicle struck a bicycle lane post that had bent into a roadway. In another, a driver was forced to take control of the vehicle to avoid an oncoming vehicle. The driver still wound up striking a parked car.
In the 7 months leading up to the pedestrian accident, Uber vehicles had been involved in 37 accidents, 33 of which involved other vehicles striking Uber test cars.
ube%20r%202.png



More at: https://www.zerohedge.com/technolog...rian-2018-couldnt-detect-jaywalkers-ntsb-says
 
That indicates just how dedicated the system is to taking your ability to drive yourself where and when you want, and instead, shuttle you around in meatpods totally under their control and authority.

That is the direction they have been going for years.

I like the efficiency and performance of the electrics,, but hate the AI. and they are all tying it in..

Except Bikes.. and Homebrew EV..

I am looking forward to salvage batteries in an affordable range soon.
 
A Tesla on autopilot rear-ended a Connecticut trooper’s vehicle early Saturday as the driver was checking on his dog in the back seat, state police said.
Police said they had responded to a disabled vehicle that was stopped in the middle of Interstate 95. While waiting for a tow, the self-driving Tesla came down the road.
After striking the trooper's vehicle, the driver in the Tesla then rear-ended the disabled vehicle before stopping.

“The operator of the Tesla continued to slowly travel northbound before being stopped several hundred feet ahead by the second trooper on scene,” police said.

“The operator of the Tesla stated that he had his vehicle on ‘Auto-pilot’ and explained that he was checking on his dog which was in the back seat prior to hitting the collision,” police said.

More at: https://www.foxnews.com/auto/tesla-autopilot-connecticut-cop-car
 
In what has now become "everyday news" that people - especially those at the NHTSA - are becoming immune to, a Tesla slammed through the front of a bakery on Tuesday in Burlington, Ontario, injuring 2 people.
There has been no word yet on whether or not Autopilot played a role in the accident.
slam%201.png

The incident happened at the British Pride Bakery at Roseland Plaza at about 10:30AM on Tuesday, according to Inside Halton. Regional police said that the woman who was driving the vehicle and an employee from inside the shop were both taken to the hospital with injuries.

slam%204.png

The incident remains "under investigation", according to local authorities and has forced the bakery to close at what would be its busiest time of the year.
slam%202.png



More at: https://www.zerohedge.com/technolog...-slams-through-front-bakery-injuring-2-people
 
The emergence of smart cars has opened the door to limitless possibilities for technology and innovation – but also to threats beyond the car itself. New research from Michigan State University is the first to apply criminal justice theory to smart vehicles, revealing cracks in the current system leading to potential cyber risks.
“Automotive cybersecurity is an area we don’t understand well in the social sciences. While there are groups of computer scientists and engineers digging into some of the issues, the social aspects are extremely relevant and under-examined,” said Thomas Holt, professor of criminal justice at MSU. “As the technology gets greater market share, it’s critical to get ahead of the curve before there are issues we can’t rein in.”
As vehicles become smarter and more connected to WiFi networks, hackers will have more opportunities to breach vehicle systems. Connecting your smartphone through a USB port can give a hacker backdoor access to data from both your phone and your car. Additionally, Google Android users who can download apps from unverified sites are even more at-risk.
The research, published in the Journal of Crime and Justice applied Routine Activities Theory, used a popular criminal justice framework, to current forms of vehicle security and provided recommendations for manufacturers and owners to improve safety.
“The risk with vehicles isn’t just personal data – though that is still a real concern,” Holt said. “Say the car is compromised and a hacker alters certain alert systems that tell a driver when tire pressure is low or so the emergency brake sensory systems don’t kick in. That could lead to loss of life.”
The theory Holt applied says that in order for a criminal to act three things need to come together: a motivated offender, a suitable target and a lack of guardian. In the context of vehicle security, he said that motivators and targets are clear, but the presence of a guardian was where vehicles fell short.
“Where we found holes was surprising: there’s no one technically responsible for these vehicles’ central computer systems,” Holt said.

More at: https://www.infowars.com/study-raises-alarm-on-smart-cars-vulnerability-to-cyberattacks/
 
Even if I was born with some sort of condition that resulted in being have short arms and I wouldn't be able to reach the steering wheel, I still wouldn't get a self driving car.
 
Two vehicle occupants have been seriously injured after a Tesla slammed into the back of a Cloverdale Township Volunteer Fire Department Truck early Sunday morning. According to the Greencastle Banner-Graphic, the accident took place in Cloverdale, IN.
The fire truck was in the eastbound lands of the interstate, responding to an earlier wreck, when a Tesla ran into the rear of the truck, causing "heavy damage" to both.
clover1.jpg

Reports from the scene indicated that both the driver and the passenger were unconscious and trapped. There is no word yet on whether or not Autopilot played a role in the accident.

Both occupants were extricated from the vehicle and the Indiana State Police said that the accident involved “serious personal injury.”
Recall, this is not the first time a Tesla has slammed into the back of an inanimate fire truck. In 2018, an accident occurred when a driver smashed into the back of a fire truck in Southern California. That driver was found to have been "looking down" at "what appeared to be a mobile phone" while the car's Autopilot was engaged, according to Bloomberg.


More at: https://www.zerohedge.com/technolog...ured-after-tesla-slams-parked-firetruck-again
 
Another day, yet another Tesla wreck.
It was just hours ago that we highlighted a wreck where a Tesla slammed into the back of (yet another) inanimate fire truck.
And yet again, here we are with another "peculiar" sounding Tesla accident. This one involving a Tesla that ran a red light near Los Angeles, California and slammed into another vehicle on Sunday, killing two people.
According to KTLA 5, the incident took place at Vermont Ave and Artesia Blvd. on Sunday. The driver of the 2016 Tesla exited the westbound 91 freeway "at a high speed" and then failed to stop at a red light at the next intersection. As a result, he slammed into a 2006 Honda Civic at the light.



More at: https://www.zerohedge.com/technolog...er-tesla-runs-red-light-slams-vehicle-near-la
 
Tesla Autopilot seems to give some drivers a sense of invincibility while traveling the roads as they place their lives in the hands of artificial intelligence.
This was the case with YouTuber Dougal Vlogs, who uploaded a video on Dec. 30, showing a Model 3 presumably engaged in "Autopilot" (as per the video's headline) traveling at a high rate of speed (over 70 mph) during a rainstorm.
The video is short, about 15 seconds, the vlogger is seen speeding down a two-lane highway traveling at 70-75mph while using Autopilot.
The vlogger is holding a camera about to talk about the Model 3, and an alarm sound starts blaring. Next thing you know, the Model 3 hydroplanes and crashes into the shoulder of the road, all caught on camera!

The vlogger was heard several times during the incident yelling "OH MOY GOD" -- and at the end of the video tells his audience he just crashed.



More at: https://www.zerohedge.com/markets/oh-moy-god-tesla-driver-autopilot-films-own-crash
 
Three people were critically injured after a Tesla ran a red light in Salt Lake City on Sunday morning, smashing into another car.
According to the Deseret News, the Tesla hit another car while “traveling at a high rate of speed” through a red light, according to Salt Lake Police Lt. Brett Olsen.
tesla1.png

Two men in the Tesla and one woman in the car that was struck were all taken to the hospital in critical condition.

Meanwhile - stop us if you've heard this one before - the Tesla's battery then began "exploding on scene", prompting a hazardous materials team to show up. Photographs show the front end of the Tesla completely destroyed and ravaged by flames.
tesla2.png


More at: https://www.zerohedge.com/technolog...la-runs-red-light-smashes-car-high-rate-speed
 
No sooner do we report that the NHTSA is considering a petition to investigate 500,000 Teslas for unintended acceleration, than another Tesla driver winds up dead.
Police are in the middle of investigating what is being called a "single vehicle crash" involving a Tesla that "crashed and burst into flames" in Pleasanton, California.
The driver of the vehicle was killed, according to ABC 7. His identity has not yet been released.
The crash was reported on Saturday night at about 6pm local time at the intersection of West Las Positas Boulevard and Hacienda Drive.

wr.png

The Tesla was going southbound and lost control near the intersection. It then crashed into a sign outside of an apartment complex before catching on fire.
The intersection was closed for several hours on Saturday night as a result of the crash.
"The car was going so fast, it actually took out a street signal," the on-the-scene reporter says in her coverage.
wr2.png



More at: https://www.zerohedge.com/technolog...flames-california-intersection-saturday-night
 
A group of hackers has managed to trick Tesla’s first-generation Autopilot into accelerating from 35 to 85 mph with a modified speed limit sign that humans would be able to read correctly.
Hackers at McAfee Advanced Threat Research conducted the experiment.
They explain what they set out to do in a blog post:
McAfee Advanced Threat Research (ATR) has a specific goal: identify and illuminate a broad spectrum of threats in today’s complex landscape. With model hacking, the study of how adversaries could target and evade artificial intelligence, we have an incredible opportunity to influence the awareness, understanding and development of more secure technologies before they are implemented in a way that has real value to the adversary.
With that in mind, they decided to target MobilEye’s camera system since it’s deployed in over 40 million vehicles, including Tesla’s first-generating Autopilot vehicles, which were used for this specific experiment.
They decided to try to modify speed limit signs in ways that a human would be able to still know the limit, but the automated system could get confused:








Ultimately, they were able to make a Tesla vehicle on Autopilot accelerate by 50 mph over the limit:
The ultimate finding here is that we were able to achieve the original goal. By making a tiny sticker-based modification to our speed limit sign, we were able to cause a targeted misclassification of the MobilEye camera on a Tesla and use it to cause the vehicle to autonomously speed up to 85 mph when reading a 35-mph sign. For safety reasons, the video demonstration shows the speed start to spike and TACC accelerate on its way to 85, but given our test conditions, we apply the brakes well before it reaches target speed. It is worth noting that this is seemingly only possible on the first implementation of TACC when the driver double taps the lever, engaging TACC. If the misclassification is successful, the autopilot engages 100% of the time. This quick demo video shows all these concepts coming together.
They released a quick video of one of the experiments:



More at: https://electrek.co/2020/02/19/tesla-autopilot-tricked-accelerate-speed-limit-sign/
 
Back
Top