Sunday’s New York On Sunday full-page ad aimed at Tesla’s “fully autonomous driving” software, calling it “the worst software the Fortune 500 company has ever sold”, starting with $ 10,000, the same price as the software itself. Provided to the person. It can be named “another commercial product of a Fortune 500 company that causes a serious malfunction every 8 minutes”.
This ad aims to ban insecure software from safety-critical systems that could be targeted by military-style hackers as part of a campaign to remove Tesla Full Self-Driving (FSD) from the public. It was removed by the recently established organization, The Dawn Project. Drive on the road until “serious malfunction is reduced to 1/1000”.
Dan O’Dowd, the founder of the Advocacy Group, is also the CEO of Green Hill Software, a company that builds operating systems and programming tools for embedded safety and security systems. At CES, the company said BMW iX vehicles use real-time OS And other safety software, as well as new wireless software products and data services for automotive electronic systems.
Despite the potential competitive bias of the founders of The Dawn Project, Tesla’s FSD beta software, an advanced driver assistance system accessible to Tesla owners to handle street driving functions, is a set of YouTube. It has been scrutinized in the last few months after the video was displayed. A flaw in the system was infected with a virus.
The NYT ad will appear a few days after the California Department of Motor Vehicles tells Tesla: “Rethink” that opinion The company’s test program, which uses consumers rather than professional safety operators, does not meet the department’s self-driving car regulations. California DMV regulates autonomous driving tests in the state, including Waymo and Cruise, which plan to develop, test, and deploy Robotaxi to report crashes and system failures called “disconnections.” Need a company. Tesla has never published these reports.
Tesla CEO Elon Musk has been Vague reply on Twitter, Tesla’s FSD claims to have not caused any accidents or injuries since its launch.National Highway Traffic Safety Administration (NHTSA) Report survey The owner of the Tesla Model Y reported that another driver collided with the vehicle because the vehicle entered the wrong lane while turning left in FSD mode.
Even if it was the first FSD crash, Tesla’s autopilot, the automaker’s ADAS that comes standard with the vehicle, Involved in about 12 crashes..
Alongside NYT ads, The Dawn Project Fact check Of that claim Proprietary FSD safety analysis We studied data from 21 YouTube videos with a total drive time of 7 hours.
The videos analyzed include beta versions 8 (released in December 2020) and 10 (released in September 2021), and to reduce bias, the titles of this study are significantly positive or negative videos. I avoided. Each video was evaluated according to a driver’s license rating, which a human driver must pass in order to obtain a California Department of Motor Vehicles driver’s license. To pass the driver’s test, California drivers have a scoring operation error of 15 or less, such as being unable to use the turn signal when changing lanes or maintaining a safe distance from other moving vehicles. There should be zero serious driving errors such as crashes and red light executions.
The study found that FSD v10 made 16 scoring operation errors on average within an hour and made serious driving errors about every 8 minutes. According to the analysis, the error was improved in the 9 months between v8 and v10, but the current improvement rate is “an additional 7.8 years (per AAA data) to 8.8 years (per transport bureau data)”. Achieve the accident rate of human drivers. “
The Dawn Project ad makes a bold claim that a grain of salt must be taken, especially because the sample size is too small to be taken seriously from a statistical point of view. However, if the 7-hour video actually represents an average FSD drive, the findings show a bigger problem with Tesla’s FSD software, and whether Tesla should be allowed to test this software on public roads. No restrictions may be asked to broader questions.
“We didn’t ask our family to be crash test dummies for thousands of Tesla cars on public roads …” the ad said.
Federal regulators have begun to take some action against Tesla and its autopilot and FSD beta software systems.
In October, NHTSA sent two letters to the automaker. Use of nondisclosure agreement For owners with early access to FSD Beta, and for the company’s decision to use radio software updates to fix issues with the standard autopilot system to be recalled. father, Consumer Reports has released a statement During the summer, FSD version 9 software upgrades didn’t seem to be secure enough for public roads, he said, and would test the software on their own.
Since then, Tesla has released different versions of the v10 software. 10.9 is always here, and version 11 will release a “single city / highway software stack” and “many other architectural upgrades” in February. According to CEO Elon Musk..
The latest version 10.8 reviews are distorted and some online commenters say it’s much smoother. Also, many other commenters have stated that they are completely unsure about using the technology.Thread to review the latest FSD version above Tesla Motors Subreddit Page Indicates that the owner is sharing a complaint about the software. Some write, “The general public isn’t ready yet …”.
Another commentator said the car turned right and said, “It took too long to get into a completely empty straight road … then I turned left and continued to hesitate for no reason, blocking the oncoming lane and then accelerating. “. Following the following, I changed my mind about speed and suddenly slowed down because I thought the road at 45 mph was 25 mph. “
The driver said that the system completely ignored the next left turn and eventually had to be completely disconnected.
Dawn project campaign The FSD emphasizes a warning from Tesla that “in the worst case it could do the wrong thing.”
“How can everyone tolerate safety-critical products on the market that can do the wrong thing in the worst case,” the advocacy group said. “Isn’t that the definition of a defect? Fully automated driving needs to be removed from the road immediately.”
Neither Tesla nor the Dawn Project were asked to comment.
New York Times ad warns against Tesla’s “Full Self-Driving” – TechCrunch Source link New York Times ad warns against Tesla’s “Full Self-Driving” – TechCrunch
The post New York Times ad warns against Tesla’s “Full Self-Driving” – TechCrunch appeared first on California News Times.