EV Journalist Said a New Bug in Tesla’s Latest FSD Almost Killed Him

self-driving
Multiple Tesla owners have noticed a new bug in the latest FSD Beta. Dublin. (Photo by Niall Carson/PA Images via Getty Images

Earlier this week, Tesla (TSLA) released a new Beta version of the Full Self-Driving (FSD) software. The updated system is supposed to have more advanced autonomous driving capabilities than previous versions, but some Tesla owners have found from test drives the new FSD is also more dangerous.

Fred Lambert, the editor in chief of popular electric vehicle news site Electrek, said a software bug in FSD nearly killed him during a test drive yesterday (Aug. 31) on a highway in Canada. Lamber is a founding member of Electrek and the founder of Zalkon, a newsletter about investment in clean energy.

“On two occasions, when passing on the left lane, FSD Beta tried to veer to the left in the median strip into one of those U-turn areas for emergency vehicles,” Lambert wrote in a post on X, formerly Twitter, today.

Lambert was testing the Beta v11.4.7 version of FSD, which he received on his Model 3 earlier this week. The v11 series of FSD are supposed to be the last beta versions before the official v12 FSD, expected to release next year.

Lambert said the issue seemed new and extremely dangerous.

“I was driving at 118 km/h (73 mph) and it didn’t even slow down. Basically trying to take a sharp left turn at highway speed,” he wrote in the tweet. “I was able to bring the car back into the lane in time, but I almost overcorrected which is dangerous as you are passing.”

SEE ALSO: Tech Billionaire Dan O’Dowd Owns 5 Teslas. He’s Also Waging War Against the Company

Tesla encourages its customers to send feedback every time they disengage FSD or Autopilot, a less advanced driver assist program. After the deadly incident, Lambert sent a message to Tesla saying, “[FSD] just tried to kill me, so please fix it.”

Lambert was able to take over the vehicle in time because he was alert and kept his hands on the steering wheel while FSD was driving as instructed by its user manual. Tesla FSD is classified as a level 2 (out of five levels) autonomous driving system under the Society of Automotive Engineers (SAE) standards adopted by the U.S. Department of Transportation. But CEO Elon Musk has suggested on multiple occasions FSD is safer than a human driver and is rapidly improving, which may have encouraged many Tesla drivers to lower their guard when using the software.

Lambert isn’t the only Tesla owner noticing the veering problem. He said in another tweet today he’s getting messages from people seeing the same behavior on FSD Beta v11.4.7.

“I’ve experienced the same behavior at lower speed (90kmh),” Bastien Theron, an engineer and EV entrepreneur, responded to Lambert’s tweet. “11.4.x is better than 11.3.6 on many front, but 11.3.6 was safer as mistake were annoying but not dangerous imo. 11.3.6 was an old grandpa driving, 11.4.x is a reckless teenager some times!”

There have been reports of earlier versions of Tesla Autopilot making mistakes on highways such as taking exit ramps when it wasn’t supposed to. Tesla has fixed the problem in newer versions.

“This is actually a lot more dangerous than a surprise exit ramp because there’s no exit ramp to slow down in with those median strip u-turn areas,” Lambert wrote in an article in Electrek today about his frightening experience.

Tesla could not be reached for comment.

EV Journalist Discovers New Bug in Tesla’s Updated FSD

FOLLOW US ON GOOGLE NEWS

Read original article here

Denial of responsibility! Web Times is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – webtimes.uk. The content will be deleted within 24 hours.

Leave a Comment