Tesla Cybertruck FSD 12.5.5 So Bad the Owners Can’t Stop Complaining

It’s bad. So bad, you might call it the Palantir of automotive tech.

Some even speculate that the Tesla AI system is fraud, learning to drive worse with every new release.

Along with an explosion of written complaints, video examples are being posted at an alarming rate:

NZ Tesla Owner Convicted for “Autopilot” Dangerous Driving

As someone who watches the constant drumbeat of Tesla owners losing court battles, almost as fast as their passengers losing their lives, I have noticed certain names frequently top the list.

My suspicion is some cultures are more susceptible to a particular aspect of Tesla fraud. They believe excessive waste to buy a cheap car brand (substandard parts that cost Tesla $4 to make are sold by them for $900) elevates their sense of privilege — they then intentionally use a bogus “token” vehicle to flaunt or ignore traffic laws, as well as laws of physics.

Notably, the accused in this NZ news story attempted to argue the absolutely weakest possible defense:

Singh contended he wasn’t sleeping, but still appealed the dangerous driving conviction on the basis that he can’t have been driving…

He argued he was not, not driving (double negative = driving), but also that he was not driving. He didn’t claim to be innocent as much that he thought he should be allowed to be a criminal.

Driving while not driving?

As if anyone should be allowed to exist in an “untouchable” criminal state of … X.

And not surprisingly, the courts declared his bunk a fool’s gambit.

High Court judge Justice Matthew Downs disagreed in a decision released last month.

Singh’s cause for conviction wasn’t whether he fell asleep or not, but that he … had not seen what was happening around him in the car, which he did not, Downs said.

Singh also appealed on the grounds that he couldn’t have been guilty of failing to stop for police when he didn’t know he was meant to stop, he wasn’t avoiding stopping, and when he did notice the lights and sirens, he pulled over.

Once again, Downs disagreed.

“Mr Singh’s failure to stop was clearly due to his own fault.”

To be fair, Tesla didn’t see what was happening around the car. Tesla is an abject failure of engineering. A total fraud, claiming to have driverless cars based on “vision”, yet blind to high visibility vehicles with flashing lights SINCE 2016.

It’s unfortunate the courts still aren’t ready to hold Tesla accountable for telling customers they can sleep while operating a car.

Related: California just passed a law that says autonomous cars (road robots) must immediately obey police orders.

Tesla Cybertruck Drops a FIFTH Recall Notice: Cameras Take 8 Seconds to Display

Some say there is an unofficial sixth recall related to engine failure.

While we wait for that to percolate into public regulations, let alone the many other unofficial failures of the Cybertruck, here’s the official fifth recall notice this year alone:

Vehicles with rearview cameras in the U.S. must display an image within two seconds of turning on, the NHTSA said, and some of the Cybertrucks failed to display an image for up to eight seconds. Tesla received 45 warranty claims and four field reports that may be related to the defect…

An eight second delay. Forty five claims. So many Cybertrucks have now crashed, it’s a wonder there are nearly fifty of these clown cars still operating.

Now think about how Tesla pumps marketing with “fastest launch time in a straight line” by counting seconds. They constantly talk about each one like it’s the biggest measure of success imaginable.

Zero to 60 in how many seconds? Nevermind, because the Cybertruck can’t get display cameras up and running within two seconds.

In other words it will display a tree six seconds after it has crashed into it.

An eight second delay!

Unbelievable. How bad is Tesla engineering such that they can’t even meet basic safety regulations on a brand new car?

CA Passes AB 1777 Requiring Autonomous Cars to Immediately Follow Police Orders

There have been numerous instances of road robot algorithms being written so poorly that they end up blocking everyone else (causing denial of service). In the earliest case, nearly a decade ago, Google engineers never bothered to input basic Mountain View traffic laws and their robots were pulled over for nuisance driving (too slow).

In the latest case, Waymo stopped perpendicular to San Francisco traffic, on Nob Hill just outside the Fairmont, dangerously blocking a Vice Presidential motorcade.

California has thus finally passed a new emergency requirement (don’t call it a backdoor), that a traffic authority can issue a direct command to an entire robot fleet to vacate a space.

The bill would, commencing July 1, 2026, authorize an emergency response official to issue an emergency geofencing message, as defined, to a manufacturer and would require a manufacturer to direct its fleet to leave or avoid the area identified within 2 minutes of receiving an emergency geofencing message, as specified.

Now the obvious question arises how strong the integrity checks are on that message bus (no pun intended), because I know a lot of people who thought dropping orange cones to “geofence” robots already was a great idea.