My experience with Tesla Full Self Driving (Supervised)
In the fall of 2021, I purchased a Tesla Model 3. Here’s me looking way too proud about this:
In my defense, at the time Elon hadn’t shown his true nuttiness yet. But I digress.
Recently, Tesla made a big deal about Full Self Driving (Supervised) being ready for prime time. Of course, the fact that the afterthought (Supervised) was added at the same time should have been a dead giveaway, but I figured what the hell; at least I’m not paying for the privilege of beta testing and training my car’s neural net.
So, let’s begin with the generalities. First of all, I will say that, as far as autonomous driving goes, I was actually fairly impressed. The car handled most situations very well. It was hesitant at times, and downright lead-footed at others. There were long stretches where I did not have to intervene and disengage the system. It was not great at thinking ahead on routes, and if left to its own devices frequently waited until far later than I am comfortable with to get in the correct lane for an upcoming maneuver, especially in heavy traffic. So, by the law of averages, not bad, but certainly room for improvement.
The problem is, “average” is not good enough when we are talking about something as safety-critical as driving.
When driving a vehicle, you’re making hundreds of decisions per minute. Some of them are very small, and by the time you have significant experience behind the wheel, a lot of them are automatic. They all add up into a safe drive that gets you where you need to go.
But, it only takes one wrong decision to land you (or someone else) in the hospital, or even gods forbid the morgue. And this is what Tesla and the shills don’t want you to focus on.
I went several drives without having to intervene. Then the next drive, the car tried to turn left into the oncoming lane instead of the correct lane.
A few more drives without having to intervene. Then it inexplicably started a safe left turn, and then stopped in front of oncoming traffic, requiring me to floor it to clear the lane and avoid getting t-boned.
Overall, during the month, I had multiple fails like this, where the car just simply did not read the road correctly. My suspicion is that, given California has more Teslas by far than any other state, and given that most of Tesla’s software engineering staff is still based in California, that the training data heavily favored California road markings and signage, which are admittedly (especially the road markings) better than they are in the middle of the country where we have to be able to plow the roads.
Whatever the reason, it’s clear that, despite the crowings of Elon and his company in the reality-distortion field, Full Self Driving is anything but, assuming you’re trying not to get yourself or someone else killed. There is still a lot that needs to happen here, yet time and time again Tesla (or maybe it’s just Elon) seems to think they have solved for all cases and have declared the software ready.
For my part, I will continue not paying for the privilege of training their software, until such time as it meets that safety threshold. I just hope they don’t kill anyone in the process with their overconfidence.