Divining Tesla’s Safety Score

Or how I became the Queen’s limo driver because I had no choice

Richard Gingras
The Mobilist
8 min readNov 9, 2021


Photo by David von Diemar on Unsplash

In late September Tesla updated the software of my Model S (Performance, 2018) and advised that I now had the option to participate in Tesla’s Full Self-Driving (FSD) Beta . In saying yes, I was then informed that my driving performance would be evaluated along several criteria and calculated as a 0–100 Safety Score.

The game was on.

I have unabashed love for my Tesla. It rethought the experience of driving — and made it better. It’s refreshing to own a vehicle that gets smarter over time, that identifies issues and fixes them, that introduces new features and benefits all along the way. What a concept.

I respect and enjoy the Autopilot capabilities. Yes, the driver needs to monitor the car and keep a hand on the tiller. It works extremely well. It feels more safe. It’s a great reliever of stress. There is little question that we’ll come to rely on fully autonomous vehicles. Tesla AutoPilot and FSD are not that. Neither is fully autonomous. Not yet. However Tesla’s systems are strikingly good iterations toward what that might be. One critical benefit of Tesla’s effort is that it has the largest ongoing testbed of human driving behavior and autopilot performance.

Which brings me to the Safety Score. We were told that our participation in the FSD beta would be determined by our score. Okay. That seems logical. Give it to the better drivers. Seems wise given the criticism that Tesla shouldn’t be testing on the general driving public. In addition, better drivers are valuable in teaching the software about automotive scenarios, conditions, and responses. Model your system on the best drivers.

I AM one of the best drivers — in the world! I assure you. Yes, certain friends cite experiences with me that gave them pause, or wishing that I would. But they do live to tell the story, and tell stories they do. I admit to my preference for urgency. Wasting time is a bad thing. I do feel I have a total sense of control, even while certain passengers are losing theirs. Which hints at the heart of the matter. Am I a model of excellent self-assessed driving skills? Or, am I a model of excellent driving behavior that should become the norm we want others to follow? Or neither?

How does the Safety Score Work? (details from Tesla)

It assesses five Safety Factors.

  • Forward Collision Warnings per 1,000 Miles. These are audible and visual alerts provided to the driver when the system thinks there’s a possible collision necessitating the driver’s intervention. It’s great. Particularly for situations that arise outside the driver’s field of vision. In some cases it’s detecting your own dubious driving behavior — following closely resulting in an urgent stop. In others it’s alerting you to the dubious behavior of others — entering your lane without knowing you exist. This Safety Factor seems quite reasonable. I’m sure it will know when a warning was due to my unwise behavior versus wisely advising me of the bad behavior of others.
  • Hard Braking. This seems straightforward. Hitting the brakes hard is a situation to avoid. I almost never hit the brakes on the Tesla, soft or hard. In addition to standard brakes the Tesla uses regenerative braking to slow the vehicle as you remove pressure from the accelerator. Regenerative braking uses the kinetic torque of the motor to slow the vehicle, and generate power back to the battery (technical explanation). I love it. It provides a much better sense of control. I do not exaggerate in saying I seldom apply the disc brakes. No hard braking for me. Score me perfect.
  • Aggressive Turning. This too is straightforward. No sharp turns. Shouldn’t be a problem. I’ll moderate my behavior a tad.
  • Unsafe Following. Yes, following too closely is not a good thing. How following too closely intersects with the flow of heavy traffic might be the question.
  • Forced Autopilot Disengagement. This happens when you are using AutoPilot and the system determines you are not paying attention and not responding to alerts. Makes perfect sense.

What does it NOT cover?

It does not evaluate speed. You can go as fast as you want as long as you don’t violate the five Safety Factors. It does not keep score when AutoPilot is engaged — though the miles covered are included in the formula. It does not evaluate traffic law violations — not speed limits, not stop signs nor red lights.

Let’s win this!

With strong confidence in my Formula 1 driving skills I hit the road knowing I will soon be fulfilled with positive reinforcement. Perfect score or nothing.

Well, it didn’t happen that way. On my first few drives my scores were riddled with Aggressive Turning demerits, Unsafe Following penalties, and a slew of disturbing-beyond-all-logic Hard Braking slaps on my wrists. I was getting daily scores and trip scores in the 60’s and 70’s. Got a Forward Collision Warning on day three which coupled with a few Hard Brakings earned me an all-time low daily score of 31! What was that all about?

I was annoyed. I am not being unsafe! I was puzzled. Hard Braking and Unsafe Following had me particularly flummoxed. I did “my own research” and found a few articles and YouTube videos. Learned a few things.

Hard Braking is technically defined as “backward acceleration”. That means that any quick stop is an issue even if you don’t use the brakes. Thus, all my clever use of regenerative braking was getting me into trouble. I learned that the score demands graceful, controlled deceleration — even when using regenerative braking. Don’t jostle the baby! My stops were measured and controlled, but sometimes quick. The system seems to demand the behavior expected of the Queen’s limo driver. I grumble. One problematic issue here is the traffic light turned yellow. If you are even modestly close to the intersection, there is almost no way to decelerate without getting a Hard Braking demerit. It makes the driver inclined to accelerate through the yellow rather than not. Not a good thing and something I think Tesla should modify. The system is aware of the state of the light and its distance.

Unsafe Following also demanded scrutiny. Curiously, I found that the AutoPilot would follow at close distances which would earn me demerits without AutoPilot. Biasing toward the skill of the machine doesn’t seem fair. Or is it? I found that accelerating on highway offramps behind other cars could get me dinged. While it doesn’t measure Unsafe Following below 50 mph, at speeds above it seems to insist on 7–8 car lengths. While that’s fine if you’re the Queen’s limo driver with escorts, it is not fine if driving in heavy urban freeway traffic at 70+ mph. In those circumstances, traveling at a distance necessary to satisfy the scrutiny of Unsafe Following has me in the middle lane being passed on both sides. Maybe I should use the AutoPilot.

Forward Collision Warnings will ding your score even when it’s not the fault of the driver. No, the system cannot discriminate based on the fault of the driver, or lack thereof. Though that should and could be part of a future system, it is not the case today. So when you get a warning because someone stops short in front of you and you didn’t respond quickly enough, a penalty will be issued. That’s fair. However, those other instances where another driver thoughtlessly drifts into your lane or darts from a side street? You will get whacked with a full one-point penalty on the 100 pt score and the penalty will stay until 30 days have passed. The lesson: be the Queen’s limo driver. Stay in the slowest lane. Pause like a nervous cat at any incautious movement far ahead. Hmmm.

Aggressive Turning. Here as well, I was racking up demerits for what seemed to be innocuous turns, maybe quick but not sharp or reckless. The answer: again, the Queen sits in your back seat. No matter how safe your turn (including a reasonably quick 90 degree turn to back into my charging spot), don’t dare shake or stir the Queen’s martini.

Okay, now I know the rules. Maybe I can play this game.

Armed with fresh info and clever tips, I chased the holy grail of a great Safety Score. I became the Queen’s limo driver. I continued to drive faster than most but with an intense focus on controlled driving behavior. That’s exactly what Tesla’s safety score wants to hear. Now six weeks in my score is up to 97. Not perfect but perfection is within reach.

I can’t say that I’m entirely happy with this. On the one hand I don’t have an issue with a more controlled approach to city driving. It’s somewhat relaxing. There’s a Zen aspect to it. But I’m missing the ability to enjoy driving the great two-lane blacktop, which the quickness and handling of the Model S makes so scintillatingly fun. Can I presume some of that behavior might return after I’m entered into the Full Self Driving Beta? Or will I get kicked back out if I do? Will the Safety Score evolve or will I?

What is Tesla looking for?

Why is Tesla taking this approach? Two possible scenarios come to mind. First, and most likely: Tesla wants more thoughtful drivers in their Beta. Especially now. Already criticized for testing en masse on public thoroughfares, this approach will help ensure fewer accidents, less public acrimony, and less risk of further government restraint.

The second might be the desire to bias their machine learning models on sets of drivers that fit the pattern the system should emulate: controlled, non-aggressive driving. This too makes a lot of sense from a behavioral modeling perspective. I find the criticism of testing such systems on public thoroughfares to be simplistic and counter productive. There’s no question that automated driving systems, even modestly flawed ones, will make the public safer than the current controlled chaos of today’s traffic environment. The fastest and best path to the safest system requires the billions of miles of testing that Tesla and others seek to acquire. And, yes, it will require some of us to step up our game even if it means the game is a bit less fun.

In his professional life, Richard Gingras is Vice President of News at Google. This has nothing to do with his professional life, except for an ongoing fascination with how technology and human behavior evolve.