
Tesla now forces drivers to give feedback when intervening on ‘Full Self-Driving’

Tesla has quietly made it mandatory for drivers to provide feedback every time they intervene on “Full Self-Driving.” The prompt, which used to disappear on its own after a few seconds, now stays on screen indefinitely until the driver selects a reason or sends a voice note.
The change arrived with FSD v14.3.2 as part of software update 2026.2.9.9, which rolled out in late April. Tesla didn’t announce the new behavior — the company retroactively updated the release notes to mention it.
What changed
Previously, when a driver took over from FSD, a feedback prompt would pop up on the touchscreen asking why they intervened. You click on the voice button on the steering and automatically send Tesla a message about what happened. If you ignored it, it would go away after a few seconds. It was a great way for Tesla to get feedback on FSD.
That’s no longer the case.
With the latest update, the prompt sticks around until you either tap one of the on-screen options — Preference, Discomfort, Navigation, or Critical — or record a voice note using the microphone button on the steering wheel. There’s no dismiss button and no timeout.
Tesla has already iterated on the design three times in rapid succession. The original version offered Preference, Comfort, Critical, and Other as choices. A second revision swapped “Other” for “Navigation” after users pointed out they had no good way to flag route-related issues. The third iteration, rolling out now with software version 2026.2.9.10, makes the dialog smaller and no longer blocks access to other screen controls like navigation, climate, and drive selection (yes, they did that).
The only real “hack” to quickly get rid of it: double-tap the microphone button on the steering wheel. This records an empty voice note and clears the prompt instantly — no need to look at the screen. It’s the fastest workaround, but the fact that it exists as a workaround rather than an actual dismiss option tells you something about how Tesla designed this.
Why Tesla wants this data
The logic is straightforward. Tesla’s FSD system improves by analyzing real-world intervention data, and vague or missing feedback makes it harder to diagnose failure modes. By forcing drivers to categorize every takeover, Tesla gets cleaner training signal for its AI pipeline.
Tesla now has nearly half a million active FSD subscribers generating $546 million in annual recurring revenue. That’s a massive crowdsourced data collection network — and with 10 billion FSD miles driven, each tagged intervention becomes a valuable data point for improving the system.
Tesla first introduced the ability to send voice feedback after FSD interventions back in 2023, but that was entirely optional. Making it mandatory is a significant shift in how Tesla treats the driver-feedback loop.
The problem
Several Tesla owners have flagged legitimate concerns about this approach. The prompt appears immediately after disengagement — often exactly when the driver’s full attention should be on the road. A persistent on-screen dialog demanding input during what might be a safety-critical moment is, by definition, a distraction.
Beyond safety, there’s a practical annoyance: the available categories don’t always match the actual reason for the intervention. As several drivers noted in response to my own experience with this change, they end up just tapping a random option to clear the screen — which defeats the entire purpose of collecting the data in the first place.
And the fact that it even stays on screen when the vehicle is in Park — which I confirmed myself — suggests this wasn’t fully thought through before deployment. FSD v14.3 brought meaningful improvements to the driving experience, but this feedback system feels rushed.
Electrek’s Take
I’ve been giving Tesla FSD feedback through its in-car system for years now. I think the data is genuinely valuable — both for Tesla and for the broader goal of improving the technology. I’ve reported specific intersection issues, navigation errors, and comfort-related interventions, and I’ve seen some of those same situations improve in subsequent updates. So I’m not against feedback collection in principle.
But here’s the thing: Tesla’s FSD subscribers are paying customers — up to $99 per month — and you can’t turn them into unpaid quality assurance testers without even giving them a clean way to opt out or dismiss the prompt. The only fast way to clear it is to double-tap the mic button and send an empty voice note, which is a hack, not a feature. That’s just poor design.
This shouldn’t be surprising, though. Offloading QA to paying users has been Tesla’s approach with FSD from the beginning. The entire “beta” program was built on the premise that customers who paid thousands of dollars for a feature would also serve as the test fleet to make it work. This mandatory feedback prompt is just the latest — and most in-your-face — version of that same strategy.
Tesla should add a simple dismiss button with a reasonable timeout. Drivers who want to help will still provide feedback. Forcing everyone to interact with a prompt while they’re actively driving — or finding workarounds to avoid it — undermines both the user experience and the data quality Tesla claims to want.
FTC: We use income earning auto affiliate links. More.








