Tesla Model 3 owners have reported 1,051 problems related to adaptive cruise control (under the forward collision avoidance category). The most recently reported issues are listed below. Also please check out the statistics and reliability analysis of Tesla Model 3 based on all problems reported for the Model 3.
When using the cruise control or self driving function the car will, unprovoked, brake suddenly and severely. Hitting the acceleration pedal will disengage the cruise control or self driving and will stop the deceleration. The error happens about 1-2 times per 20 miles. I do not use these features often due to several near miss car crashes due to deceleration error. Sometimes the car will report other phantom alerts (like curvature assist on a straight road).
See
all problems of the 2021 Tesla Model 3
🔎.
I am submitting this report to the national highway traffic safety administration regarding a recurring safety issue with my 2018 Tesla Model 3 involving unintended “phantom braking. ” on multiple occasions, my vehicle has abruptly decelerated without any apparent obstacle or hazard present. These incidents have primarily occurred while using driver assistance features such as autopilot and traffic-aware cruise control on highways and well-marked roads under normal driving conditions (clear weather, dry pavement, consistent traffic flow). The braking events are sudden and significant, creating a serious safety risk—particularly when other vehicles are following closely. In several instances, the deceleration was forceful enough that I believed a rear-end collision was likely. There were no visible triggers such as overpasses, shadows, vehicles merging, or roadside objects that would reasonably justify the system’s response. Below are details to assist your investigation: •driver assistance features active: just cruise control, not auto pilot on these occasions •weather/road conditions: clear and dry •frequency of occurrence: this is an ongoing problem that has occurred dozens of times in the past few years. The March 30th incidents were only the most recent ones. •any warning messages or alerts: no this behavior appears unpredictable and difficult to anticipate or mitigate, increasing the risk of a crash. Given the potential safety implications, I respectfully request that NHTSA review this issue to determine whether it reflects a broader defect affecting other vehicles. Thank you for your attention to this matter.
See
all problems of the 2018 Tesla Model 3
🔎.
While driving at highway speeds, using cruise control, the vehicle occasionally brakes for no apparent reason with no vehicles or obstacles in the way. The braking is severe and could potentially cause me to be rear ended. The dealer inspected the vehicle and could not reproduce the problem. There has been no previous warning to the braking.
See
all problems of the 2025 Tesla Model 3
🔎.
The full self driving (supervised) system (fsd(s)) failed by running a red light. Video is available for inspection upon request. My safety was not at risk - this time - because the system waited until there was no cross traffic to resume motion, even though the light was still red. I have not tried to reproduce the problem for a dealer since it happens infrequently. (this is the third time in perhaps 6 months of subscription to Tesla's fsd(s). Tesla has not informed me of any inspections they have performed, even though they have been notified through the channel they provide. There was no indication from the car that it perceived any failure. It stopped only when I applied the brake. Prior symptoms include two previous incidents of stopping for a red traffic light, waiting until there was no cross traffic, then resuming motion while the traffic light was red. This is the first time I thought to save dash cam video.
See
all problems of the 2023 Tesla Model 3
🔎.
I was driving with full self driving active on Sunday morning and my car came to a complete stop at a red light. Once the traffic cleared the car drove right through the red light and I was given a traffic citation by a police officer. The car did wait until there were no other cars present so I don't believe I was in any real danger but the fact that the car went through a red light while on full self driving violates the rules of the road which as we know is illegal. I have a video from my dash cam but this form doesn't appear to allow for video uploads.
Tesla full self-driving (supervised) was engaged at the time of the incident. The vehicle was traveling at low speed on a straight roadway when it unexpectedly initiated a sudden steering maneuver out of its lane toward a roadside structure (outdoor restaurant seating area), resulting in a collision. The driver did not provide steering input prior to or during the maneuver. There were no warning messages or alerts indicating a hazard or need for evasive action before the event. Vehicle telemetry data reviewed after the incident indicates that the steering system executed a rapid, large-angle deflection without corresponding driver input, consistent with a system-initiated maneuver. The system continued to apply throttle during the event. The incident created a direct safety risk to the vehicle occupants and nearby pedestrians due to unintended vehicle path deviation toward a fixed structure. Additional context: when this car was in the repair shop for repairs, we were told that during the replacement of the dashboard/computer system, the technicians observed a persistent system code that would not clear. Tesla technical support was contacted, and the system was ultimately reinstalled to resolve the issue. While the direct connection between this code and the fsd malfunction is unknown, the persistent code may indicate preexisting software system instability. The vehicle has been inspected by repair professionals, and supporting data (telemetry and video) is preserved and available for review upon request. Cause of the malfunction is currently unknown, but evidence indicates a failure in the driver assistance system’s steering control.
Inused Tesla to back up from my driveway and it hit my gate little damage but the car did it . Did not calculate and crash sensor sisnt work car crash the car on fss (automatic pilot).
When adaptive cruise control is enabled (autopilot, not equipped with full self driving) car will sometimes brake hard as if there is an obstacle ahead, with the words "curve assist active" flashing on the screen. Seems to happen most often on state highways, vs interstate. In one case last night car slowed from 65mph to 30mph, on a straight road, with no object ahead. Thankfully no cars were immediately behind or I would have been rear-ended. Happened 4 times in 500 miles of highway travel.
See
all problems of the 2020 Tesla Model 3
🔎.
There is no checkbox for full self driving (supervised): 3 times in the last 4 weeks, my Tesla has become impatient at long red lights and lurches out into the intersection. I am very concerned because one of those times (first time) it did not creep forward, but just took off thru the red light. I had to accelerate thru to get ahead of those cars that still had the green light. Fortunately, nothing untoward happened, except scaring me! the other two times, I recognized what was happening and put on the brakes and removed from fsd mode. I have complained to Tesla and am asking for them to verify my fsd is up to date and this issue will be fixed. Needless to say, I have stopped using the fsd (supervised) until this is all addressed.
Incident date: Feb 1, 2026, 2:45 pm pst location: camarillo, CA (pleasant valley rd & village commons blvd) description: the vehicle (Tesla Model 3, fsd supervised) attempted an unprotected left turn. The system identified a gap but paused excessively before initiating the maneuver. After the pause—when the gap was no longer safe—the system proceeded to turn anyway, directly into the path of a speeding oncoming vehicle. The system failed to abort the maneuver after its own hesitation. It also failed to accelerate with the urgency required to clear the path. I was forced to manually override with emergency acceleration and steering into an oncoming lane to avoid a high-speed t-bone collision. This appears to be a "stale data" failure where the car executed an old plan that was no longer valid.
Tesla full self-driving (fsd) drove the vehicle onto a curb inadvertently, a few days after a software update. The incident caused a scratch to the wheel and a cut in the tire. Tesla has refused to cover the repair, stating that fsd requires driver attention. The issue is that the vehicle had never driven into a curb on a 90-degree turn before. I had my hands on the wheel and did not anticipate the software making an incorrect decision and striking the curb. Full self-driving was engaged at the time, and the software made an incorrect calculation that resulted in the impact. Tesla should replace the damaged tire and repair the wheel. I have the 1 min recording of the incident but could not upload the video due ot its size being more than 10 mb.
See
all problems of the 2022 Tesla Model 3
🔎.
Vehicle information: 2020 Tesla Model 3 awd (leased) full self-driving (fsd) enabled summary of safety issue: while full self-driving (fsd) was engaged, the vehicle failed to detect roadside infrastructure and collided with multiple fixed objects, including a route sign board, a walkway sign board, and an underground electrical cable. The system did not provide adequate warnings or corrective action before impact. System jerk the steering confirmed system is engaged but did not break and drive to non-drivable area and not given any disengagement warning and totaled the car. Incident details: • date of incident: [xxx] • location: on [xxx] {xxx] piscataway, NJ 08854 • road type: [xxx] • weather/visibility: clear • speed at time of incident: <40 mph (under the speed limit) description: as soon as I trigger the fsd it immediately tried to navigated left or right and failed and unexpectedly veered into roadside infrastructure. The system did not slow down, steer away, or alert the driver in time to prevent the collision. The impact caused significant damage to public property and the vehicle is totaled. No injuries occurred. Evidence available: • photos of the scene and damage • police report (number: [xxx] • tow documentation • Tesla collision center estimate reason for reporting: this incident suggests a potential safety defect in Tesla’s fsd system related to object detection, path planning, and collision avoidance. The failure occurred without driver input and raises concerns about the system’s reliability in detecting fixed roadside objects. I request that NHTSA review this incident as part of ongoing evaluations of Tesla’s driver-assistance systems. Information redacted pursuant to the freedom of information act (foia), 5 u. S. C. 552(b)(6).
What component or system failed or malfunctioned, and is it available for inspection upon request? the steering system failed due to corrosion and water ingress affecting both the steering rack and the associated steering wiring harness. These components were replaced by Tesla and are no longer available for inspection. How was your safety or the safety of others put at risk? the steering system is a safety-critical component. Corrosion and water intrusion created a risk of reduced or sudden loss of steering assist while driving, which could lead to loss of vehicle control and increase the risk of a crash. Has the problem been reproduced or confirmed by a dealer or independent service center? yes. Tesla service confirmed water ingress and corrosion affecting the steering rack and wiring harness and performed a full replacement of both components. Has the vehicle or component been inspected by the manufacturer, police, insurance representatives or others? yes. The vehicle and steering components were inspected by Tesla at a Tesla service center, which identified water ingress and corrosion as the cause of failure. Were there any warning lamps, messages or other symptoms of the problem prior to the failure, and when did they first appear? yes. The vehicle displayed steering-related warning messages prior to repair, including reduced steering assist alerts, which prompted inspection by Tesla service. The warnings appeared shortly before the repair was performed.
See
all problems of the 2019 Tesla Model 3
🔎.
Component/system that failed: Tesla full self-driving (supervised) software version 14. X (2025. 38. X and later). The issue is software-only and remains installed on any vehicle updated to v14. Description of safety defect: fsd v14 deliberately removes the driver’s ability to set or adjust maximum speed offset using the right scroll wheel (feature present in all prior versions including my current v13. 2. 9). As a result, the system routinely exceeds posted speed limits by 5–20 mph with no way for the supervised driver to correct it short of fully disengaging fsd. The vehicle ignores speed-limit signs, relies on inaccurate map data, and follows surrounding traffic speed instead of the legal limit, even in school zones, construction zones, residential areas, and adverse weather. This significantly increases risk of collision, injury, and traffic citations. Reproduced/confirmed: widely reproduced by thousands of owners on Tesla forums, reddit, and x immediately after v14 rollout in late 2025. I have chosen to remain on v13. 2. 9 to avoid this exact defect. Inspected by manufacturer or others: no in-person inspection required; Tesla has full remote log access. Multiple owners have submitted in-car bug reports. Warning lamps/messages: none. The speed-offset control is simply grayed out and non-functional by design in v14. Vehicle available for inspection upon request.
See
all problems of the 2024 Tesla Model 3
🔎.
The forward-facing camera on my vehicle repeatedly fogs up inside the sealed camera housing, creating a light to severe haze across the camera lens. This internal fogging obstructs the camera’s view and affects the vehicle’s safety systems that rely on it, including forward collision warning, automatic emergency braking, lane keeping, and adaptive cruise control. The components are available for inspection upon request. This obstruction puts my safety at risk because these systems either disable themselves or operate unpredictably when the camera view is blocked. The issue occurred again this year under normal outdoor temperatures, and the fogging cannot be cleared by the driver or by using defrost or cabin heat. The problem has been confirmed previously by a Tesla service center, where technicians inspected the camera area and told me my vehicle had one of the worst cases they had seen and told me this is a common issue. They attempted a repair, but the issue has returned, making it a recurring defect. No police or insurance representatives have inspected it. During the first occurrence, the vehicle displayed messages indicating that features like autopilot, cruise control, and lane assist were disabled due to camera obstruction. The current recurrence produced visible internal haze over the camera lens and a warning message appeared which I do have that Tesla acknowledges this is an issue and that rather than fixing it, you can bring your car in for expedited cleaning. I have read others stating Tesla changed their windshield design and it doesn't heat up this area properly any longer. This issue has happened in multiple winter seasons and directly affects the proper operation of safety-critical driver assistance systems.
Tesla full self driving software malfunctioned and turned into oncoming traffic lanes instead of crossing over to the correct opposite side of the roadway. This occurred after getting off a on ramp in ukiah, California around 2am Nov 15, 2025. This could have put others in serious danger if there were cars coming in the oncoming lanes heading in the different direction. The problem has not been reproduced by a dealer or other entity. Yet this has happened to other Tesla drivers as well using fsd as you are probably already aware. No inspection and it was sudden and unexpected.
While driving with fsd engaged and set to standard profile, fsd accelerated to 76mph in a 55 zone. This is on software version 14. 1. 4. Unfortunately this new update no longer allows you to set your speed while using fsd, you can only select sloth, chill, standard, hurry, or mad max.
Description of the defect: while driving with full self-driving (fsd) engaged, the vehicle's primary computer shuts down and reboots, causing a sudden loss of all safety features, including blind spot monitoring, lane departure warning, forward collision warning, and the driving visualization. The cameras go black while the vehicle is in motion. The trigger: the failure is triggered by high electrical load. If the cabin heater (hvac) and seat heaters are active while fsd is fully computing, the 12v power supply to the computer drops below critical voltage, causing a "turbo a" processor lockup and system crash. Evidence of defect: service center diagnostics confirmed the specific error code aps_w169_turboa_scs_lkup (computer crash) occurred at the exact same timestamp as vcleft_a302_blowergeneralfault (hvac blower failure). This proves a common-mode voltage drop affecting multiple critical systems simultaneously. Manufacturer response: the manufacturer (Tesla) acknowledged the logs but refused to inspect the wiring harness or ground connections, instead recommending a replacement of the computer unit. However, the failure can be mitigated by unplugging external cameras to reduce load, proving the root cause is an insufficient power delivery design (wiring/grounding) rather than a component failure. The computer is running at full load with the newest fsd software and it can no longer keep up with additional loads. Safety risk: this defect causes a sudden, uncommanded loss of driver assistance and situational awareness tools (cameras/visualization) while the vehicle is maneuvering in traffic. The vehicle’s power distribution system appears insufficient to support the electrical load of the software (fsd) and the hardware (hvac) simultaneously.
I was driving on a straight interstate road between 60-70 mph using full-self driving around 4:15pm and there was no traffic in front, behind, or next to me. Suddenly, the car tried to make a sharp right turn that felt like more than an obstacle avoidant move or lane keeping move. I had to grab the wheel even harder than I was (my hands were fortunately on the wheel) and force it back into the lane. If I did not make this intervention, I would've gone off the road and crashed.
All safety and driver-assistance systems on my 2022 Tesla Model 3 failed due to an internal short in the vehicle’s computer, as confirmed in writing by Tesla’s technician. All cameras (rear, side, front) are non-functional, navigation does not work, and all adas features (autopilot, lane-keeping, collision warning, blind-spot monitoring, emergency braking, etc. ) are disabled. The vehicle has no visibility when reversing and no active safety protections while driving. Tesla documented the cause as an internal computer failure, not related to damage or misuse. This creates a dangerous condition on public roads since the car loses all safety systems. Tesla quoted ~$2,900 for a new computer even though the failure involves critical safety components. This may indicate a broader safety defect.
Central computer (motherboard), cameras (front/rear/side), autopilot/auto steer system. Error code: aps_w132. Description of problem: on or around [xxx], Tesla pushed a mandatory over-the-air software update to my vehicle. Immediately after, multiple safety-critical systems failed: all cameras (including backup) became intermittent or non-functional, auto steer/autopilot stopped working, and the car emitted constant distracting beeping alerts due to a critical motherboard failure. This made driving hazardous—impaired visibility for parking/reversing, loss of collision avoidance features, and constant audio distractions that could lead to accidents. I had to drive in this unsafe condition for weeks while waiting for service. Tesla attempted a remote fix but it failed. I scheduled service on November 19 for December 3 (19-day delay). Diagnostics confirmed the update induced the hardware failure, a known issue reported by other 2021 Model 3 owners on forums like reddit (e. G. , software bricking motherboards post-update). Despite this, Tesla refused warranty/goodwill coverage, quoting $2,576. 83 for repairs. Service advisors josh and andy were unprofessional: josh was rude/condescending; andy was aggressive, sarcastic, and retaliatory. After I mentioned filing complaints with the az attorney general and news channel 3, andy threatened $100/day storage fees with an impossible deadline (called at 4 pm Friday, close at 5 pm, demanded pickup by morning, then noon). He closed my ticket without authorization, delaying fixes and forcing me to contact another location—prolonging my exposure to the unsafe vehicle. This defect poses serious safety risks: sudden loss of cameras/autopilot could cause crashes. Tesla's awareness (from prior reports) and poor handling (intimidation delaying repairs) exacerbates the danger. No crash/injury yet, but potential is high. Please investigate this update-induced failure as a widespread defect. Information redacted pursuant to the freedom of information act (foia), 5 u. S. C. 552(b)(6).
After upgrading to fsd (sull self driving supervised) v14, the speed limit function has been removed. There is no way to set a max speed for cruise control. The car speeds unsafely at all times while in fsd. There is no way to control how fast the car can go. It just "thinks" what speed is best. It constantly speeds 7-10 over in every setting above "sloth". Sloth setting still does not have a speed limit setting. This is very dangerous. There is no excuse for not having a max speed setting.
In the new Tesla update to full self drive they removed the ability to adjust the speed control while using fsd yet state "note: you are responsible for the speed and control of your vehicle at all times, whether fsd (supervised) is enabled or not. ".
The car, with no known or evident driver, stopped the lane awareness, cruise control, etc. All of those just stopped working and per Tesla that's expected which is surprising since it's all part of the safety systems. The car's visual cues stopped identifying lanes and as a byproduct of that, none of the dependent features work.
On [xxx] at approximately [xxx], I was driving northbound on [xxx] in phoenix, arizona in my 2025 Tesla Model 3 using Tesla's full self driving (fsd) (supervised) mode, when the incident occurred. I was paying close attention while driving in the left-most, non-hov lane coming up to a bend just before the [xxx] overpass, when I noticed traffic ahead slowing down and coming to a stop. I monitored fsd as it went through the act of braking; however, it seemed to wait a little longer than I would have expected before it actually started to brake, and when it did, the fsd system had to brake more aggressively than I've previously experienced. As it neared the end of the braking event (while the car was still moving), the fsd system disengaged itself without any input from me (neither by manually applying the brake pedal, applying torque to the steering wheel, or pressing the fsd button). When the fsd system disengaged itself, my vehicle continued forward. Had I not been paying attention, my vehicle would have collided with the rear end of the vehicle in front of me. Since I was paying close attention, I immediately slammed on the brakes after noticing that fsd had disengaged itself. I have dashcam footage of this event from the Teslacam system; however, it does not include audio or telemetry data. Even without that, you can see in the video that my car noses down while fsd applies the brakes, then noses up when fsd disengages itself, and then noses down again when I apply the brakes manually. (the dashcam footage exceeds the 20mb limit; however, I can provide it via other means upon request. ) it should be noted that I am an experienced Tesla fsd user, as I used it for over a year in my prior vehicle (2018 Tesla Model 3, which had been upgraded with the hw3 fsd computer), and this is the first time I have ever experienced a near collision event while using fsd or an event in which fsd disengaged on its own. Information redacted pursuant to the freedom of information act (foia), 5 u. S. C. 552(b)(6).
Tesla Model 3 2025, fsd v13. 2. 9, red light violation: video enclosed: [xxx] it just happened and I have no more information can provide. Apparently fsd didn't warn me that it will run a red light. Information redacted pursuant to the freedom of information act (foia), 5 u. S. C. 552(b)(6).
Full self driving mode (with hardware 3) consistently will enter the carpool lane when I am driving solo. In the vehicle navigation settings, I have, "use carpool lanes," unselected. It doesn't seem to use this input in it's decision to enter the carpool lane. Sometimes it is very quick to signal and then enter the carpool lane illegally that I can't respond quick enough to correct. However, it only seems to need 1 correction for it to ignore the carpool lane for the remainder of the trip. This then can/will repeat each new trip (it doesn't always occur for each new trip, sometimes it won't make an attempt to get into the carpool lane). I have sent Tesla dozens of recorded messages that it needs to be fixed. It will enter at an illegal point, crossing the solid line, this can sometimes be a dangerous maneuver as well as being illegal for a solo occupant. Also, I don't want to want to be ticketed. It's quite a stressful situation each time this occurs.
Full self drive 14. 1 removes the max speed setting that allows drivers to control the speed of the vehicle while under computer control, leaving no way to prevent the car from speeding when the speed profiles choose to drive over 28+ mph over the speed limit. The lack of ability to control the speed of the system is fundamentally unsafe by design.
My family and I were in my Tesla Model 3 2022 and heading from charlotte to orlando on [xxx]. The car was in full self-drive (fsd) mode while commuting through daytona beach. The rain was heavy in the area. While driving in daytona beach, fsd caused the car to change lanes and go to a lane that had standing rain puddles on [xxx]. After fsd caused the lane change, the car immediately hydroplaned out of control. The car's short clearance from it's bottom and the ground which traps water that exacerbates hydroplaning, along with its generative breaking feature made matters worse and uncontrollable. The car spun into the median three times and settled in the middle of danger interstate. The car is totalled. My family and I require medical attention and counseling due to the trauma. Information redacted pursuant to the freedom of information act (foia), 5 u. S. C. 552(b)(6).
Repeatedly and randomly, with cruise control (not fsd) engaged, the car would brake, sometimes aggressively. The last time, it braked so hard the tires or brakes squealed and the car swerved slightly before I was able to regain control. This was on the freeway (in colorado, if I remember correctly) while I was cruising at 70 or 75 mph. I decided that that was the last time I would use cruise control; it was just not safe. I had an appointment on October 22 to take the car in for service. But then a friend told me this was called "phantom braking" and that there was no fix. A friend suggested that I contact the NHTSA.
A recent update has caused the vehicle to no longer acknowledge an important speed sign. This was confirmed by another Tesla, it is due to an update. This is unsafe and is capable of allowing the vehicle to travel through town at 75 mph where pedestrians cross the highway. If the camera gets blinded at the wrong time then it will not see people or vehicles as it plows into them. I made Tesla service aware and tried to submit a bug report but Tesla app simply says they are aware that the camera may not always read the signs. However, this was never an issue until a recent update. When are you going to make them liable?.
I love my vehicle and its technology. However after installation of the latest (fsd v14) update, I observed issues I believe pose a risk to the safe operation of the vehicle. I have made dozens of reports for 30+ days to Tesla for these issues. 1. Driving profiles and speed the update features multiple driving profiles (e. G. , "sloth", "chill", "standard", and mad max). Only the "sloth" profile drives at the “speed limit” (what their outdated map thinks speed limit is). All other profiles drive over the posted speed limit (sometimes significantly). The ability to manually adjust the speed limit setting has been removed or disabled in these profiles. Once the vehicle is in a profile, the system stays in its own selected speed. This behavior prevents the driver from easily controlling the maximum speed when using the system, undermining driver-intent. 2. Incorrect or outdated speed limit data the vehicle frequently displays incorrect speed limits: for example, school zones, reduced speed zones, temporary speed changes, work zones, and even a busy high way I drive everyday that’s been changed for 3 months now, are often not recognized or updated in the system. On several occasions over the past months, the system continued to use an outdated or incorrect posted speed, despite the road clearly being a lower limit. Because the system uses the erroneous speed data as the basis for its automated driving decision (and without manual override of speed in fsd), this situation increases risk-especially in areas where lower speed limits are enforced for safety (pedestrian zones, near schools). 3. Safety risk description the lack of manual speed adjustment under autonomous mode means the driver has less control over vehicle speed when using fsd, reducing the driver's ability to mitigate risk in a dynamic environment. I’m asking for Tesla to either make sure they have constant real time speed limit data, or allow again for manual speed adjusts when in fsd.
Recently I had three back to back near miss head on collisions on highway early morning still dark. These three along with a previous near miss the vehicle had zero reaction to the imminent collision with roughly half a second to spare. It happened so fast that I must have let go of the record button as there was no saved video. I am working with lemon law firm and class action firm, get your crap together because these recent accidents reported in az are also your fault for not reacting appropriately to previous complaints.
Vehicle: 2019 Tesla Model 3 dual motor VIN: [xxx] mileage: ~84,041 software: v12 (2025. 32. 6 e575ed98d527), fsd supervised v12. 6. 4 date/time: [xxx], ~[xxx] location: [xxx] incident description: while operating autopilot, I was stopped at a stop sign preparing to turn right. I lightly pressed the accelerator to prompt the turn. At that moment, the vehicle would not respond to steering or braking input. Despite applying full force on the steering wheel and brake pedal, the car continued straight ahead and struck a street sign. This represents a loss of manual override — I was completely locked out of steering and braking. Prior behavior at same location: on prior occasions using autopilot at this same intersection, the vehicle would attempt to make the turn but then immediately steer itself into the breakdown lane. I had learned to expect this and was prepared to take over. However, in this most recent incident, the vehicle did not allow me to take over at all, which created a far more serious hazard. Result: •vehicle damage (front bumper, sensors) •significant safety risk if pedestrians or cross traffic had been present •demonstrates that autopilot can both mis-execute turns and, critically, fail to relinquish control when the driver intervenes action taken: •scheduled Tesla service (cherry hill, NJ – oct 7, 2025) •requested Tesla to preserve all telemetry, camera footage, and autopilot logs •reported to my insurance carrier as a suspected manufacturer defect request: I am reporting this as a serious safety defect. A system that prevents manual override of steering and braking is unsafe and could cause severe injury or death. Please investigate Tesla autopilot’s behavior at this location and in similar right-turn scenarios. Information redacted pursuant to the freedom of information act (foia), 5 u. S. C. 552(b)(6).
While driving my Tesla equipped with autopilot/traffic-aware cruise control, I experienced an unexpected and abrupt braking event. A vehicle had entered the center turn lane and had already cleared my lane, posing no obstruction to my path. Despite this, my Tesla suddenly applied the brakes without warning, causing a rapid deceleration that could have led to a rear-end collision had another vehicle been following closely. There were no visible hazards, warning messages, or alerts prior to the braking. The incident occurred in clear daylight conditions on a dry road, and the vehicle in the center turn lane was stationary or turning away from my lane. This appears to be a case of phantom braking, possibly due to misclassification of the adjacent vehicle or overcautious object detection by the vehicle’s advanced driver assistance systems (adas). This issue has occurred more than once in similar scenarios, and I believe it poses a serious safety risk. I have not yet had the issue inspected by Tesla or a service center, but I am submitting this report to raise awareness and request investigation into the reliability of Tesla’s braking logic in these situations.
| Problem Category | Number of Problems |
|---|---|
| Adaptive Cruise Control problems | |
| Automatic Emergency Braking problems | |
| Warnings problems | |
| Forward Collision Avoidance problems | |
| Dynamic Brake Support/brake Assist problems | |
| Adaptive Cruise Control Software problems | |
| Adaptive Cruise Control Software Signage/signal Recognition problems |