Tesla Model Y owners have reported 1,097 problems related to adaptive cruise control (under the forward collision avoidance category). The most recently reported issues are listed below. Also please check out the statistics and reliability analysis of Tesla Model Y based on all problems reported for the Model Y.
The standard cruise control is extremely dangerous. The problem is what I hear is called ghosting. When driving using cruise, and there are no other vehicles nor any other objects such as animals or humans. The car will suddenly decelerate almost to the point of a complete if not overridden by manual takeover. I am under the impression that this is intentionally be done by Tesla, so that you will subscribe to their full self driving service. This can and will cause accidents in traffic.
I believed Tesla full self driving (fsd) was in effect. My Tesla proceeded at 35 miles an hour toward a stop sign and I had to manually stop the vehicle to prevent blowing the sign. It turns out that Tesla sent an email 2 days ago (April 7 2026) that notified me that fsd had been cancelled because my credit card was declined (I had to replace a card on April 7 due to a lost card). I did not see this email. There was no notification in the vehicle that fsd had been deactivated. When I thought I was engaging fsd, I was in fact engaging an autosteer mode with much lesser capabilities, essentially only acting like adaptive cruise control and lane guidance. Engaging this lesser mode is done in exactly the same fashion as fsd, and any driver using it in my situation would always assume fsd was active. This problem, if not corrected will absolutely cause accidents and potentially fatalities. Tesla needs to inform the driver via the head unit that fsd is not active in an obvious, unmistakeable way.
See
all problems of the 2023 Tesla Model Y
🔎.
Running full service driving (fsd) v14. 2. 2. 5 on hw4. At skillman st & I-635 dallas TX, fsd attempted a wrong way turn into oncoming traffic instead of taking the correct leftmost lane. In the image attached instead of following the blue path Tesla took the wrong lane highlighted in red.
Driving using autosteer feature at 65mph on expressway, attempted to change lanes which caused the cruise control to suddenly turn off causing the car to begin braking hard with traffic behind it. Tesla’s new Model Y has combined the lane keeping and cruise control so if the lane keeping turns off the car suddenly brakes hard from the regenitive brakes. Lane keeping and cruise control should be two separate controls to prevent this issue.
Following a 25-day repair at a certified collision center due to an earlier accident, the vehicle's full self-driving (fsd) and active safety systems began failing. The vehicle repeatedly failed to keep speed and failed to stay centered in the lane. On March 25, 2026, while traveling at low speed on a public roadway with fsd actively engaged, the system failed to maintain a stable path and incorrectly steered the vehicle into a curb, causing property damage to the wheel. This failure put my safety and the safety of surrounding traffic at risk by executing an unpredictable steering maneuver that the active safety systems failed to prevent. The vehicle was taken to the manufacturer's service center twice with this specific complaint prior to the curb strike. The service center investigated and officially concluded that this was not a software issue, stating that they could do nothing else. They deferred the resolution back to the collision shop, citing likely camera/sensor physical misalignment from the 25-day body repair. The collision shop is now refusing to review the vehicle or inspect the hardware, claiming lack of visible exterior damage despite the internal service team stating this is a latent hardware alignment defect. The vehicle is currently in the same failing condition and available for inspection upon request. No warning lamps or error messages appeared on the screen prior to the failure or the curb strike. The symptom of unpredictable lane tracking first appeared immediately upon receiving the vehicle back from the collision repair earlier this year. Note: I have a formal data privacy request pending with Tesla to pull the exact chat logs showing their refusal to service this hardware defect. I will provide these to the investigator upon follow-up.
See
all problems of the 2022 Tesla Model Y
🔎.
This is my 10th report; it relates to 2 issues I have reported before 1. The "phantom breaking" on cruise control that I have reported a few times previously has apparently been fixed on a software update, but we were not notified. I went in to santa fe Tesla where I took delivery of my vehicle and got into a discussion about my disappointments with this car, specifically the tires (see #2). I mentioned my past problems, including the phantom breaking. The man was dismissive and a bit defensive and asked if I'd made an appointment. I said I'd been told several times they hadn't figured out a software fix (and even Tesla employees were simply refraining from using cruise control, which is what I did). He said they "fixed it a while ago" and he's not had complaints since then. I tried cruise control on my 45 minute freeway drive and indeed, the problem did not occur. Interesting that they knew it was an ongoing problem, didn't deal with it, and didn't announce when they finally did fix it. I do not yet know if it is an ongoing safety problem for me but I'm planning to gradually increase my use of cruise control, since most of my driving is high-speed freeway driving. 2. I asked about getting warranty rebate for my original tires that have failed at 38k miles. He said I would have to pursue that through the distributor (american tire?) and that he'd only had 2-3 customers ever do that. He said most customers just pay for better tires. He offered to sell me another continental tire in the $400 range and implied that I could find tires through a tire store. I went to discount tire who can get me a rebate on only 1 of the 4 tires because it's down to 4/32" tread. The other 3 can't be warrantied until they reach 4/32". . . And I've already hydroplaned/slid on the freeway in a sleet storm with wet roads. The company knowingly provides inadequate tires and does not stand behind them. This is an unacceptable safety risk and a moral outrage.
I am filing a safety complaint regarding Tesla's removal of driver-controlled maximum speed in fsd (supervised), a change that has alarmed a significant number of Tesla owners and safety-conscious drivers. Previous software allowed drivers to set a precise maximum speed via the steering wheel scroll wheel while fsd (supervised) was active. Tesla replaced this with fixed "speed profiles" (sloth, chill, standard, hurry, mad max), none of which allow a driver to set a specific speed. Each profile operates at a programmed range relative to the posted speed limit. Even the slowest profile (sloth) has been observed exceeding the speed limit. The system also frequently misreads posted limits entirely. This is a serious safety regression. In winter conditions, a cautious driver may need to travel well below the speed limit due to snow or black ice. Previously, they could reduce fsd's max speed precisely. That is no longer possible, the system determines speed, and the driver cannot override it downward while fsd remains engaged. This concern is not isolated to just me. Online forums and Tesla owner communities have seen widespread discussion and frustration over this change, with many drivers expressing that it has made them feel unsafe and less in control of their own vehicle. Removing precise speed control from a supervised autonomous system directly contradicts Tesla's stated position that the driver remains responsible for safe operation. This is an ongoing issue and not limited to a single date for the "tell us the approximate date this incident occurred. " I request NHTSA: 1. Investigate Tesla's removal of driver-adjustable maximum speed in fsd (supervised); 2. Evaluate compliance with fmvss and driver control guidelines for automated driving features; 3. Require Tesla to restore precise, driver-controlled maximum speed settings.
See
all problems of the 2025 Tesla Model Y
🔎.
Nhtsa complaint narrative — safercar. Gov vehicle: 2024 Tesla Model Y component/system: full self-driving (supervised) software, version 14. 2. 2. 5 on March 14, 2026, at approximately 5:22 pm, a 2024 Tesla Model Y was traveling on u. S. Route 3 in franconia, new hampshire, at approximately 45 mph within the posted speed limit. The vehicle was operating with full self-driving (supervised) software version 14. 2. 2. 5 actively engaged. Three occupants were present in the vehicle, including two minor passengers. All occupants were properly restrained with seat belts. Component/system that failed: the full self-driving (supervised) system, version 14. 2. 2. 5, failed to safely detect and respond to a snow-covered road surface. The system lost control of the vehicle upon encountering a snow patch on the roadway, causing the vehicle to strike a tree. The fsd system is the primary failed component. The vehicle has been towed and is available for inspection upon request. How safety was put at risk: the fsd system was in active control of the vehicle's steering, braking, and acceleration at the time of the failure. The system provided no auditory alert, no visual warning, and no driver takeover request prior to the loss of control event. The failure occurred without any warning whatsoever, leaving insufficient time for the driver to intervene and prevent the collision, despite the driver actively supervising the system in full compliance with Tesla's own supervision requirements. Three occupants were placed in immediate risk of serious injury or death. The vehicle struck a tree and was totaled. Emergency services responded and evaluated all occupants on scene. An official police report was filed. Prior warning lamps, messages, or symptoms: none. The fsd system issued zero warnings — no auditory alerts, no visual prompts, no haptic feedback, and no takeover requests — at any point prior to or during the loss of control event. The system was operating normally by all displayed indicato.
See
all problems of the 2024 Tesla Model Y
🔎.
Subject: Tesla self-driving / autopilot incorrect maneuver at intersection vehicle: Tesla (model: **)?software: full self-driving / autopilot (specify which was active)?date: __[xxx]__?time: _[xxx]___?location: _** carlos, [xxx] ____ (city, intersection or street) description of incident: while the vehicle was operating with Tesla’s driver-assistance system engaged, the navigation indicated the car would turn right at an intersection. As the vehicle approached the intersection and began the maneuver, it unexpectedly continued straight instead of completing the right turn. This caused the vehicle to enter the intersection in front of other vehicles that were stopped at another traffic light. I had to intervene to ensure safety. There was no clear reason for the incorrect maneuver, and the system behavior was unexpected and potentially dangerous. Additional information: •weather conditions: ___good weather _clear__ •traffic conditions: ___lot of traffic people getting off work___ •driver intervention: yes / no (explain briefly)yes heading straight to cars if I had not made a quick maneuver and turn the steering wheel. I would have crash hitting at least one or two cars. •dashcam footage available: yes / no no I am submitting this report so the event can be reviewed for possible software or safety issues with the driver-assistance system. Information redacted pursuant to the freedom of information act (foia), 5 u. S. C. 552(b)(6).
The vehicle has a phantom breaking problem. It does that 3 times daily on the same route in the same place. This is a big safety flag. It has the same problem on the autopilot & normal cruise control.
I was driving with the car in autopilot in the center lane of the highway. I could see a white vehicle move close on my right, and I confirmed that it had crossed over into my lane with both its front and back left wheels on my Tesla screen with a quick glance. I then tried to maneuver to the left lane (which my screen showed was clear) but the steering wheel had significant tension. All of a sudden, I heard the chime signaling that autopilot disengaged, the car jerked hard, and my car was headed toward the median. I swerved quickly to the right and hydroplaned. As I straightened out, my vehicle hit a vehicle in the right hand lane. The strong tension followed by jerk and quick release of tension prevented me from having full control of my vehicle. I believe that if autopilot had not been engaged, I could have safely moved into the left lane with control. The police arrived but I was very shaken up and simply said I swerved to avoid a car in my lane but in retrospect wish that I had gone into detail. I’d also like to note that I have a clean driving record, was not in a hurry, did not have loud music on, had both hands on the wheel at 10 and 2 o’clock, and was focused on the road.
Tesla's full self-driving software continues to malfunction. On January 30th at 9:50 a. M. Mountain standard time I was parked at work and discovered my vehicle was left unlocked with a partial window unrolled. This was my first indicator of many faulty problems signaling the vehicle was not operating properly. The vehicle has a proximity sensor via bluetooth that locks and unlocks the vehicle on the owners approach was not properly working. On January 31st I experience a catastrophic failure with the software of the vehicle as I was locked out of my vehicle after utilizing summons feature that stopped the vehicle in the middle of the intersection obstructing traffic. Police arrived on scene within 30 minutes and we were unable to put the car in any gear to either drive the vehicle or place it in neutral to roll it out of the way. I have detailed video of the incident which is a better accounting of the incident on both occasions, the 30th and 31st of January. This is roughly the third or fourth occasion of Tesla's system producing a catastrophic failure resulting in damage to the vehicle or property damage.
On January 28, 2026, the vehicle (a Tesla Model Y) was operating with the full self-driving (fsd) system actively engaged in normal highway traffic conditions. Without any prior warning messages, chimes, or system alerts, the fsd system abruptly disengaged approximately 1 second before the vehicle steered into a concrete road divider. Component failed: the Tesla full self-driving (fsd) software and underlying perception/control systems failed to maintain the lane and abruptly handed over control with insufficient time to react. The vehicle and its data are available for inspection. Safety risk: the sudden, sub-second disengagement provided the driver with zero time to react, take manual control, or perform an evasive maneuver. This system failure directly resulted in a high-speed collision with a road divider, causing significant vehicle damage and physical injuries to the driver that required medical treatment. Inspection/confirmation: the vehicle is currently subject to an open insurance claim and has been evaluated for repair. Warnings: there were no warning lamps, messages, or requests for the driver to take over prior to the immediate 1-second window before the crash. Evidence: I have downloaded and preserved the complete event data record (edr) and the multi-angle dashcam video files. The edr data mathematically documents the sub-second disengagement, and the video files confirm normal traffic flow with no external hazards forcing the collision. I can provide the edr pdf and video files to NHTSA investigators immediately upon request.
When driving with cruise control engaged the car randomly brakes and shows a "curvature assist " indication. The car quickly slows from 55 to 35 with no warning. This primary happens on a straight road, not a curve. I am concerned about the traffic behind me.
See
all problems of the 2021 Tesla Model Y
🔎.
On full self driving, there is no way to change the follow distance and it follows way too close very often. This generates incidents that the car reports to Tesla insurance and we are penalized for something we can't always control. We also can't change our speed but I'm more concerned about the follow distance on highways. Last incident at 7:46pm on 1/25/2026.
On January 20, around 9:35 am, phoenix time, I had the car in self-driving mode for a left-hand turn at the intercession of camino real and river road in tucson, arizona. A real-time view shows that it’s a tricky and dangerous left-hand turn. For the past 3 weeks, the car navigated it well, waiting until it was perfectly safe to do so. Today, however, the car moved quickly and unexpectedly into the center of river road, narrowly escaping a head on collision with a west-bound car, and then paused, squeezed in between west and east-bound lanes when I tapped the brake and took the wheel. Everything happened so quickly. I made the left turn into the east-bound lane, but, looking back, I don’t know how an accident didn’t occur, as traffic was still moving in east-bound lanes rapidly. There must have been enough distance between two cars at just the right time, that nothing hit me.
Tesla supervised self driving (fssd) update 14 removed the ability for the vehicle operator to set speed targets. Additionally, their system is unable to accurately determine speed limits for many roadways. The discrepancy between fssds perceived speed limit and actual speed limit can be as much at +/- 25mph. The discrepancy can occur suddenly and at any time during a drive, even on stretches of road with a consistent speed limit. This results in moments of dangerous acceleration and/or deceleration that is not requested by the vehicle operator or can be reasonably anticipated. This causes erratic driving behavior to both the Tesla and to other motorist in the vicinity. Erratic, unpredictable driving is a major cause of traffic accidents that can lead to serious damage, injury, or death. Prior versions allowed the operator to set a speed target, similar to standard cruise control, that the vehicle attempted to maintain and would not exceed. Fssd v14 does not allow the vehicle operator to have any control over the speed of the vehicle to any degree that meaningfully contributes towards safe, legal driving.
I was driving in full self driving mode on my Tesla Model Y. The car stopped at the left turn stoplight as it does normally. And then while the light was still red and the cross traffic was green (traffic flowing), the Tesla released the brake, accelerating into the intersection. I stomped on the brake and kept the car from fully entering the intersection, then continue home without using full self driving mode any further. If I hadn’t intervened, the Tesla would have caused an accident, injury, or even death.
On Saturday morning January 3, 2026 I was driving with my wife, daughter and dog. There was light rain at the time and I had my adaptive cruise set at approximately 72 mph. I was driving our 2024 Tesla y. At approximately 10:21am the rear of car began to slide right as we entered gradual left curve, possibly because we hit water on road. I tried to slightly correct right and when I did that the steering wheel aggressively corrected back left which put us into a slide. Within a second we hit a tree on the passenger side. The impact was severe and on the passenger side door where my wife was sitting. At that time I theorize that my dog was ejected out of the back window and thrown across the freeway. We then rolled in the other direction (I dont know how many times) down a hill and settled upside down. I immediately smelled smoke and began to try get out of my seatbelt but I couldn’t. I carry a knife and I was able to cut myself out. I then tried to open door and could not so I began punching the glass and eventually got it moving and pushed on it and got it open. I slid myself out, turned around and saw car was on fire. I pulled my daughter out and then began extracting my wife. She was unconscious and had obvious orthopedic injuries. I am 100% confident that had I been driving my other vehicle (non Tesla) that this accident would have not occurred. The Tesla and its steering system caused this accident. On top of that, the batteries immediately caught fire, my seatbelt release button did not work, and the electric door button did not work. Had I not been carrying a knife and strong enough to punch door open this story would have ended up differently. The car completely burned. As far as I know, this was the first time this exact issue presented itself. However, when looking online its seems this issue is fairly common with Teslas in wet weather.
Vehicle equipped with Tesla full self-driving (supervised v 14. 2) software exhibits unsafe automated following behavior. The system maintains following distances that are too short at steady speeds, including behind stable lead vehicles with no braking or traffic disturbances. Tesla’s own safety score system flags this behavior as “following too closely,” indicating elevated collision risk. However, the driver has no available control to adjust minimum following distance or impose safer headway while using fsd this represents unsafe autonomous tailgating behavior with loss of driver control authority over safe following distance, creating increased risk of rear-end collision and hazardous automated driving conditions. Tesla offers several controls (sloth, chill, standard, hurry, and madmax) modes, none of which solve the issue if the driver in front is driving the speed limit. My car under fsd will crowd (or tailgate) the front car regardless of the mode. In one instance, I was following a car under fsd in sloth mode and it made the car in front of me pull over just to get me off their tail. . . . This is certainly now how I wish to drive, and don't want my car driving this way. Failure mode: automated headway policy dominance, lack of driver override, unsafe autonomous following distance.
On December 30th, 2025, my wife experienced a serious and unexpected safety event while using autosteer in one of our Tesla Model Y cars (we own two of them). The version of the vehicle software in use was v12 (2025. 26. 8) and the full self driving (supervised) software was v12. 6. 4. The vehicle came to a complete stop at a red traffic light on northbound san gabriel blvd in pasadena at the intersection that controls entry to the I-210 west. After waiting for approximately 55 seconds, autosteer initiated forward movement into the intersection while the traffic light was still red. The vehicle’s movement was not caused by driver input. My wife’s feet were away from the pedals at the time. There was no visible lead vehicle movement or other obvious external trigger. Fortunately, my wife was attentive and immediately disengaged autosteer, brought the vehicle to a stop, and carefully reversed back into a safe position. Moments later, multiple fast-moving vehicles crossed the intersection in front of her path. Had she not intervened promptly, there was a realistic risk of a serious side-impact collision. This serious incident was reported to Tesla on 12/30/2025 including a video of the incident. In my correspondence, I requested that all relevant vehicle telemetry, autosteer/fsd logs, and any uploaded video associated with this time and location be preserved and reviewed. As of today (7 weeks later) they have not followed up with me. The 18-second video was recorded by an independently installed viofo dashcam, which clearly shows: . The vehicle stopped at a red light, . The light remaining red, . And the vehicle beginning to move forward unexpectedly.
Autopilot was engaged and active at the time of the incident. The vehicle failed to detect a stationary piece of road debris (appearing to be a dropped vehicle part) located in the driving lane. The system did not provide any warning, slow the vehicle, or attempt an avoidance maneuver. The vehicle drove directly over the debris, resulting in damage to the lower exterior/body panel. Road and traffic conditions were normal. This raises concerns about autopilot’s object detection and hazard response capabilities.
Vehicle equipped with Tesla full self-driving (supervised v 14. 2) software exhibits unsafe automated behavior due to removal of driver speed control. The system infers speed limits and driving speed without allowing the driver to set a safe maximum speed. In residential neighborhoods with children, pedestrians, and shared social spaces, the vehicle drives at model-inferred speeds that are socially and physically unsafe. Driver is unable to impose a lower safe speed limit without disengaging. Tesla removed previous option for driver to adjust speed. In a state park campground, the system failed to detect a posted 15 mph speed limit and inferred a 55 mph limit. The vehicle accelerated to unsafe speeds on narrow, pedestrian-heavy roads, with no driver ability to cap speed while under fsd. This represents a loss of human override authority and unsafe autonomous system behavior, creating pedestrian hazard and safety risk. Failure mode: automated speed inference dominance, map prior misclassification, lack of driver override, unsafe autonomous acceleration in pedestrian environments.
There is a serious safety-related fsd (full self driving) fundamental design flaw with stop sign behavior. On previous versions and on the latest and best version of fsd (currently that is v14. 2. 1. 25) when fsd approaches a stop sign and there is no white painted stopping line, fsd will make its full initial stop (also called the zero-speed stop) directly at or behind the stop sign instead of making the initial full stop beyond the stop sign at a location where the driver can see cross traffic. Sometimes the fsd initial full stop is 20, 30, 40, even 50+ feet back away from the edge of the road. At these distances from the edge of the road, most of the time, there is no visual of cross traffic left and right. The fsd stop then turns into the fsd "creep" where fsd, after stopping 30 feet back will then commit to the turn from 30 feet back giving drivers little to no time to see cross traffic. If I am the supervisor of fsd who is liable for my safety and my vehicle's safety, I need to be able to see cross traffic before my car (with fsd engaged) decides to commit to the turn, but fsd does not care if the driver can see. The "creep" is perhaps the least human-like manuever that fsd performs. From the stopped location directly at the stop sign, they creep may inch up and stop again, it may inch up a couple times and stop again, it may pull up to the edge of the road and stop again, or it may just pull out into oncoming traffic in one swift motion. Bc of this behavior, fsd has almost got rear ended countless times at stop signs. Also, cross traffic see's the creep and thinks I'm about to pull out in front of them drivers go beyond the stop sign to a location where they can see to make their one and only full stop. To avoid this issue, fsd needs to do this too (I. E. Make initial full stop at the edge of the road) this is legal in mostly every state (I live in PA) Tesla has not provided a single response to these reports and nothing seems to be getting done about it.
The vehicle was in supervised full self driving on the turnpike, when all of a sudden it darted out of the lane into th grass hitting a guard rail.
When in self driving mode, which activates the adaptive cruise control, it is not possible to set the following distance. The following distance automatically selected by Tesla self driving is much too close to the vehicle in front of me. Tesla has removed the ability to set the following distance. It follows at approximately 2 seconds behind the car in front of me, regardless of my vehicle speed . . . . At 80 mph 2 seconds is not enough time for a driver to react. Following distance should be controllable by the driver. Taking away this ability deprives a driver of driving within their own limitations.
On several occasions, my Tesla Model Y has braked for no reason while using traffic aware cruise control as well as when using autopilot. I can re-create this situation on the same parts of the highway. The sudden, uncommanded stopping creates a hazardous situation with the cars being me as I may get rear ended. At this point, I do not feel safe using autopilot or tacc. I submitted a ticket to Tesla but they declined to work on it.
The full self-driving (fsd) stopped working although it was fully paid for at the time of vehicle purchase. Along with that, the car stopped receiving/installing over the air software updates, the lane markings in the display disappeared, and the automatic cancelling turn signals stopped working (requiring me to manually cancel them). I took it to Tesla service and was told that the car needs a new computer at a cost of approximately $2,300. To answer the bullet points above: 1. The main vehicle computer malfunctioned. 2. I paid for fsd for the safety it provides - without fsd, the car is simply less safe in traffic. 3. The Tesla service center confirmed that the vehicle computer has malfunctioned. 4. There was no indication of any kind, until suddenly the fsd stopped working and I noticed subsequently that downloading/installing software updates failed, lane markings disappeared, and automatic cancelling turn signals stopped working requiring manual operation.
See
all problems of the 2020 Tesla Model Y
🔎.
Incident report – Tesla y – unintentional acceleration Saturday November 29, 2025, approximately 2:40 pm on the above-referenced date and time, we were returning home with the intention of pulling the car into our garage, an activity performed over a hundred times in the one year we have owned the car. Because of the need to back in to allow the car to be charged, the following procedure is adhered 1. Disengage self-driving (if it was being used) in the street prior to entering the courtyard for residence. 2. Pull forward three-quarters of the way into garage (since the residence is the last on the courtyard and the car cannot be pulled beyond the garage opening to directly back in). 3. Place car in reverse and slowly perform k-turn to back into garage. On the date in question. The following anomalies occurred: 1. After placing the car in reverse (action 2), instead of starting to back out, the car lurched forward. The brake was immediately applied and the car stopped about a foot before hitting the garage wall. 2. The brake was applied again to assure self-driving was disengaged and proceed with action 3. 3. Car was placed in reverse and accelerator was lightly pressed to slowly back out of garage. Rather than slowly back out (as was done hundreds of times) the car immediately accelerated and rapidly gained speed. 4. The brake was applied to stop the car but could not overcome the acceleration (pictures of skid marks were taken). 5. In order not to directly hit the wall located across from garage, car steering wheel was turned to minimize any direct contact ( skid marks will show turn). 6. Car came to a stop when it hit the wall causing significant damage to the car and causing damage to the wall (pictures can be forwarded).
Incident date 11/27/25 incident location: CA state highway 101between san luis obispo and los angeles. Driving conditions: daylight/dry roadway. Description of safety defect / complaint: during a single approximately 400-mile highway road trip, while using Tesla’s traffic-aware cruise control (tacc) feature enabled, the vehicle abruptly and forcefully applied the brakes on at least six (6) separate occasions without any apparent cause. On each occurrence: • no vehicle ahead was braking or decelerating, • no vehicle was merging or cutting in front of my vehicle, • no stationary or moving obstacles (including overpasses, road signs, or debris) were present in or near the travel lane, • the forward roadway was clear and unobstructed for a considerable distance. These sudden, un-commanded braking events were severe enough to cause significant deceleration, requiring me to immediately intervene by pressing the accelerator pedal to override the system. Due to the frequency and unpredictability of these phantom braking events, I no longer feel safe using traffic-aware cruise control or any Tesla advanced driver-assistance features that rely on the same sensor suite and software. I am filing this report because repeated uncommanded braking in highway traffic constitutes a serious safety hazard that could lead to rear-end collisions, particularly when closely followed by other vehicles or commercial trucks. Additional information (if applicable): • software version at time of incident: v12 (2025. 38. 9 fe 714a33a545) • full self-driving capability package: no. Enhanced autopilot: no. • any dashcam or sentry mode footage available: no I request that NHTSA investigate this recurring phantom braking issue in Tesla vehicles equipped with traffic-aware cruise control and autopilot systems.
The car has a feature called "curvature assist" that takes control of braking and steering at times when using cruise control. The incidents are at times surprising as there is no curvature present but the braking is suddenly applied. This is a surprise for both the driver and any following traffic. The car would be safer if this feature could be disabled but that is not possible.
My car installed update v12 (2025. 38. 8. 7) last night. This morning I was driving to work using autopilot when alarms sounded, the hazard lights turned on, and the screen flashed a warning that I had to take over immediately. The message included a note that autopilot had failed due to a "systems error. " the navigation and visualization screen froze, went blank, and took 10 minutes to come back on. I asked Tesla to roll back the update and they have told me that they cannot do that. I came very close to crashing into a concrete guard rail as the car was going around a turn when the system failed.
Very scary! heading west on [xxx] , my 2026 Tesla Model Y ran 2 red lights! it stopped at the first red light that sits back about 100 feet from [xxx] , and then just sped ahead, went through that light and the one directly on [xxx] and made a right turn. Crazy! information redacted pursuant to the freedom of information act (foia), 5 u. S. C. 552(b)(6).
The latest version of Tesla fsd software does not let me fully control my speed. You can switch profiles or stop using full self-driving, but you cannot set the speed to what you want. In the last version, you could use the right scroll wheel to increase or decrease the speed. Now that shifts profiles but even that does not give you control over what exact speed you are driving.
Currently on the newest Tesla full self driving software v14. 1. 4, it is phantom breaking as well as phantom swearing out of nowhere. It is swerving and using the breaks so hard it actually affects the steering wheel and self disengages it's self (without the big-loud safety alert to immediately take over control), it just does the soft disengage audio notification, like if you wanted to cancel fsd yourself. The serious problem is that the car does not stop like it should, instead it just continues it seems like it just neutrally rolls on its own drive and twice in the last week it would swerve directly into oncoming traffic. It is not supposed to be doing this at all, especially now that the "hands off" approval is enacted. I drive about 450 miles a week and this has happened about twice a week for the last two weeks. Thank god I was paying attention all four times, but I am so worried this is going to kill someone. I have tried to call Tesla but they purposely don't staff their phones and no one answers.
| Problem Category | Number of Problems |
|---|---|
| Adaptive Cruise Control problems | |
| Automatic Emergency Braking problems | |
| Warnings problems | |
| Forward Collision Avoidance problems | |
| Adaptive Cruise Control Software problems | |
| Automatic Emergency Steering problems |