22 problems related to adaptive cruise control have been reported for the 2026 Tesla Model Y. The most recently reported issues are listed below. Please also check out the statistics and reliability analysis of the 2026 Tesla Model Y based on all problems reported for the 2026 Model Y.
There is a serious safety-related fsd (full self driving) fundamental design flaw with stop sign behavior. On previous versions and on the latest and best version of fsd (currently that is v14. 2. 1. 25) when fsd approaches a stop sign and there is no white painted stopping line, fsd will make its full initial stop (also called the zero-speed stop) directly at or behind the stop sign instead of making the initial full stop beyond the stop sign at a location where the driver can see cross traffic. Sometimes the fsd initial full stop is 20, 30, 40, even 50+ feet back away from the edge of the road. At these distances from the edge of the road, most of the time, there is no visual of cross traffic left and right. The fsd stop then turns into the fsd "creep" where fsd, after stopping 30 feet back will then commit to the turn from 30 feet back giving drivers little to no time to see cross traffic. If I am the supervisor of fsd who is liable for my safety and my vehicle's safety, I need to be able to see cross traffic before my car (with fsd engaged) decides to commit to the turn, but fsd does not care if the driver can see. The "creep" is perhaps the least human-like manuever that fsd performs. From the stopped location directly at the stop sign, they creep may inch up and stop again, it may inch up a couple times and stop again, it may pull up to the edge of the road and stop again, or it may just pull out into oncoming traffic in one swift motion. Bc of this behavior, fsd has almost got rear ended countless times at stop signs. Also, cross traffic see's the creep and thinks I'm about to pull out in front of them drivers go beyond the stop sign to a location where they can see to make their one and only full stop. To avoid this issue, fsd needs to do this too (I. E. Make initial full stop at the edge of the road) this is legal in mostly every state (I live in PA) Tesla has not provided a single response to these reports and nothing seems to be getting done about it.
When in self driving mode, which activates the adaptive cruise control, it is not possible to set the following distance. The following distance automatically selected by Tesla self driving is much too close to the vehicle in front of me. Tesla has removed the ability to set the following distance. It follows at approximately 2 seconds behind the car in front of me, regardless of my vehicle speed . . . . At 80 mph 2 seconds is not enough time for a driver to react. Following distance should be controllable by the driver. Taking away this ability deprives a driver of driving within their own limitations.
On several occasions, my Tesla Model Y has braked for no reason while using traffic aware cruise control as well as when using autopilot. I can re-create this situation on the same parts of the highway. The sudden, uncommanded stopping creates a hazardous situation with the cars being me as I may get rear ended. At this point, I do not feel safe using autopilot or tacc. I submitted a ticket to Tesla but they declined to work on it.
Incident date 11/27/25 incident location: CA state highway 101between san luis obispo and los angeles. Driving conditions: daylight/dry roadway. Description of safety defect / complaint: during a single approximately 400-mile highway road trip, while using Tesla’s traffic-aware cruise control (tacc) feature enabled, the vehicle abruptly and forcefully applied the brakes on at least six (6) separate occasions without any apparent cause. On each occurrence: • no vehicle ahead was braking or decelerating, • no vehicle was merging or cutting in front of my vehicle, • no stationary or moving obstacles (including overpasses, road signs, or debris) were present in or near the travel lane, • the forward roadway was clear and unobstructed for a considerable distance. These sudden, un-commanded braking events were severe enough to cause significant deceleration, requiring me to immediately intervene by pressing the accelerator pedal to override the system. Due to the frequency and unpredictability of these phantom braking events, I no longer feel safe using traffic-aware cruise control or any Tesla advanced driver-assistance features that rely on the same sensor suite and software. I am filing this report because repeated uncommanded braking in highway traffic constitutes a serious safety hazard that could lead to rear-end collisions, particularly when closely followed by other vehicles or commercial trucks. Additional information (if applicable): • software version at time of incident: v12 (2025. 38. 9 fe 714a33a545) • full self-driving capability package: no. Enhanced autopilot: no. • any dashcam or sentry mode footage available: no I request that NHTSA investigate this recurring phantom braking issue in Tesla vehicles equipped with traffic-aware cruise control and autopilot systems.
My car installed update v12 (2025. 38. 8. 7) last night. This morning I was driving to work using autopilot when alarms sounded, the hazard lights turned on, and the screen flashed a warning that I had to take over immediately. The message included a note that autopilot had failed due to a "systems error. " the navigation and visualization screen froze, went blank, and took 10 minutes to come back on. I asked Tesla to roll back the update and they have told me that they cannot do that. I came very close to crashing into a concrete guard rail as the car was going around a turn when the system failed.
Very scary! heading west on [xxx] , my 2026 Tesla Model Y ran 2 red lights! it stopped at the first red light that sits back about 100 feet from [xxx] , and then just sped ahead, went through that light and the one directly on [xxx] and made a right turn. Crazy! information redacted pursuant to the freedom of information act (foia), 5 u. S. C. 552(b)(6).
The latest version of Tesla fsd software does not let me fully control my speed. You can switch profiles or stop using full self-driving, but you cannot set the speed to what you want. In the last version, you could use the right scroll wheel to increase or decrease the speed. Now that shifts profiles but even that does not give you control over what exact speed you are driving.
While driving in rain at night, the vehicle’s automated driving system attempted to make a turn at an intersection near active train tracks. Instead of remaining on the roadway, the system steered the vehicle directly onto the train tracks, where the vehicle became stuck between the tracks and the paved road surface. All four tires were damaged, and the vehicle could not return to the travel lane under its own control. If a train had been approaching, this situation could have resulted in a severe or fatal collision. I had to manually reverse the vehicle a significant distance to return to the roadway. The malfunction appeared to result from the automated driving system misidentifying the roadway under rain and low-visibility conditions. This suggests a recurring risk for any vehicle using the system at this location in similar conditions. The safety issue has not yet been inspected or confirmed by the manufacturer. No warning lights or alerts were displayed prior to the incident. The affected components and system are available for inspection upon request. I also have dash-cam video of the incident documenting the event.
Fsd was unable to avoid object in road, may be due to how the adaptive headlights work as I was unable to see the object as well fsd/adaptive headlights may may it hard or impossible to see objects on road at night no in the process of a insurance claim no.
Tesla fsd v14. 1. 3 feels unsafe for the typical “average” driver because it no longer allows users to set a maximum speed limit offset when using the standard driving profile. In fsd v13, drivers could specify a max speed offset—such as 10% or 15% above the posted limit—when fsd was enabled. However, this option is no longer available in fsd v14. As a result, fsd v14 in the standard profile often speeds on highways and makes aggressive lane changes to pass slower vehicles. This behavior feels unsafe and stressful for most everyday drivers. Tesla should restore the ability for each driver to set their own comfortable max speed limit and reduce the aggressiveness of lane changes.
Tesla fsd issues: 1. Phantom braking sudden to a full stop for no known reason in the middle of the road, throwing my dogs into the floor and the car stopped completely on the highway. If an infant had been in the car ina car seat, their neck could have been broken 2. Sudden jerkiness when making turns 3. In one case the car suddenly changed lanes and was a close encounter. 4. Fsd system is now completely non-functioning. 5. These issues started about about 1 month ago and worsened in the past 1 week after the latest software update. Tesla premium connectivity navigation issue 1. Instead of taking me on main roads to my destination, it has been taking me down rural, isolated dirt roads. In one case I was directed to a dirt, logging road, isolated and dangerous terrain. I had to turn around in a secluded area and the navigation system kept directing me to dirt roads, although main roads were close by once I opened my iphone for directions. 2. In lynchburg, va the navigation directed me towards an exceptionally steep downgrade to the river and would have been a catastrophe 3. These issues started this weekend 10/24 - 10/26/25 door locks 1. Door locks respond occasionally and there have been times I left the car assuming it would lock but it had not. There is no rhyme or reason to this. 2. This issue started 10/25/25 I am following up with Tesla next but thought you should be made aware. I did the recent Tesla software update and things just got worst.
Upon purchase of the vehicle, unlike previous models and versions of the software - the vehicle’s speed limit control was incapable of being set - instead, using fsd, you had to select from percentage based offsets - which were frequently ignored and would still go much higher or much lower than the percentage value set. The owner of the vehicle was unable to say limit to only 10 over the speed limit; instead it was percent based and not reliable. In addition, more than a third of the time - the speed limits in the infotainment system did not match the posted speed limits. Sometimes, in the middle of a highway - it would randomly go from 55 to 25, posing a significant threat hazard when using the fsd or cruise control system (which is something Tesla charges extra for access to) causing the vehicle to immediately decrease speed when other vehicles behind are getting up to speed and not expecting a vehicle to randomly slow down excessively. Additionally, when you manually override the speed limit control, or the system does, the user interface hides this limit so the driver is unaware of what limit of speed it is set to. These issues were already bad enough, but just yesterday my vehicle was pushed a software update that dramatically reduced the performance of fsd, and now I am unable to manually control the speed limit at all in fsd. This was a regression from purchase, when a scroll wheel at least allowed me to manually set the speed limit or correct it when it was wrong. Now, that feature has been removed and instead there’s a duplication of fsd profiles (left/right on the right scroll wheel does the same as scrolling up/down). And continuing this trend, the driver is incapable of seeing what the speed limit is set to because it’s hidden from the user interface. In addition, the driving behavior for fsd has significantly regressed from before the update. Cruise control’s most basic functionality should be to set a speed limit. Now this feature has been removed.
On the day of the incident, I drove my vehicle into the charging station and activated the Tesla fsd autonomous driving and automatic parking functions. During the automatic parking attempt by the vehicle, the system exhibited the following severe loss of control behaviors: 1. The vehicle automatically collided with the vehicle in front without any human operation. 2. It paused for about 1 second after the collision. 3. Then, the vehicle suddenly accelerated in reverse at high speed, again without any driver operation. 4. It directly hit another Tesla that was charging behind my vehicle. The entire process is fully executed automatically by the vehicle system. I cannot intervene or stop the system through the steering wheel or brakes. Ii. Severe security risks when the accident happened, my friend was sitting in the passenger seat. If the vehicle had moved a few more inches, it could have caused serious injury or even a life-threatening situation. This system failure is a combination of sudden unintended acceleration (sua) and the loss of control in autonomous driving decision-making, posing a significant systemic risk to public traffic safety. I immediately contacted Tesla and sent a complaint letter to the north carolina consumer center. They accepted my complaint and sent an email to Tesla, but it has been over twenty days and I have not received a response from Tesla.
Reporting fsd incident on 10/08/25 approx around 8. 15 pm I was going on I-90 towards western mass / u mass amherst using fsd on 10/08/25 on I-90 road work was ongoing and it was 2 lanes. All the vehicles started going in the right lane as left lane had cones . Left lanes cones were tapering on its way ahead. My Tesla fsd chose the left lane which I would not choose as I saw at the distance cones were tapering to form a single lane . Tesla fsd drove in left lane and as it noticed cones tapering into single lane fsd acutely cut into the right lane in front of the of the truck at a very very narrow margin . Once the right lane , fsd asked me to take control of the car. From there till u mass amherst I could not use fsd . This was scary experience. I wanted to report using voice command but I got busy . Please you must have recording of the event. No body was hurt no vehicle damage just wanted to make aware about fsd.
I would like to formally report several incidents I have experienced with my newly purchased Tesla Model Y. I acquired the vehicle on [xxx], collected it from washington d. C. , and drove it back to houston between September 20 and 21. Prior to my return journey, I conducted a test drive. During this drive, at dusk, I engaged full self-driving (fsd) mode. The vehicle stopped appropriately at a traffic light; however, once the light turned green, it moved forward but veered into the yellow safety buffer zone located between the lanes of opposing traffic. I intervened by manually steering the car back into the correct lane. A similar occurrence happened in washington d. C. While attempting a left turn under a highway overpass—the vehicle again crossed into the yellow lines. These incidents suggest that the fsd system did not reliably detect the yellow lane markers. On the weekend of [xxx], during my drive to houston, I primarily used fsd. While navigating a road construction zone, the system failed to recognize large safety cones (yellow columns) and nearly collided with them. Fortunately, I was attentive and promptly took control, braked, and changed lanes to avoid an accident. This demonstrated the fsd’s inability to identify these safety markers. On Tuesday, [xxx], I drove the vehicle to work and returned home using fsd. As I exited the highway onto a ramp merging with frontage lanes, I observed that fsd did not reduce speed appropriately and nearly made contact with vehicles on the frontage road. This indicates that the system was unable to interpret the ramp as a short section intended for deceleration and safe merging. On [xxx], while returning home in the evening after work, I used fsd due to light traffic. The vehicle navigated turns and stops satisfactorily and paused roadside before my house. When I resumed manual control to park, the vehicle suddenly became unresponsive, accelerated onto my front yard, struck the flower bed stones, information redacted pursuant to the freedom of information act (foia), 5.
Bought a brand new Model Y and I less than a week I received the below two error message warnings and the airbag light comes on. Cabin occupancy radar obstructed. Front passenger safety restraint system issue. Also to engage the fsd, I have to double tab 2-3 times in order for it to engage.
This happened after I installed the most recent software version. My wife was driving with fsd enabled when the car suddenly braked hard. A little while later, as it was supposed to make a left turn, it suddenly accelerated on its own, ignored a red light, and drove straight through the intersection. A warning light appeared on the screen, notifying her that fsd was inoperable, but it was already accelerating, so it was too late to intervene. Therefore, the car crashed with another car. My wife, mom, and my baby were in the car, and my wife got injury on her face. I have a recorded video, and Tesla should have the log for that moment.
The curvature assist function activates and applies the brakes automatically on straight and level sections of freeways with no observable obstacles or curves, requiring driver to resume acceleration to override the brakes.
Update to odi 11678614. This problem applies to non-fsd (full self driving) Tesla Model Y cars. Tesla's manual describes autopilot as traffic aware cruise control - it's the first step on non-fsd cars. The second step on non-fsd cars, autosteer, is active lane keeping. In older ys, the first stalk pull engages autopilot, second pull engages autosteer. 2026 juniper doesn't have a stalk - a single scroll wheel press engages both. I read the 300 page juniper 2026 Model Y manual before driving it for the first time. On page 108, it lists 7 things that will cause autopilot to disengage. I presumed that comprehensive list was complete. Unable to attach. Having *autopilot* disengage when turning the wheel in *autosteer* is not listed. During my 59k miles on my 2021 model y, I used autosteer for at least 20k miles. Turning the wheel in autosteer did * disengage autopilot. In my first emergency situation in the 2026 juniper Model Y, turning the wheel * disengage autopilot, and the regen properly kicked in for the disengagement. This action came very close to causing a serious accident. My objections are 1) the action of *autopilot* (disengage / remain engaged) when the wheel is turned in *autosteer* is not documented, and 2) it's different between the old and new ys using the same current software. I understand there were problems with autopilot remaining engaged, and perhaps the change was a good idea; that's an entirely different discussion. Make the action consistent and document it and I'm happy. Summary: in heavy traffic, while on autopilot / autosteer (cruise + active lane keeping), the car beside me suddenly tried to pull into my lane. I made an emergency lane change by turning the wheel, but the cruise also disengaged, (contrary to operation in the prior model) and went to maxiumum regen braking, causing the car behind me in the new lane to nearly rear-end me. Punching the accelerator during the surprising braking avoided the collision.
I've driven my old 2021 Tesla Model Y for 4 years / 59k miles. Four days ago, I got a new 2026 Tesla Model Y. The new y has made a change to the interaction of the cruise control and automatic lane keeping. In the old one, first cruise was enabled, then lane keeping. If you disengaged lane keeping by overriding the steering wheel, cruise control was maintained. In the new y, a single selection enables both cruise and lane keeping. The safety issue is that overriding lane keeping by turning the steering wheel disengages both features at once - the old method kept cruise control enabled. I was driving in heavy traffic today on a 6 lane interstate with cruise and lane keeping enabled. The car next to me abruptly swerved into my lane, causing me to make an emergency lane change to avoid a collision. When I overrode the lane keeping to avoid the other car by turning the steering wheel, my cruise control also disengaged (unlike the old system), which caused maximum regenerative braking to kick in, abruptly slowing me in the new lane. I had to punch the accelerator to avoid being rear-ended in the new lane. This change in lane-keeping / cruise control applies to all new 2025/2026 Model Ys - the "juniper" model update. This change, according to reddit and facebook forums, seems to be universally hated and many other drivers have raised the same safety concerns. I agree with them, but didn't make a report until now because I had not yet experienced an issue where this almost contributed to a high-speed crash. Thank you.
While driving along a few roads using traffic aware cruise control (tacc) or autopilot (ap), the car will suddenly brake for no reason at all. Speeds reduced from 75 to 50 in 2 seconds. Had there been a car close behind me, we may have collided. I am able to reproduce this on several other roads as well. I have not yet reported this to the manufacturer, but a review of internet forums shows this to be a relatively common occurrence among owners. They refer to this as phantom braking.
The incident occurred on [xxx] around [xxx] (from the dashcam footages) in a highway (don't know the place, but I have a police report). I was driving my Tesla Model Y 2026 with full self-driving (fsd v13. 2. 8) beta engaged when the vehicle suddenly off the road and hit the fence on shoulder. I retrieved dashcam footage from the incident, including footage leading up to and during the accident. The fsd system malfunctioned, the car is in a body shop and is available for inspection upon request if done quickly; huge risk to myself and others because no way know when the fsd failed; from this link, same issue: [xxx] the vehicle has not inspected by others no any warnings, the fsd had been working properly in the firt 300 miles trip; but all of sudden, swerved to the roadside with full speed. Information redacted pursuant to the freedom of information act (foia), 5 u. S. C. 552(b)(6).
| Adaptive Cruise Control problems | |
| Automatic Emergency Braking problems | |
| Warnings problems |