Tesla Model 3 owners have reported 1,051 problems related to adaptive cruise control (under the forward collision avoidance category). The most recently reported issues are listed below. Also please check out the statistics and reliability analysis of Tesla Model 3 based on all problems reported for the Model 3.
Tesla’s full self driving did not react to emergency vehicles approaching from behind. . Vehicle was a police cruiser emergency lights, and sirens were active on freeway.
See
all problems of the 2022 Tesla Model 3
🔎.
The Tesla ran a red light on two separate occasions while in full self driving mode. The first time it went through without me being able to intervene. Given it had done it before, the second time I was able to stop it. Both times occurred at the intersection of [xxx] and [xxx] . It has not done it at other intersections. Information redacted pursuant to the freedom of information act (foia), 5 u. S. C. 552(b)(6).
See
all problems of the 2023 Tesla Model 3
🔎.
Main incident: time: 2025-08-28 ~12:41 pm pst what happened: after searching for a parking space, the driver was preparing to reverse into the spot, the driver pressed up the stalk to shift the gear into reverse, a sound was heard and something showed up on the screen, the vehicle once again started accelerating and running forward suddenly and sharply (3 mph to 28 mph in 3 secs), driver tried to steer the vehicle away from cars and buildings to minimize danger, the car rushed onto a concrete curb, hit one blue Tesla, then hit the side of a concrete utility pole, and ran over a plant and last crashed into the chain-link french before it finally stops injuries: * spine pain for one passenger damages: ** all tires blown out, tire falls off ** severe damages throughout the vehicle body * one Tesla parked on the parking lot was severely damaged * the utility pole's base box was scraped and deformed * one plant was knocked down similar case 3 few mins ago: time: 2025-08-28 ~12:38 pm pst what happened: the driver entered a oneway parking lot, but found no available spot and was trying to back up, the vehicle suddenly accelerated on its own. The driver performed an emergency brake and stopped the vehicle. At the time the vehicle stopped, it was only about 20 inches away from two cars in front of it injuries and damages: none what component or system failed or malfunctioned, and is it available for inspection upon request? malfunctioning of the electronic control or braking system leading to abnormal acceleration conflicting gear shifting and autopilot engagement design the right stalk is used to both shift gears and engage autopilot features, this design makes it very easy for the system to confuse the two actions autopilot feature safety issues autopilot features (whether its fsd or cruise control) should not be allowed to initiate abrupt acceleration to a dangerous speed in a low speed area (for example parking lot) insurance claim filed, number: 25-647243342.
See
all problems of the 2018 Tesla Model 3
🔎.
I want to add an update to my previous report # 11690629. I think it would be helpful to provide what the max speed should be based on my input. So, I will just resubmit the previous report with some additional information. On December 31, 2024, I purchased a new Tesla Model 3 long range with full self-driving (fsd). The problem is the car goes faster than it should when full self-driving is activated. I have attached an example photo of the Tesla screen taken while the car is being controlled by fsd. It shows the actual speed limit of the road is 35 mph, and the car is traveling at a steady speed of 51 mph. It also shows a “max speed” of 64 mph which means the car may travel that fast on this road if it chooses to. Since I use an offset of 10%, the max speed in this 35 zone should be 39 mph. However, the car increased it to 64 mph, and I certainly did not use the right scroll wheel to increase it. This is clearly a safety issue which can lead to accidents. Tesla has made 5 attempts to fix this without success. On the last attempt (September 24, 2025), they refused to answer my questions such why does the max speed suddenly change to a value much higher than my specification? they simply stated that the system was operating as designed. Really? it’s design to drive 51 mph in a 35 mph zone? furthermore, they went on to say I should disengage the fsd system or intervene manually if I believe the car is operating in an unsafe manner. One reason I purchased this car was for fsd. I would expect them to fix it rather than my having to abandon this feature that I pay for. I hope you will encourage Tesla to fix this issue before there are any more fsd-related accidents.
See
all problems of the 2025 Tesla Model 3
🔎.
On December 31, 2024, I purchased a new Tesla Model 3 long range with full self-driving (fsd). The problem is the car goes faster than it should when full self-driving is activated. I have attached an example photo of the Tesla screen taken while the car is being controlled by fsd. It shows that the speed limit of the road is 35 mph, and the car is traveling at a steady speed of 51 mph. It also shows a “max speed” of 64 mph which means the car may travel that fast on this road if it chooses to. This is clearly a safety issue which can lead to accidents. Tesla has made 5 attempts to fix this without success. On the last attempt (September 24, 2025), they refused to answer my questions such why does the max speed suddenly change to a value much higher than my specification? they simply stated that the system was operating as designed. Really? it’s design to drive 51 mph in a 35 mph zone? furthermore, they went on to say I should disengage the fsd system or intervene manually if I believe the car is operating in an unsafe manner. One reason I purchased this car was for fsd. I would expect them to fix it rather than my having to abandon this feature that I pay for. I hope you will encourage Tesla to fix this issue before there are any more fsd-related accidents.
Component/system involved: advanced driver assistance systems — fsd (supervised)/autosteer (lane keeping assistance) and adaptive cruise control. Possible failure to warn/stop (fcw/aeb). Vehicle and data are available for inspection upon request. I preserved dashcam files and requested Tesla to preserve engineering logs and edr. What happened & safety risk: on Aug 13, 2025 at ~9:30 am pdt on us-101 sb near east palo alto, CA, with fsd (supervised) engaged, the system appeared to misinterpret an exit ramp/gore area as a continuing lane at a highway fork and maintained ~60 mph (posted ~70). As soon as I saw the trajectory was unsafe, I braked and began manual steering takeover, but the vehicle contacted a roadside sign near the gore/shoulder before I could complete the maneuver. I then stopped safely. No other vehicles were struck. Airbags did not deploy. This posed a serious collision risk to me, my passenger, and nearby traffic. Reproduction/confirmation: not reproduced. I have not attempted to reproduce the event. Unknown whether the issue has been reproduced by Tesla or a service center yet. Inspection to date: police responded (report pending). My insurer has opened a claim. I opened a Tesla service request asking to preserve autopilot/fsd engineering logs for the incident window and to coordinate edr extraction; engineering review pending. Vehicle remains drivable. Warnings or symptoms before failure: no audible/visual forward-collision warning was perceived by me and I did not observe automatic emergency braking. No prior warning lamps/messages were noticed before the departure toward the gore. Unknown whether any internal/partial interventions were recorded in logs. . Read more...
See
all problems of the 2021 Tesla Model 3
🔎.
Too fast and too close to stop sign to red light or other cars while in full self drive (supervised). I had to disable it to avoid risks of accident. Even using chilly mode.
Had acc set on 80 mph on the hiway yesterday. A vehicle was passing me and the Tesla suddenly applied heavy brake pressure to the point of tires screeching. Speed dropped very quickly to about 50 mph. I had not touched the brake pedal. Brakes released and speed resumed to 80 mph. Very scary and high risk of rear end collision. Lucky nobody was behind me. Don’t want to experience that again.
Autopilot (adaptive cruise control) will freak out and slam on the brakes when a motorcycle passes by splitting lanes. This is a serious safety issue as cars and motorcycles behind me may not be able to react in time and cause a rear end collision or into the wall or vehicles next to them. It is impossible to override the braking with the accelerator as it will be too late by the time the car accelerates. Flooring the accelerator is also not a safe option.
My Tesla Model 3 disengaged cruise control while going 68 mph and braked hard for no apparent reason. I was driving northbound on I-405 at 3:45 a. M. Going to lax airport in the number 3 or 4 of six or seven lanes. Cruise control had been activated for at least 10 minutes before the incident at a speed of 68 mph. There was no traffic within half a mile before or after me due to the time of day. Neither my left or right foot were near either the brake or acceleration pedal, nor were they in motion. Both hands were on the steering wheel and no control was being activated. The only noticeable environmental condition is that the road surface had just changed from a dark gray asphalt to a bright white new concrete roadway. No emergency braking alert was seen or heard, so I do not think it was a false collision detection, and there was no car within a half mile. However, I do believe the severity of braking could have caused an accident if there had been someone behind me. Tesla Model 3 software version was v12 (2025. 20. 6 046c4575d120).
While driving on autopilot the I-10w on Wednesday July 16, 2025, the car suddenly and very harshly braked for no reason at all. The vehicle in front of me was visible on the vehicle's display, meaning it was keeping it distance and was actively adapting to its speed and distance, and at no time did I press the brake or move the steering wheel for the vehicle to use its emergency braking. It was so harsh, I felt the blood throughout my head and body lunge forward, and autopilot immediately turned off. If I wasn't aware of this sudden braking, I would have assumed autopilot would still be active, but knowing of Tesla's "phantom braking", I immediately took control once I noticed autopilot turned off on its own. I have a video of the dash cam available (over 10mb) if needed.
Description of problem: while driving under normal highway conditions on July 2, 2025, my Tesla suddenly and violently phantom braked — meaning it rapidly and without cause applied the brakes forcefully. There were no obstacles, vehicles, or hazards present. The sudden deceleration caused a severe flare-up of a recent cervical disc replacement surgery I had undergone. I experienced extreme neck and spinal pain and had to seek emergency medical attention. I am now under medical supervision, on pain medication, and unable to work. I contacted Tesla requesting the logs from the incident and have received no response. I’m deeply concerned about the safety of this vehicle and the lack of accountability for this known and previously investigated issue. I am in the process of retaining legal counsel and intend to pursue this further as a product liability and injury case.
I am filing an urgent complaint regarding multiple incidents involving my 2024 Tesla Model 3 (VIN: [xxx] ) that caused major safety risks, personal property loss, and serious disruption to my life. [xxx] – while in a left-turn lane with autopilot engaged, the car attempted to go straight. When I took over, the system malfunctioned severely. (photo evidence available) [xxx] – remote support failed; car towed to Tesla upland service center. June 27, 2025 – issued loaner vehicle. July 1, 2025 – Tesla claimed I must pay for repairs. I requested a written evaluation report, but to date it has not been provided. July 20, 2025 – without consent, Tesla remotely locked my vehicle with my personal belongings inside, violating my property rights. Aug 1, 2025 – Tesla promised review and report but refused to unlock my car. Aug 12, 2025 – after 10+ days, still no report. Upland “options” included threats that my locked car in my garage could be deemed stolen, causing severe mental distress. Demands: 1. Full investigation into autopilot failure and handling delays. 2. Immediate delivery of written evaluation report. 3. Accountability for delays, threats, and property rights violations. 4. Compensation for all losses and disruptions caused. Contact: [xxx] I request a written response within 24 hours. Information redacted pursuant to the freedom of information act (foia), 5 u. S. C. 552(b)(6).
See
all problems of the 2024 Tesla Model 3
🔎.
I was traveling northbound on [xxx} in sugar grove, IL with the traffic aware cruise control on. As I passed under [xxx] the car braked heavily for no reason. No cars in the area. . . Thankfully. I could have easily been rear-ended. It has done this on more than one occasion at this location. I filed a "bug-report" via the right hand button on the steering wheel. I was also traveling [xxx] near lasalle-peru with the cruise control on at about 80 mph when the vehicle suddenly braked heavily. No cars around at all. The moment was extremely frightening. Information redacted pursuant to the freedom of information act (foia), 5 u. S. C. 552(b)(6).
Curvature assist feature agressively brakes while driving at highway speeds. This is supposed to slow down for curves, but frequently intervenes for straight, clearly marked lanes. Feature cannot be turned off. Drivers behind me have gotten agitated several times for the sudden, harsh braking, and I do not feel comfortable driving this car.
My vehicle is equipped with supervised fsd and has had couple of issued on the intersection of [xxx] and [xxx] in portsmouth, virginia. It seems every time I stop at the light, the system recognizes the red light and stops, but shortly after it wants to keep advancing forward although the light is still red and has not changed. The vehicle has also had a problem staying on lane at light on the intersection of [xxx] and [xxx] in suffolk as it is getting ready to turn left prior to entering the entrance road to [xxx] . The image in the screen is correct and the lines are in place, but the vehicle has trouble staying with in the line and needed to be corrected. Information redacted pursuant to the freedom of information act (foia), 5 u. S. C. 552(b)(6).
I was traveling at highway speeds on a highway with no vehicles or other obstacles in front of me. I had adaptive cruise control on. Then, the emergency brakes deployed for no reason (so called “phantom braking”), decelerating the car quickly. I turned off adaptive cruise control as fast as I could and nothing bad happened. But the situation was dangerous, as if a car was close behind me it could have rear ended my vehicle.
For the last few weeks while driving, my car will automatically put itself into fsd with out me authorization. Mean I will we driving and the car will just put itself into fsd. This has happened about 15-20 times over the last 2-3 weeks. Tesla wants to clean the front camera? which makes no sense and charge me for it. If the front camera was obstructed, like in the case of fog. The car would alert me that it could not due to being obstructed, which isnt the case.
See
all problems of the 2019 Tesla Model 3
🔎.
Phantom braking (as it's referred to by experts) happened to me twice on the highway using traffic-aware-cruise-control. Very scary. I was on cruise control going 60 mph when suddenly my car braked suddenly and inappropriately, losing at least 20 mph in a split second before I was able to slam on the accelerator and prevent an accident (rear-end collision). This happened twice in the course of 5 minutes on the same highway! there was no obstruction in the road, no cars moving into my lane, nothing but normal road conditions, when it occurred. From my research, this is a huge problem with Tesla vehicles that thousands of individuals are loudly complaining about online and now in court cases in the country. I see there are thousands of reports to your organization about this, but it says you have only complaints from 2021 and later years. However, I have a 2019 and it's happening to me. And it's happening to every single model year out there, just from researching and talking to Tesla owners. This is a major safety hazard, plain and simple. I was inches of being in a rear-end collision, and drivers behind me are extremely upset and vengeful on the road after an incident like this (understandably!). Because of this I fear for my safety and well-being on the road using Tesla's cruise control features. Very disappointing. Above for the adas option, I wasn't able to select 2 options, but "automatic emergency braking" is also applicable here as that is what happened. Somehow Teslas are falsely activating emergency braking during cruise control!.
I have a safety concern regarding Tesla's full self driving (fsd) supervised system. V13. 2. 8 being the latest customer version. Since the update to fsd v13 in December, I have consistently used it on a daily basis. While it performs great on city streets, it consistently caused a serious safety risk on the freeway: it habitually tailgates cars at high speeds of 70-80 mph on the freeway. It does this even when there is no traffic and the other lanes are wide open. I have experienced this on both a Model 3 (2025) and a model y (2024). I am writing this complaint here hoping you will pay attention to it, as despite my best efforts, Tesla has ignored my feedback. I even opened a service ticket about it at one point, and the service tech couldn't care less. Not maintaining proper distance to the cars in front on the freeway puts the passengers at high risk for a rear end collision. If Tesla is to launch an unsupervised version of their fsd system, this problem must be addressed first!.
Vehicle: 2024 Tesla Model 3 date of incident: [insert date] location: intersection of [xxx] operating mode: Tesla autopilot (engaged) on the above date, while driving with Tesla autopilot engaged, the vehicle initiated an unexpected and improper right turn at the intersection of [xxx] this maneuver was not indicated in the planned navigation route, nor was it consistent with the legal road markings or signage in the area. The vehicle turned into a clearly marked “do not enter” and “wrong way” zone, without any driver input or confirmation. I attempted to intervene, but the turn happened abruptly and could not be safely overridden in time. This resulted in a collision, vehicle damage, and physical trauma to both occupants — including nausea and vomiting in my wife immediately after impact. Dashcam footage from the right repeater camera confirms the presence of multiple “do not enter” signs visible at the point of the turn. We have requested the Tesla event data recorder (edr) logs to confirm that autopilot was active and that steering input came from the system, not the driver. I am reporting this as a serious safety concern related to Tesla’s autopilot system and its decision-making at intersections. If this had occurred in a busier traffic environment, the result could have been far worse. This type of navigational or perception failure should be investigated immediately. Both occupants are seeking medical treatment and continuing to deal with the emotional and physical aftereffects of this incident. Information redacted pursuant to the freedom of information act (foia), 5 u. S. C. 552(b)(6).
While driving from costco in fountain valley to best buy in costa mesa, I was using fsd for the entire route. Everything was running smoothly until we approached south coast drive near the I-405 interchange (costa mesa, CA). At that point, the fsd system incorrectly directed the vehicle to enter a freeway off-ramp, effectively driving into an I-405 freeway exit ramp in the wrong direction. This created a highly dangerous situation, as the ramp was intended for vehicles exiting the freeway, not entering. Fortunately, I was able to take over control of the vehicle and prevent a potential accident, but this was a very close call. Incident details •date: March 21, 2025 •time: between 2:00 – 3:00 pm •location: south coast dr near I-405 on/off-ramps (costa mesa, CA) •route: costco fountain valley ? best buy costa mesa •navigation mode: full self-driving (beta) enabled throughout •conditions: daylight, dry roads, moderate traffic what happened •fsd turned the car onto a road segment that functions as a freeway exit. •the vehicle entered the lane facing oncoming traffic exiting from I-405. •I immediately disengaged fsd and took manual control to correct the situation.
Multiple instances of phantom braking, car will suddenly slam on brakes without any indication of danger or anything to avoid. This has happened at least a dozen times and we are fearful that it will cause an accident. Vehicle will also alert lane departure and attempt to automatically adjust, turn to another lane while in the appropriate lane. Multiple incidents, at least 30 in 6 months of ownership. No collisions as of yet, concerned there could be.
Tesla has told me the ap4 computer has failed in my car which has resulted in them having to replace it. This issue impacts: emergency braking, blind spot detection, lane keep/centering, assisting with parking, automatic high beams and wipers, cruise control/ adaptive cruise control, vehicles built in gps to be stuck at one location every time, has made my touchscreen not automatically brighten or dim, my auto dimming mirrors be too dim in most driving conditions, constantly fail to install system improvements and safety updates, and because my vehicle keeps trying to do this, it has caused increased battery drain as well. This has increased the safety risk of my vehicle tremendously. . . Tesla has not seemed to publicly acknowledge the issue yet either.
After installing Tesla ota software update 2024. 45. 32. 15, my vehicle experienced critical system failures. Upon entering the car the next morning, I noticed that all cameras were non-functional, and navigation had completely stopped working. This posed a serious safety concern, as features like autopilot, parking assistance, and basic visibility enhancements (such as rearview camera functionality) were unavailable.
The full self driving is very dangerous. I had several events that put me in danger. Like, going to the wrong lane. Going to the wrong way. Changing lanes that almost hit a car good thing I took over right away. It made the other car upset and honk at me.
My Model 3 failed at under 50,000 miles and gave out a series of failure warnings. It was unable to drive anymore and steering became stiff and unresponsive. In researching this I see it is a common issue with hundreds of thousands of cars affected. And the NHTSA has an open investigation into it. Tesla as usual is denying the problem exists.
See
all problems of the 2020 Tesla Model 3
🔎.
My Tesla has full self driving supervised. When I use this feature, since about three updates ago, my car has started braking for no reason. It hasn’t happened on a freeway, but, but on secondary roads, it will brake and try to come to a stop. It seems to do this, when there is a shadow across the road, or when there is a puddle in or near the road. Sometimes it will brake for no reason at all. I I have come close to being hit from behind. I think this is a dangerous situation that needs to be looked into.
I was driving east on [xxx] in bourne MA using the Tesla adaptive cruise control system called "autopilot" in very light traffic conditions. The sun was low on in the sky (behind and to the right). The system automatically engaged the brakes while crossing the bridge without obvious reason, decreasing the car speed about 10 mph in less than a second before I overrode by pressing the accelerator. I believe the car vision system mistook a shadow cast by the vertical girders on the bridge onto the road as an obstacle. The sudden deceleration was a hazard to those behind that had to brake, but there was no collision or loss of control. The manufacturer seems acknowledge and accept that the defect exists, but has not been able to resolve the problem. Information redacted pursuant to the freedom of information act (foia), 5 u. S. C. 552(b)(6).
The car consistently has phantom braking. The speed will drop around 15 miles of speed immediately. It also will slow on rolling hills and random long turns on cruise control. Most recent date listed.
12/26/2024 around 5-6pm, full self drive abruptly tried to accelerate into the intersection while sitting at a stop light. I stopped it immediately but was very surprised as fsd v13 has been exceedingly reliable and a pleasure to drive. This is the 2nd time this has happened after the holiday update of fsd 2024. 45. 25. 5 (v13. 2. 2). Both times mine was the lead car at the light with no others ahead of me. This behavior has been reported multiple times in social media groups this past week by other fsd users. I entered a Tesla service request but have not heard back other than an automated appointment scheduled at the service center 8 weeks from my request and the incident.
On December 24, 2024 I was driving on freeway 5 south and all of the sudden my vehicle came to an abrupt breaking without any reason. The traffic ahead of me was of normal traffic. This abrupt breaking happened twice within 5 minutes of time span. I have notified Tesla service center and the service center gave me a quote of $137. 50 for diagnostic and to schedule the service. I felt that this issues should be taken care of by Tesla with no charge to me. This is a safety issue. I have experienced the abrupt breaking issues with my vehicle on many occasions since I own the vehicle in 2018. Each time I was told there's no problem with the vehicle when I notified Tesla service center.
Two major safety issues need to be reported: 1) the left front lower control arm falling off in the middle of driving, caused car wheel stuck on a traffic street (luckily it was off the freeway 5 minutes ago). The long bolts connected the lower control arms were loss due to the defective design (the bolts are screwed upward vertically, falling off is definitely just a matter of time). I found there was a class action against Tesla due to this problem (it is referred to model x and s) 2) the other major safety issue I encountered as I turned on the “advanced autopilot “ feature, the car could break suddenly without warning on the freeway. There were no cars or obstacles in the front. I researched online and found many Tesla owners have experienced this so called “phantom breaking” problem as I had (I had experienced this unsafe auto breaking randomly since I have driven this car).
Hi. My issue is partly related to case pe-22-002 and may also contribute to it as well. I will forward my dash-cam videos (the false detection of cars in-front of me for 63. 5% of that drive is included here) to your email as support for the investigations. I have been noticing warnings that a car is in the lane I am changing into when there is no car present. This has happened several times, despite the camera showing no cars and checking for myself showed no cars. The first time it happened, it scared me so bad I jerked back into my lane and could have lost control of being safely in the lane. I filed a service request and was told it would cost me an est. $195. 00 to have them look at it, but they said they think it is software related and tried to say they don't deal with software issues. It has experienced many occasions of decelerating for no cause when using the full self driving, autopilot and cruise control. It would cause me to have to disengage them as I felt it was effecting my distance efficiency. I found a new issue last night that caused concern. We have been getting high reports of following too closely, so I drove from plainview, TX to amarillo, TX in complete control myself while driving 100% perfectly and it reported that it claimed 63. 5% of that drive was "following to closely" when in-fact any time a vehicle was in-front of me by (I think) 30 car lengths, I changed into the other lane where the was no vehicles in the lane at all. It is my personal opinion that it is a combination of the cameras not being infrared and not seeing well enough in low light situations, and the software unable to tell the difference between darkness/shadows and objects that should be avoided. These false alerts occur more for me in low light conditions, but the unneeded deceleration has occurred in all conditions.
Car randomly breaks or swerves when using cruise control. When there is nothing ahead of me have called Tesla and been told it is called phantom breaking and that it is normal happens multiple times a day, almost causes collisions. Very dangerous.