Tesla Model Y owners have reported 160 problems related to assist (under the lane departure category). The most recently reported issues are listed below. Also please check out the statistics and reliability analysis of Tesla Model Y based on all problems reported for the Model Y.
The vehicle's autopilot/full self-driving (hw3) computer has failed at approximately 70,000 miles. Tesla service has diagnosed the vehicle as requiring replacement of the hw3 computer. As a result of this failure, multiple safety related systems are inoperative, including collision avoidance, lane assist, lane departure warning, automatic wipers, and other driver assistance features including adaptive cruise control. The vehicle is also unable to receive over-the-air software updates, including potential safety updates. In addition, the vehicle experiences system lag, including delay when shifting into drive or reverse. The center display is slow to respond, and certain media and software features are no longer functional due to inability to update. This condition increases safety risk because active safety systems designed to assist with collision prevention are disabled. The failure appears to be hardware-related and not due to normal wear. Tesla has confirmed the need for computer replacement but has declined coverage as the vehicle is outside warranty. The failed component should be available for inspection upon request.
See
all problems of the 2021 Tesla Model Y
🔎.
1. Front passenger side wheel and auto steering system malfunctioned. Yes it is available for inspection. 2. My car was not drivable and I was stuck on the leftmost lane of highway with fast traffic passing by. Had to call 911 to move myself to safety. 3. Not yet 4. Vehicle has been inspected by the police 5. No warning lamps or message prior to failure.
See
all problems of the 2023 Tesla Model Y
🔎.
Autopilot suddenly slams on the brakes going [xxx] on [xxx] just north of ankeny, IA when you hit the underpass at [xxx] [xxx] , [xxx] ). This happens every time for the past year about. No software update has fixed this yet. Information redacted pursuant to the freedom of information act (foia), 5 u. S. C. 552(b)(6).
The incident occurred while the vehicle was being operated with Tesla driver-assistance features enabled, including lane assist and cruise control. The driver experienced unexpected vehicle behavior that felt inconsistent with normal steering response. Although the driver maintained hands on the steering wheel, the vehicle appeared to react in a manner that was not anticipated, contributing to loss of control and a collision resulting in total vehicle loss. No bodily injury occurred. After the incident, the insurer assigned fault to the driver but declined to investigate any potential vehicle system or software-related issues, directing us to the manufacturer. When attempting to report a potential malfunction to Tesla, the manufacturer declined to open a report because the totaled vehicle had been removed from the Tesla app, preventing further system or telemetry review. This report is submitted to document the incident and the lack of access to any system-level review following the loss.
Navigation is frozen. No cameras work. No full self driving features work. No cruise control.
See
all problems of the 2024 Tesla Model Y
🔎.
Computer system failure causing all cameras to fail, mirrors to stay darkened even in daylight, navigation system not updating, collision and lane departure feature failure. Issues were reported to Tesla and an ap 4 computer replacement were scheduled, however service has been delayed for several weeks ( now 9 Jan 2025). The car does not feel sufficiently safe to drive with the impaired mirror visibility couples with the lack of any cameras or safety features working.
See
all problems of the 2025 Tesla Model Y
🔎.
All cameras not working with latest software update - told need new ap4 computer on 2024 Model Y which is 2 months old. Tesla service advisor states - car is totally safe to drive with just mirrors till appointment in few weeks. No backup camera, navigation, battery life dying faster. It seems like a liability to me to drive this way and very negligent to me.
Model y s/w version 12. 5. 4. 1 the Model Y can be made to oscillate in the travel lane at highway speeds I think I experienced this at 70-75, it would seem to apply at any capable speed. Proper behavior: 1) driving on the highway in fsd 2) trigger the left or right turn signal to initiate a lane change by automation. What actually occurs: 1) driving on the highway in fsd 2) hold the turn signal in the desired up or down position to initiate a lane change. 3) the Tesla begins to switch lanes, and will continue to further its lane change provided you continue to hold the turn signal (up/down). 4) if you release the turn signal input "early", that is prior to the Tesla determining that the new lane is the current lane, it will immediately begin reverting to the original lane that you attempted to leave. 5) it begins to over correct, requiring maybe 5 damped oscillations. Its very uncomfortable, and exposes a vehicle on the edge of control - the vehicle is not traveling neutrally on its suspension. To be clear, it is damped, but I don't see how it could be described as stable at highway speeds.
Tesla fsd v12. 5. 4. 1 tried to run a red light without stopping. Although fsd has a multitude of other issues when it comes to breaking traffic laws running red lights shouldn't be one of them. Luckily I was able to override the system and no one was in the area to get hurt and no property damage.
When driving on highways the car will jerk the steering wheel and cause the car to turn when the car is in the center of the lane. This causes the driver to counter the force from the steering wheel to keep the car on the road. This only happens when autopilot and full self driving are not being used. I have taken it to the service center twice. The first time they said it was an issue with the car alignment so they realigned the tires and the second time they said it was a problem with the car’s computer so they replaced the computer. This continues to happen and is dangerous. Sometimes it will give a lane departure warning while other times there are no warnings.
On [xxx] , south of austin, north of [xxx] , on both directions, there are many turn lanes on either side. These turn lanes fork from either of the two highway lanes and do not have clear lane markings (by design, not due to fading) until the lane width is full. Tesla autopilot with lane assist will, at multiple times along this route, center the car between the current lane and the emerging turn. Once the lane markings begin , the car will violently jerk back into a lane. Avoidance of these situations requires a takeover. Along this route, this is particularly dangerous on banks, hills, and in traffic (even with max following distance configured) due to reduced visibility of the start of the lane markings. One example of this is at [xxx] , although there are many others along this route in both directions. Information redacted pursuant to the freedom of information act (foia), 5 u. S. C. 552(b)(6).
Tesla’s “autosteer” function inexplicably and suddenly shifted lanes and sideswiped the other car that was passing me on the right. This defect of the “autosteer” function took control of the steering of the car, even with my two hands on the wheel. I do not know why the Tesla suddenly shifted in the intersection or, more importantly, why it did not detect the other car. I did see the other car approaching before the Tesla swerved into it. I wanted to remain in the left lane because it was the correct and safest thing to do. There were no obstacles ahead of me, and the left lane continued normally.
See
all problems of the 2022 Tesla Model Y
🔎.
I have Tesla’s fsd software that consistently tries to enter closed highway ramps where the flags, lights and barriers are clearly marked as “do not enter”. I have reported this to Tesla a dozen times with no improvement or response. I’m sure there will be retaliation for this but I am sick and tired of swerving to avoid a fatal collision. Too many other safety issues with fsd to document.
See
all problems of the 2020 Tesla Model Y
🔎.
What component or system failed or malfunctioned, and is it available for inspection upon request? system computer. After a over the air (ota) update, most driver assistive functions are inoperative. System feedback display sometime goes black (speedometer, controls, backup camera, etc). After manufacturer serviced car to verify and resolve issue, issue was not corrected. Screen continues to error, asking for a software update. Per the manufacturer this is a computer issue with the cache, and being told to replace computer. How was your safety or the safety of others put at risk? since the driver feedback display and cameras are sometimes non-functional. Back up camera can be unusable, lane keeping, and cruise control are inoperative. Has the problem been reproduced or confirmed by a dealer or independent service center? yes has the vehicle or component been inspected by the manufacturer, police, insurance representatives or others? by an manufacturer repair center. Were there any warning lamps, messages or other symptoms of the problem prior to the failure, and when did they first appear? no, they appeared after an ota update 9-Aug-2024.
We were on the freeway with cruise control on and the car slammed on the brakes. It automatically slowed 15mph while in front of another vehicle. The cause was that it hallucinated a person in the lane of the road, I could see the "person" on the screen of my car. However there was no person or hazard in the way.
Since the recall earlier this year on for the autopilot, I now have a car that is actually more dangerous as it emits a very jarring audible warning when I engage the autosteer option. This has actually caused me to jump in my seat and cause me to turn the wheel creating a dangerous situation. Not only is this a potential accident waiting to happen, it is also superfluous as the driver is the one who pulls in the turn signal stall to engage the autopilot and the autosteer functionality. I do not need a very loud and startling tone to sound advising me that the autosteer function is engaged. Please remove this as it is creating a dangerous driving environment. I have friends who also have Teslas and they confirm this hazardous condition exists in their cars as well. Thanks.
I was on [xxx] heading towards chicago and following the traffic. I started changing the line to pass the vehicle in front of me and suddenly my Tesla completely lost control and started drifting to the sides. I released the accelerator right away but it made it even worse. It seems that acceleration continues to happen that’s what made car to drift. Eventually car slowed down and I was able to make it to my line. I don’t understand why line assistance didn’t work to keep the vehicle stable, why after releasing accelerator the vehicle continued to gain the speed. It was sunny day and dry road. No foreign objects on the highway. Information redacted pursuant to the freedom of information act (foia), 5 u. S. C. 552(b)(6).
Tesla fsd consistently ignores temporary lane closures, and will cheerfully try to drive over the barrier poles. The locale is the san diego pure water project construction zones. They set up lane closures that change every week or so. They use a line of red poles along the middle of the road to block off a lane and direct both lanes of traffic into one lane. Sometimes there is also a large closed sign in the middle of the closed lane. Tesla never sees the red poles and often does not see the closed sign. I attach several photos each showing both the Tesla screen and reality in the same image. (unless Tesla fixes this soon) I can send you lots more photos if you wish.
I took Tesla to arbitration over defects and deceptive practices and their legal representative committed fraud by playing my own dashcam video and muting it so you couldn’t hear the auditory warnings the car makes when it makes errors. In this case the car was driving itself into oncoming traffic and stopping on the highway for non existent stop signs. I have reported this to the florida bar and florida state attorney office ashley moody who is now investigating.
Tesla has a bug with their basic autopilot software that they refuse to fix and that I believe is causing a safety issue. They advertise autopilot as standard on all Tesla's as an active safety feature that helps reduce driver stress and improve safety on the road. Here is the issue. Basic autopilot has a feature built into it that limits the speed to 5 mph over the speed limit. This is a great idea! however, since a software update that was released early last summer, the car struggles to read the proper speed limit. For instance, many roads in my area are posted at 65 mph. The car doesn't read these and defaults the speed limit to 50 mph, meaning I can only use autopilot at 55 mph max. When most people are driving 70 mph, this causes a safety issue. Tesla is aware of the problem. I have filed many bug reports, and even contacted a service center about the issue. The service center claims to be aware of it, and said that they escalated the problem to the software team, but no fix has been issued in over a year. Either they just don't care about this issue and are too busy trying to work on fsd and robotaxi to fix it, or they are purposely not fixing it to force people to subscribe to fsd as fsd doesn't have this issue.
I've had enhanced autopilot since June 24 2022. Since that time I've accumulated approximately 15,000 miles, with approximately 80% bing auto pilot miles. On may 14th my car was upgraded to "full self driving (supervised)" without my concent. I was given 30 days of fsd trial, with no way to opt out. I was presented with a software upgrade that looked like a normal upgrade, however after the upgrade I was told I had fsd. I immediately disabled fsd, only enabling autopilot (highway only), and attempted a drive from fairfield, CA to concord CA along I-680. During the drive, the car was not able to maintain lane control, and would suddenly apply hard braking randomly. On two occasions the car accelerated towards stopped traffic, and I had to disengage autopilot. I ended up manually driving the car as it was unsafe to use to assistive systems. I contacted Tesla, who said they where unable to remove the fsd trial, and I was stuck with the upgrade, even though it made the car unsafe to drive.
Two incidents: 1) non-accident: my wife was slowing to make a right turn at a stop light and the steering locked up for about 3 seconds, where she wasn't able to regain manual steering control. We did not have full self-driving mode at the time. Luckily, she was only going about 10 mph, and was able to brake before running into oncoming traffic. 2) accident: our son was driving on a country road, along a section with a slight curve. We had a free trial for full self-driving mode during the previous month, but it ended two weeks before the accident, so he could not have had it in self-driving mode. It appears that he didn't make the curve, but as evidenced by a single long tire skid mark, continued almost perfectly straight, across the oncoming lane, and ran into a redwood tree. The car rolled and ended up upside down, against the bank back on the right side of the road, with the lithium batteries on fire. Our son and his passenger sustained severe injuries and neither has a memory of what caused the accident. Naturally we suspect the possibility that the steering locked up again. But it also seems likely that the aeb and abs were not functioning correctly, as if they were, there should have been several short skid marks for the tires on both sides of the car, rather than only a single long skid mark. Additionally, there's no shoulder on that road, and there is a clump of bushes that come very close to the white line, so it's possible that a steering auto-correct initiated the incident. We have not yet been able to recover edr or log data from the vehicle.
On Sunday may 12, I was driving to work, there was a moment where I felt the steering wheel was very stiff to the point where it felt weird and hard to control. I felt the two front wheels a bit odd, but then it went back to normal. I paid no attention to it after. As I was driving on highway 820 west, my car started to automatically move to the left and I tried steering my wheel to get back in lane, but the steering wheel got so stiff that even using all my might I could not steer it. Eventually the car slammed against the cement wall three times as it spins. At the end, the police came on site and saw that the two front wheels were pulled inward. I believe it was a software/technology malfunction.
The front triple camera glass was found to be dirty on the interior part of the glass. I had alerts of camera blocked/blinded and also I had several instances of emergency corrective maneuvers including braking, lane assists and forward collision warnings. Tesla cleaned the interior of part of the glass.
I used the full self driving software Tesla provided us in April. On April 15th, the car did not detect a curb therefore the car went right over the curb which popped both right side tires and the front rim. Damage was $1,800. I brought this up to Tesla and they informed me that it was my fault for not taking over and refused to help with the repairs. I understand taking over if there was a 3-foot shoulder, but on roads with no shoulder such as this one I found it impossible to correct in time. Also, at the very last split second, the torque on the wheel was too tight too correct (not enough time anyways). This is a busy state maintained road in the middle of columbia, missouri near a college campus and my safety was put at risk while I had to find the next side street to pull over. This is dangerous software that is not ready yet.
Unwanted lane assistance in Tesla Model Y made the car swerve to side. When I tried correcting it, the steering got hard to control and whole car swirled 360 degrees and hit shoulder rail. Brakes didn’t slow the car.
I am using full self driving (fsd). When I look at my screen to control the cars map it gives me a strike. I have 5 strikes it removes fsd but also removed auto pilot that has lane centering and also removes the camera so if I were to have a stroke, tonic-clonic seizure, or heart attack or for any other issue the vehicle would not pull over like it would in fsd/auto pilot. This makes it very unsafe and I could argue it violates the ada act! for privacy I have had one of the previous issues and bought the Tesla with fsd just in case it ever happens again!!! now I don't have that safety measure for a week and I drive a lot on two lane 75 mph highways. The basic lane assist will even shut off if I hug the lane !! how dangerous is that???? now I have absolutely no safety features!.
The new safety requirement for Tesla to update its autosteer to be more stringent and align to safety regulations is a complete disaster. This requirement results in multiple false alarms and creates more distractions than prior to the update. Furthermore. . . It's upsetting that this department has an apparent obsession with Tesla's technology. Rather than promote and work with Tesla to create a future where technology can meaningfully assist drivers on the road and reduce fatalities and car crashes, this department is doing its best to ensure that no progress is made. Lastly - it would be nice to see some actually meaningful and relevant statistics regarding Tesla autopilot incidents. Rather than reporting a meaningless number of incidents with 0 context, instead try comparing the number of Tesla incidents to non-Tesla incidents. And analyzing if there's a statistically significant difference between those two groups.
The NHTSA changes that forced the Tesla update as part of campaign #23v-838 do not make the vehicle’s autopilot system more safe. Moreover, the car is actually now more unsafe and distracting with disruptive beeping and alerts. The NHTSA should have Tesla reverse the changes to basic autopilot in this update. Furthermore, the NHTSA should pivot its focus to other manufacturers’ autopilot-equivalent software features, as many of them are even less restrictive or even more dangerous. For example, the blue cruise can disengage without any warning to the user. There have been no recalls relating to this.
Due to NHTSA recall number 23v-838, the car has become more dangerous than before. Often making me apply more force on the wheel when constantly alerting me which has inadvertently made me swerve into oncoming lanes. Not only has this mandatory recall made the autopilot experience much worse, but it seems highly politicized by the previous leader of nhsta and should be reinvestigated. This is such a failure of over regulation over biased leadership and should not be tolerated.
Ever since a recent update to my car, it has become far more difficult to keep the adas features active. It is incredibly intrusive, even just glancing at the screen to see what message it's telling me leads to the car threatening to cut off all driver assistance features. This is so dangerous. The car is a better driver on the highway than I am. It is better at holding speed, it never veers out of its lane, and it never misses someone attempting to impinge on my lane. I think something needs to be done to make this system more lenient. It used to be better but something changed in late December. People in other cars seem to be able to stare at their telephones with impunity while their car continues on without any issues. All I want is to be able to glance away from the road to see what the screen is trying to tell me.
Chatgpt subject: safety concerns regarding recent Tesla autopilot changes (campaign #23v-838) dear NHTSA, I am writing to express my concerns regarding the recent required changes to Tesla's autopilot system under campaign #23v-838. While I understand the importance of updates for safety, it appears that these modifications have had an adverse impact on the vehicle's safety and increased the potential dangers of driving. I urge the NHTSA to thoroughly investigate and address these issues to ensure the continued safety of Tesla vehicles on the road. Your prompt attention to this matter is crucial for the well-being of Tesla drivers and the public at large.
Received the new ota recall update. The driver monitoring is now too sensitive and I cannot check the cross traffic when approaching intersections without receiving alerts to pay attention to the road. The car then requires me to apply varying amounts of turning force to the steering wheel, which sometimes can cause accidental disengagement, which also disengages tacc and auto braking. This is dangerous.
After Tesla software update 2023. 44. 30. 2, the rear back up camera no longer works. It’s just a black screen. Failure first occurred on 12/20/2023. Also, Tesla “auto pilot” no longer works (cruise control and lane assist). Support team could not fix both issues. Have a service appointment scheduled on 12/27/2023. Thank you and happy holidays.
To whom it may concern at the national highway traffic safety administration (NHTSA), I am writing to urgently express my concerns about the latest software update for the Tesla Model Y’s autopilot system, which, in my view, significantly compromises safety. As a Model Y owner for over two years, I have generally found the vehicle and its features to be reliable and safe. However, the recent update has introduced an overly stringent hands-on-wheel detection mechanism that is not only inconvenient but also potentially hazardous. The new update requires frequent and often forceful interaction with the steering wheel to assure the system of driver presence. This change is drastically different from my previous experience, where I received only one hands-on-wheel violation in two years. The current sensitivity of the system disrupts the smooth operation of the autopilot, leading to frequent and abrupt disengagements. I have found myself struggling to maintain the system’s activation, inadvertently causing the vehicle to exit autopilot mode multiple times. This issue goes beyond mere inconvenience; it actively detracts from driving safety. The need for constant and sometimes aggressive adjustments to satisfy the system’s requirements is distracting and can lead to erratic vehicle behavior. The irony is stark: a system designed to enhance driving safety and ease is now a source of potential danger and stress. The unpredictability and over-sensitivity of the updated system could lead to dangerous situations, especially on highways or in heavy traffic, where sudden disengagement of the autopilot can be particularly risky. As a driver, I now find myself more focused on keeping the autopilot engaged than on the actual driving conditions and surroundings, which is surely contrary to the feature’s intended purpose. I urge the NHTSA to investigate this situation and cancel the recall.
| Problem Category | Number of Problems |
|---|---|
| Assist problems | |
| Blind Spot Detection problems | |
| Warning problems | |
| Lane Keep Automatic Steering problems | |
| Lane Keep Steering Assist problems | |
| Lane Departure problems |