294 problems related to adaptive cruise control have been reported for the 2022 Tesla Model Y. The most recently reported issues are listed below. Please also check out the statistics and reliability analysis of the 2022 Tesla Model Y based on all problems reported for the 2022 Model Y.
When using Tesla autopilot (adaptive cruise control) Tesla curvature assist will engage on straight roads during nice weather and great forward visibility. When Tesla curvature assist engages, the car brakes and slows dramatically. I have had drivers behind me think I was brake checking them or messing with them. This feature is dangerous and there is not an option to disable.
Accident involving my Tesla while the full self-driving (fsd) feature was engaged. Vehicle information: [2022, model - y, ] incident summary: while driving with fsd engaged, the vehicle hydroplaned and the steering wheel started spinning uncontrollably and despite attempts, couldn't regain control of the car, and it spun while crossing lanes. A truck hit the front of the car and ended up on shoulder.
Tesla's fsd functions slowly has been reduced in the last years. Firts Tesla disabled the double pull function, resulted taken away the standard cruise control function. Lately, the fully paid ($12000) fsd is not recognizing streetsigns anymore, which worked perfectly before. This function is crucial to manage speeds, lanes, pedestrians and school zones or just attend in public roads! Tesla service center refuses to fix this issue or offer any additional solutions! this is very concerning and those vehicles with full fsd are not safe to operate on public roads anymore! thank you.
Any of categories are matching with my issue. 3 1/2 years old, 64,000 miles Tesla Model Y, ecu, which controls full self driving , and drive assist system, is fault right after warranty expires. The replacement cost is $2,800. Does it make sense that ecu fault on the owner? it is only 3 1/2 years old car. It is definitely defective part, but Tesla service is not replace for free. I am concerning if ecu fails while I am driving.
Vehicle was stopped at a red light. On its own it began driving- accelerating forward from a complete stop - ramming the car in front - then into an active intersection. Thankfully the steering was responsive but not the brake until a few seconds after incident began. Miraculously no one was injured. We have Tesla’s dashcam video of incident but unable to upload onto this platform (it only takes photos).
"max speed offset" 6 months ago Tesla updated fsd to v 12 on hw3 vehicles. Although I think v 13 on hw 4 vehicles are also affected just not quite as much. Immediately driving through any 25 or 35 zones the cruise would actually set dangerously high. Complained to dealer. Claimed couldn't do anything as it was a software issues, would eventually get corrected. It did a little, happening less often, but still happened every time in small towns around me. When they changed it, the recommendation on the screen said go 40% over limit. I kept lowering my offset, no help. Lot of other people have been complaining on Tesla message boards. One person there said use 0% and that did work, but is annoying. I have included some pics. One showing the car set is self to 54 mph when I drove into a 35 zone. The car read the 35 as it is on the screen. One with message about why they changed. They want the car to keep up with "flow of traffic". Behind the pop up window you can see their recommended 40% over. Two pics messaging from Tesla. Despite setting appt to fix it they claim nothing they could do. When I asked why mine did it all the time in some areas, no response. Last pic, my spouse doesn't use fsd and the old system is still active for autopilot and work perfectly. They could easily switch back. That area is still there, but greyed out in fsd screen. But they want their aggressive driver "keep up with traffic flow".
On [xxx], traveling westbound from [xxx] on [xxx] using adaptive cruise control. Pick-up truck rapidly pulled up behind and cruise control shut off, stated "unavailable" and Tesla started to brake. . . Manually overrode. Travelled another 1/4 mile and reengaged cruise control after pulling away from following pickup. Pickup caught up and cruise control shut off again. At 4-way stop on [xxx] , rechecked system and then headed for [xxx] on adaptive cruise control. Pickup came up again and cruise control shut off. At next intersection I turned right and pickup turned left. . . Had "electronics" sign on side of truck. Not able to get full name. Question: is there electronic systems "out there" that can take control of Tesla adaptive cruise controls? I tried to pass this on to Tesla, but wouldn't or couldn't take the above message. Information redacted pursuant to the freedom of information act (foia), 5 u. S. C. 552(b)(6).
I engaged Tesla autopilot (fsd) as I left my house to pick up my son. The car started accelerating and swerved sharp left and ran over one of our neighbors' front yard leaving damages to their yard as well as my car. All 4 tires and wheels are damaged as it ran over the concrete curve and the front bumper of the car is damaged.
Tesla's autopilot/full self drive feature is reading minimum speed limit signs as the speed limit and slowing to 40 mph on 70 mph highways. This is happening to many Tesla owners who are talking about it on social media platforms like reddit. The cameras usually read speed limit signs and either slows or accelerates to match the posted speed limit. With the most recent software update it's now reading minimum speed limit signs and sometimes random highway signs, such as GA 25, and slows to 25 mph. It was dangerous for me specifically when I was passing a slow-moving big rig this past weekend. As I passed the truck doing roughly 75 mph the car saw a minimum speed limit sign and immediately slowed to 40 just as I was getting back into the right lane. That's downright dangerous.
Tesla will not fix: Tesla pushed a spring software update which introduces risk to pedestrians and risk of accident including rear-ending. In the past an image of the car is on the left side of the screen and a large map is on the right side of the screen. With this update, the map is removed from the right side of the screen when not in gear. If the car is in gear it remains on the screen. To setup adas destination it now works best if you have the car in gear. If the car is in park you have only full screen car image and extra steps to get the large map. If you voice command destination you may never see the map until driving on road. You should be driving at that point not interacting with the screen. Accident and pedestrians at risk while using screen. For instance if you get in the car and issue a voice command to destination with only one location the tiny map image in the far upper right corner of the screen displays a route so small and completely without detail it could be wrong but you just assume it is correct because it will take extra user interface interaction to make it large enough to tell what it is actually going to do. You put the car in reverse and back into the street, put the car in drive and now you get the full size map, route, destination for the first time and route is wrong. When in park, interacting with the map you don't really know what will cause the map to go away and be replaced by the car image. Can happen if you change destination entry. The map is removed from the screen every time you put the car in park. Now a big problem. If you are driving and decide maybe I should go to a different destination first and you pull over and put the car in park the map is removed from the screen again and the destination remains unchanged. You either fight the propensity to replace the map with the car image or don't put the car in park. To prevent the car screen from replacing the map unpredictably you have to keep the car in gear.
I took Tesla to arbitration over defects and deceptive practices and their legal representative committed fraud by playing my own dashcam video and muting it so you couldn’t hear the auditory warnings the car makes when it makes errors. In this case the car was driving itself into oncoming traffic and stopping on the highway for non existent stop signs. I have reported this to the florida bar and florida state attorney office ashley moody who is now investigating.
Safety issue with Tesla's auto pilot braking without cause in the middle of freeway. Many times accidents were about to happen when I intervened and press onto the gas pedal. It seems like there are other complaints and not sure why NHTSA is not doing about it? is Tesla that powerful that government does not want to disturb anything?.
Phantom braking occurred when the adaptive cruise was engaged when a truck entered the road from the right side. The car abruptly slowed when the truck crossed my lane although there was no eminent danger. The car behind us nearly collided with us due to the sudden braking incident. This is the third sudden braking incident; however, the others were related to a bridge or overpass.
The adaptive cruise control doesn’t function correctly. I was on the freeway, which was 70 mph, and all of a sudden the vehicle breaker abruptly while in autopilot. I saw that it registered the max speed as 40 mph. It almost caused a collision.
I've had enhanced autopilot since June 24 2022. Since that time I've accumulated approximately 15,000 miles, with approximately 80% bing auto pilot miles. On may 14th my car was upgraded to "full self driving (supervised)" without my concent. I was given 30 days of fsd trial, with no way to opt out. I was presented with a software upgrade that looked like a normal upgrade, however after the upgrade I was told I had fsd. I immediately disabled fsd, only enabling autopilot (highway only), and attempted a drive from fairfield, CA to concord CA along I-680. During the drive, the car was not able to maintain lane control, and would suddenly apply hard braking randomly. On two occasions the car accelerated towards stopped traffic, and I had to disengage autopilot. I ended up manually driving the car as it was unsafe to use to assistive systems. I contacted Tesla, who said they where unable to remove the fsd trial, and I was stuck with the upgrade, even though it made the car unsafe to drive.
Autopilot began beeping with messages to pay attention while on highway. Moving the wheel did not remove message to the point car was shaking in lane due to turning the wheel. Warning beeps and messages got louder and more distracting and would not go away until brake petal was used at highway speeds to disengage autopilot.
Incident report: Model Y autopilot anomaly incident details: date and time: April 18th, 2024, approximately 11:44 am pst location: I-405 northbound vehicle: 2022 Model Y autopilot setting: full autopilot engaged at 65 miles per hour description: while driving my Model Y on I-405 northbound, I experienced a sudden and unexpected event. As I reached for a bag containing an apple, the car abruptly veered left toward the middle concrete divider that separates the southbound and northbound lanes. The impact resulted in a noticeable scratch sound as the car brushed against the divider. Challenging autopilot behavior during this incident, I felt like I was wrestling with the autopilot system. It took approximately 7 seconds of back-and-forth struggle before I regained control of the vehicle. Fortunately, there were no other cars nearby, and both the car and I remained unharmed. The outcome felt like a true miracle. Summary in summary, the autopilot system executed a sudden maneuver without any prior warning. The experience left me shaken, and I continue to have nightmares about what had happened. I have saved 4 videos which recorded the accident.
Tesla has failed to provide a software update to address both recalls, and has made it impossible for me to get support to have this rectified. They cancelled a service appointment I made and are no longer responding to my inquiries.
In the Tesla full self-drive adaptive cruise control system the set speed arbitrarily and abruptly drops from the set speed to 40 mph on an interstate highway. In other words, if set to 70 it will drop instantly to 40 mph for no reason.
I was using autopilot (adaptive cruise control) in the toll lane. The toll lane has stretches with double white lines that separate that lane from the other, regular freeway lanes. Drivers are not supposed to cross those double white lines. In some parts of toll lane, there are sections where drivers can merge in or out of the toll lane/carpool line into or from the regular freeway lane. As I was using autopilot in the toll lane/carpool lane, the car suddenly veered at high speed to merge into the regular freeway lane rather than staying straight in the toll lane. I had to take control of the wheel to keep the car from veering into the adjacent regular freeway lane.
Vehicle had a software update. After software update the car can see all speed limit signs while driving except for 65 mph sign. Traveling into a town car will register 45 mph, 35 mph, 25 mph signs and update the max speed to reflect the speed limit. However 65 mph signs are completely ignored. Resulting in the max speed being the speed of the last sign passed. Ie leaving a town where speed limit goes from 35 to 45 and then finally to 65 the speed on the car will only register the 45 sign and thus driver is unable to set the cruise or use autopilot at the correct speed. Contact was made with the service department but was told it was there was inconsistency between what the camera is seeing versus what is posted in the navigation maps package. It was submitted for review and will be fixed in the future but until then the only way to use cruise control at 65 is to get off onto a side road where no speed limit sign is posted turn around and get back onto the highway. I have since had 2 more software updates to the car and it is still broken.
Phantom braking while driving with adaptive cruise control or on autopilot.
Autopilot will suddenly start to slow down on highway and freeway speeds with no warning, this has happened more than 4 times. The last time that this happened me and my partner almost got rear-ended.
While using cruise control at highway speeds the car abruptly brakes for no reason. If another car was following too close it would have rear ended us. It happens at unexpected times and speeds. I has happened multiple times and I am to the point of not using it because of the danger it causes. It has not been inspected by any authority because of its randomness. No warnings were made prior or during the abrupt braking.
The contact owns a 2022 Tesla Model Y. The contact stated that while driving at approximately 70 mph with the adaptive cruise control activated, the vehicle decelerated while driving at night in the rain. The contact was concerned about being rear ended. There were no warning lights illuminated. The contact used the app to communicate with an unknown dealer, who responded and informed the contact that there were no hard wire issues. To contact took control of the vehicle and use the accelerator pedal to drive as needed. The failure might have been caused by false positive conditions in the surrounding area. The mechanic informed the contact to reboot the vehicle. The vehicle was not diagnosed or repaired. The contact stated that after the computer was rebooted, the lane departure unavailable message was displayed. The manufacturer was not contacted. The failure mileage was approximately 23,000.
Due to NHTSA recall number 23v-838, the car has become more dangerous than before. Often making me apply more force on the wheel when constantly alerting me which has inadvertently made me swerve into oncoming lanes. Not only has this mandatory recall made the autopilot experience much worse, but it seems highly politicized by the previous leader of nhsta and should be reinvestigated. This is such a failure of over regulation over biased leadership and should not be tolerated.
Ever since a recent update to my car, it has become far more difficult to keep the adas features active. It is incredibly intrusive, even just glancing at the screen to see what message it's telling me leads to the car threatening to cut off all driver assistance features. This is so dangerous. The car is a better driver on the highway than I am. It is better at holding speed, it never veers out of its lane, and it never misses someone attempting to impinge on my lane. I think something needs to be done to make this system more lenient. It used to be better but something changed in late December. People in other cars seem to be able to stare at their telephones with impunity while their car continues on without any issues. All I want is to be able to glance away from the road to see what the screen is trying to tell me.
To whom it may concern at the national highway traffic safety administration (NHTSA), I am writing to urgently express my concerns about the latest software update for the Tesla Model Y’s autopilot system, which, in my view, significantly compromises safety. As a Model Y owner for over two years, I have generally found the vehicle and its features to be reliable and safe. However, the recent update has introduced an overly stringent hands-on-wheel detection mechanism that is not only inconvenient but also potentially hazardous. The new update requires frequent and often forceful interaction with the steering wheel to assure the system of driver presence. This change is drastically different from my previous experience, where I received only one hands-on-wheel violation in two years. The current sensitivity of the system disrupts the smooth operation of the autopilot, leading to frequent and abrupt disengagements. I have found myself struggling to maintain the system’s activation, inadvertently causing the vehicle to exit autopilot mode multiple times. This issue goes beyond mere inconvenience; it actively detracts from driving safety. The need for constant and sometimes aggressive adjustments to satisfy the system’s requirements is distracting and can lead to erratic vehicle behavior. The irony is stark: a system designed to enhance driving safety and ease is now a source of potential danger and stress. The unpredictability and over-sensitivity of the updated system could lead to dangerous situations, especially on highways or in heavy traffic, where sudden disengagement of the autopilot can be particularly risky. As a driver, I now find myself more focused on keeping the autopilot engaged than on the actual driving conditions and surroundings, which is surely contrary to the feature’s intended purpose. I urge the NHTSA to investigate this situation and cancel the recall.
Subject: safety concerns regarding recent Model Y autopilot update to whom it may concern at the national highway traffic safety administration (NHTSA), I am writing to urgently express my concerns about the latest software update for the Tesla Model Y’s autopilot system, which, in my view, significantly compromises safety. As a Model Y owner for over two years, I have generally found the vehicle and its features to be reliable and safe. However, the recent update has introduced an overly stringent hands-on-wheel detection mechanism that is not only inconvenient but also potentially hazardous. The new update requires frequent and often forceful interaction with the steering wheel to assure the system of driver presence. This change is drastically different from my previous experience, where I received only one hands-on-wheel violation in two years. The current sensitivity of the system disrupts the smooth operation of the autopilot, leading to frequent and abrupt disengagements. I have found myself struggling to maintain the system’s activation, inadvertently causing the vehicle to exit autopilot mode multiple times. This issue goes beyond mere inconvenience; it actively detracts from driving safety. The need for constant and sometimes aggressive adjustments to satisfy the system’s requirements is distracting and can lead to erratic vehicle behavior. The irony is stark: a system designed to enhance driving safety and ease is now a source of potential danger and stress. The unpredictability and over-sensitivity of the updated system could lead to dangerous situations, especially on highways or in heavy traffic, where sudden disengagement of the autopilot can be particularly risky. As a driver, I now find myself more focused on keeping the autopilot engaged than on the actual driving conditions and surroundings, which is surely contrary to the feature’s intended purpose. I urge the NHTSA to investigate this and stop enforcing ridiculous changes to Tesla.
The contact owns a 2022 Tesla Model Y. The contact stated while her husband was driving 60 mph they crashed into a large plastic object that was in the roadway. The contact stated that there was no warning from the manufacturer forward collision avoidance system or any other safety system, the cruise control was engaged at the time of the crash. The contact stated that her husband swerved to the left to avoid the object and the right front bumper, right corner of the hood and headlight crashed into the plastic object and were broken and crushed. The contact stated no warning light was illuminated and the air bags had not deployed. The contact stated that there were no injuries and that the police were on scene and wrote a report. The vehicle was drivable and the contact drove back to her residence with her husband. The contact had not taken the vehicle to a local dealer or repair shop. The vehicle was not repaired. The manufacturer was informed of the failure. The failure mileage was approximately 26,000.
"phantom breaking" issue. When the vehicle is placed in cruise control the vehicle just randomly slams on the breaks; this typically happens when on the freeway/highway at speeds of 55 mph or greater. My wife and I have experienced this extremely dangerous issue from day 1 of taking possession of our Tesla over a year ago. I've reported it to Tesla and also spoken to the repair technicians that work on the Tesla vehicles. They're aware of the issue and haven't done anything to help or even offered to fix the issue. The technicians advised me that they are fully aware of the issue, and Tesla has no fix at this time. This is totally unacceptable and extremely dangerous. We've nearly gotten into a couple of major accidents because of this uncorrected issue.
Date: 12/1/2023 time: approximately 5:20 am location: intersection of springfield avenue and summit avenue, summit vehicle 1: type: Tesla Model Y mode: full self-driving (fsd) direction: eastbound on springfield avenue speed: approximately 30 mph vehicle 2: type: sedan (make and model unknown) direction: southbound on summit avenue, turning right incident description: while operating a Tesla Model Y in full self-driving mode, I was traveling eastbound on springfield avenue in summit at around 30 mph. The incident occurred at the intersection of springfield avenue and summit avenue. At this intersection, the traffic light for springfield avenue was flashing yellow, indicating caution, while the light for summit avenue was flashing red, signaling a stop requirement for vehicles on that road. As I approached the intersection, a sedan traveling southbound on summit avenue initiated a right turn onto springfield avenue. Despite the flashing red light for the sedan's direction, the Tesla's fsd system, which was controlling the vehicle at the time, appeared to misinterpret the right-of-way. The Tesla continued at the same speed without reducing, shifting lanes, or showing any indication of yielding to the turning sedan. Recognizing the impending risk of collision, the sedan driver executed a sharp right turn, narrowly avoiding an accident but coming close to the curb in the process. Realizing the Tesla's fsd system was not responding appropriately to the traffic situation, I quickly disengaged the fsd mode and manually took control of the vehicle to avert a potential collision. Remarks: this incident highlights a concern regarding the current capabilities of the full self-driving system in complex traffic scenarios, particularly at intersections with flashing traffic signals. It underscores the necessity for continuous driver attention and readiness to assume control, even when fsd is engaged.
Driving on a clear day, no traffic empty highway. Suddenly stop very hard no reason. Phantom braking. And a few minutes later happen again suddenly violently stop, seems the cameras thinking something is infront of the car which nothing. Like I said empty highway.
Full self driving (fsd), when engaged on the freeway, occasionally causes needless hard braking (phantom braking). This happens randomly and without warning. When this occurs, I and others road users are at increased risk of an accident and serious injury and/or death. Since there has been no accident the vehicle has not been inspected.
When using cruise control my Model Y will randomly slam on the brakes for no reason. Latest incident was on a two lane highway (hwy 395 in CA) at 1pm. Sunny with perfect visibility. Road condition was no problem. No visual obstacles anywhere that would confuse the cameras. Zero reason for the brakes to be slammed. Luckily there were no cars around me. Especially behind me. This is a common occurrence and very dangerous. This could easily cause someone to rear end me as this happens in spots this shouldn’t and when you least expect it. This has been an issue people have been complaining about for awhile now with no fix. Please help.
Adaptive Cruise Control problems | |
Automatic Emergency Braking problems | |
Warnings problems | |
Forward Collision Avoidance problems | |
Automatic Emergency Steering problems |