111 problems related to adaptive cruise control have been reported for the 2019 Tesla Model 3. The most recently reported issues are listed below. Please also check out the statistics and reliability analysis of the 2019 Tesla Model 3 based on all problems reported for the 2019 Model 3.
This morning, I was driving using fsd beta and the car swerved and veered almost directly into the curb, head-on. I was able to capture the data using the car's built-in dashcam and saved it, though I can't upload it to this site. I was able to slam on the brakes at the last minute and luckily no one else was around as it could have easily injured someone else and/or damaged my vehicle.
Since disabling the radar, the car very frequently brake checks on the highway when there are no cars in front of it, due to the reflection off the surface of highways that are very flat. This happens at least every hour or 2 when driving.
The car has repeatedly slammed on the brakes while using Tesla's full self-driving functionality and made the car unusable at times. I've gotten into many near accidents, with the most recent happening on 1/8/23 where the vehicle slammed on the brakes on the freeway in front of a big rig, almost causing it to rear end me while me, my wife and two children were in the car. At the same time, the vehicle has also almost hit pedestrians while using the fsd beta, which I managed to avoid on 1/8/23 because I was paying attention. Had I not, I'm sure we would've struck a pedestrian walking across the street.
12/24 between 7:54am and 8:55am driving on the 10 west from arizona to California while on auto drive within one hour experienced 4 times the car engaged the brake while traveling 80 mph on the highway. All 4 times there was nothing in front or side of the vehicle. The vehicle did not alarm a sensor that something was in the way. Fortunately we reacted quickly and continued with the accelerator so we didn’t stop abruptly on the highway. 2 out of the 4 times it was a very hard brake the other two times it was hard but not nearly as forceful. Again nothing in the road, sun behind us, no indicators of obstructed cameras. We haven’t been on a long trip with the car in a long time, however my son drove from orange county , CA to los angeles one evening about 6 weeks ago and he said the same thing occurred 3 times while on the trip and on the highway. I did google the situation and it sounds like I am not the only one who has experienced the phantom braking situation. We have disengaged the auto drive and auto pilot the remainder of the trip.
Phantom braking is occurring regularly on a road trip from CA to la. It has happened on clear roads with sunny conditions and no obstructions as well as in shadowy or dark conditions. It only occurs with cruise control on.
Two lane highway I was in the left lane with the cruise control on not full self driving. In the right lane was a semi which I came up along side off on the left , I was driving 70mph. (speed limit) the car suddenly hard braked on it’s own. The car behind me had to brake to avoid an accident and went into the dirt. After that on this same drive, there were at least 6 more instances of phantom braking when traveling over a “wash bridge “ going under a highway sign, driving along side another semi and once without anyone near and an open road. This also happened 3 months ago and when I reported it to Tesla they said when I received the full self driving down load which I got the next day, the problem should be corrected. I also reported the previous problem to you, the case # 11488958.
Was using autopilot feature on the vehicle on thanksgiving. Was going about 75 miles per hour when suddenly autopilot failed and the car lost control. I had to quickly regain control of the vehicle before I crashed. I took photos of the error messages and reported the incident to Tesla via the Tesla app. Typically autopilot fails gracefully and provides notifications. I’ve never seen it fail with no warning and loss of control of the vehicle.
While freeway driving with autopilot (adaptive cruise control) engaged the car will suddenly break very hard when there is no reason to. This is super dangerous when traveling at 70 mph and someone is behind me. This is a known issue (phantom breaking) that has been a problem for years and not resolved.
"full self driving" (fsd) 10. 69. 3. 1 disconnect events lock out driver after five 'strikes. ' this makes it impossible to diagnose repeatable faults until a future version unlocks fsd. My testing suggests there are problems with controlled intersections making a left turn. However, the lockout prevents documenting the problem(s). Locking out the software also makes it impossible for NHTSA, iihs, and even the national transportation safety board to identify and document failures with fsd. It also impacts car reviewers as well as this owner.
The contact owns a 2019 Tesla Model 3. The contact stated that while driving at various speeds with the autonomous self driving or adaptive cruise control feature activated, the vehicle experienced phantom braking, causing the vehicle to abruptly stop in the middle of the roadway. The contact stated that the failure had occurred on several occasions without any other vehicles or objects nearby. The cause of the failure was not yet determined. The local dealer and manufacturer were not yet notified of the failure. The failure mileage was 28,000.
While the vehicle has adaptive cruise control engaged, there are times it applies braking (sometimes aggressive braking) for no reason (commonly referred to as phantom braking). It’s almost as if the vehicle thinks there is an object in front of it and is applying emergency braking, however there is nothing on the road. In my experiences, it happens mostly on the highway at speeds 60-80mph. This happens on clear days, with no weather concerns. This is a severe issue that will cause accidents (if people are following closely behind me, there is a concern of them hitting me). It is scary and concerning ; this needs to be immediately addressed and corrected.
On a recent cross-country trip in our Tesla, we experienced "phantom braking" a number of times when we had the adaptive cruise control engaged. We were not using more advanced features on any of these occasions.
I was traveling approximately 70 mph in passing lane when I engaged (standard) autopilot, as I approached to pass a vehicle with another vehicle following me in the same lane as I was traveling. My vehicle (phantom) braked hard to which the wheels began to squeal slightly and the vehicle behind me had to brake hard to avoid hitting my vehicle. I almost immediately applied pressure to the accelerator pedal to avoid a complete stopping of the vehicle and to which the vehicle responded to accelerate. And I disengaged the autopilot immediately by applying an upward motion on the gear stock.
Tesla software update 2022. 24. 6 is an update that forces the vehicle to use Tesla vision for autopilot adaptive cruise control. The driver does not have the option to disable auto high beam features and the software is not accurate enough to prevent flashing and high beaming of other drivers. This hazard has occurred at least 5 times within two days of receiving the update. A hazard is created in two different ways: - blinding approaching vehicles, including turning on high beams when approaching vehicle is in direct area of glare - flashing high beams when following other vehicles at a close distance, distracting and blinding the lead vehicle while this problem has not been verified by a dealer, auto high beams while using autopilot cannot be disabled and many other drivers are reporting similar issues in online forums. There is no warning or way to adjust the feature. This will likely lead to a collision because approaching drivers cannot see through glare.
Car is on cruise control on a two lane highway, posted speed 55 mph. It has ghost emergency stops on cruise control not auto pilot . My car doesn’t have it. It renders a simple feature unusable.
Yesterday I had the most frightening driving experience of my entire life. It was raining heavily and I was going about 35 mph on the local road in my Model 3 standard, 3 years old. Suddenly I lost all traction and was slipping side to side +/-90 degrees from the direction of traffic. I immediately took my foot off the accelerator and tried to slow down but didn’t brake. I swung 180 degrees each way about 5 times. Luckily no one hit me but I was taking up all lanes as I careened all over the road. I can’t believe how easily the Model 3 loses traction in the rain. This is unacceptable.
While driving on the freeway tonight at about 11:45 pm with no one nearby and no shadows or debris in the road, the car on autopilot slammed on the brakes, taking the car from 68mph to below 45mph in less than a second, which could have resulted in a rear end collision had someone been behind the car. The car has done this many times in the past but this time was unusual in that three beeps occurred right as the car slammed on the brakes. I saved the video and will see if the video shows the incident.
1. Phantom braking while on autopilot in a freeway (i15, i215, etc) 2. Disengaging auto pilot when it is making a curve in a freeway to another freeway (san diego freeway).
My friend was driving the vehicle (a 2019 Tesla Model 3) while it was engaged in full self-driving beta 10. 12. 2 mode. The Tesla was in traffic waiting to turn left at an intersection. As the vehicle made the turn it decelerated from 18mph to 10mph at a secondary red light - for no reason. The driver had to take control of the vehicle in order to maintain progress and not cause a hazard to other road users. A link to the clip of the incident can be found below: https://vimeo. Com/729935224/032493c7fd.
My friend was driving the vehicle (a 2019 Tesla Model 3) while it was engaged in full self-driving beta 10. 12. 2 mode. While fsd was engaged, the driver approached a road closure sign due to road works occurring. Ignoring these well-marked signs, which clearly said 'road closed', the Tesla proceeded past these signs until the driver had to perform a stop to prevent the vehicle from colliding with the road works. A link to the clip of the incident can be found below: https://vimeo. Com/733970645/9ef2ad21d2.
My friend was driving the vehicle (a 2019 Tesla Model 3) while it was engaged in full self-driving beta 10. 12. 2 mode. While fsd beta was engaged, the driver approached an intersection, the fsd stopped correctly to yield to oncoming traffic on a one-way street. The vehicle was travelling across the intersection according to its navigation. There was also a cyclist ahead of the Tesla at the intersection. As the cyclist entered the intersection, the Tesla followed and attempted to travel around the cyclist to complete the manoeuvre and continue travelling straight ahead. However, the Tesla then deviated from its course and instead travelled left at the intersection, travelling the wrong way on one-way street, causing the driver to perform an emergency stop and reverse out of the dangerous situation. The Tesla appeared to panic at the presence of the cyclist when attempting to travel around it, and then headed directly into oncoming traffic while travelling the wrong way up the one-way street. A link to the clip of the incident can be found below: https://vimeo. Com/733972674/ce244b3a69.
I was driving on Tesla “autopilot” and instead of noticing the exit from freeway , the car thought it was still on freeway as it had to take the next exit, it just on full speed went into the exit, it did not break or slow down and just before the sharp exit turn on 65 mph it just disengaged autopilot with message “system error “ luckily I was aware and was able to avoid a major crash . If I was a second late, the car would have gone through the road barrier. I’m sure if I had a crash Tesla would find reasons to blame me for the crash even though it was its software. Autopilot is junk piece of software with little regards to customer safety. I am not sure how something like this is even allowed to be sold and used on public roads.
My friend was driving the vehicle (a 2019 Tesla Model 3) while it was engaged in full self-driving beta 10. 12. 2 mode. As the vehicle approached a rail crossing, with lights indicating it was safe to cross, the vehicle rapidly reduced speed from 21 mph, creating a hazard for the vehicles behind. The driver had to take back control of the car to mitigate the risk of a collision. A link to the clip of the incident can be found below: https://vimeo. Com/732400249/c1dbfe39dd.
My friend was driving the vehicle (Tesla Model 3, 2019) while full self-driving beta 10. 12. 2 mode was engaged. The fsd Tesla makes a left turn at an intersection on a protected left turn, but slows down considerably just before the traffic lights - which did not apply to the Tesla but rather the traffic from the opposite side of the intersection - after making the turn, stopping in the middle of the lane. The driver disengages and accelerates to complete the manoeuvre, to avoid a rear collision. The Tesla may have mistaken the red traffic light showing for other traffic as one it should pay attention to, almost causing a collision and displaying the inadequacies of its vision and cameras. A link to the clip of the incident can be found below: https://vimeo. Com/732401882/8837a257a0.
My friend was driving the vehicle (a 2019 Tesla Model 3) while it was engaged in full self-driving beta 10. 12. 2 mode. The Tesla fails to slow down when approaching a curve in the road resulting in taking the corner far too quickly, despite the fact that there is a sign by the road warning drivers to slow down to 15mph. Tesla also failed to recognise the change in speed limit due to a blind turn in the road approaching, the car continued to go at a rapid speed (30mph) and even turned left when the road curved toward the right, leading to the driver taking control to steer the car back on path. During the turn, the Tesla was engaged in fsd beta 10. 12. 2 and crossed the dividing line into the opposite lane. If another vehicle was approaching this incident would have resulted in a collision as the Tesla strayed into the path of oncoming traffic. A link to a clip of the incident can be found below: https://vimeo. Com/730269602/ae8bd9f3b1.
My friend was driving the vehicle (a 2019 Tesla Model 3) while it was engaged in full self-driving beta 10. 12. 2 mode. Whilst travelling at 24mph the fsd informs the driver to “avoid traffic cones” and brakes to 12mph for no apparent reason . The video of the incident clearly shows there were no traffic cones to avoid. The vehicle automatically disengages from fsd mode whilst crossing an intersection, creating a hazard for other road users. A link to the clip of the incident can be found below: https://vimeo. Com/730258398/e0f8b28291.
My friend was driving the vehicle (a 2019 Tesla Model 3) while it was engaged in full self-driving beta 10. 12. 2 mode. The vehicle was travelling at approx 35 mph - too fast for a narrow and winding road. The vehicle crossed the central line into the oncoming lane and a moment later veered towards the right - nearly making contact with the curb and causing a collision. The driver had to take back control of the vehicle. A link to the clip of the incident can be found below: https://vimeo. Com/731689228/485219aa71.
My friend was driving the vehicle (a 2019 Tesla Model 3) while it was engaged in full self-driving beta 10. 12. 2 mode. Travelling on a narrow and winding road the driver had set the max speed to 20 mph. After re-engaging fsd mode the Tesla “reset” the max speed to 35 mph. Clearly too fast for the road conditions, the vehicle had difficulty negotiating the road safely and almost veered into the wrong side of the road. A link to a clip of the incident can be found below: https://vimeo. Com/731691467/e336068d60.
My friend was driving the vehicle (a 2019 Tesla Model 3) while it was engaged in full self-driving beta 10. 12. 2 mode. The vehicle was stationary at a “no right turn on red light” intersection. In spite of the sign, in fsd, the vehicle took the turn in “autopilot creeping forward” mode, making an illegal turn on a busy intersection. Please see a link to a clip of the incident below: https://vimeo. Com/731427985/e4d022bdea.
My friend was driving the vehicle (a 2019 Tesla Model 3) while it was engaged in full self-driving beta 10. 12. 2 mode. At 10mph the vehicle attempted to change lanes whilst there was another vehicle in the other lane. If the driver had not intervened there may have been a collision or the other vehicle would have to brake hard to avoid contact. There is go_pro video of the incident here. Https://vimeo. Com/736107290/2a01155c84.
My friend was driving the vehicle (a 2019 Tesla Model 3) while it was engaged in full self-driving beta 10. 12. 2 mode. At approx 20 mph the vehicle entered a stretch of road works with temporary signage and cones. The vehicle performed an emergency stop for no reason to the frustration of other road users. The driver had to take control of the vehicle. There is go_pro video of the incident here. Https://vimeo. Com/736110202/bcedc1c516.
My friend was driving the vehicle (a 2019 Tesla Model 3) while it was engaged in full self-driving beta 10. 12. 2 mode. At 25 mph the vehicle veered across two solid white lines and into the cycle lane. There is go_pro video of the incident here. Https://vimeo. Com/736112946/2d0d7a4ed3.
My friend was driving the vehicle (a 2019 Tesla Model 3) while it was engaged in full self-driving beta 10. 12. 2 mode. The vehicle failed to recognise a speed limit change approaching a sharp left turn. The vehicle took the turn too fast at 29mph veered into the oncoming lane. The quick reactions of the driver, who took control averted a collision. There is go_pro video of the incident here https://vimeo. Com/736115753/289c3413e9.
My friend was driving the vehicle (a 2019 Tesla Model 3) while it was engaged in full self-driving beta 10. 12. 2 mode. After successfully crossing a junction the vehicle accelerated too quickly on a narrow residential road with another vehicle approaching from the opposite direction. The vehicle was travelling at 27 mph when it was forced to brake hard. There is go_pro footage of the incident here. . . Https://vimeo. Com/738677992/0a3c3d4713.
My friend was driving the vehicle (a 2019 Tesla Model 3) while it was engaged in full self-driving beta 10. 12. 2 mode. The vehicle stopped short of a stop sign at a junction then slowly moved to the middle of the junction, failing to make a right turn. After manual re-routing the vehicle brakes for no reason other than a pedestrian is close by. The vehicle then fails to make a left turn successfully, cutting the corner. There is go_pro video here. . Https://vimeo. Com/738680260/7c4eeba7e2.