Tesla Model Y owners have reported 137 problems related to assist (under the lane departure category). The most recently reported issues are listed below. Also please check out the statistics and reliability analysis of Tesla Model Y based on all problems reported for the Model Y.
The front triple camera glass was found to be dirty on the interior part of the glass. I had alerts of camera blocked/blinded and also I had several instances of emergency corrective maneuvers including braking, lane assists and forward collision warnings. Tesla cleaned the interior of part of the glass.
See
all problems of the 2022 Tesla Model Y
🔎.
I used the full self driving software Tesla provided us in April. On April 15th, the car did not detect a curb therefore the car went right over the curb which popped both right side tires and the front rim. Damage was $1,800. I brought this up to Tesla and they informed me that it was my fault for not taking over and refused to help with the repairs. I understand taking over if there was a 3-foot shoulder, but on roads with no shoulder such as this one I found it impossible to correct in time. Also, at the very last split second, the torque on the wheel was too tight too correct (not enough time anyways). This is a busy state maintained road in the middle of columbia, missouri near a college campus and my safety was put at risk while I had to find the next side street to pull over. This is dangerous software that is not ready yet.
See
all problems of the 2023 Tesla Model Y
🔎.
Unwanted lane assistance in Tesla Model Y made the car swerve to side. When I tried correcting it, the steering got hard to control and whole car swirled 360 degrees and hit shoulder rail. Brakes didn’t slow the car.
I am using full self driving (fsd). When I look at my screen to control the cars map it gives me a strike. I have 5 strikes it removes fsd but also removed auto pilot that has lane centering and also removes the camera so if I were to have a stroke, tonic-clonic seizure, or heart attack or for any other issue the vehicle would not pull over like it would in fsd/auto pilot. This makes it very unsafe and I could argue it violates the ada act! for privacy I have had one of the previous issues and bought the Tesla with fsd just in case it ever happens again!!! now I don't have that safety measure for a week and I drive a lot on two lane 75 mph highways. The basic lane assist will even shut off if I hug the lane !! how dangerous is that???? now I have absolutely no safety features!.
The new safety requirement for Tesla to update its autosteer to be more stringent and align to safety regulations is a complete disaster. This requirement results in multiple false alarms and creates more distractions than prior to the update. Furthermore. . . It's upsetting that this department has an apparent obsession with Tesla's technology. Rather than promote and work with Tesla to create a future where technology can meaningfully assist drivers on the road and reduce fatalities and car crashes, this department is doing its best to ensure that no progress is made. Lastly - it would be nice to see some actually meaningful and relevant statistics regarding Tesla autopilot incidents. Rather than reporting a meaningless number of incidents with 0 context, instead try comparing the number of Tesla incidents to non-Tesla incidents. And analyzing if there's a statistically significant difference between those two groups.
The NHTSA changes that forced the Tesla update as part of campaign #23v-838 do not make the vehicle’s autopilot system more safe. Moreover, the car is actually now more unsafe and distracting with disruptive beeping and alerts. The NHTSA should have Tesla reverse the changes to basic autopilot in this update. Furthermore, the NHTSA should pivot its focus to other manufacturers’ autopilot-equivalent software features, as many of them are even less restrictive or even more dangerous. For example, the blue cruise can disengage without any warning to the user. There have been no recalls relating to this.
Due to NHTSA recall number 23v-838, the car has become more dangerous than before. Often making me apply more force on the wheel when constantly alerting me which has inadvertently made me swerve into oncoming lanes. Not only has this mandatory recall made the autopilot experience much worse, but it seems highly politicized by the previous leader of nhsta and should be reinvestigated. This is such a failure of over regulation over biased leadership and should not be tolerated.
Ever since a recent update to my car, it has become far more difficult to keep the adas features active. It is incredibly intrusive, even just glancing at the screen to see what message it's telling me leads to the car threatening to cut off all driver assistance features. This is so dangerous. The car is a better driver on the highway than I am. It is better at holding speed, it never veers out of its lane, and it never misses someone attempting to impinge on my lane. I think something needs to be done to make this system more lenient. It used to be better but something changed in late December. People in other cars seem to be able to stare at their telephones with impunity while their car continues on without any issues. All I want is to be able to glance away from the road to see what the screen is trying to tell me.
Chatgpt subject: safety concerns regarding recent Tesla autopilot changes (campaign #23v-838) dear NHTSA, I am writing to express my concerns regarding the recent required changes to Tesla's autopilot system under campaign #23v-838. While I understand the importance of updates for safety, it appears that these modifications have had an adverse impact on the vehicle's safety and increased the potential dangers of driving. I urge the NHTSA to thoroughly investigate and address these issues to ensure the continued safety of Tesla vehicles on the road. Your prompt attention to this matter is crucial for the well-being of Tesla drivers and the public at large.
See
all problems of the 2020 Tesla Model Y
🔎.
Received the new ota recall update. The driver monitoring is now too sensitive and I cannot check the cross traffic when approaching intersections without receiving alerts to pay attention to the road. The car then requires me to apply varying amounts of turning force to the steering wheel, which sometimes can cause accidental disengagement, which also disengages tacc and auto braking. This is dangerous.
After Tesla software update 2023. 44. 30. 2, the rear back up camera no longer works. It’s just a black screen. Failure first occurred on 12/20/2023. Also, Tesla “auto pilot” no longer works (cruise control and lane assist). Support team could not fix both issues. Have a service appointment scheduled on 12/27/2023. Thank you and happy holidays.
See
all problems of the 2021 Tesla Model Y
🔎.
To whom it may concern at the national highway traffic safety administration (NHTSA), I am writing to urgently express my concerns about the latest software update for the Tesla Model Y’s autopilot system, which, in my view, significantly compromises safety. As a Model Y owner for over two years, I have generally found the vehicle and its features to be reliable and safe. However, the recent update has introduced an overly stringent hands-on-wheel detection mechanism that is not only inconvenient but also potentially hazardous. The new update requires frequent and often forceful interaction with the steering wheel to assure the system of driver presence. This change is drastically different from my previous experience, where I received only one hands-on-wheel violation in two years. The current sensitivity of the system disrupts the smooth operation of the autopilot, leading to frequent and abrupt disengagements. I have found myself struggling to maintain the system’s activation, inadvertently causing the vehicle to exit autopilot mode multiple times. This issue goes beyond mere inconvenience; it actively detracts from driving safety. The need for constant and sometimes aggressive adjustments to satisfy the system’s requirements is distracting and can lead to erratic vehicle behavior. The irony is stark: a system designed to enhance driving safety and ease is now a source of potential danger and stress. The unpredictability and over-sensitivity of the updated system could lead to dangerous situations, especially on highways or in heavy traffic, where sudden disengagement of the autopilot can be particularly risky. As a driver, I now find myself more focused on keeping the autopilot engaged than on the actual driving conditions and surroundings, which is surely contrary to the feature’s intended purpose. I urge the NHTSA to investigate this situation and cancel the recall.
This is a complaint regarding intermittent camera blindness, driver assistance failure, and dismissed diagnostics on a 2023 Tesla Model Y. These issues have impacted critical safety systems including: •pillar and rear-view cameras •autopilot and traffic-aware cruise control (tacc) •park assist and visualization systems despite multiple service visits and clear documentation, Tesla has failed to diagnose or remedy the malfunction, attributing the issues to external weather factors despite my repeated statements that the problems occurred in clear conditions. The vehicle has also generated camera blocked/blinded alerts, even when the lenses were clean, and service technicians confirmed the b-pillar camera was blocked during 8. 7% of the drive cycle — a serious figure given that Tesla’s driver-assist systems depend entirely on vision-based processing. The vehicle has shown alerts related to automatic emergency braking being disabled, which Tesla attributed to a “false trigger. ” regardless, the vehicle may have operated without key crash-avoidance systems enabled at unknown times. I live near santiago canyon, a winding road I specifically raised concerns about in relation to Tesla’s camera-only system. A recent fatal crash in that same area involving a Tesla with severe post-collision fire damage has made this complaint more urgent. My concerns appear predictive and ignored. Timeline of complaints •12/20/2023: rear camera issues begin. •12/22/2023: initial service request opened. •12/31/2023 – 1/4/2024: repeated documentation of pillar camera blindness and system failures. Tesla dismisses as fogrelated. Invoice #3000s0008955083 issued (irvine, CA). •1/4/2024: I confirm continued camera blindness after service. •1/5/2024: I formally tell Tesla I believe the issue is due to a software update and is broader than one sensor. •7/11/2024: new concerns about gps failure and power trunk documented in invoice #3000s0010537222 (lake forest, CA).
The updated autpilot nag requires too much steering whel torque often to the point of diengaging the steer assist making the car jerk sideways. This I believe makes the interaction less safe.
Upon use and observation, the latest recall ota update has caused the opposite effect that NHTSA was attempting to address. By making this update, it has caused the autopilot/lane keeping feature too easy to disengage with the steering wheel weight based presence notification/reminder. This can lead to an mis-input by the driver, causing an accident or collision. Please work to ensure that this is addressed or taken back for further evaluation.
Subject: safety concerns regarding recent Model Y autopilot update to whom it may concern at the national highway traffic safety administration (NHTSA), I am writing to urgently express my concerns about the latest software update for the Tesla Model Y’s autopilot system, which, in my view, significantly compromises safety. As a Model Y owner for over two years, I have generally found the vehicle and its features to be reliable and safe. However, the recent update has introduced an overly stringent hands-on-wheel detection mechanism that is not only inconvenient but also potentially hazardous. The new update requires frequent and often forceful interaction with the steering wheel to assure the system of driver presence. This change is drastically different from my previous experience, where I received only one hands-on-wheel violation in two years. The current sensitivity of the system disrupts the smooth operation of the autopilot, leading to frequent and abrupt disengagements. I have found myself struggling to maintain the system’s activation, inadvertently causing the vehicle to exit autopilot mode multiple times. This issue goes beyond mere inconvenience; it actively detracts from driving safety. The need for constant and sometimes aggressive adjustments to satisfy the system’s requirements is distracting and can lead to erratic vehicle behavior. The irony is stark: a system designed to enhance driving safety and ease is now a source of potential danger and stress. The unpredictability and over-sensitivity of the updated system could lead to dangerous situations, especially on highways or in heavy traffic, where sudden disengagement of the autopilot can be particularly risky. As a driver, I now find myself more focused on keeping the autopilot engaged than on the actual driving conditions and surroundings, which is surely contrary to the feature’s intended purpose. I urge the NHTSA to investigate this and stop enforcing ridiculous changes to Tesla.
Subject: safety concerns regarding recent Model Y autopilot update to whom it may concern at the national highway traffic safety administration (NHTSA), I am writing to urgently express my concerns about the latest software update for the Tesla Model Y's autopilot system, which, in my view, significantly compromises safety. As a Model Y owner for over two years, I have generally found the vehicle and its features to be reliable and safe. However, the recent update has introduced an overly stringent hands-on-wheel detection mechanism that is not only inconvenient but also potentially hazardous. The new update requires frequent and often forceful interaction with the steering wheel to assure the system of driver presence. This change is drastically different from my previous experience, where I received only one hands-on-wheel violation in two years. The current sensitivity of the system disrupts the smooth operation of the autopilot, leading to frequent and abrupt disengagements. I have found myself struggling to maintain the system’s activation, inadvertently causing the vehicle to exit autopilot mode multiple times. This issue goes beyond mere inconvenience; it actively detracts from driving safety. The need for constant and sometimes aggressive adjustments to satisfy the system’s requirements is distracting and can lead to erratic vehicle behavior. The irony is stark: a system designed to enhance driving safety and ease is now a source of potential danger and stress. The unpredictability and over-sensitivity of the updated system could lead to dangerous situations, especially on highways or in heavy traffic, where sudden disengagement of the autopilot can be particularly risky. As a driver, I now find myself more focused on keeping the autopilot engaged than on the actual driving conditions and surroundings, which is surely contrary to the feature’s intended purpose. I urge the NHTSA to investigate this.
Tesla software updates aren't transparent on the changes to the cars behavior they will cause. They hide behind the vague terms "minor fixes" with no way to find out what that means. They are changing how the adaptive cruise control and lane keep assist feature work without specifying it. The only way to find out if something changed is to re-read the whole manual every time there is an update and look for changes. The erratic behaviors these updates are causing are mostly unpredictable braking. The unpredictable braking would be predictable if it were disclosed what the updates are actually doing. I believe they are slowly moving enhanced autopilot features to regular auto pilot without disclosing it. The car will now early and aggressively slow down before a turn if navigation is turned on. Also it will randomly slow down if traffic in the other lane is going slower than the speed limit. There is one thing my car hasn't done before that isn't disclosed in the manual. When driving past semis on the interstate the autopilot feature will cause the car to nearly drive on the road line opposite of the semi. Their unwillingness to have full transparency of what the updates are doing is making the cars unsafe an will cause an accident in my opinion. There have been several updates that have been especially bad about adding new functionality without disclosing it.
My vehicle beeps liberally when parking in tight spots, anywhere near a line of any sort, etc. A few weeks after getting my vehicle in late November (I don’t remember the exact date), I was driving through construction at night and without warning, the steering “corrected” itself and I hit a cone causing 2 scuffs on my new vehicle. The steering locked, but thankfully I was able to get the vehicle back into the lane. When I called Tesla a few weeks later they asked for a specific date and time which I didn’t have. They said they were unable to help without that. Thankfully there was nothing more serious, but I would like my scuffs and steering fixed.
These are observations over the past couple of months that I have owned the car. They have been consistently observed and have occurred at speed between 35-70 mph. There are four main issues I’m having with my Tesla. 1) I cannot use cruise control because it requires automatic windshield wipers. The automatic windshield wipers cannot sense when it’s raining and just turns on randomly when it’s not raining. 2). The car breaks on its own randomly on both the highway and city streets. I have almost been rear ended several times because of this. 3) the car thought another car was coming in my lane on interstate 95 and violently switched lanes without warning causing me to oversteer and almost wreck. I had my family in the car with me and it was terrifying. 4). The two times I used full self driving the car actually accelerated to make it through a yellow light. I know these are all beta features of the car but in my opinion they are all dangerous to use on the roads and should be disabled until they are ready for prime time.
I was driving at 64 mph utilizing the adaptive cruise control and autosteer (beta). I was in the first of a two lane divided highway with no vehicles within 1000 feet in front of me and several cars passing me on my left. Fortunately, I had both hands on the wheel at the 10 and 2 positions and have good arm strength. Suddenly, abruptly and violently the car took over the steering and braking swerving to the left about a foot or so then applying the brakes hard while simultaneously turning the front wheels about 65 degrees to the right. I believe another vehicle without the Tesla's low center of gravity would surly have rolled over. I was now heading across the breakdown lane towards the guardrail. With great effort I took over control of the steering just 1-2 feet from the guardrail pulling hard left and pressing down on the accelerator (to release the brakes) causing a second roll over feeling. I then turned right to straighten the car out and continued down the highway in the same lane without contacting any objects. I felt the violent nature of the swerving in my lower back and left arm. The car did not want to release the steering to me. I have no evidence other than the experience of my wife and me. I was coming off a white concrete bridge deck with a slight left curve without any lane markings. At the moment of the incident the bridge deck was covered with a large area of black tar. I believe these two factors caused the car to go into some kind of avoidance maneuver. I also believe a lesser capable driver may have been killed.
I was driving my Tesla Model Y on the highway and suddenly the steering wheel turn 90 degrees and crash my driver side with the back of the car in the lane to the right. The night before the accident there was a system update for the car. No warning were made by the car.
I was driving south bound on hwy90 at about 78mph using navigate on autopilot. Without warning the car swerved into a left turn lane and then slammed on the brakes when the turn lane came to an end. This caused a number of personal belongings to be damaged. Thus happened as I was on my way home from a service center visit that stated the autopilot systems were all functioning correctly. Tesla was notified immediately a d has not responded to my complaints.
My Tesla Model Y has been braking while on autopilot out of nowhere and very, very hard. I have noticed this has been happening more when I am next to a semi truck or other large vehicle where the Tesla will for whatever reason slam on the brakes even though there is no vehicle or objects in front of me and there is not a vehicle merging. This is a major safety issue and can lead to a loss of property and worst of all, life. Please do a recall on these vehicles and make Tesla fix these issues before it leads to further accidents and injuries/deaths. Thank you.
One month latest Tesla model-y owner. Tried all means in last 3 days to submit several concerns, but Tesla had made it impossible to file written complains. 1. Automatic emergency braking was applied with kids in back eat to school on an internal 35mph road in broad day light with a solid cement divider in left and no vehicle around anywhere. Everyone was hurt someway or other due to this phantom braking and would have been much worse on highway as there are many trucks on this side. Seems accelerator was also not working for few seconds. A software fliker is seen, seconds before this, in video. Not sure a hack, bug or intentional. Major safety issue, have video from dashcam. 2. The fsd is not ready, whatever Tesla says, it doesn’t care about road condition, speed or other surroundings. Drives like aggressive kid who just learned driving & doesn’t care about anyone or anything. Thats the reason have stopped using it. 3. The automatic lane departure assistance is very dangerous at the as it takes control of vehicle and tries to move away from lane as soon as you are around the border, without considering speed, other vehicles in side lanes, road condition or road hazards and due to this, several time it happened like tug of war, where other vehicles were too close like a truck in there lane left & right, hence was trying to be on edge of lane and this system was yanking to go away from line, causing shaking & was about to hit other vehicles. This may have caused accidents, had to turn this off as well. 4. The estimated range is so wrong and off my 70-100 miles and this could cause safety issues for family in night or cold getting stuck or anxiety that can cause emotional trauma and safety issues. 5. Other small stuff- wiper turn on without rain or water, sentry mode don't work if charge is below 65 miles, which could be a security risk, fandom door open notification, bluetooth issue, car don't unlock, costly super charger etc. Sorry but had no other way.
The facts of the incident: on October 20, 2023, at approximately 3:54 pm, I, howard, was driving our 2023 Tesla Model Y on arizona state route 64 (sr-64), a two-lane highway, enroute from orange, California, to the grand canyon south rim. The vehicle carried a total of four passengers, including my wife, myself, and another couple from san diego. During our journey, an alarming and life-threatening situation unfolded. Without warning, our 2023 Tesla Model Y abruptly veered to the left, dangerously encroaching into oncoming traffic. Despite my immediate attempts to regain control by steering the vehicle away from the oncoming traffic, the steering wheel became unresponsive, effectively locking in place. This mechanical or software malfunction prevented me from correcting the perilous situation, resulting in a head-on collision with a large Ford 350 truck, which was also traveling at a high speed, estimated to be more than 60 miles per hour. Fortunately, while there were injuries, there didn’t appear to be any life threating injuries. Because of the resources required to further investigate and do an analysis of the vehicle, as well as the challenge of dealing with Tesla inc. , law firms have declined to pursue further investigations. Morgan and morgan initially started an investigation with knott, llc, arizona, but then dropped the pursuit due to costs. The vehicle is still on-hold by my insurance company, usaa, but they may not hold it much longer before placing it on auction. However, it is urgent that NHTSA take possession and complete further analysis of this vehicle. Your assistance and consideration are appreciated, and I look forward to assisting with this investigation.
During the week of October 09 2023, I started receiving two failure alerts on my 2023 Model Y. The first is that lane keeping assist was unavailable. In spite of this alert, the system continued to function properly. The second warning was a steering assist reduced warning. At first, this also seemed to not corollate to a noticeable issue with the vehicle. On the evening October 15, when leaving a supercharger station, I noticed a momentary stutter in the steering wheel that accompanied the above mentioned alert. No other issue was observed. On the afternoon of October 16, when leaving from work, after putting the vehicle in gear, I had no steering ability at all, it was a similar feeling to the steering wheel lock on older vehicles. The wheel would move approximately 1" in either direction. I immediately shut the vehicle down and performed a reset by holding in both scroll wheels on the steering wheel. This did not clear the code, but did return steering ability. Since I could not predict when the steering might go out again, I contacted Tesla and had the vehicle towed to the nearest service center.
My Tesla Model Y is installing the same updates multiple times. The updates installing multiple times is causing erratic functionality issues with the car's software. Tesla has been unwilling to resolve the issue and claim the car is working just fine. When I asked the local service tech I was told the car shouldn't be installing the same update multiple times. I don't feel safe in my car when I don't know when it is going to act up next.
There was an update last night on my Tesla Model Y. I've noticed two episodes of phantom breaking after this update this morning when I was driving to work. There was a truck in the next lane but the car hit the brakes all of a sudden on the highway when the autopilot was on active. It was at a curve on the highway. I'm assuming the car thought that the truck was moving into my lane leading to sudden activation of brakes. This could have been dangerous. Fortunately there was nobody behind me and we were safe.
On a round trip from albuquerque, nm to tucson, az, we experienced 8-10 instances of phantom braking in our 2020 Tesla long-range Model Y. The occurrences happened when the car was in adaptive cruise control and lane keeping assistance. Highway speed was typically 75 mph, weather was sunny, and roads were clear. Overall, an unnerving experience!.
While I was driving my car earlier today, my Tesla Model Y experienced an unexpected steering freeze right in the middle of the freeway. This incident occurred while I was accompanied by my 2-year-old son. Initially, I was quite alarmed and attempted to exert force to guide the car towards the slower lane, but unfortunately, the steering remained unresponsive. After trying the brakes a few times and executing forceful maneuvers, I was fortunate that the steering finally began to respond. Thanks to these actions, both my son and I are safe and well today. It's worth considering the safety implications of Tesla vehicles on the road, and some individuals believe that the NHTSA should intervene by imposing fines and addressing the use of the autopilot feature promptly. Thanks dr. Sujata mathur.
Approximately 11:26 am on Thursday July 27, and approximately 110 miles east of gallup, new mexico, I experienced when driving a 2021 Tesla Model Y, a series of extreme phantom braking events. One phantom braking event resulted in an extreme braking and the car gave a loud alarm signal. This happened while passing two rather closely spaced semi-trailer trucks that were in the right lane of the interstate highway 40 roadway going westbound. Both trucks were traveling without making any unusual movements. They were steadily traveling between 70 and 80 mph. There was a large SUV following our vehicle. This event resulted in an extremely hazardous condition. A high-speed rear-end collision was imminent. I, as the driver of the Model Y, immediately accelerated and steered over to the left-lane shoulder. No collision occurred and I was able to pass the two trucks and return to the right lane. Clearly, this behavior of the Model Y is extremely hazardous and must be corrected. It is my recollection that I had disengaged the auto steer mode prior to passing the two trucks and then reengaged the auto steer after passing the two trucks and moving to the right lane. I have experienced this behavior before as I have been driving the vehicle for the past two years of my three-year lease. Strangely it always occurs, it seems when driving on rural interstate, such as this case. Fortunately, these experiences helped me to instantly respond by accelerating and moving onto the left lane shoulder to avoid a disastrous accident. It is appalling that this has been happening. The Tesla gives absolutely no indication of the cause of the urgent breaking either before or after the event. Also there has been no notifications from Tesla on this dangerous phantom breaking. I am sending this report to Tesla and the NHTSA.
Had my front windshield replace by dealership. While driving home on highway from dealership I turned on super cruise and the car no longer followed the center of the lane. It moved over to the left and nearly ran into a neighboring car before I over rode the lane following with the steering wheel. On an empty section of the highway later I tested and it was now driving with its wheels on the left lane line, body and mirror extending over the line into the next lane. Dealership should have checked that or at least warned me to look out for it when I picked up the car. Called Tesla roadside assistance and person on the phone told me to recalibrate camera, which I did and it fixed the problem. Calibration of the front camera was thrown off when it was remounted on the new windshield. Obvious to me in hindsight, but it did not occur to me at all when I picked up the car.
My family and I were driving from san diego to arizona on interstate 8. I was utilizing autopilot (adaptive cruise control with lane keeping assistance) and 4 different times the car slammed on the brakes from approximately 75 mph to 60 mph. We were driving on a bright, sunny, extremely hot day (116 degrees). Luckily there were no cars behind us or we could have been rear-ended. It's also noteworthy, there were no cars/vehicles in front of us or objects in the road when the phantom braking occurred. I now no longer feel safe using autopilot on our Model Y. No warning messages appeared and no additional information was provided by the car. We have used autopilot on several other road trips and never experienced this issue.
When I drive a Tesla Model Y under self-driving mode at night inside yellowstone np, there is another car on the opposite side with a high beam turned on, and just near that car there is a bison on the road, the self-driving computer fails to detect the bison in the beginning, and too late to do emergency break, the car hit the bison after passing that car. The bison died but driver is safe, the front right head light part and right repeater camera was missing. It's very hard to see the bison in the dark when the other car turns on the high beam, for both humans and cameras, the self-driving without radar or lidar is not able to detect the obstacle at the blind spot. I want to report this accident as a reference, the car's self-driving software helped me avoid serious injury. The Tesla fsd(full self-driving or autopilot) is great but I still hope there's more sensor such as radar or lidar can be added to the car for better night vision self-driving. The car deemed a total loss and there is a video from the front dash cam record how this accident happen.
Problem Category | Number of Problems |
---|---|
Assist problems | |
Blind Spot Detection problems | |
Warning problems | |
Lane Keep Automatic Steering problems | |
Lane Keep Steering Assist problems | |
Lane Departure problems |