Is Tesla FSD Ready For UNSUPERVISED Driving? You'll Be Surprised
Вставка
- Опубліковано 9 лют 2025
- Welcome to my channel dedicated to all things Tesla Full Self-Driving (FSD) technology. Stay updated on the latest news, updates, and demonstrations of FSD as I explore the future of autonomous driving. Don't miss out - Subscribe to stay informed!
In today's video, I take FSD V13.2.2 through 'unsupervised' scenarios with real time re-routing. There was a critical disengagement in which I felt was very concerning.
What are your thoughts on this situation?
If you want to reach me, please send an email to contact@driveelectrictoday.com
3:15 Correction: Unsupervised pilot program in Austin will leverage Model 3’s and Model Y’s. No RoboTaxi vehicles confirmed.
I watched the disengagement at 22:34 in slow motion. The steering wheel was already correcting to the left when you took over so I’m pretty sure it wouldn’t have crashed. But I still agree with the fact that you took over.
I’ve had the 2025 MYLR now for 10 days. I’ve got around 1,300 miles on the car and 80-90% of those miles have been with FSD. I’ve had two incidents that I had to intervene because of lane selection but other than that it’s been great.
Awesome man! Glad to hear you’re enjoying the new Y! You definitely racked up those miles quick haha
"Amazed at times, then disappointed." absolutely my experience. Fsd negotiates complex rotaries, then out of the blue makes basic error.
I'm amazed at some comments blaming the driver for the near accident due to unsafe lane change. Where is it written that you can't edit your route while driving? It isn't. The sys would lock out route edits while moving if that's the case. Besides, isn't "do not hit something" the most basic rule of any ADAS system? How is it that the camera saw the oncoming car, flashed a blind spot warning, and still swerved right to change lanes? There is some horrible disconnect in the system. I don't call it latency. Look before you do something is a basic human rule and should be reflected in the system
Bell yes that's a safety intervention. If the route planer screws up the driving then route planning needs to be modified. E.g. no changes to path until 30 seconds after route change......
Yes! Some sort of buffer period. Once adjusted continue down the path, then adjust accordingly 15-30 seconds afterwards. Couldn’t agree more here.
Super uncomfortable, but that is what you call a preroll. FSD was not going to turn into the other car. If you slow this video down and go frame by frame, 2 frames after the driver turns the steering wheel, the other car can already be seen in the front windshield.
Also take a close look at the Pilar blind spot warning. It kind of flickers…..not steady. Keep in mind that white vehicle was also moving at a much faster rate so I’m wondering if it’s a latency issue.
A part of me wonders if FSD would have indeed waited for that minivan to pass before turning or if it had no intention on waiting. A test, surely, I do not want to determine.
@ it seemed to brake fast to let the car go and then do the turn at a VERY close margin. im sure it calculated the speed of the car but i'm not gonna ask you to put your life on the line on my assumption.
For anyone that was caught by the clickbait where it seems FSD almost crashed against a vehicle, the driver manually deleted a stop during the trip, so FSD had to reroute and in that second there was some hesitation by FSD. Solution, not allowing manual rerouting unless you are at a stop, or a safe situation. I can't wait for FSD to be implemented and removing the human factor, which is always responsible for accidents and deaths on the road
Agreed…. Must’ve been pretty close though the driver seem to be shaken for the rest of the video don’t blame him
Yes reader should probably be only done during a stop don’t know but you got Remote the human factor agreed
Yup, as mentioned in my voiceover in the very beginning of the video - I was actively modifying the trip in the GPS to have FSD reroute in real time during that incident.
Also make rerouting easier
shouldnt need to stop, but doesnt need to react so quick in that situation.
Here is what I noticed from the video. After the driver changed the destination, FSD started breaking and it did identify the car on the right ahead of time. I can't tell if FSD actually accelerated before the driver took over but it did identify that there was a car there. It's possible that it was still breaking and it was just going to wait for the car to pass. It's possible that even though it had already identified the oncoming car it was still going to turn because it wasn't done calculating so it just didn't react fast enough. Good corner case. I would suggest not routing in the middle of the road with cars close by.
I’m also in 13.2.2 and I’m curious what the latest version would do differently
I think the latest versions of super tiny updates. Haven’t heard anything notable on the release notes either. I’m patiently waiting for those to pop up on my app though.
Woah on that reroute "almost" side swipe. Glad it didn't turn out worse. I would think that when a driver manually disengages FSD by wheel move, that the system will send a report automatically to developers.
I do know that they give you the ability to send a voice note at the time of disengagement. But hopefully some level of data gets sent to the team. This was good data to have.
@DriveElectricToday Agreed...there have been times I have missed a voice record send also and have thought, "they have to be receiving data still regardless of the voice memo."
I Love Your Videos Thank You for being here 🙂
Great to hear from you! Glad you enjoy the content!
Super uncomfortable, but that is what you call a preroll. FSD was not going to turn into the other car. If you slow this video down and go frame by frame, 2 frames after the driver turns the steering wheel, the other car can already be seen in the front windshield.
I was also reviewing it back quite a few times now. I noticed that too. Surely not a test I want to put myself through. I am wondering, though, if I reroute again in this same area, will it do that same exact thing or will it learn not to?
@ Hmm, I don’t think FSD updates on the fly for any interventions. Possibly the next training run it will be updated for disengagements. Not sure if that would be a point releases but more likely it wouldn’t be updated based on engagements until the next major version (i.e. version 14). Also, the Tesla AI team could make manual adjustments, in which case it could easily be fixed in the next point release.
@@DriveElectricToday Certainly not your fault either way. I'm not sure if I could have grabbed the wheel that quickly. Great job!
Opinion: FSD has autonomy to do quick reroutes like that, but it seems to indicate a real problem in safety priority. Blind spot warning was clearly on before the car moved to the right. That tells me that the blind spot warning system itself may well only be a visual display to the driver, and not to FSD. Which makes me wonder what is going on with balancing safety and navigation there. Turning from the left lane was dodgy to start with. I will give FSD a very small chance that it was truly threading the needle, making the turn within inches of the passing car... That was the most serious looking disengagement I've ever seen with v13.2.2, which is the same version I've been using for 5 weeks.
Absolutely agreed. After this situation, it definitely seems like the blind spot monitoring attached to the side mirrors is decoupled from FSD’s visibility.
I’m glad you mentioned that FSD has the capability to reroute. I was a bit unsure if the software can accommodate for that. And I started wondering if I just pushed it outside of its limits. But it really should be able to reroute in real time without issue.
This was, by far, the most serious disengagement I have had to perform.
Agree. I've never seen a maneuver like that in 13.2.2
I've used this update for more hours than most, and it can be very aggressive at times. I highly doubt it was going to run you into that van. It was going to turn right as it passed. I've experienced something similar, and I've also noticed the system will make aggressive moves right after it's turned on sometimes. I entered a destination, and it immediately started crossing several lanes of freeway to make a quickly coming exit. There was a different freeway I wasn't even thinking about at the time that was a similar distance to the pin, and I couldn't figure out why it was suddenly darting all the way over. The lanes were clear of traffic, but it put me on notice to be more thoughtful when I start it.
They definitely still have some cooking to do, but it knew that vehicle was there. There's redundancies that would have to be breached for the worst possible outcome there. They need to fix that either way. It obviously shouldn't have done that.
Agree - I, too, have noticed the abrupt decisions FSD makes upon enabling it in the middle of a drive. I must say the “start from park” function does not act like that. It takes its time to get its bearings first.
It makes sense why FSD would need to adapt quickly when enabled in the middle of the drive. But it would be great to have some sort of “buffer” period to allow for the onboard cameras to get its bearings first before making a jerky, possibly dangerous, move.
Thanks for your comment. This is a really interesting point you bring up in terms of enabling FSD and the abrupt nature it displays. I may use this as a talking point in my next video.
@@DriveElectricToday I've come close to 4 vehicles on the latest update. I allow FSD quite a bit of freedom for the sake of my curious nature and experimentation. lol
On 3 of those occasions, I was holding on to the wheel and ready to intervene once it became absolutely necessary, but one of them happened so fast that I didn't have time to reach for the wheel. After watching your situation a couple more times, I'm sure I would have reacted similarly. My examples were a bit different, but my Y did make the correction at the very last moment of all 4 of these various close calls.
I have LR. 2023 model y. I love FSD and use it daily, but I don't believe it is ready for unsupervised prime time yet. There are some really important things that need to be resolved. I was in a gated community facing a culdesac at the back of the neighborhood. FSD was going to go up a driveway and who knows how far it would have proceeded. There was a major road it was trying to get to . Not gooooooood . So I turned car around then reengaged FSD. The car was going to run into the exit gate of community.! For the most part FSD is simply amazing, but there are definitely safety issues that need to be resolved !!!
100% - It’s funny because I’m amazed, shocked, disappointed and upset with FSD all within a 30 minute window lol
Hold on here, in your initial intro there was noticeable tire squealing when you disengaged but later in the video there was no sound at all. So which is it? Did you add the tire squealing to add some drama? I don’t appreciate that kind of disingenuous click baiting.
It's taking the far left lane on a left turn because that is the law. On right turns it should use the nearest right lane.
Yup, it should absolutely be taking the right lane to make that turn. Which is another reason I would have preferred it to continue straight then reroute later.
2025 M3LR RWD v13.2.x we drove from Pensacola to Orlando and back on FSD a few weeks ago. Minimal interventions. We did have some issues with attention monitoring and sunglasses when driving into the sun later in the day heading west on I10. Would get nags even when facing forward. With glasses off it did not do it. We plan to purchase a subscription on the months we take big trips as it really reduces driver fatigue.
That’s awesome! I can definitely see the value in subscribing to FSD before a longer road trip. I hope they don’t remove the subscription pricing model.
How’s the LR? I kind of regret not going with the long range myself.
@@DriveElectricToday We love it. I had a NEMA 14-50 plug run to the garage and we use the mobile connector for home charging. Keep it plugged in at 55-60% SOC for most days and then use up to 95% for trips out of town. It predicts trip usage so well, we don't have range worries anymore after several multi-day work trips.
Interesting that it was a white vehicle. I had a similar issue in a roundabout that it was like it didn’t see the white vehicle. In your case it had the red indicator showing so it clearly recognized it in one of the systems.
Yes that is the most bizarre part. A certain component/stack of the system visualized and displayed the white minivan but still proceeded to make its, potentially faulty, decision.
I take the same routes you do in video and many more around CBUS. I would lo e to possibly collaborate in near future. I also will say I have had near misses as well with quick edits like you did with FSD (MYLR)
That would be awesome! Definitely open to that idea. How is your MY holding up in the winter here?
maybe tesla is just too busy or there are legal issues, but i wish tesla would get back to people on incidents like these. LET'S KEEP TESTING TILL WE GET THIS RIGHT!
Agreed - I really wish there was a better feedback loop for these types of situations. Hopefully it’s something we’ll see evolve as time goes on.
Based on a prior incident that I experienced and what you experienced, I am beginning to suspect that the FSD hardware / software is not responsive enough to respond to sudden unusual incidents in some cases. Yes, the vehicle on the right appeared on the screen but that does not mean FSD hardware and software was able to recognize in a timely fashion, determine that it was a threat, and then take appropriate action quickly enough. To go into slightly more detail, there are multiple processors running and multiple threads in software running where each thread is responsible for performing some software task. Threads suspend waiting to be unblocked such as waiting for a message that is sent by another thread. There are inherent latencies with respect to when a particular thread can run. The threat identification, threat assessment, and threat mitigation may be separate threads in a very simple explanatory model. This is just an hypothesis since none of us really know the innards and that is the problem for cases when sudden unexpected incidents occur. Only controlled tests can expose issues in a reasonably safe manner without driving a few hundred miles for such incidents to occur.
Yes, I have absolutely observed this type of behavior with FSD many times. It’s almost as if when a sudden or unusual incident occurs, FSD doesn’t have a great “buffer” period to assess the situation properly.
In the context of this situation with the white minivan and the GPS rerouting, FSD should have continued straight and deprioritize a faster route, and instead prioritize safety.
if there are different threads as you describe wouldn't engineering have the sense to make the visible rendering on the screen the last priority? what am i missing.
Should FSD be able to handle modifying a trip in the middle of a drive? What are your thoughts? Let me know down below👇
Tesla should have hard rules that the car can't make a right turn from a left lane that's just common sense. First switch lanes and then turn, if there's no room to switch lanes just miss the turn.
Clearly trying to make a last-second right turn from the left lane, and it certainly saw the other vehicle, because the blind spot warning was on. Very bad.
Ah yes I haven’t even noticed that the blind spot monitor light was red. Good callout, and it makes the situation even more confusing.
its simple. when you reroute it has to restart itself. it cant do it within .01 seconds. you should have rerouted when you knew you had time.
Absolutely great point here. This is what I have been pondering about as well. Though, I do wonder how unsupervised vehicles will handle GPS rerouting such as this.
@@DriveElectricToday i dont think they would make it able to use rerouting, its just for A to B. that's how it's supposed to be used. if you think about it, even with any GPS if you reroute in ANY gps nav it takes a second to figure it out. same with a tesla. it shut down and restarted while that car was already coming fast. it was your fault not FSD.
Yes that is a great point. Navigation really should be used from point A to point B. So I do agree that forcefully rerouting is tinkering with the navigation.
What are your thoughts on unexpected road closures in terms of GPS rerouting such? I wonder how it would handle those types of situations?
@@DriveElectricToday actually had a situation with my tesla for this EXACT thing, i had to take over because even with the stop sign guy it didnt register it. so that has to be worked out. i drive a 2024 model 3 on 13.2.2
Very irresponsible rerouting in mid trip and not waiting for a safe area, a red light, or something like that. Even with that, I see a red warning in the camera view when the car is about to turn, can't tell, but even if the guy hadn't touched the wheel I think it was stopping because it detected the car coming. Again, very irresponsible way of rerouting, probably the best fix is not allowing a manual reroute unless you are in a safe situation
Looks like it’s like us humans making last minute rerouting, we want to turn right and do saw the fast vehicle coming and positioned itself to turn, it wasn’t going to turn until the fast vehicle passes then we will make that turn very quickly. Does what learning from humans does.
Couldn’t agree more here. It is, at the end of the day, trained in human behavior. This is a classic human decision that I’m sure we have all seen many times. Surely not a test I want to observe with FSD though.
My opinion is that FSD behaves "too human". AI is trained with "human driving data", that's why is inconsistent, unpredictable just like a human. My MX 2025 v13.2.2 when turning into my garage, it never does it the same way. Same way when pulling out of the garage, sometimes it goes the opposite way it should go. It is, like I said, way "too human"
Yup, this is what I thought too when reviewing this incident. It’s almost as if a human driver wanted to reroute themselves and they made a knee jerk reaction on the road. This situation at 22:22 definitely resembles human like behavior.
Yep, not good!😮 It happened to me where a human driver just turned just like fsd did there. In your case Fsd should have reacted to the vehicle on your right there.
Good point here. At the end of the day, these FSD training models are mirrored on human drivers. This is certainly something a human would do. I have seen many people try and turn at the very last second, spot a car in the neighboring land, then stop. Not sure if FSD was going to stop or not. Surely not a test I want to determine.
No, but I’m optimistic 😅
lol same!
That is definitely not good behavior this far into the software. There should be no obviouse safety errors like this when we are months away from real unsupervised trials. Did you snapshot or record for tesla AI?
Unfortunately, I missed the window in which you can send a recording back to Tesla. In the moment I was too stunned to send that back. I wish I had though. Perhaps if this video gets enough engagement, it might catch Tesla’s attention and they could retroactively pull the logs. Just my guess though.
22:32 FSD failed murder attempted . This was really bad.
Agreed, prob my top most critical disengagement. Though I was modifying the trip which caused that to happen. And many other comments are saying that there should be a buffer period before FSD makes any route changes after the maps are updated. Which makes a lot of sense.
still love your commentary, still hate that FPV, it's too shaky and unnecessary, made me dizzy.
I was hoping the settings I enabled made it better, but sounds like it’s not any better 😞
Thank you for the feedback. I will look into either changing the settings again, maybe trying out a new camera, or omit the POV camera entirely.
I apologize, thank you for the feedback. I’ll look into it further.
@@DriveElectricToday thank you for kind reply, but i think when FSD going for a unprotect turn, the FPV shot is very helpful to check left and right, it is only unnecessary when it was going smoothly and straight. just doesn't to use it too frequently.
@@BerlinBK2662 Yes that makes sense. Will follow that guidance on my next video. I like to capture people’s attention with it in the very beginning of the video as well. Thank you for the feedback. Stay tuned for my next video.
Definitely don’t use two cameras. It makes viewers think you are doing it so you can edit out important events in post production.
@RaisingAwesome Ah yes, I can see that happening. Thank you all for the great feedback. I will take this seriously and implement it in my future posts.
Not this year, maybe next.
I think so as well. We’ll see how the unsupervised testing goes, but I am thinking next year is more in line too.
Suggestion, maybe talk a little less during your video. That’s a tremendous amount of information to take in while we’re trying to watch how the FSD performs. At least spread out that commentary.
Understood, good feedback here. I’ll implement that suggestion into future videos. Thanks for the feedback!
Needs Lidar.
At 3:12 he says 'Its going to be leveraging their robotaxi design and fleet of vehicles'.
But their is no working and tested robotaxi, nor is there a fleet of vehicles. It's all just talk, and vaporware. Waymo has a working robotaxi that gives 600,000 rides per month. Tesla robotaxi has nothing.
Remember, Elon said there would be 1 million robotaxis by the year 2020. And we all know how that turned out .
Yup, great correction here! In their pilot program this year, they'll be launching it on Model 3's and Model Y's. Thanks for the correction here!
The white car should have slowed down but he was being a asshole
My M3 Hw4 LOVES to turn the very moment a car passes even if there is no cars coming after. It looks like it was going to do that but i would have took over as well.
Agreed, after all FSD is trained on human behavior which is pretty apparent in this situation
12:29, you edit out the video and then say it did great. Suggest not editing such things as it appears disingenuous.unless you do that to boost comments, lol
Nope, I’ve got two cameras rolling concurrently and then switch between them while I edit. I don’t cut anything out from the drive. I can see how that can be interpreted differently though. I will incorporate one camera view more often in future videos. Though I may continue using the FPV camera for unprotected turns where the main camera can’t see left and right. Thoughts on that?
He says 'FSD is amazing at times, and other times its not ready for unsupervised'
Amazed because Drive Electric is tesing the buggy FSD software in very easy rural conditions. FSD in a busy big city is nowhere even close to Level 5 Full Autonomy. That is years away.
Totally agree! I see a lot of room for improvement when it comes to city driving with FSD. Many many flaws with downtown/city areas even today.