I'll be releasing the rest of this interview probably tomorrow morning. It's almost 2 hours. James and I are also recording a more technical video as well going over AI Day slides. That will come out in a few days. 😉
There was news the other day that GM cannot deliver cars due to a shortage of these sensors. So Tesla hit with the same shortage just said, cancel the features for now. We can enable it later without them.
Hi Dave, I am a long time Tesla investor and owner and visiting Austin next week, do you have any tips on how to get a peek inside Giga Austin? Please let me know if you have any tips, thank you!
@@sudeeptaghosh For sure. Tesla's primary concern is safety. Also, they are entirely driven purely by profits as most other corporations. However in this case safety and profit coincide as the added software capability has none or very little added costs. It is a double win.
So true. Can't figure why he doesn't add that with Bird's eye view for parking. It would take 1/1000th of the software engineering effort of FSD and would be something my wife would really use!!
the reason is that Elon is cheap and is maximizing profit margins for each car. The fact that he removed the uss without having a viable alternative at all was a ridiculous decision. Not having cross traffic backup alerts and bird's eye view are results of this cost-savings mentality they have. It is frustrating to pay this much for a car and not get these basic features other cars have in that price range.
2 issues: 1) The steering projection lines in the backup camera are terrible at tracking where I’m going. It needs to be recalibrated. 2) None of the cameras actually show me the immediate surrounding of my vehicle.
No, the cameras will NOT be better inching towards a curb than ultra-sonic (your second question) because they cannot see objects lower than a foot, up to 3 feet in front of the car. This is really about saving money and arguably materials / resources, but let's not say it's about a superior product. It's also worth noting the cameras are being upgraded from 1.2MP to 5MP and it's not clear whether those cameras will be needed to replace all of the functionality of the sensors being removed, or if it can be achieved with cameras on "legacy" vehicles. Tesla Vision doesn't do a better job than radar for AP especially at night and when it's raining (I'm in the FSD Beta and my car is from 2018).
Actually, if you are moving toward a curb, and were more than 3 feet away when you started, the measured wheel rotation in conjunction with the camera's last identified location of the curb, will work. However, if an object, like a bike or skateboard, was placed within the camera’s blind spot (below a foot in elevation up to three feet in front of the car) while the car was parked, it would have no "memory" of the object to work with.
2 Problems: 1) You don't have 4 cameras at bumper level. 2) You never are going to get superior with camera alone to camera + ultrasonic. More inputs is always going to be better than fewer. ESPECIALLY in inclement weather. Ultrasonic function normally when covered in water drops. Cameras, not so much.
Yes! More sensor will always be better and smarter… car can fly too if strapped with a rocket booster… but you have to consider the business side too… is camera enough? Yes! Human can use their eye to drive! So camera is enough for most cases. I think they want to reduce the manufacturing cost
@@stevenlee1726 I understand why they did it, I just think it's short sighted. Auto makers have been installing ultrasonic sensors in cars for years just to aid human drivers, and they are cheap. Every time I drive my Tesla in anything more than a drizzle full self driving goes bye-bye (sometimes the side cameras fog up just from the cold, and Tesla tries to tell me they are working as designed) No way in hell they get to level 5 with cameras alone.
@@daveinpublic ultra sonic sensors are very cheap including their chips, they were never had supply problems and can be manufactured in any country. Unlike cpu's and gpu's that need state of the art fabs.
@@stevenlee1726 as J.P.M already said. All cars nowadays have ultra sonic sensors at least as an option and they are dirt cheap. Camera is enough for most situations yes, but what about situations that are not? How much is most? The cameras and auto options in my father's tesla stop working under rain, snow and thick fog, when in this situations the radar and ultra sonic sensors would work perfectly fine. The radar works in all conditions unlike a camera, might as well use it.
One of the best things about our 2012 Nissan Leaf was the 360 degree overhead view cameras. They had cameras on each side of the car and in the front and back. On the main screen in the dash, they had an overhead picture of the car, and real time video facing all 4 directions. It was brilliant. It made parking, backing up, etc a breeze. It is amazing to me that here it is 10 years later and this isn't available on $60,000 vehicles. The back up camera on the Tesla is excellent, but the side cameras face the rear, and the only front camera is behind the rear view mirror, so it can only see things about 4 ft away from the car, so you have no idea how far away from the curb you are, or if you're about to drive over something breakable.
I've learned so much from these interviews with James. He's such as intelligent, humble and great explainer of everything. Thank you Dave for spreading these knowledges.
Removing the sensors prior to having the software ready is a huge mistake. And removing radar was a mistake too, since Tesla turned my radar off I have way more phantom braking events. Vision only will never work without more sensors. Tesla removing things is all about reducing their costs, yet it never lowers the price of their vehicles for consumers. But this was a good interview.
I too am having many situations where I get the "phantom braking" from overhead shadows on the road several times on my way to town in the mornings. The cameras can get a film over them where they sort of still work but miss a bunch of stuff. Also the cameras can get crud and even rain drops mess with the cameras. Also the cameras are placed pretty high up and miss things that are low down that the car can still hit. This is a mistake. For people like me that are full self driving beta testers we are so far back on the base release of software that we don't get to keep up with all the new features released to the non-beta testers. When will the base software of beta release be closer to the software being released to the rest of the Tesla community. I get tired of reading and seeing all the great features being released to all the non-beta users and having to wait for months until the beta FSD community gets to try these out. This is especially bad when we lose radar and ultrasonic sensors and they say that software will eventually fix this loss. For us Beta FSD users, this "eventually" is not good enough. I've considered many times dropping out of the FSD Beta just because of this software release lag but then why did I pay so much extra for FSD when I wouldn't be able to use it until someday. This story is getting old.
@@bruceperson7820 I hear you! I'm also getting tired of missing out on the new software updates. It took forever just to get the update that fixed the USB Unavailable issue. I'm not super impressed with the FSD beta anyway, so I'm really considering dropping out because I hate waiting months for bug fixes for other issues because beta users don't get the updates like everyone else does. And I wish Tesla would stop adding games and just fix the Dashcam Viewer!
@@Resist4 omg the dashcam viewer is completely bricked in my car, is it like this for everyone? It takes 5+ minutes to load anything. I entirely stopped looking at sentry recordings. I just hope if anything happens to my car I'll be able to pull the SD card and read it elsewhere.
@@bruceperson7820 I didn't pay for FSD, but removing radar made my car worse too. I get about 3000% (3/day vs .5/ week) more false positive forward collision warnings now. I am seriously considering setting it to late warning, but that feels like disabling a safety feature.
@@jonathansage2147 The Dashcam Viewer has not worked right since day one. And Tesla has done nothing to fix it. I rarely get full 10 minute clips. Most of the time they are one minute, two minute, three minute, 4 minute or 5 minute clips. Often the video skips around or doesn't finish. Moving the slider many times causes the video to not finish and loop. This can't be that difficult for Tesla to fix. I think the Dashcam Viewer is more important than adding more games to my car. Elon just fix it and give us better resolution cameras for crying out loud! And don't tell me that the cameras were never designed to be used as a dashcam, because if so then you guys weren't thinking ahead and aren't as smart as you claim to be.
I'd like to push back a little on the assertion that the current vision-based system is better than the radar system was. Would appreciate hearing the basis for this claim, because it sure doesn't seem to be based on the experience of many of my friends. (I have an older car with radar.)
I still waiting for my autopilot to be better… to drive as fast as with radar, to drive as close… I still waiting for automatic wipers to be on par with other cars, same as automatic high beams. If you’re a Tesla owner you get promised a lot but what you actually get…
@@TC-V8 probably my model x plaid i get brighted all time by mostly semis driving on the interstate because the auto high beams are sooo bad they turn on wayyy to much not too big of a deal except the fact my 150,000 dollar car i have to manually turn on and off the high beams becuase it cant tell if someone is in front of me!!
I can relate to that ... I just got my 2023 MY without USS and it is a disappointment. Even forward parking is a struggle. I don't understand why they removed USS without replacing their function
I picked up my new model Y LR on March 1st of this year (2023) and its parking assistance is non-existent. ZERO warnings when parking. I even set up an obstacle with padded garbage cans and reversed into them. I bumped them with no warnings. Same thing goes from the front. I should have paid more attention to that stuff before buying the car. I would have waited. I heard the sensors were removed, but from what I was reading it seemed like “Tesla Vision” was going to be better, but honestly I don’t even think that has been rolled out software wise. If it has, then it’s absolute GARBAGE. My 22k 2014 Honda has far superior parking assistance then this 55k car. And it pisses me off that these Tesla fans like the one you were talking to keep saying that it’s going to be better as if they are sure. When you asked him about the front parking you should have pressed him on that. The Tesla camera in the front is too high up to see anything right in front of the bumper. I would believe that would be better if there was a camera on the bumper, but there is not. And knowing how cheap Elon is there is no way I see them putting one in.
especially living here in LA, parallel parking is super important.. a tight squeeze, I'm not gonna rely on the blind spot of the camera to tell me to keep going...
I'm not against Tesla trying different things however it should ensure a better experience not a worst one after the transition. Contrary to what he says Tesla Vision is not better in every way after the removal of the radar. After my radar was removed I am experiencing dangerous "phantom braking" in situations where it never happened before.
You know how good a camera is in snow/slusch/rain condition, I know for myself I had to remove the dirt quite often on all camera because no Cie as make something to clean the damn cameras automaticly.
Good video. But what evidence do you have that auto pilot is working better with vision only than it was with vision and radar now? I’m still seeing major complaints about phantom braking, and law suits to boot. Personally I believe having to redundant systems is always better than just removing one that is known to work better for some scenarios, and I believe removing the radar was a major mistake.
My car is worse without radar. My incidence of forward collision warnings have increased several hundred percent. I used to have one every two weeks or so. I now have 3/day. That is too many false positives for a safety system. It's now the boy who cried wolf and I will probably set it to late warning, but that feels like I'm choosing to compromise my safety for the sake of not being annoyed.
@@jonathansage2147 from what I’ve been reading, that’s true of the majority of Tesla Model Y and Model 3 owners. Phantom braking is the reason I sold my 2020 Model Y after radar was removed permanently when I signed up for FSD beta. It’s way worse than radar and vision combined. Since I wrote this reply I looked into it more. It’s gotten very bad for many, but not all Tesla owners.
@@t1328 ouch. What are you moving to? Radar was software removed only like a month ago (they made the announcement and cut over for new cars many months ago), you jumped ship super quickly. I'm going to hold out for longer, but a Rivian R1S is super tempting.
The biggest shortcoming will be a 3'2" blind spot in front of the car and 2'8" blind spot in the rear. The limitation on both will be the camera angle shooting forward and down. Without additional cameras or new locations (Tesla said the locations will not be moving), it will be interesting to see what the solution will be or if that will be just an acceptable shortcoming moving forward... (pun intended)
Thanks for the insight. I have owned and driven a model S for five years. Love the car, but the software has been all over the place. I have a history in software and think its safe to say that TESLA has not delivered on its promises. The whole story about removing sensors because they believe cameras will do a better job has either not been explained to the publich very well, or its a lot of wishful thinking that might work one day. In the meantime TESLA is still delivering cars with incomplete software which can lead to users making assumptions that could end up hurting them.
Looking forward to the whole interview! Don't worry about making the segments seem like they are from independent tapings. Just release the clips, we all know it was from one sit down. That is what Lex and Rogan do.
I'm guessing this was a really long, more structured conversation, and Dave is releasing it in small parts. My first thought is that it will be good for the channel and good for topic organization.
High curbs cost me $1,800 twice, once at a new Starbucks, once on a narrow out-of-town residential street making a Y turn with my Ex 'distracting me'... after just getting it fixed! First time car insurance covered it and dinged me for three years with higher premiums , second time I just said old car, let it hang down... until I get my Cybertruck and then let FSD use 'vision' to figure it out... of course, the Cybertruck won't need to worry about high curbs! Lol Who needs ultrasound?... oh yeah, bats! Lol SoCalFreddy
Ultrasonic sensors are inexpensive, just bought a 4-pack for my SUV for $25. And yes, they work fine. Radar is probably expensive, however. (Which my Model 3 has)
Thinking about smart summon and other self driving scenarios, what about a cat or a dog that takes a nap near or under the car where the cameras cannot see them? Since the front facing cameras would be blocked by the frunk, might the sonar detect an animal under or near the front bumper better than the cameras?
The issue is where the cameras are mounted on Tesla vehicles. The angle from the front camera creates a blind spot for objects about 12-inches tall, three feet or less from the front bumper. I hope they can fix that with the new 5MP cameras.
The camera resolution doesn't fix that. What fixes that is remembering what was there, even if you turn the car off. Of course, that only helps for things that don't move, like curbs, not for things like dogs or your kid's scooter.
@@pfcrow Or I put a non standard pair of wheels on my car and now all of it's math is wrong. Or the object moves. Or I jockey the car several times with the object out of frame and the map progressively drifts further and further from reality. Or I turn the car off and it forgets everything that was out of frame. Or I install a software update and it forgets everything out of frame. Or ANY number of other totally plausible edge cases.
@@pfcrow Or your kid. Kids running around or sleeping animals etc are not exactly edge cases, they are real world everyday occurrences which the old system would have coped with fine. I have a theory that American parking bays are huge compared to the rest of the world so Elon and the rest are not that concerned about it, or at least they don't appear to be moving very fast on this. Doing this internally while beta testing is fine. Rolling out the 'feature' to the whole world and inconveniencing people that have purchased your product, making it far less safe is not.
What i want to know about is the cameras they are using. How do they deal with darkness, rain, snow or any vision impairment? Do they incorporate night vision or FLIR? part of making FSD better than a human driver is to make it see better than a human. I have thrown these questions out there many times and have yet to get an answer. I would love to know your thoughts Dave.
Paul... Unfortunately it is very hard to get responses to UA-cam videos. Some producers are very good at it, but they are the exception! I have never gotten an answer from Dave Lee, sadly enough.
@@TeNgaere Thanks Marvin. I guess I will keep plugging away at it. So far the only response I have gotten was from Warren R but he would answer my questions with other questions in what I perceived as condescending. I am trying to throw questions out there that I personally have not heard asked before. If I could get some of these influencers to get these questions to the masses, it may be a benefit.
Paul I am disappointed that some of the well known Tesla UA-cam influencers are not asking for these simple software additions. They spend a lot of time talking about video game or music addtions but nothing about completeing features like Birds Eye View or cross traffic alerts. Go figure. I have texted Elon many times but no response of course.
@@TeNgaere one thing i have been pondering is the feasibility of dynamic inductive charging for EVs built into roads. I have no idea what the costs would be but it would be interesting to have EVs that weigh much less as they will not need as large of a battery pack therefore more efficient. Just thought I would run it by you
Hi David, I like and follow your channel and this is the first time I post a response. I think here that you are not asking the critical questions. The camera on the Tesla sits at the top of the windscreen and cannot possibly see closer than one meter in front of the car at the very bottom. How can this be better than today's ultrasonic? Another thing, everyone who buys a Tesla doesn't live in California. What about those who drive where it rains or snows a lot, the camera becomes quite useless then, doesn't it?
There are a million cases cameras cannot do at all or better than sensors & radar. Those are not corner cases. For example: if the camera is dirty, if it is completely dark, if strong light shines into the camera, less than 1 m in front of the car, an object coming from behing another object in that blind spot, getting out from a garage (pretty allodern cars have radars in the corners telling you if there is something coming) etc. Those are daily situations.
Those who complain over removing the radar will complain over removing the ultrasound sensors. They say things like that the radar works in a fog, and that's true. But the forget that you can't drive with only radar! So, it's an incomplete faulty argument. If you cant see, you can't drive. It's as simple as that. I would complain that the cameras will be covered with dirt in some situations (like winter when they spread salt to melt the ice avay) and need some cleaning mecanism. That problem might be too difficult for Teslas engineers, though! Our eyes can blink. Perhaps Teslas cameras could do that too! Of course cameras can do just as good job as ultrasound sensors. Any engineer would think so!
So, the vision only solution is still well inferior to the radar solution. James said they caught up, but that's simply not true. ACC has a lower speed limit and a much longer min following distance! Tesla just forced this vision only update onto my car with radar and I lost functionality, which also cost me about 30 miles in range. Radar can also see a car stopping several cars ahead which vision will never be able to do, so it's also affects safety. Radar should be an option.
Technically the human eye has electromagnetic frequency range limitations that end at the ultraviolet end and the infrared end of the spectrum. The camera sensors can be made to extend beyond these end limit ranges even into night vision. In intense rain or snow storms radar may have some advantages in detecting road edges and obstacles etc.
The update on vision is great but they still need the sensors for other advancements. Like emergency braking and when getting close to things. Self-parking will surely not be a thing anymore. They have removed functionality that are not replaced with vision and that's rediculous.
I would strongly disagree with current layout of tesla cameras. Only chance they can pull it off with this camera layout is to map and remember area before you park for example. I just can imagine for example how precise can they monitor front right and left corners of car.
I live in a snowy/slushy area in winter with salted roads. Front cameras are cleared by the wipers. The left/right back looking cameras are well protected and almost never need cleaned. The left/right front looking cameras can get dirty, but it doesn't seem often; occasionally, I will give those a wipe. The back camera gets dirty all of the time, covered in salt. I regularly clean that camera at least once a week. The operator will likely need to periodically clean a little. It seems to take a few trips to obscure the rear camera, so it doesn't seem to get clouded in a single trip.
This guy speaks with such confidence about things that are so obviously untrue. There is no defence for removing what for some is a critical aid BEFORE you have the replacement ready. Just shows utter contempt for your customers. There’s no way camera based systems can be better than USS with / or Radar. It’s cost cutting pure and simple.
I get that as the car moves it can assume an object is still there but what about if you’ve been parked and the objects around you move when you get back. Ex garbage cans
Yes, fellow viewers, it is in fact becoming more and more difficult to determine if the story is true or not. You see, fellow viewers, “the people who make cars sometimes have lapses in their own reasoning, regarding safety features, unfortunately”.
Still no update for the removal of USS! I don't really care if the vehicle will have HW3 or HW4. I just want some form of USS to be installed back into these cars! I will not be a FSD adopter when I purchase a MSP and I don't have faith that if I purchase a MSP now Tesla will offer a free or paid retrofit to return some kind of USS back into these vehicles that have already been delivered.
I think the flaw in the logic here is that people DON'T just do things like judge distance with their eyes. It may seem like we do, but our other senses play into how we judge things, too. There are very few times where multiple senses aren't used. For example, what you taste is influences as much by your sense of smell and sight as it is your actual taste buds. Soldiers have found their sense of hearing is affected by wearing a helmet, even when it doesn't cover their ears (sound reverberates off your skull, too). I think Tesla is going to find (the hard way) that multiple signals are really needed for the kind of fidelity they want to go for. And personally, I think that where safety features are concerned, more input signals are better than less. But let's be clear, ultrasonics are going away to reduce cost and complexity. And it's happening now because they already made the decision to drop ultrasonic, and they can't change their mind and get the parts now ... suppliers have reallocated supply elsewhere. So now they're forced to go without and have the vision software catchup.
I don’t need a sensor to tell me how close my front bumper air dam is from a curb. After ramming my 2000 BMW Z3 air dam over a 6+” parking curb ($900) so always stop short of curbs.
Not necessarily. They get parallax by comparing subsequent frames when the car is moving, so any error is quickly corrected. I have a family member with only one eye, and he drives just fine.
What FSD is missing which would be the most enabling is networking with other vehicles and infrastructure, and even potentially pedestrians, which would give precise feedback and enable coordination. The network would necessarily be incomplete when first activated, but even occasional inputs would be helpful in identifying errors, and in some cases correcting them in real time. The data requiring integration and interpretation would be orders of magnitude lower in volume than for the vision-based portion of the system, and would normally be precise and reliable.
I feel like if Dave invited me over for a family dinner I would be just finishing up my meal and he would abruptly just say "alright, thanks." and stand up to usher me out the door haha.
I'll be releasing the rest of this interview probably tomorrow morning. It's almost 2 hours. James and I are also recording a more technical video as well going over AI Day slides. That will come out in a few days. 😉
I can bet my money Tesla must have tested verified that they can do it before physically removing the the sensors ..
There was news the other day that GM cannot deliver cars due to a shortage of these sensors. So Tesla hit with the same shortage just said, cancel the features for now. We can enable it later without them.
Hi Dave, I am a long time Tesla investor and owner and visiting Austin next week, do you have any tips on how to get a peek inside Giga Austin? Please let me know if you have any tips, thank you!
@@sudeeptaghosh For sure. Tesla's primary concern is safety. Also, they are entirely driven purely by profits as most other corporations. However in this case safety and profit coincide as the added software capability has none or very little added costs. It is a double win.
Can't wait. I always enjoy the conversation with James!
Apparently Elon has never got the “Camera obscured” message that I ALWAYS get while driving at night in high humidity areas.
I get a similar message sometimes when a camera is lined up with the sun. The camera over-loads and Tesla is blinded.
@@alldeeplearning949
Maybe apply some heat to the glass in front of the camera.
Maybe install a sensor that can see more than the human eye?
And I expect he’s never driven country roads in snowy weather…
@@SetTheCurve
Name one?... I'm sure NASA will be wanting it.
7 MINUTES? DAVE WE NEED A 4 HOUR COME ON!
Hopefully this is just a teaser 😅
Can’t get enough of this double intellect
😆😆😆
1 minute ought to be enough to say, its not needed, we dismissed it and saved a lot of money
@@dewiz9596 lol……. Don’t overload data centers with things they uses more energy to run
I wish Tesla had cross traffic alerts when backing up.
Many other cars have it and it is a great feature to reduce accidents when reversing.
So true. Can't figure why he doesn't add that with Bird's eye view for parking. It would take 1/1000th of the software engineering effort of FSD and would be something my wife would really use!!
They don't? I mean, I own a Mazda it has that...
@@quixomega A lot of OEMs have that, but for some reason, unmexplained, Tesla doesn't. No excuse as they have the software expertise to do it.
@@quixomega
They have the most advanced tech... Like a valve that has no use to the driver.
But the more important tech they are actively deleting
the reason is that Elon is cheap and is maximizing profit margins for each car. The fact that he removed the uss without having a viable alternative at all was a ridiculous decision. Not having cross traffic backup alerts and bird's eye view are results of this cost-savings mentality they have. It is frustrating to pay this much for a car and not get these basic features other cars have in that price range.
Ultrasonic sensors were giving accuracy in centimeters which was really useful, I doubt camera can do that accuracy atm
At the moment
@@reggiebuffat then why are they doing it now ?
@@sebastiangeorge7714
Money saving
No, Ultrasonic sensors is "guessing" distance, but not very accurate. Vision will be milimeters and accurate as they say in this video.
@@jesperalstrup2985 by now vision is not working at all that’s way more guessing
I love these in person interviews! Can't wait to see the rest of these conversations.
I have missed James. Such a great guy. Thank you Dave.
I am impatiently waiting for his assesment of AI Day 2.
Me too can’t wait.
2 issues:
1) The steering projection lines in the backup camera are terrible at tracking where I’m going. It needs to be recalibrated.
2) None of the cameras actually show me the immediate surrounding of my vehicle.
thats one thing i never understood is how a car with cameras all around it still cannot do that. let alone giving us that 360 degree birds view
I feel a little smarter every time you have James on your podcast
Thanks Dave
Lol James has no clue what hes talking about. Sounds like a marketing guy discussing engineering
James is a clueless. How the heck can a camera be better than USS thats in the bumper when the cameras can't even see anything below the bumper.
No, the cameras will NOT be better inching towards a curb than ultra-sonic (your second question) because they cannot see objects lower than a foot, up to 3 feet in front of the car. This is really about saving money and arguably materials / resources, but let's not say it's about a superior product. It's also worth noting the cameras are being upgraded from 1.2MP to 5MP and it's not clear whether those cameras will be needed to replace all of the functionality of the sensors being removed, or if it can be achieved with cameras on "legacy" vehicles. Tesla Vision doesn't do a better job than radar for AP especially at night and when it's raining (I'm in the FSD Beta and my car is from 2018).
Actually, if you are moving toward a curb, and were more than 3 feet away when you started, the measured wheel rotation in conjunction with the camera's last identified location of the curb, will work. However, if an object, like a bike or skateboard, was placed within the camera’s blind spot (below a foot in elevation up to three feet in front of the car) while the car was parked, it would have no "memory" of the object to work with.
@@markwilliams4671 unless it is in active tracking mode during the time
@@markwilliams4671
Rain?
And thats is why a person gets lower safety scoring on FSD if u drive at night, the cameras simply cant see!
@Lashiv b I wouldn't bet on them. I am driving a 2023 MY without USS and it is a disappointment and forward parking is a struggle
It's crazy how much more enjoyable these are to watch when you are both talking in person. Really feels different, in a good way.
2 Problems: 1) You don't have 4 cameras at bumper level. 2) You never are going to get superior with camera alone to camera + ultrasonic. More inputs is always going to be better than fewer. ESPECIALLY in inclement weather. Ultrasonic function normally when covered in water drops. Cameras, not so much.
Tesla removed the sensors so they don’t have to worry about using extra computer chips, and so they can ship more cars.
Yes! More sensor will always be better and smarter… car can fly too if strapped with a rocket booster… but you have to consider the business side too… is camera enough? Yes! Human can use their eye to drive! So camera is enough for most cases. I think they want to reduce the manufacturing cost
@@stevenlee1726 I understand why they did it, I just think it's short sighted. Auto makers have been installing ultrasonic sensors in cars for years just to aid human drivers, and they are cheap. Every time I drive my Tesla in anything more than a drizzle full self driving goes bye-bye (sometimes the side cameras fog up just from the cold, and Tesla tries to tell me they are working as designed) No way in hell they get to level 5 with cameras alone.
@@daveinpublic ultra sonic sensors are very cheap including their chips, they were never had supply problems and can be manufactured in any country. Unlike cpu's and gpu's that need state of the art fabs.
@@stevenlee1726 as J.P.M already said. All cars nowadays have ultra sonic sensors at least as an option and they are dirt cheap.
Camera is enough for most situations yes, but what about situations that are not? How much is most? The cameras and auto options in my father's tesla stop working under rain, snow and thick fog, when in this situations the radar and ultra sonic sensors would work perfectly fine.
The radar works in all conditions unlike a camera, might as well use it.
One of the best things about our 2012 Nissan Leaf was the 360 degree overhead view cameras. They had cameras on each side of the car and in the front and back. On the main screen in the dash, they had an overhead picture of the car, and real time video facing all 4 directions. It was brilliant. It made parking, backing up, etc a breeze. It is amazing to me that here it is 10 years later and this isn't available on $60,000 vehicles. The back up camera on the Tesla is excellent, but the side cameras face the rear, and the only front camera is behind the rear view mirror, so it can only see things about 4 ft away from the car, so you have no idea how far away from the curb you are, or if you're about to drive over something breakable.
My Range Rover had it and ut was great. I just expected the model y to have because the system is so much smarter but seems not haha
So true. We have the old Leaf and 2024 TMY. It's amazing how the Leaf can be so much better at parking. And it has no sensors whatsoever!
@@derwaechter01 Fortunately, we bought our Y in 2022 when they still had the ultrasonics. Still, doesn't come close to the aerial view of the LEAF.
Love the studio, looks great!!
glad to see and hear James in HD!
I was wondering where Douma was. He needed a week to absorb AI day 🤣🔥
😅
I've learned so much from these interviews with James. He's such as intelligent, humble and great explainer of everything. Thank you Dave for spreading these knowledges.
Yeah, as a Tesla owner I can guarantee you together with all Tesla owners that this is not knowledge, it's bullshit.
Removing the sensors prior to having the software ready is a huge mistake. And removing radar was a mistake too, since Tesla turned my radar off I have way more phantom braking events. Vision only will never work without more sensors. Tesla removing things is all about reducing their costs, yet it never lowers the price of their vehicles for consumers. But this was a good interview.
I too am having many situations where I get the "phantom braking" from overhead shadows on the road several times on my way to town in the mornings. The cameras can get a film over them where they sort of still work but miss a bunch of stuff. Also the cameras can get crud and even rain drops mess with the cameras. Also the cameras are placed pretty high up and miss things that are low down that the car can still hit. This is a mistake. For people like me that are full self driving beta testers we are so far back on the base release of software that we don't get to keep up with all the new features released to the non-beta testers. When will the base software of beta release be closer to the software being released to the rest of the Tesla community. I get tired of reading and seeing all the great features being released to all the non-beta users and having to wait for months until the beta FSD community gets to try these out. This is especially bad when we lose radar and ultrasonic sensors and they say that software will eventually fix this loss. For us Beta FSD users, this "eventually" is not good enough. I've considered many times dropping out of the FSD Beta just because of this software release lag but then why did I pay so much extra for FSD when I wouldn't be able to use it until someday. This story is getting old.
@@bruceperson7820 I hear you! I'm also getting tired of missing out on the new software updates. It took forever just to get the update that fixed the USB Unavailable issue. I'm not super impressed with the FSD beta anyway, so I'm really considering dropping out because I hate waiting months for bug fixes for other issues because beta users don't get the updates like everyone else does. And I wish Tesla would stop adding games and just fix the Dashcam Viewer!
@@Resist4 omg the dashcam viewer is completely bricked in my car, is it like this for everyone? It takes 5+ minutes to load anything. I entirely stopped looking at sentry recordings. I just hope if anything happens to my car I'll be able to pull the SD card and read it elsewhere.
@@bruceperson7820 I didn't pay for FSD, but removing radar made my car worse too. I get about 3000% (3/day vs .5/ week) more false positive forward collision warnings now. I am seriously considering setting it to late warning, but that feels like disabling a safety feature.
@@jonathansage2147 The Dashcam Viewer has not worked right since day one. And Tesla has done nothing to fix it. I rarely get full 10 minute clips. Most of the time they are one minute, two minute, three minute, 4 minute or 5 minute clips. Often the video skips around or doesn't finish. Moving the slider many times causes the video to not finish and loop. This can't be that difficult for Tesla to fix. I think the Dashcam Viewer is more important than adding more games to my car. Elon just fix it and give us better resolution cameras for crying out loud! And don't tell me that the cameras were never designed to be used as a dashcam, because if so then you guys weren't thinking ahead and aren't as smart as you claim to be.
I'd like to push back a little on the assertion that the current vision-based system is better than the radar system was. Would appreciate hearing the basis for this claim, because it sure doesn't seem to be based on the experience of many of my friends. (I have an older car with radar.)
I still waiting for my autopilot to be better… to drive as fast as with radar, to drive as close… I still waiting for automatic wipers to be on par with other cars, same as automatic high beams.
If you’re a Tesla owner you get promised a lot but what you actually get…
I find often get dazzled from Tesla M3's - do you think this this due to their automatic high beams?
@@TC-V8 probably my model x plaid i get brighted all time by mostly semis driving on the interstate because the auto high beams are sooo bad they turn on wayyy to much not too big of a deal except the fact my 150,000 dollar car i have to manually turn on and off the high beams becuase it cant tell if someone is in front of me!!
I can relate to that ... I just got my 2023 MY without USS and it is a disappointment. Even forward parking is a struggle. I don't understand why they removed USS without replacing their function
I picked up my new model Y LR on March 1st of this year (2023) and its parking assistance is non-existent. ZERO warnings when parking. I even set up an obstacle with padded garbage cans and reversed into them. I bumped them with no warnings. Same thing goes from the front. I should have paid more attention to that stuff before buying the car. I would have waited. I heard the sensors were removed, but from what I was reading it seemed like “Tesla Vision” was going to be better, but honestly I don’t even think that has been rolled out software wise. If it has, then it’s absolute GARBAGE. My 22k 2014 Honda has far superior parking assistance then this 55k car.
And it pisses me off that these Tesla fans like the one you were talking to keep saying that it’s going to be better as if they are sure. When you asked him about the front parking you should have pressed him on that. The Tesla camera in the front is too high up to see anything right in front of the bumper. I would believe that would be better if there was a camera on the bumper, but there is not. And knowing how cheap Elon is there is no way I see them putting one in.
There’s a 3 foot Blindspot in front of the car and there’s also an area behind the car for the offsite sensors may help with backing up into a rock
especially living here in LA, parallel parking is super important.. a tight squeeze, I'm not gonna rely on the blind spot of the camera to tell me to keep going...
Very timely and informative. Thank you. James answered the question I had (as I'm sure many others did) since the news broke.
I'm not against Tesla trying different things however it should ensure a better experience not a worst one after the transition. Contrary to what he says Tesla Vision is not better in every way after the removal of the radar. After my radar was removed I am experiencing dangerous "phantom braking" in situations where it never happened before.
ooooh.. I sense a long conversation with James split out into daily chunks coming! Cant wait. Thanks as always for doing this.
You know how good a camera is in snow/slusch/rain condition, I know for myself I had to remove the dirt quite often on all camera because no Cie as make something to clean the damn cameras automaticly.
This is just mind blowing. The confidence in vision to make such a huge decision. With existing HW, not even hi res cameras like Nio, or HW4 FSD
Great stuff guys. Can't wait until "tomorrow" 🙂
Thanks Dave and James
Good video. But what evidence do you have that auto pilot is working better with vision only than it was with vision and radar now? I’m still seeing major complaints about phantom braking, and law suits to boot. Personally I believe having to redundant systems is always better than just removing one that is known to work better for some scenarios, and I believe removing the radar was a major mistake.
Only real reason it probably was removed was supply shortage of chips.
@@nulian I think it was just cost cutting personally.
My car is worse without radar. My incidence of forward collision warnings have increased several hundred percent. I used to have one every two weeks or so. I now have 3/day. That is too many false positives for a safety system. It's now the boy who cried wolf and I will probably set it to late warning, but that feels like I'm choosing to compromise my safety for the sake of not being annoyed.
@@jonathansage2147 from what I’ve been reading, that’s true of the majority of Tesla Model Y and Model 3 owners. Phantom braking is the reason I sold my 2020 Model Y after radar was removed permanently when I signed up for FSD beta. It’s way worse than radar and vision combined. Since I wrote this reply I looked into it more. It’s gotten very bad for many, but not all Tesla owners.
@@t1328 ouch. What are you moving to? Radar was software removed only like a month ago (they made the announcement and cut over for new cars many months ago), you jumped ship super quickly. I'm going to hold out for longer, but a Rivian R1S is super tempting.
Dave, how can I get to see inside Giga Austin next week any tips?
Ok, ok,ok! Next they will get rid of the headlights, I suppose?
very cool to see another in person interview with you two!
The biggest shortcoming will be a 3'2" blind spot in front of the car and 2'8" blind spot in the rear. The limitation on both will be the camera angle shooting forward and down. Without additional cameras or new locations (Tesla said the locations will not be moving), it will be interesting to see what the solution will be or if that will be just an acceptable shortcoming moving forward... (pun intended)
"The best part is NO part"
Thanks for the video, very informative
Another great job, thanks.
It's always nice to see James, but I wish it wasn't a "quick" video, but rather a LONG video discussing AI Day 2022.
Kepp me from curbing the wheels - I'll be ecstatic.
Dave being back
Thanks for the insight. I have owned and driven a model S for five years. Love the car, but the software has been all over the place. I have a history in software and think its safe to say that TESLA has not delivered on its promises. The whole story about removing sensors because they believe cameras will do a better job has either not been explained to the publich very well, or its a lot of wishful thinking that might work one day. In the meantime TESLA is still delivering cars with incomplete software which can lead to users making assumptions that could end up hurting them.
Looking forward to the whole interview! Don't worry about making the segments seem like they are from independent tapings. Just release the clips, we all know it was from one sit down. That is what Lex and Rogan do.
I think many will agree with me that right after A. I. Day Part 2 we were waiting to have you talk with James. The more conversations the better!!
You two are going to chat on a daily basis?
AWESOME 👍
I'm guessing this was a really long, more structured conversation, and Dave is releasing it in small parts. My first thought is that it will be good for the channel and good for topic organization.
Yeah, tbh for me something an hour long is usually a no on the whole watch. 5-20 minutes is more like it
Cool new format!
High curbs cost me $1,800 twice, once at a new Starbucks, once on a narrow out-of-town residential street making a Y turn with my Ex 'distracting me'... after just getting it fixed! First time car insurance covered it and dinged me for three years with higher premiums , second time I just said old car, let it hang down... until I get my Cybertruck and then let FSD use 'vision' to figure it out... of course, the Cybertruck won't need to worry about high curbs! Lol Who needs ultrasound?... oh yeah, bats! Lol SoCalFreddy
Thanks, Dave and James. Very helpful. Looking forward to the next talk 👍
Biggest impact I see is repair time and cos.! Sensors are why a bumper cover replacement can be thousands of dollars.
Ultrasonic sensors are inexpensive, just bought a 4-pack for my SUV for $25. And yes, they work fine. Radar is probably expensive, however. (Which my Model 3 has)
Thinking about smart summon and other self driving scenarios, what about a cat or a dog that takes a nap near or under the car where the cameras cannot see them? Since the front facing cameras would be blocked by the frunk, might the sonar detect an animal under or near the front bumper better than the cameras?
The issue is where the cameras are mounted on Tesla vehicles. The angle from the front camera creates a blind spot for objects about 12-inches tall, three feet or less from the front bumper. I hope they can fix that with the new 5MP cameras.
The camera resolution doesn't fix that. What fixes that is remembering what was there, even if you turn the car off. Of course, that only helps for things that don't move, like curbs, not for things like dogs or your kid's scooter.
@@pfcrow Or I put a non standard pair of wheels on my car and now all of it's math is wrong. Or the object moves. Or I jockey the car several times with the object out of frame and the map progressively drifts further and further from reality. Or I turn the car off and it forgets everything that was out of frame. Or I install a software update and it forgets everything out of frame. Or ANY number of other totally plausible edge cases.
@@pfcrow Or your kid. Kids running around or sleeping animals etc are not exactly edge cases, they are real world everyday occurrences which the old system would have coped with fine. I have a theory that American parking bays are huge compared to the rest of the world so Elon and the rest are not that concerned about it, or at least they don't appear to be moving very fast on this. Doing this internally while beta testing is fine. Rolling out the 'feature' to the whole world and inconveniencing people that have purchased your product, making it far less safe is not.
What i want to know about is the cameras they are using. How do they deal with darkness, rain, snow or any vision impairment? Do they incorporate night vision or FLIR? part of making FSD better than a human driver is to make it see better than a human. I have thrown these questions out there many times and have yet to get an answer. I would love to know your thoughts Dave.
Photon counting -
ua-cam.com/video/CX6WX1ONwpA/v-deo.html
Paul... Unfortunately it is very hard to get responses to UA-cam videos. Some producers are very good at it, but they are the exception! I have never gotten an answer from Dave Lee, sadly enough.
@@TeNgaere Thanks Marvin. I guess I will keep plugging away at it. So far the only response I have gotten was from Warren R but he would answer my questions with other questions in what I perceived as condescending. I am trying to throw questions out there that I personally have not heard asked before. If I could get some of these influencers to get these questions to the masses, it may be a benefit.
Paul
I am disappointed that some of the well known Tesla UA-cam influencers are not asking for these simple software additions. They spend a lot of time talking about video game or music addtions but nothing about completeing features like Birds Eye View or cross traffic alerts. Go figure. I have texted Elon many times but no response of course.
@@TeNgaere one thing i have been pondering is the feasibility of dynamic inductive charging for EVs built into roads. I have no idea what the costs would be but it would be interesting to have EVs that weigh much less as they will not need as large of a battery pack therefore more efficient. Just thought I would run it by you
Thanks Great discussion
Best intro of all Tesla content providers, no contest
Hi David,
I like and follow your channel and this is the first time I post a response. I think here that you are not asking the critical questions. The camera on the Tesla sits at the top of the windscreen and cannot possibly see closer than one meter in front of the car at the very bottom. How can this be better than today's ultrasonic? Another thing, everyone who buys a Tesla doesn't live in California. What about those who drive where it rains or snows a lot, the camera becomes quite useless then, doesn't it?
Seemed like shure sm7b is just the best in the market, the sound is just perfect.
There are a million cases cameras cannot do at all or better than sensors & radar. Those are not corner cases. For example: if the camera is dirty, if it is completely dark, if strong light shines into the camera, less than 1 m in front of the car, an object coming from behing another object in that blind spot, getting out from a garage (pretty allodern cars have radars in the corners telling you if there is something coming) etc. Those are daily situations.
I think they wheel need another camera in front like other vehicles have had for some time now
What happens when it rains? I always get a message that at least one of the camera is blocked. MX 2018
You guys rock, thnx for all the high quality conversations!
Those who complain over removing the radar will complain over removing the ultrasound sensors.
They say things like that the radar works in a fog, and that's true. But the forget that you can't drive with only radar!
So, it's an incomplete faulty argument. If you cant see, you can't drive. It's as simple as that.
I would complain that the cameras will be covered with dirt in some situations (like winter when they spread salt to melt the ice avay) and need some cleaning mecanism.
That problem might be too difficult for Teslas engineers, though!
Our eyes can blink. Perhaps Teslas cameras could do that too!
Of course cameras can do just as good job as ultrasound sensors. Any engineer would think so!
So, the vision only solution is still well inferior to the radar solution. James said they caught up, but that's simply not true. ACC has a lower speed limit and a much longer min following distance! Tesla just forced this vision only update onto my car with radar and I lost functionality, which also cost me about 30 miles in range. Radar can also see a car stopping several cars ahead which vision will never be able to do, so it's also affects safety. Radar should be an option.
Technically the human eye has electromagnetic frequency range limitations that end at the ultraviolet end and the infrared end of the spectrum. The camera sensors can be made to extend beyond these end limit ranges even into night vision. In intense rain or snow storms radar may have some advantages in detecting road edges and obstacles etc.
Nice studio and video.
what about when my car wakes up near a large hump not visible and I try to go forward??
The update on vision is great but they still need the sensors for other advancements. Like emergency braking and when getting close to things. Self-parking will surely not be a thing anymore. They have removed functionality that are not replaced with vision and that's rediculous.
Sick studio!
I would strongly disagree with current layout of tesla cameras. Only chance they can pull it off with this camera layout is to map and remember area before you park for example. I just can imagine for example how precise can they monitor front right and left corners of car.
Bats are WAY better at navigation in low light than any pair of eyes even those of owls
Very helpful thanks
Do the Tesla cameras ever get dirty? Are they unaffected by snow, frost, rain, dust mud etc? Will Teslas only be sold in sunny areas?
I live in a snowy/slushy area in winter with salted roads. Front cameras are cleared by the wipers. The left/right back looking cameras are well protected and almost never need cleaned. The left/right front looking cameras can get dirty, but it doesn't seem often; occasionally, I will give those a wipe. The back camera gets dirty all of the time, covered in salt. I regularly clean that camera at least once a week. The operator will likely need to periodically clean a little. It seems to take a few trips to obscure the rear camera, so it doesn't seem to get clouded in a single trip.
Great new look! Keep up the awesomeness 👌
You guys are great together 😎😎
There’s no front camera on the bumper. That’s the problem I see. The camera can’t see on the lower bumper side.
The camera determine the distance, the wheel sensor allow to move the car enough to avoid the obstacle. Great engineering solution..
Nope!!!
This guy speaks with such confidence about things that are so obviously untrue. There is no defence for removing what for some is a critical aid BEFORE you have the replacement ready. Just shows utter contempt for your customers. There’s no way camera based systems can be better than USS with / or Radar. It’s cost cutting pure and simple.
Nice set
My guess is that the new occupancy model make ultrasonic superfluous and since they are not perfect they might even confuse the AI
Thank you Dave.
James D is such an excellent interviewee; I only wish he'd go to work at Tesla 😳
Camera also don't work at certain solar angles, it blinds the car. FSD or summon won't work
My only concern is, when my rear camera is full of snow and dirt in winter
the whole point of having those USS is my eyes can't see what's around the bumpers....
Seems Tesla need 1 more camera on front bumper to do all ultrasonics did :)
yep, I think so too
Very interesting! Thank you so much
I get that as the car moves it can assume an object is still there but what about if you’ve been parked and the objects around you move when you get back. Ex garbage cans
Yes, fellow viewers, it is in fact becoming more and more difficult to determine if the story is true or not. You see, fellow viewers, “the people who make cars sometimes have lapses in their own reasoning, regarding safety features, unfortunately”.
Saying what you are going to do, is skin in the game.
2023 Model 3's in Australia and NZ still have the ultrasonic sensors installed and working.
Great new format!
3:35 EXACTLY! Ultrasonics on my brand new Y we're telling me the distance from a pole NOT the curb so drove over the curb scraping it.
Still no update for the removal of USS! I don't really care if the vehicle will have HW3 or HW4. I just want some form of USS to be installed back into these cars! I will not be a FSD adopter when I purchase a MSP and I don't have faith that if I purchase a MSP now Tesla will offer a free or paid retrofit to return some kind of USS back into these vehicles that have already been delivered.
I think the flaw in the logic here is that people DON'T just do things like judge distance with their eyes. It may seem like we do, but our other senses play into how we judge things, too. There are very few times where multiple senses aren't used. For example, what you taste is influences as much by your sense of smell and sight as it is your actual taste buds. Soldiers have found their sense of hearing is affected by wearing a helmet, even when it doesn't cover their ears (sound reverberates off your skull, too). I think Tesla is going to find (the hard way) that multiple signals are really needed for the kind of fidelity they want to go for. And personally, I think that where safety features are concerned, more input signals are better than less.
But let's be clear, ultrasonics are going away to reduce cost and complexity. And it's happening now because they already made the decision to drop ultrasonic, and they can't change their mind and get the parts now ... suppliers have reallocated supply elsewhere. So now they're forced to go without and have the vision software catchup.
the problems is they have not perfected vision camera system and pulling out ultra sonic sensor.
1:20
Laughs in fender benders the world over
What happens if someone decides to just lay in front of your car while the car is off?
I’m still waiting to get functionality back on my’21 sr+. But I remain optimistic 😎
What?! That's it? That's not enough time with James!
I don’t need a sensor to tell me how close my front bumper air dam is from a curb. After ramming my 2000 BMW Z3 air dam over a 6+” parking curb ($900) so always stop short of curbs.
Doesn't a camera system need parallax to be very accurate (mm scale)?
Not necessarily. They get parallax by comparing subsequent frames when the car is moving, so any error is quickly corrected. I have a family member with only one eye, and he drives just fine.
What FSD is missing which would be the most enabling is networking with other vehicles and infrastructure, and even potentially pedestrians, which would give precise feedback and enable coordination. The network would necessarily be incomplete when first activated, but even occasional inputs would be helpful in identifying errors, and in some cases correcting them in real time. The data requiring integration and interpretation would be orders of magnitude lower in volume than for the vision-based portion of the system, and would normally be precise and reliable.
I feel like if Dave invited me over for a family dinner I would be just finishing up my meal and he would abruptly just say "alright, thanks." and stand up to usher me out the door haha.
Yeah the endings always roll in a little abrupt 😂