Autopilot vs Humanpilot: A Dark Reality
Вставка
- Опубліковано 28 вер 2024
- Sometimes truth can be... Uncomfortable.
Consider supporting this channel on Patreon: / aidrivr
Equipment used for filming:
GoPro Hero 10 Black (Visualizations): amzn.to/3kGze2I
GoPro Hero 11 Black (Interior): amzn.to/3H7Xsuq
GoPro Enduro Batteries: amzn.to/3wrzhSK
GoPro Dual Battery Charger: amzn.to/3XBxnLi
Fat Gecko Triple Mount (Exterior): amzn.to/3R8J0GQ
Fat Gecko Double Suction Mount (Interior): amzn.to/3HyW0Tj
Sony A6400 w/ 18-135mm Lens: amzn.to/3RauluT
Sigma 30mm F1.4 Lens: amzn.to/3HvOU1C
Insta360 One RS: www.insta360.c...
Insta360 X3 (free selfie stick!): www.insta360.c...
Insta360 X2 (free selfie stick!): www.insta360.c...
Equipment used for editing:
16” MacBook Pro M1: amzn.to/3j71X04
Magic Trackpad: amzn.to/403P8UV
Roost V3 Laptop Stand: amzn.to/3JgqBX4
Phillips Fidelio X2HR: amzn.to/3wvuVd7
Dragonfly Cobalt: amzn.to/3kHLPCR
Shure SM7B Microphone: amzn.to/3wxz9kK
Gator Frameworks Mic Arm: amzn.to/3XGGE4E
Elgato Wave XLR Preamp: amzn.to/3XEZpW2
I earn from purchases you make via Amazon Associates & Insta360 affiliate links - thank you!
Professional level production here. 👍
I put my Model 3 in drive the other day when I meant to put it in reverse to back out of a parking space. The car didn't allow me to drive forward into a bad situation.
That is amazing! And provided by an OTA software update, which is awesome. Thank you for the kind words :]
Statistically, you currently save the Tesla much more often than it saves you.
Humans are still safer than FSD Beta by itself.
Human supervised FSD Beta seems to be safer than both.
@@LightAndShaddow5 Source?
@@TrueFerret I think what he’s referring to are disengagements and it’s true.
Watching some videos you’ll see a couple of disengagements per ride. How often did an almost accident happen during those rides in which cases Tesla Autopilot did save the situation? Close to 0.
Let’s say 5 disengagements and 0 scenarios in which Autopilot had to take action in a 30min drive through a city.
So he’s technically right, the driver just saved the Tesla 5 times, the car didn’t save him once
@@LightAndShaddow5 Maybe. It is a difficult one to quantify for the simple reason that disengagements happen long before necessary. IOW a self-driving car is disengaged long before it becomes a problem simply because the human behind the wheel feels uncomfortable not because the situation is not recoverable or has become unsafe. TESLA specifically chose testers in their BETA program that are more cautious than the average driver.
I have made serious mistakes on the road where I was saved by other drivers paying attention. I have a feeling that 90% of the disengagements that looked SERIOUS something similar would have happened.
Also to consider: If one person learns to be a better driver after a mistake (if they survive it), it's just that one driver. If a self-driving car makes a mistake and that mistake is sent to the software developers to investigate, *all* cars with that software will learn from it after an update.
This topic aside if this wasn't about driving and avoiding deaths this would sound like an endorsement campaign a super intelligence would make on why it's okay to replace humans on Earth by robots. I just thought it's funny btw I totally agree with you there.
Yes you're right but on the other hand humans are much faster at learning so it's hard to say if this is really an advantage for self driving cars.
If a human makes a mistake he learns from it and apply it to various scenario that look similar meanwhile a self driving car has to do x time that mistake over all similar scenarios in order not to do it again.
Not to mention humans can also learn from the mistakes of others, eg watching dashcam videos or hearing advices from others drivers. So they also benefit from a crowd sourced database of way to improve.
@@polosh100 Nvidia made an AI that learned to play Minecraft by watching tutorial UA-cam videos so maybe the standards agency’s could make a central edge case video database for AI to learn from.
It's almost like cars are an inherently dangerous and deadly form of transportation.
+1 to this. I wonder how many deaths per year are related to trams or trains
All this effort and tech for a fundamentally destructive and wasteful form of transportation. It's a bummer
Its almost like it would be best to invest in public transport and probably just abandoned cars entirely.
@Almarca The CIA website has a Fact book that records all information around the world and provides it publicly. If you wanna check its in there. From what I've seen it's far far less then cars.
Almost like a system were all the transport is organized to work within a select time frame on rails away from roads to ensure safety in train cars designed for human safety and accommodations such as food water and backup generators in the rare case the train gets stranded for some reason. (Won't ever happen) in the end the train is safer. And vehical deaths are on the rise along with pedestrian deaths due to a lack of government intervention and a lack of proper driveing execution from the average driver who woke up late for work and now is speeding through a red light because they can't be any later then 5 minutes.
In the end a society would function better without cars. Then one with a great public transport system. If we had 2 bullet trains that go all across the states or 2 that connect southern California with northern then every major city has a good train network to move people about the city and those city's get rid of highways and roads that aren't really nessesary. Things would be safer.
You would get to work faster or be able to visit family for a smaller price then owning a car.
This would reduce the strain on the population and actually fund a city properly with minimum risk.
@@playstation8779 unfortunately this will never happen in the US. i think the only reasonable and likely way to accomplish a car-less system in the US is with an entirely new city. i'm almost certain that city would be in california, no other state cares enough. maybe when world war 3 comes and all the big US cities get nuked, we will rebuild them car-less.
You hit the nail on the head by calling out the "emotional response" as a lot of progress (not just regarding self-driving) is held up by the very same thing.
Last year I was driving between northern Norway and Sweden in my Model 3, just using traffic-aware cruise control.
During winter, we have no sunlight for about 1 month, so it's all night driving. This can really get to you. I didn't really perceive how sleepy I was becoming.
I suddenly sprung back to awareness when I realized the car was beeping the alarm and had corrected my course back onto the road, as I was about to drive straight off.
Gave me a chance to learn something about driving and rest without earning a horrible injury in the process.
My Volvo does that too. It also detects when you are driving in a sleepy driving pattern.
@@Xanthopteryx Volvo and other have "features". Tesla has hardware and software to be L4. And they will achieve it in 1-2 years.
@@archigoel Yeah I doubt that.
@@archigoel well they've been saying that it will be next year for the past 7 years.
My car has these systems too. Doesn't let me do anything stupid if i fall asleep, stops me when im about to crash into something and prevents collisions when someone is being stupid around me. Saved me like 2 times because people on highway are morons. I still wouldn't use full self driving, i like to pay attention to everything around me and i dont like driving while i wouldn't be able to immediately react. I also enjoy driving so i want to keep doing that. But those safety systems are awesome and all cars should have that. Would prevent tons of crashes. I also almost never use the cruise control, only when i need to drink some water, just so I dont spill it all over me while still trying to drive.
I so appreciate your effort at setting the record straight! Keep up the good work.
Set the record straight in regards to human / AI cyborg car being safer than human car, but not regarding a human car vs AI car. In the later case, the human is safer despite the video using human supervised AI data interchangeably with AI only data.
Great video and insight! Framing the accidents prevented on a macro level is something often missed by the masses.
I sometimes lose sight of how this isn't known to some people, or I guess even commonly understood. You probably have to deal with a great many comments that probably aren't so kind. Hope you're doing okay, and that people will continue to learn from you. Love the videos!
This is the most important video on this channel. More people need to know this
FSD and AP save lives TODAY. It drives me absolutely crazy seeing hit pieces against this amazing technology.
Ppl not going to mention it stopped in the middle of the lane bc the person fell asleep or had something happen that made them unable to take over after 3 loud warnings, then instead of drifting into the wall and other cars spinning out killing way more wallets and ppl and blocking the whole road. It made a bad pile up ya, but that's it. Cars were still moving to the side of it and going on and not dying. It's just the fact, teslas ai is already better then human
Phantom braking is real. Tunnels are a common location. I'm always prepared when coming up on overpasses. The other drivers were responsible for hitting the car in front of them. It doesn't matter why the Tesla slowed unexpectedly.
@Scott Garee phantom breaking was fixed and it doesn't do that to 0mph just a sudden slow down which is FIXED
@@lavaphoenix753 It may not slow to 0, but it still occurs. I'm not trying to diagnose that particular incident.
well spoken. never understood why computers need to be 100x better as a human, to be considered safe. Even if the system is just twice as good, it is half as many dead people.
It’s literally immoral to NOT use it at that point!
It’s always the most reckless drivers complaining about how “dangerous” autopilot is
I'm not a reckless driver and your comment doesn't make sense.
@@pruthvinedunuri2983when did he say your a reckless driver?
I’m a reckless driver and I love autopilot, even tho I don’t have any ability to use it 💀
@@Twistiry what I mean is I'm not a reckless driver and still believe autopilot is dangerous.
@@pruthvinedunuri2983 I do agree autopilot is still dangerous but it has learned within the past few years
I'm someone who loves driving. Especially old cars. But even a car enthusiast like myself appreciates the value and opportunity to save lives that comes with the expansion of self driving technology. Sure it isn't perfect yet, but it is way better that most people behind the wheel in most scenarios.
I would seriously like to see some form of self driving only highway lane in high congestion areas, much like how many HOV lanes are currently set up. It is unrealistic to expect a widescale mass transition to self driving vehicles over night, but self driving only lanes are something that could be made reality right now.
I have never been in a self driving or semi-autonomous vehicle, but I have owned plenty with adaptive cruise control. It is amazing how upset people get when the car leaves a safe following distance behind the car in front. Especially in very heavy traffic. We truly do need to change our mindset behind the wheel and realize not every car on the road today has an aggresive human at the controls.
Seriously! I always have to change my AutoPilot follow distance in traffic just so people don’t get mad at me for being safe 😂
(Though even its closest follow distance is still safe, especially with instant reaction time. Some people still get upset behind me lol)
Thank you for making this, the problem with many humans is they’re not willing to understand. I hope this video gets seen by lots of people so there can be less hate for an obviously good thing.
Also love the content btw, really appreciate what you do.
7:16 perfect reaction from the motorcycle cop
Perfect summary. Thanks for putting this important piece out there.
Props to you for trying to educate people who do not question their beliefs. It might not change their mind this time but like you said in your intro, it might if we can restore the balance of coverage to real life events.
Very well put together video. Hit the nail right on the head.
All points well taken. As a Tesla owner w/o FSD or Enhanced Auto Pilot and happy to have a performance car that allows me to drive in my usual defensive mode but is able to help me avoid erratic drivers. It will make sense for all to use FSD or equivalent software to drive but we are not there yet.I greatly appreciate all you FSD Beta Testers for doing the hard work needed to achieve full autonomous driving.
In its current state, would you really trust autopilot by itself, over a human driver by itself?
Auto pilot needs human interventions, without the human supervision, it’s still way below human level of safety m.
FWIW, data point of 1; in a year I’ve put about 30k highway miles on AP driving between SF, LA, and LV (traveling nurse) and literally never had a problem. It must have saved my butt a hundred times. However FSD and especially navigate on autopilot are an absolute cluster f*ck mess of garbage programming. It’s bad. It’s really really bad. I know the software is not single-stack, but since enabling FSD I’ve noticed AP has started misbehaving, making mistakes and acting unusual where it didn’t used to. Again, I’m just 1 person, but that’s been my experience.
@@youtubesucks8024 I totally agree. I’ve had FSDb for 3 months now. It simply cannot handle city driving without endangering me and those around me. As I say, FSDb is about as good as a teenager with 5 hours of driving experience.
Idea for a future video: more examples of Tesla ADAS features preventing accidents. I'm very interested in understanding what the capacity of the various levels of software (e.g. Autopilot, Enhanced AP, FSD, and FSD Beta) to actively prevent collisions and other accidents. I'm having a very difficult time understanding this based off watching Teslacam footage.
Aside from the things you mentioned, Tesla has another really cool feature coming up (in regards to preventing accidents)
I’m not sure exactly what they will call it, but it’s like that unintended acceleration prevention example on steroids
Right now (while under manual driving), Teslas can prevent a limited set of accidents from happening. But they are currently working on a feature that will automatically avoid all accidents no matter what you do (unless you somehow put it in a situation where all options are bad, or someone were to instantly jump out in front of you, etc.)
There was a demo of this at Tesla AI Day 2022, I believe. Basically, you could literally floor the accelerator and let go of the steering wheel, and it would still find a safe path that avoids all obstacles (this is completely separate from FSD, so it would not be actively following your intended route or otherwise take over normal driving, just preventing an accident until you were safe). Of course, it would not only steer, but also make acceleration/deceleration decisions (so the fact that you happen to be flooring it would just be ignored)
@@Muhahahahaz I'll have to rewatch AI Day 2022.
tesla autopilot is a god send. i get easily fatigued driving even on highways and in case i lose focus tesla beeps at me hard enough to jolt me awake
Maybe your single most important video ever. Thanks a lot.
Excellent video! If only I had a following, I'd pass this on to everyone. Thank you.
💯% on point. I had a lady pull out in front of me after that accident . It's the other drive non safe Tesla behaving badly driver and other you have to worry about. I wish there was some kind obd2 port communication device that could talk to other cars to help prevent accidents. That would tell the car to stop for my case before she pulled out in front of making a left on a highway .
i'm still in awe of the 40 pedal misapplications per day statistic from tesla. you'd think someone who buys a $50k+ vehicle would be able to differentiate from only TWO pedals. that means if a car only had one pedal there would be 40 people a day pressing their floor mat wondering why the car isnt going. insane.
They can. There isn’t a similar amount of crashes like this happening with ICE vehicles because it ISN’T HUMAN ERROR IN MOST CASES…
It’s a glitchy car flooring you into danger. Period.
@@wyattnoise there are crashes for this constantly in all sorts of vehicles. Stop spewing unbacked bs
The moral of the story seems to be that Tesla drivers are worse than regular drivers and they need software aids to shore up their lack of ability.
FSD saved me from a head-on with a large construction truck that came over the centerline yesterday morning.
Great video and spot on! People should be more aware of this before making judgments on FSD, just because media like negative news on Tesla because of the clicks it generates
Pedal misapplication. Never heard of something like that ever. That is something so trivial, I though that almost doesn't occur but some drivers....
I love this video... It really puts things in perspective. 👍
Im taking a robot intelligence course right now, so this channel is particularly interesting 😁
What the fuck why does this have only 32k views in a year?! This is one of the most grounding videos about EVs and SD out there!
Thanks again for pointing out the reality of FSD beta v. Human drivers. One has to wonder how some of these “meat sacks” make it through every day. Beta continues to improve while, it seems, the humans are getting worse and worse.
Great video. Captures the insanity that is human drivers!
💯 true ! This is a very promising tech. One day autonomous driving will surpass human driver
The pileup accident looks like Autopilot coming to a stop because the driver didn't touch the steering in a while. The driver is still responsible for taking over during FSD Beta, the *Beta* terms and conditions clearly state that. I bet the statistics for FSD crashes will be far lower once it's actually out of Beta.
We are quickly approaching the point where all input is error :)
That seems unlikely. Autopilot gets pretty nasty before it starts braking, but even then I don't think it slams the brakes, and should instead stop fairly gracefully.
This phantom braking is a real thing, which I have encountered myself, and I do have to take over sometimes. I don't recall it ever stopping that suddenly, but it does have to in real emergency situations so if it is mistaken about an emergency, then it almost certainly can do what was shown in the video. It is also vaguely plausible that the car beeped for the driver to take over and the driver slammed the brakes in response, but that should have shown up in the telemetry data.
How the hell do you press the wrong pedal down? Great video as always!
It was a major problem for Toyota a decade or so ago as well. I've had isolated instances where my foot was at an odd angle and the accelerator was pressed when I applied the brake (e.g. side of my foot was off the side of the brake).
Great video. We have to get messages like this out there as much as possible. I wonder why there is more pedal misapplication in a Tesla? Is it because there is autopilot and brake regen resulting in drivers using less pedal and getting more rusty. That could be a concern. Which brings up another thought. Once we get to level 4 when people start to drive manually alot less but perhaps may need or want to take over, but people drive manually alot less and become rusty, this could create a more dangerous environment as well.
I think that pedal misapplication is just as common in other cars. But since they do not accelerate so fast the accidents are less spectacular. Add to this the laser focus from click bait media on Tesla.
Thats a huge problem, so some developer want to skip some levels, because of the fact that the concentration of the driver isn't there anymore. There are some example of people wo didn't look on the track or are even doing a lot other stuff.
Tesla has no interest in Level 3 or 4, which is probably one big reason that the public seems to constantly misunderstand FSD. Once they got their basic (“Level 2”) AutoPilot working, they had a good starting point to work with, towards developing full autonomy. It would only be a waste of time for them to develop the extra features that a partial-autonomy Level3/4 system would need, especially when it will be obsolete so quickly
(Indeed, Tesla is about to release FSD Beta 11 in the coming weeks, which is a single-stack solution for all driving situations, including the highway. In other words, their AutoPilot software will become obsolete and no longer used on any Tesla that has the FSD computer)
Had that Tesla stopped in traffic because the driver in front left a child seat on the roof and it fell.
It would have been a much different story.
Humans suck at driving. FSD will/ has saved lives more than seatbelts.
The ironic thing here is, if all the cars around the Tesla was using FSD, the pile up would most likely not have happened.
If you can't drive a car, don't drive.
Great episode. We all needed that.
I once tapped accidentally the wrong pedal when I was parking my friend's model 3. I immediately changed to brake pedal and nothing bad happened. I don't understand how one could "accidentally" keep pushing the wrong pedal down. They must been on drugs.
THIS!
I fail to understand how you can press the wrong pedal and just keep pressing it expecting the car to do something different.
Yeah....
Other people on the road is why I don't care for driving all that much.
I do as much as possible to keep myself out of their way.
Autopilot isn't capable of driving by itself.
100% spot on. I couldn’t agree more. I don’t know why the Tesla braked there, and maybe it was an autopilot fault. However, a human car driver can brake hard for a multitude of reasons, most of which will not be obvious to other drivers. In all of the countries I have driven in, it is always the driver’s responsibility to maintain a safe distance from the car ahead and the excuse that “He just braked for no reason” doesn’t cut it with the traffic police or your insurance company. Watching that clip of the Tesla braking and then any number of morons ploughing in behind him just shows why autonomy can’t come fast enough. Imagine if all of those cars following had been Teslas with autopilot engaged, there would not have been a single collision, guaranteed. Thanks for highlighting the inadequacies of human drivers. As a career military pilot, flight safety was our top priority, same in the commercial flying sector. Over the years, flying has become safer and safer even though it is impossible to completely eradicate human error. We would spend much time preparing and learning from others’ mistakes. The same is not true on the roads. There are more distractions than ever, smart phones being the number one, and the only way to make driving safer is to accelerate automation.
You are such a nice guy! I love watching your videos.
This "pedal misapplication" problem really begs the question: Do we need cars on public roads able to do 0-60mph in 3sec?
Great video; right message! Thank you!
Excellent video
8:16 especially when it is a Polestar 2. That news article went from "Tesla" to "Car" really quick :P.
Every Tesla tuber should make several videos like this one and publish them over a 2 to 3 week period - flooding the channels with the real Tesla safety info. That should do a lot to counter the misinformation.
yeah my dad accidentally hit the accelator instead of the break in a tesla while parking 😅
100 thumbs up AI Driver!! One of your most reasoned videos ever. All you have to do is watch the Wham Bam Tesla Cam channel to see how bad human drivers are.
Yeah, autopilot is better than human driver, but what's even better? Not putting everyone into deadly 2 ton metal boxes going at fast speeds.
Could not agree more.
There is a special place in hell for the NHTSA employs who get to access and see these statistics every day of how many innocent people die on the roads in crashes caused by human drivers compared to driver assisted cars but still chose to ignore it because of some conflicts of interests or probably even direct financial gains from competitors without these safety features just like tabaco, pharma and alcohol companies used to do before medical studies ended up exposed them now becoming common knowledge how they manipulated the population using media who profited greatly off people's health.
Being trusted with the safety of all and choosing to sell the lives entrusted to them for some personal gains, true scum of humanity.
The numbers are rather amazing….. good job
I wish more people was able to think logical like this 😇
The 8 car pile up at the start, what was the driver doing? Anyone that was awake and had a few brain cells would take over and not let FSD/AP come to a complete stop on a highway like that.
6:15 ok that is niffty.
I've literally never heard of pedal misapplication before. How does that even happen? Do the same people accidentally shove their fork in the eye, and when they realize it's in their eye not their mouth, they shove it deeper, hoping it will end up in their mouth?
You may not have heard of it, but you've certainly read about many accidents caused by it. The most common one is when a car suddenly smashes through a storefront.
I really want to recommend the youtube channel „notjustbikes“ to all of you guys! He fundamentally changed my opinion on cars and he shows how deep the rabbithole actually goes! It really is like taking the red pill from matrix, once you have seen it, there is no coming back from it…
There was a media report last week that a Tesla crashed into a house. They posted images of it, and it turned out it was a polestar 2. The media outlet then changed the title from "Tesla crashed into house" into "Car crashed into house" and removed the image.
Its kinda like a city would benefit from a public transport system without any roads in the mix. Less roads. Less cars. More trains and more room for shops. More shops. More work. More work. More income for the average person. More income means more money and less stress. Less stress because you don't need to own a car. You can walk to a station hop on the train and be at work when the train ends up arriving on time.
No worrys you can even grab a bite to eat at the station. Maybe you meet a friend and chat on the train. Who knows. In the end. Why confine yourself in a car and get stuck on a highway that is backed up miles and miles without end. Waste money on payments. Waste money on insurance. Waste money on repairs and service. All so I can't get stuck in traffic and spend 50 bucks to fill up my tank halfway. XD man. The more I think about it the more I want to leave to go to a place that has transport figured out.
Also it takes up way less space then roads.
More space for humans less space for cars.
Heat, snow, rain, limited mobility. No thanks. Cars are ubiquitous, because humans like them. They weren't forced on us. What gets forced on us are busses and trains. They have to take someone else's money to pay for them, so others will ride them. In dense metro areas they find some success. They are in no way universal. New York City still has plenty of cars. Please remember your Utopia might be someone else's Purgatory. If you find you have to force your Utopia on people who don't want it then you are the problem.
Always Remember with Humans You Can't Fix Stupid , But AI Mistakes can be fixed .
Excellent video. Thank you
Wait... why are people holding their accelerator pedals down?
To go faster.
@@scottgaree7667 yea but getting in a crash makes you slow down, which is the opposite of go faster. You feel 🤔
@@kylewatson6890 It does kill the buzz, doesn't it? 🤐
@@scottgaree7667 I would imagine, don't know by experience though. I've never crashed when I'm driving like a maniac 😈
All one has to do is to take a drive between 4 and 7 pm in Los Angeles to understand why humans aren't good at this
Thanks for making a video about this.
My issues with autopilot or FSD, isn’t the safety of the system; it’s the absolutely insane number of dumb sh*t unforced errors of the software. I know to be aware during an on ramp or off ramp exit, lane changes, turns, unprotected intersections, etc… but when the car randomly hard brakes at the apex of a turn or on a flat straight away, randomly resetting the speed limit on the highway, brake checking semis etc. is infuriating and keeps me in a constant elevated state of stress. I know I know beta I know. That’s not an excuse and it really shouldn’t be allowed on the road until it can perform 99% of basic driving sh*t reliably and consistently.
People will always find a way to hate on a Tesla just because it’s a Tesla love the content
I am a retired police officer accident reconstructionist. I have looked at the numbers and it is extraordinary how much safer Tesla vehicles are than others by any metric. I do not yet own a Tesla, but every time I have countered the hate mob, I am accused of owning one and exhibiting a prejudice. People mention the 1 incident every news channel talks about, yet ignore the 40,000 deaths and countless other incidents. Too many people defend their position rather than look at evidence. It is another of the human factors that lead to traffic collisions to begin with. I don't use the word "accident" because they often are not - they're a collision due to intentional acts by the operator, which are eliminated in a Tesla FSD. FSD will not show off, care if you are late to work, think it can beat a red light, try to race another vehicle etc. which accounts for a lot of other collisions.
Couldn't agree more. Accident reconstructionist seems like it would be a very interesting job! Thank you for your service
@@AIDRIVR The math and science was interesting and the specialty has lots of retirement jobs. The downside is that most of the time there is a death and the children who died due to parents who didnt take the time to buckle them up still causes nightmares. Nothing more heart wrenching than a tiny lifeless body or telling the parents of a teen who died rushing to a movie theater that their 17 YO is dead after we flew them to a trauma center. If you can handle that, the practical application of physics and math to take yaw marks or other tire marks to derive a speed is quite cool. We then have to do scale drawings for the inevitable lawsuits that follow between involved parties. A couple colleges, such as Northwestern University, have full programs but usually it is usually taught regionally by police academies.
That's using the data from tesla crashing using autopilot or fsd or is it both and if all fsd BETA, everyone had it then there would be 0 crashes ever
Some crashes are caused by externalities. Nature and humans will continue to invent things the AI can't cover.
THANK YOU! This should be required viewing by EVERONE! Of course, there would still be a lot of deniers but at least some people would get the message.
And nhsta has the audacity to prevent this safety technology from being released knowing that humans are such a holes driving
Great Video! Numbers do not lie. 👌
I mean yes the AI would be overall safer but imagine the feeling when your car crashes and you just can't do anything about it.
That must be such a strange experience.
Do you think it would be different than sitting in the passenger seat when a human is driving?
the 981 clip 💀
Fantastic video!
The difference is that people are in control of their own fate, as it stands. However, when a computer is in control of all of the driving... no number of deaths would be acceptable. Granted, as long as humans drive alongside computer drivers, there will always be unpredictability. But when it's only CPU drivers, it will be a closed loop system... and there shouldn't be any accidents that aren't attributable to acts of god, at that point.
5:15 hmmm, backward speed not capped, reverse long drive glitch found
My thoughts exactly YES!!!!;) 💋💥❤️
I agree with a lot that you say, but I think your robo-taxi comment was a bit misguided. There already is a cheaper alternative to driving for people who don't really want to drive but are forced to, it's just super awful at this point in time; we need public transit to improve so we can remove drivers and cars from the road in general.
When a human crashes and survives they hopefully learn from their mistake and won't repeat it. When a neural network operated vehicle crashes, the entire fleet of vehicles learns how to act in that scenario going forward.
Thank you for the unique video
More video
A Tesla driver approach to read statistics...
1 more crashes in total includes the crashes from ai cars
2 if you wanna compare numbers of crashes/year of ai and manual drivers, you have to calculate with the numbers of cars which are on the road (ai and manual separated)
3 human error occur more in teslas because the drivers think it drive by it self. The company Tesla say different. Drivers like you and me know this, but we are not the majority.
I have no issues with AI drivers. The only thing I want is a mechanical override switch for the car's computers. Something that just shuts off everything and requires an easy manual operation to restore normal operation. In an electrical car, that requires:
1. Mechanical shut-off switch that:
1: Shuts off motor.
2: Shuts off steering motor (the one responsible for power steering)
3: Shuts off brake pedal overrides (including ABS)
4: Shuts off battery if need be.
2. Mechanical brake override
1: Activates brake application at a "high" level.
2: Activated mechanically
3: Note: Can be a mechanical parking brake lever.
Those are my only requirements.
Same as trains having brakes which are active by default and you need to actively apply force to de-apply them. I want cars that by default have everything off and mechanical brake action can be done by the driver if needed.
Why? None of this is necessary for safety. Why would you ever disable ABS? That's illegal in many countries.
Braking is mechanical. You have a brake booster, but it works even if that system dies.
@@logitech4873 ABS can render the brake pedal useless. That's why I'd want to disable it if really needed.
The brake booster can be disabled using the mechanical parts of the ABS.
There's a few videos online (maybe the youtube ones are still available) showing ABS being overridden causing the user to be unable to brake (this is done by hacking into the car's CAN BUS)
Ever heard of 600hp not being “Streetable” I think most of these guys aren’t able to drive a sports car…
a price to pay for innovation
I don't know how you were able to make this video without being furious. Just the fact that people are so incredibly dumb, and then are too arrogant to admit they're wrong, it just infuriates me. And then imagine: these people have the right to vote... It's absolutely insane how amazing people think they are, especially compared to how absolutely awful they are.
To be fair, if there were a choice, I wouldn't consent to be on the road with most people who think they're good drivers... let alone those who know they're not. Having said that, driving and roads use in general are statistically much safer in my country. The shift of being able to work from home (for a significant percentage of jobs now, and hopefully going forward) some people can exercise that choice. In a similar vein, here's hoping that one day the US invests in public transport (and more human-centric city design) and might stand a greater chance of more people at least having the option to not drive/own a car if they don't want to.
"Breaking News: Orcs are raiding a small village. Will the fat, lazy heroes finally intervene and save the innocent Villagers?"
"Breaking News: Homicidal Heroes murder innocent Orcs visiting a local Village."
Welcome to Tesla.
How do people put all their weight onto the pedal?
They think it's the brake pedal
Love the video, very well explained and shown what's real.
I shared this with some friend's that still hate Tesla hard core and say FSD is a joke.
The tunnel accident is just the latest BS that is being spread about FSD.
It could have been a medical emergency with car slowing in the lane which caused the pile up, no big deal.
You know because it was a Tesla involved it's instantly at fault and was using FSD!
No way the 6 or 7 other cars are at fault especially the bait and switch by the following closely truck.
I don’t wanna keep bringing this up in the comments because I really love AI DRIVR probably more but dirty Tesla went in depth on that tunnel accident it would be funny to see your friends reaction to that video as well… lol
I think the tunnel accident is an anecdote that showed that without competent human supervision, autopilot is still much less safe than humans.
@@LightAndShaddow5 but still safer than the 7-8 drivers that ran into the Tesla cuz they were tailgating and not paying attention to the road.
While pedal misapplications are definitely under the category of human error, there is probably a design reason that leads to it happening more often in Teslas, compared with other vehicles. If I were in charge, I would be looking into why this is happening, even if it's not autopilot's fault.
The design reason is that Teslas accelerate really damn fast, and don't give people as much time to react.
Tesla doesn't advertise with these news organizations. Of course they're going to bash Teslas.
5:10
The sheer level of stupidity one has to posses to hit the wrong pedal. Have Autopilot save you, then shift gears and floor the wrong pedal again. That person should not have a license.
Robotaxi's won't happen with the current hardware in Tesla cars.... it isn't capable of being fully autonomous!