This video was sponsored by Brilliant. Sign up at brilliant.org/... for a free 30-day trial + 20% off the annual premium membership! Consider supporting the channel / aidrivr
Hey, that was my Silver Model 3 at 3:30...yes RIP, was totalled when rear-ended by the the guy behind us who must have forgotten his Nissan van didn't have any driver assist features.
8:24 to 8:34 looks more like collision avoidance than "aggression" to me. While borderline blowing through a yellow doing 46 in a 35, a car pulls out in front and it's like "uh-oh, speed differential." Then the other car speeds up and you see the path planner go "oh, okay."
One of the side effects of using ML based algorithms is that seemingly insignificant variations in conditions (of which there are plenty while driving) might result in different behavior. This would look like inconsistency to a driver. It's why it's so important to get enough randomized training data to erase these conditions. In other words, I think this inconsistency you're noticing is due to their stated lack of training on normal driving.
More randomized normal driving data is just going to shift inconsistency to the edge cases. ML training has never shown to work on such large models. Large in scope of the problem. ML models are always specific and limited in scope. No amount of data or training will fix the inconsistency/hallucinations because they are inherent features of these fast but non-robust models.
Most if the driving people do is local. When we drive in unfamiliar areas we make loads of mistakes, driving in the wrong lane for a turn off, missing turn offs, pissing off locals... FSD has to drive like this all the time. I see loads of testers say things like, its slowing down too aggressively, its approaching the stop sign too slowly, its stopping too far back, its not pulling into the lane quickly enough to.make the next turn, etc. This is probably all down to the tester knowing the local traffic behaviour. It will be a while before Tesla start including local data for their drives. Until then, drivers will be critical. Passengers probably won't care though, so we need to jump to the part where supervision is not needed and Passengers don't notice if the car misses a turn or other triviality.
Has Tesla said they will eventually add local data? Seems like a no brainer at this point? Why haven’t they done it yet? Explain like I’m 12 yrs old pls
@@SoccerScienceLabprobably because every car would have to get its own set of training data module. Impractical at this point in the game. Maybe once the wide release model is good enough, some of the DoJo compute power can be utilized for tailored applications.
As long as it doesn't crash without the driver and arrives at the destination. It's good enough. But that is not the case right now, so it's just a driver assist. Comfort is a nice to have, it's not a concern.
Passengers very much notice if other drivers are honking at or crashing into the car, so drastically uncommon acceleration is a problem. This is why NHTSA mandating a full stop at stop signs is probably the most dangerous thing they ever did.
@@SoccerScienceLab Specialized training for local areas is a last step. We're still in the general training phase, where the overall behavior is changing drastically. Trying to add finishing touches on an unstable foundation would be counter-productive - like localizing the language for each country on a website that is still in development, with the text constantly changing. Much less work to wait until it's stable.
What a great summary of how FSD is working today. I have 12.3.6 on my daily driver, and 12.4.3 on my other Tesla. AI DRIVR sums up my experience much better than I ever could. AI DRIVR's video overlay of FSD graphics with view out the windshield is by far my favorite format to watch FSD, and his commentary is absolutely top notch.
Thanks for another great vid. I think sometimes you are giving it undue credit for thinking. For example, at 4:41, it's simply not turning left because the light is green and it doesn't have right of way. I also don't think it understands about in-n-out waiters taking orders. It's just hesitant to move in case the pedestrian is going to walk into the road.
Someone internal at Tesla mentioned Tesla employees have been "crafting" FSD for Musk and "celebrities". So I am highly skeptical of some of these videos. There is a guy driving in Chicago with FSD and it is a complete trash. The car does almost nothing by itself.
There is a big enough body of evidence to allay any fear of some sort of "fix" going on. There is a free month for every Tesla driver in America. If it really had problems, there would be a mountain of evidence...
ChatGPT hallucinations are not fixed by more data but by stacking different models. Like chain of thought, RAG, even calling up python codes. Hallucination is a feature of these algorithms not a bug to iron out. It’s inherent to the math functions. More advanced models can fix it, but those models are slow. FSD can’t sit and wait for answers.
@@AudiTTQuattro2003 computation resources to handle a more complex array of NN models like ChatGPT. When FSD comes up to a Chuck Cook’s left turn, it needs to switch to a NN built for that type of turn only. You can’t train those turns on a large NN not expect it to mess up other types of turns. So the onboard system needs to be powerful enough to run many models in parallel. At least 10x of the current system.
I think it’s also worth mentioning the recent revelation that Tesla’s been manually tracking and updating areas where influencers drive to skew what audiences are seeing
@@itsjustmeweiss Kay, Grace. 2024. "Tesla Prioritizes Musk's and Other 'VIP' Drivers' Data to Train Self-Driving Software." Business Insider. July 2024.
I've experienced extremely hard braking on v12.3.6 when approaching an intersection and the traffic signal changes to yellow. Extremely hard braking. I was lucky no one was behind me because I would have experienced a rear-end collision and/or a pissed off driver behind me. I've seen others report this as well. From a technical perspective, FSD may be correct in stopping however hard is necessary when approaching an intersection with a yellow light, but it should also take into account the g-force that would result if it decides to come to a stop instead of continuing on thru the intersection. I hope v12.5 fixes this issue.
1:40 this is one of the most annoying thing my car does nowadays. I hate cutting in line but it often waits until last minute to do the lane changes even though there was good times to make the change earlier. 2:20 the unusual lane changes also bothers me too. my 12.4.3 does both of these still
Love your videos! Great that you’re highlighting the good and the bad including the multiple WTF scenarios, instead of just the maneuvers it handles well. I have 12.4.3 myself and have encountered many of the issues you indicated. Nicely done!
For the first tieme in 4 yrs of FSD 12.3.6 read route signs on a night drive as speed limit signs . slowing me down to 30mph n a 60 mph zone. Did this multiple times forcing me to disengage FSD
You should make a video talking about Tesla workers putting extra work into areas that FSD influencers like yourself live, to make FSD look better than its general performance actually is.
I’m on 12.3.6 and the other day it completely blew a normal left hand turn that other versions have never had trouble with. The car went WAY wide and was headed off of the road so I had to take control. Pretty freaky.
I've now been driving 12.4.3 a lot and you sum it up well. It's mostly OK but there are times that make you scratch your head and say "WTF... REALLY?" For me overall, 12.4.3 is worse than 12.3.6. 12.4.3 hesitates way too much turning from one road onto another and the auto speed control is TERRIBLE now. Most of the time it'll go way under the speed limit and you have to remind it to go faster with the accelerator and THEN it'll maintain the speed limit. Then in other sections, the speed limit goes from 60 to 45 and it'll just keep driving 65. Looking forward to 12.5 as this is 1 step forward, 2 steps back.
I expect we just got lucky - it almost certainly has no understanding that a person standing next to a car at a drive-through means they're taking an order and you should wait until they move on. But it is interesting. I wonder what the rationale was; maybe just too close to the car for it to feel confident moving? 🤷♂
With the knowledge that Tesla especially trains FSD for influencers and people showing of FSD this showcase is extremely bias and is probably not representative of the actual performance of the model. You would need to drive in a complete new area everytime so that you have actual performance representation.
@@AIDRIVRyes. This is definitely a thing as we non-influencers are very aware of. We do not see this level of full self driving. You can see this in action with Chuck Cook’s videos showing Tesla testing the unprotected left in his area.
One problem I’ve noticed is that the cars don’t seem to recognize when they are out of calibration. Having driven at 4 different model Xs in the past month, the variation in state of calibration has severe impacts on the FSD performance. I realize I can remedy this, but robocars would not.
Which brings the point, who is responsible if the calibration is off in a robotaxi? The owner? Tesla? If it causes an accident, who pays for damages? Why can't the system self calibrate?
There are some scenarios and intersections where even school busses cross double yellows particularly when there are two sets of double yellows with usable pavement inbetween that is needed to keep traffic flowing.
There is a "Do Not Block" sign and clear pavement markings at a fuel station I pass leaving my neighborhood. V12.4.3 was ready to totally ignore the signs and markings two different times, forcing me to hit the brake and disengage.
For seven years now the improvements in FSD have been huge, awesome, fantastic. Still we are not done. There is not one driverless Tesla to date. 6 years ago I bought a model S with FSD (in Europe), I sold it before I could ever see FSD in action. I no longer drive Tesla (I now drive Xpeng, its FSD is free because it is just as powerful as Teslas and they do not consider that complete). I do like to watch your videos though and keep up to date on the ever lasting experiment. Being frustrated about the past 6 years though the powerful words you use on the progress … hurt. Suggestions for better words: tiny improvements, some improvements but no where near the end product, baby steps as usual. In other words, a little bit less positive would be fair, I think. Unless … you get payed by Elon … in that case, continue with the big words.
It isn't necessarily being payed, but the fact that theTesla influencers might loose access to God, oops I mean Elon, which would destroy their ability to produce "breaking" content.
Great summary of these last few weeks filled with MONTHS worth of FSD updates. Truly looking forward to 12.5, Robotaxi roadmap and FSD release in Europe. I suggest you could also do a robotaxi-ready timeline prediction based on your tons and tons of experience with the software. Anyways, always a pleasure to learn from the master!
Machine learning is just a black box. input goes in and output goes out, but we have no clue why it chose the output it did, and that's also why it's so inconsistent. Nothing smart is happening inside that silicon, and who knows why it does what it does.
I wouldn't want ChatGPT driving for me. Too many examples of it producing lists of words that end with an n, but actually don't. And then fixing the list incorrectly when that's pointed out.
The inconsistency seems logical. The system is testing in scenarios where sometimes users disconnect and sometimes they 10:31 don’t. Our reactions and/or its own results on navigating feed the net.
But we now know the Tesla specifically targets influencers with better self driving ai training to give them a better experience than the average owner.
Just moved out of the neighborhood you always drive around in Berkeley / Kensington. So nice to have normal streets now lol those ones are ridiculous even for normal drivers
It drives ok but can not drive properly in heavy traffic. Most importantly still doesn’t know blinking green means protected left turn, also makes turns on straight arrow which is illegal. It can not be more basic than that.
My Y running 12.4.3 just learned how to turn into the driveway and line up with the garage but as of yet, won't go in by itself. Maybe I can train it to pull into the garage.
FSD works well in city driving. Probably because all the testers drive these areas. They need to spend some time on rural roads where it consistently fails to handle basic traffic scenarios. An unprotected left turn on a 25-35 MPH road is not the same as a 55 MPH road. More than anything else, they need to work on consistency. I should be able to know how the car is going to react so I am prepared to react. 12.4, for me, is a big step backward from 12.3 in most ways.
I'm still on FSD 12.3.6 and I have been trying to use it for four days. In the neighborhood but yesterday I got out of my comfort zone tried it on a main road kinda like highway like. So when will I get 12.4 and 12.5?
This is probably the biggest issue FSD still has. I love it, but slowing down to 40 on a 75+ mph road because there was a sign for "Route 404" is just something that can't happen. That and stopping for the wrong red light, e.g. one on an overpass running parallel to the road the car is actually on, are the two things making me really nervous for the RoboTaxi. Both of these things have happened to me in my Model 3! It's only that last
If it's truly AI based, then the thing to remember is that there's generally a certain level of noise/randomness built into all its decision making. It's why AI image generators will create a different image and LLMs a different answer when supplied with the same prompt. Humans do the same thing of course, but we expect a computer to get it right when the answer should be obvious but we get confused and AI's do too.
Earlier 12s were way better. I subscribed last month and it has gotten so bad I had to stop using it. Would just sway back and forth like it’s drunk on the road and wouldn’t pick a lane and constantly tried to get in the right lane that was ending.
Exactly. It may work flawlessly by then, or we might come to find out it can't because of too much bounce between updates from new "weights" in the algorithm cause undesirable consequences elsewhere.
I think telsa needs to give testers like you the ability to save both good and bad data, then give it a few labels yourself (where you group things together, like being in the intersection with a green or that intersection that failed first go but succeeded 2nd go) you group the similar situations and label some as bad and some as good that way they have the good data to counteract the bad data.
Not sure if it was a mic change or something else, but I just noticed your commentary style changed a lot the last couple of videos. It used to be a lot more relaxed, easy to listen to. Now it's way more pronounced but at the same time sounds more "yelling"-like and is tougher on the ears, at least for me.
As good as it is, would you want to insure a car with FSD and no human to supposedly take over in case it screws up? It has to be better than a sober, attentive human in every way, or it isn't ready.
The craziest part is 12.5 will probably release this year, and that's going to be another big shift as it moves FSD's highway stack from hand-coded to AI.
Anyone who has FSD here it would be awesome if you could report your data to FSD community tracker! It is the only way we can continue to follow the improvement trend as it gets better and better!
The fact that it has completely ignored solid street lines and dropped my tires off of a mountain means that going purely inference-only is a horrible decision. There needs to be some of those 50k lines back in to consistently prevent that kind of behavior.
This video seems really interesting. Currently I’m looking at buying one of these. It doesn’t seem really like it’s worth $12,000 extra. Maybe the future. It is but right now it seems like you are clearly buying for the future.
I’m still HW 4 & 12.3.6 and it’s pretty flawless. Zero safety interventions. It has seemed to learn and improve. Could just be psychological but I’m not sure.
Gotta stop you there, just two minutes in. To say, great intro, beautifully done, tho' I would expect no less from you. Can't wait for the rest, ok onwards...!
12.5 will be next level hopefully haha, with cross car communication it could mean another step change. looks like they are starting to work on an "fsd map"
I think Tesla is honing these releases to excel in areas traveled by reviewers. I can’t go half a mile from my house without an intervention. Indecision, accelerating through dips, aggressive acceleration, frequent lane changes (still there for me), inability to make decisions when no other cars around. It’s not good. Better. But not good.
At least when FSD does kill some beta tester or innocent bystander, the lawyers will have mountains of evidence to win civil suits. The old "the beta tester is responsible" argument will eventually wear thin.
Stops for a single older white guy in uniform holding a tablet walking up to driver side door which is cool, but now curious if the following make a difference: neighborhood/age/gender/ethnicity or number of people approaching the driver side carrying potentially dangerous items.