- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
cross-posted from: https://lemmy.bestiver.se/post/443781
That’s pretty good, they’re fairly small targets.
Great, they’ve invented Christine.
It’s fine, they’ll fix these issues in time for the robotaxi rollout ten years from now.
What’s that? They’re planning on launching the robotaxis at the end of this month? Well then.
I’m reading the comments about this video, and I think people are missing the point.
It’s not about the Telsa running into the kid. It’s about the Tesla completely ignoring the FLASHING FUCKING STOP SIGN at the side of the bus, which resulted in it hitting the kid dummy.
This could have been a pedestrian crossing, railroad stop, intersection, etc.
These vehicles aren’t “smart” and should not be allowed on the road. Any idiot can have greater awareness than a Tesla.
No ‘smart’ device is smart.
you wouldn’t say that to his face, would you? 🥺
With a hammer to the camera
Yeah, it might kill the kid, it might not.
Im still gonna stick to my ford F50000 Fleshreaper (BLOOD FOR THE GAR GOD!™) driven by a good old fashioned human to get the job done.
Besides, it avoids the whole mess of theological issues about who gets moloch’s love.
Oh, now I get it. Didn‘t know it’s not allowed to pass the bus even when it’s on the other side of the street. In our country we teach the kids to not run across the street when they get out of the bus.
Kids will do stupid things sometimes, no avoiding that. In Germany you can pass a stopped bus on the other side of the road, but if it has its hazards on, you can’t go faster than walking speed.
Still safer than a human driver tbh
I have 0 killed children in 25 years.
That’s shrimply not true. The numbers Tesla releases are heavily cooked.
Had a quick look around but I didn’t manage to find any numbers that weren’t either using Tesla’d numbers, or guessing.
But it’s pretty well known that FSD sucks (have been in a car using it … terrifying af) and that it’ll turn itself off before an accident to pass accountability to the driver.
I love how this keeps getting repeated by everyone everywhere
it’ll turn itself off before an accident to pass accountability to the driver.
But both Tesla (5 seconds) and the NHSTA (30 seconds) count any incident where a L2 system was on before the accident as having happened with the system active. So no, they do not use it for that purpose.
You know that video going around a few weeks ago where some dude with FSD on darted across the rode into a tree? Well, he got the cars data, and it turns out it was disabled due to enough torque on the wheel which is one of the ways you disable it. He probably nudged the wheel too hard by mistake and disabled it, or there was a mechanical failure which disabled it, but the accident counted as FSD in the report he got from Tesla as ON even though it was OFF at the time of the accident when he started going out of his lane.
So please just stop it with that nonsense.
what about this one? https://youtu.be/V2u3dcH2VGM
I don’t know anything about self driving, but I can’t imagine why it would turn off right before a crash instead of keeping the breaks held
(also I know the drivers is a total idiot and it’s 100% their fault, I just want to know why it turned off)
My guess is that Automatic Emergency Breaking (AEB) is a separate feature from autopilot.
AEB is something that is always on and watching even if AP/FSD is off, but it’s only intended (originally anyway) to turn on when a crash is imminent to slow forces, not prevent a crash. So my guess is AP fails to detect the cop car as radar couldn’t see it, vision finally sees it tries to brake but it’s too late, AEB detects a crash is inevitable and kicks in which turns AP off.
It’d actually be nice if we got a straight answer from Tesla as to why it happens this way. This would have been reported as ON as it was within 5 seconds, and obviously NHSTA was investigating it as ON.
I imagine the driver was startled and pressed the brake or turned the steering wheel, either of which will cancel FSD
I may be buying the foolishness of the masses, but your anecdotes are only as good as mine.
Which part don’t you like?
I could source the crash report and a video explaining what likely happened, but if you simply don’t believe that Tesla truly stands by the 5s rule in their self reported data even with the crash report, then that’s another matter entirely.
Despite doors blowing off, Boeing planes are safer than human drivers tbh. You’d think tech fans would understand the importance of logic in computers. Red means stop.
That’s fine because those were non fascist kids
If I worked at Tesla, I would very much be doing a crappy job and slipping bad ideas into what looks like good code. The Lord’s work.
How would you know where to put it among all the other shitty code?
I would very much be doing a crappy job and slipping bad ideas into what looks like good code.
No need to: Tesla makes extensive use of the H1B program. You’d be more likely to accidentally make the code better.
Because, as we know, US developers are so much better than anyone else around the world. Which is why they they gave us excellent software like Windows 11, MS Teams, Google Hangouts, and so on.
Take your xenophobia and stick it where the sun doesn’t shine.
Hell yeah
Didn’t I just read this like a few weeks ago? But there’s a Jun 15 date in the article. So did this happen again?
It’s the same story making the rounds.
Edit: They also did it in Austin and somewhere else, so same situation in 2 different spots, generating like 4-5x the stories as each one gets repeated in the news cycle
I’ll give you my uneducated findings: self driving cars are not ready.
I doubt they will ever be really ready, they’ll eventually be considered “ready enough” no software will always work without flaws. When that software controls a car a minor flaw might mean 20 deaths.
40.000 deaths by traffic accident by year (in the US). Only 20 deaths would be a major improvement. Obviously “cars” is a highly irrational discussion though.
And it’s not just the victims who could be spared their lives, it’s also the mental toll on those who kill people on accident. Blaming it on a flaw in the software that can be improved and flaws permanently fixed is great.
I say let the mechanized reduced slaughter begin!
Isn’t Waymo in San Francisco completely self driving? And if their own recently released data is anything to go by, it would seem self driving cars are more ready than manually controlled cars. Because people are absolutely awful at driving.
Waymo cars use much better technology than Tesla.
Nobody is disputing that a machine that is never distracted and has reaction times down to fractions of a second would make a better driver than even the most skilled human, but Tesla’s FSD hardware and software aren’t there yet and probably never will be.
Comparing self driving cars to American driving standards is kinda a moot point because the american safety standards are so low that death and injury is considered the cost of doing business.
I’d be curious to see how well waymo performs compared to a country with far safer road designs and drivers that are better trained and respect rules of the road more frequently.
Way is also operating in a fairly small fixed area that is highly mapped.
Not saying that’s a bad thing, they are doing things the right way, slowly and cautiously.
Pretty normal for a Tesla.
I’m a school bus driver. This is also totally normal for human-driven vehicles.
That makes sense, Tesla use real driver data to train the cars. The cars ignore the traffic controls humans ignore, follows the rules humans follow
They try to fix bad behaviour, but I bet there haven’t been enough human driven Teslas illegally passing school buses and having a collision for Tesla to notice that FSD ignores a rule it shouldn’t
Well there is the problem right there, FSD shouldn’t be doing these tests in the first place! How else is Tesla supposed to get their amazing cyber taxi out of it has to follow all these dumb rules?
Not American, but I think FSD stands for Full Self-Driving, not an organization.
I’ve seen some comments elsewhere about how it can be trusted.
Edit; like really straight up trusted. The delusion is unreal.
its not that the car is programmed poorly, it just really hates children
Hunter Seeker Mode: School Busses. Easy Prey.