This is what I read in the article you refute: In a follow-up video, the two Tesla influencers confirmed that the Model Y had a broken sway bar bracket and damaged suspension components. The vehicle is also throwing out a lot of warnings. Yes, I read it, too. It doesn't change my statement. The guy openly told the cop it looked like a piece of paper in the road. My question is whether FSD should in the future try to dodge what the driver perceives as a piece of paper in the road?
Had the driver been in charge and dodged the "piece of paper" in the road, the problem would have been averted. Same is true if FSD dodged it.
There is no reason to anticipate FSD will outperform the human eye. Taking it one step beyond, what would FSD equipped with LiDAR have done? Moved into the oncoming traffic lane to avoid the piece of paper? Or properly perceived it as a steal piece of truck frame? Or would it have had a crisis of sensor contention? (LiDar says do this, Vision says do that?)
It is an interesting case study I guess. But reality is these circumstances will occur and over time systems will learn what to do with it, whether it is a Tesla or a WayMo. For now, this is why users are instructed that the system is an "assist", not a self-driving system. I'm pretty sure that is clear to anyone using the system.
So, all the way around, the driver was at fault here, not Tesla's self-driving in development.
We bought a car about 6 months ago that has some kind of driver "assist". Basically it is like FSD for straight roads only. If you turn it on and it comes to a curve, the car will attempt to "pull over" when it gets to the curve. Totally useless crap, but someone may think it is useful, if all the roads are staight lol. It is up to me, the driver, to know that if I turned that on and came to curve, I shouldn't expect it to make the turn. About the dumbest "feature" imaginable in a car, but I was warned and could figure out myself as to its fitness for use.
That incident was the driver's fault if anyone's. |