SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Technology Stocks : Tesla EVs - TSLA
TSLA 468.37+2.6%Nov 3 9:30 AM EST

 Public ReplyPrvt ReplyMark as Last ReadFilePrevious 10Next 10PreviousNext  
To: A.J. Mullen who wrote (26403)9/22/2025 3:42:27 PM
From: i-node1 Recommendation

Recommended By
longz

  Read Replies (2) of 26601
 
This is what I read in the article you refute:
In a follow-up video, the two Tesla influencers confirmed that the Model Y had a broken sway bar bracket and damaged suspension components. The vehicle is also throwing out a lot of warnings.
Yes, I read it, too. It doesn't change my statement. The guy openly told the cop it looked like a piece of paper in the road. My question is whether FSD should in the future try to dodge what the driver perceives as a piece of paper in the road?

Had the driver been in charge and dodged the "piece of paper" in the road, the problem would have been averted. Same is true if FSD dodged it.

There is no reason to anticipate FSD will outperform the human eye. Taking it one step beyond, what would FSD equipped with LiDAR have done? Moved into the oncoming traffic lane to avoid the piece of paper? Or properly perceived it as a steal piece of truck frame? Or would it have had a crisis of sensor contention? (LiDar says do this, Vision says do that?)

It is an interesting case study I guess. But reality is these circumstances will occur and over time systems will learn what to do with it, whether it is a Tesla or a WayMo. For now, this is why users are instructed that the system is an "assist", not a self-driving system. I'm pretty sure that is clear to anyone using the system.

So, all the way around, the driver was at fault here, not Tesla's self-driving in development.

We bought a car about 6 months ago that has some kind of driver "assist". Basically it is like FSD for straight roads only. If you turn it on and it comes to a curve, the car will attempt to "pull over" when it gets to the curve. Totally useless crap, but someone may think it is useful, if all the roads are staight lol. It is up to me, the driver, to know that if I turned that on and came to curve, I shouldn't expect it to make the turn. About the dumbest "feature" imaginable in a car, but I was warned and could figure out myself as to its fitness for use.

That incident was the driver's fault if anyone's.
Report TOU ViolationShare This Post
 Public ReplyPrvt ReplyMark as Last ReadFilePrevious 10Next 10PreviousNext