SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Technology Stocks : Tesla EVs - TSLA -- Ignore unavailable to you. Want to Upgrade?


To: A.J. Mullen who wrote (26403)9/22/2025 3:42:27 PM
From: i-node1 Recommendation

Recommended By
longz

  Read Replies (2) | Respond to of 26605
 
This is what I read in the article you refute:
In a follow-up video, the two Tesla influencers confirmed that the Model Y had a broken sway bar bracket and damaged suspension components. The vehicle is also throwing out a lot of warnings.
Yes, I read it, too. It doesn't change my statement. The guy openly told the cop it looked like a piece of paper in the road. My question is whether FSD should in the future try to dodge what the driver perceives as a piece of paper in the road?

Had the driver been in charge and dodged the "piece of paper" in the road, the problem would have been averted. Same is true if FSD dodged it.

There is no reason to anticipate FSD will outperform the human eye. Taking it one step beyond, what would FSD equipped with LiDAR have done? Moved into the oncoming traffic lane to avoid the piece of paper? Or properly perceived it as a steal piece of truck frame? Or would it have had a crisis of sensor contention? (LiDar says do this, Vision says do that?)

It is an interesting case study I guess. But reality is these circumstances will occur and over time systems will learn what to do with it, whether it is a Tesla or a WayMo. For now, this is why users are instructed that the system is an "assist", not a self-driving system. I'm pretty sure that is clear to anyone using the system.

So, all the way around, the driver was at fault here, not Tesla's self-driving in development.

We bought a car about 6 months ago that has some kind of driver "assist". Basically it is like FSD for straight roads only. If you turn it on and it comes to a curve, the car will attempt to "pull over" when it gets to the curve. Totally useless crap, but someone may think it is useful, if all the roads are staight lol. It is up to me, the driver, to know that if I turned that on and came to curve, I shouldn't expect it to make the turn. About the dumbest "feature" imaginable in a car, but I was warned and could figure out myself as to its fitness for use.

That incident was the driver's fault if anyone's.



To: A.J. Mullen who wrote (26403)9/22/2025 5:18:43 PM
From: Selectric II1 Recommendation

Recommended By
i-node

  Read Replies (2) | Respond to of 26605
 
I've experienced my Tesla braking hard for a newly painted sign in the road. I reported the event on this thread. Tesla's system has problems interpreting some images.

Not to belittle the issue or doubt your experience, but people do, too.

For some reason, your story reminded me of a late night TV comedian from decades ago, maybe Johnny Carson, reading an auto insurance company's customer damage claim narratives. Some were very creative. One was to the effect, "I was driving down the road minding my own business, when out of nowhere, the telephone pole jumped in front of me."

I guess the question today is whether these systems and sensors are better or worse than any given average human driver, or drivers taken as a whole. And, over time, technology improves. People's aging senses and reflexes generally don't.

Are any computerized systems 100% error-free?

I hope your Tesla didn't get splattered by wet yellow lane marker paint!