When Autosteer is in use, it measures the amount of torque that you apply to 
the steering wheel and, if insufficient torque is applied, an escalating series 
of audible and visual alerts again reminds you to place your hands on the 
wheel. This helps ensure you are attentive and trains good driving habits. If 
you repeatedly ignore these warnings, then you will be locked out from using 
Autopilot for the duration of that trip.
FSD has other features that use autopilot such as navigate but it still has the 
above functions.
https://www.tesla.com/support/autopilot


    On Tuesday, February 8, 2022, 09:08:07 AM CST, Peri Hartman via EV 
<[email protected]> wrote:  
 
 Technically, you're right, Paul. That is the primary point I made a 
while back and then asked the question: how long would it take for you 
to respond if the FSD drops out ? Someone mentioned that, at least, if 
gives an audible warning, so I'm assuming that "dropping out" is easy to 
notice. But someone else pointed out that if nothing goes wrong for a 
long time, it's hard or impossible to keep a high level of alertness. In 
other words, you need stimulation to keep alert.

So, even if Tesla says you must be instantly ready to regain control, is 
that a realistic position ? I think I have solidly convinced myself that 
it is not realistic. Tesla absolutely must modify FSD in some way to 
ensure the driver is alert. I'm not sure how that might work, but here 
are a few simple (and possibly stupid) ideas.

- vibrate the steering wheel from time to time
- intentionally veer out of the lane from time to time (only while safe)
- intentionally have FSD drop out from time to time and require a simple 
action to reengage
- show a blank screen on the console (maybe this would be more 
distracting than helpful)

-------------

Another point I want to bring up. There's been discussion about having 
redundant systems. While that's a good idea, I don't think it addresses 
the problem at hand. FSD didn't mechanically or electrically fail. The 
software failed. Having software that goes into "I give up" mode is not 
acceptable.

The software needs to try to make the best decision it can. Always. If 
they have redundant systems and one system "gives up," even with 
software trained by different data and produced by different development 
teams, you don't really have good redundancy. If both try rather than 
give up, then, at least, you have two solutions at every moment to 
compare.

Peri

<< Annoyed by leaf blowers ? https://quietcleanseattle.org/ >>

------ Original Message ------
From: "paul dove via EV" <[email protected]>
To: "Electric Vehicle Discussion List" <[email protected]>
Cc: "paul dove" <[email protected]>
Sent: 08-Feb-22 05:10:41
Subject: Re: [EVDL] Request Tesla crash data

>When and where did Tesla say you can let the car drive and play with your 
>radio?
>They say stay alert and keep your hands on the wheel. Sorry it is still people 
>who don’t obey that is the issue here.
>
>
>Sent from AT&T Yahoo Mail for iPhone
>
>
>On Tuesday, February 8, 2022, 5:22 AM, Peter Eckhoff via EV 
><[email protected]> wrote:
>
>All these stories remind me of the one where there was a road with a
>sharp left turn.  One foggy night, some kid ran the yellow line at the
>apex of the "L" straight up the proverbial oak tree.  In the morning,
>he came back and found he got a car.  Someone was following the yellow
>line right up and into that tree.
>
>That story has me thinking that Tesla, Waymo, etc. are needing to pass
>a Pankster Obstacle Situation (POS) test in order to be certified as
>anywhere near Level 5.
>
>At a T-intersection, Tesla missed a Stop sign??  Why did it not
>recognize that it was at a Stop signable intersection and come to a
>complete (okay "the coast is clear" rolling) stop?  What would happen
>if a drunk took out the sign the day before and it had not been
>replaced?  What happens if a kid places a garbage bag over the stop
>sign in an impromptu POS test?
>
>In an interview between Sandy Munro and Elon Musk, Musk was saying
>that road line painting was not standardized and Tesla was having a
>hard time navigating through road construction areas.  Here you have
>an area where people are working and are expecting to go home at
>shift's end instead of to a hospital or morgue.  Okayyy, Tesla is
>working on it.
>
>Last night, I put my Tesla Model Y into Autopilot a little before a
>construction area.  It was a slow drizzle/rainy night.  Autopilot lost
>it and I had to take control.  I'm grateful that no one was beside me
>and it didn't go the other direction and into a Jersey Barrier.
>Autopilot had grabbed the "wrong lines"?
>
>The more questions I ask of FSD and its kin, the more I am convinced
>that the sensors should be beefing up the driver's skills and alerting
>them to possible bad situations instead of trying to play God with
>your safety and mine.
>
>I think we have to ask questions of any FSD system.  FSD is not an
>airliner being able to fly autonomously between point A and B in a
>well controlled airspace or SpaceX nailing a landing on a barge
>somewhere out in the middle of the Atlantic.  We are dealing with
>situations where people, cars, and environmental factors interact in a
>myriad of ways every second of the day.  You don't have a kid kicking
>a ball out onto a runaway or a commuter/pedestrian with "get
>home-itis" in the middle of the Atlantic.  When was the last time a
>deer crossed the path of a SpaceX rocket?
>
>I think it is past time to rethink about obtaining Level 5 and
>concentrate on beefing up driver awareness of potential problems and
>letting the computer between our ears try and handle these situations.
>We are far better in reading the intentions of other drivers and
>pedestrians than any FSD package that may have a problem recognizing
>and then displaying the silhouette of a pedestrian.
>
>Peter Eckhoff
>
>PS  I love my Model Y and I am not going to sell it.  She's like a
>beloved girl friend with a few bad habits.
>_______________________________________________
>Address messages to [email protected]
>No other addresses in TO and CC fields
>UNSUBSCRIBE: http://www.evdl.org/help/index.html#usub
>ARCHIVE: http://www.evdl.org/archive/
>LIST INFO: http://lists.evdl.org/listinfo.cgi/ev-evdl.org
>
>
>
>-------------- next part --------------
>An HTML attachment was scrubbed...
>URL: 
><http://lists.evdl.org/private.cgi/ev-evdl.org/attachments/20220208/bac92170/attachment.html>
>_______________________________________________
>Address messages to [email protected]
>No other addresses in TO and CC fields
>UNSUBSCRIBE: http://www.evdl.org/help/index.html#usub
>ARCHIVE: http://www.evdl.org/archive/
>LIST INFO: http://lists.evdl.org/listinfo.cgi/ev-evdl.org

_______________________________________________
Address messages to [email protected]
No other addresses in TO and CC fields
UNSUBSCRIBE: http://www.evdl.org/help/index.html#usub
ARCHIVE: http://www.evdl.org/archive/
LIST INFO: http://lists.evdl.org/listinfo.cgi/ev-evdl.org
  
-------------- next part --------------
An HTML attachment was scrubbed...
URL: 
<http://lists.evdl.org/private.cgi/ev-evdl.org/attachments/20220208/26f20561/attachment.html>
_______________________________________________
Address messages to [email protected]
No other addresses in TO and CC fields
UNSUBSCRIBE: http://www.evdl.org/help/index.html#usub
ARCHIVE: http://www.evdl.org/archive/
LIST INFO: http://lists.evdl.org/listinfo.cgi/ev-evdl.org

Reply via email to