Close
Page 2 of 2 FirstFirst 12
Results 11 to 12 of 12
  1. #11
    Senior Member
    Join Date
    Mar 2014
    Posts
    560
    Thanks Given
    213
    Thanked 357 Times in 194 Posts
    Quote Originally Posted by jdong View Post
    That guy was Joshua Brown, who's posted other Autopilot youtube videos before. So, for sure they have dashcam footage and can confirm whether or not he was distracted: http://electrek.co/2016/06/30/tesla-...ught-on-video/


    The article has the details on the crash (divided highway, he had right of way, oncoming truck made left turn assuming other traffic would stop for him). But, however, not at fault != could not have prevented the accident. It's unclear whether or not he was distracted because of trusting the Autopilot…



    Either way, it's tragic, but a cautionary tale that traffic conditions can change at a second's notice. You can use automation features like Autopilot to allow you to be more alert to the big picture around you…. Or you can use it to pay less attention and play Angry Birds while your car maintains you in your lane. Obviously, one is less moronic than the other, but human temptation can push you towards the other.



    (To directly answer your question: AP's camera is in a firewalled separate module, and does not record video and is not capable of transmitting the video to the rest of the car's electronics. So no, they would not have a video recording of what happened in the car. They do, however, have full logs on the settings of the car, speed, driver input, the digital representation of what the AP interpreted on the road, etc.)
    With semi autonomous cars, I can see the same issues arising that happened with this https://en.wikipedia.org/wiki/%C3%9C...-air_collision mid-air collision over Germany. If the autopilot is giving the driver instructions to do something and the driver does otherwise thinking that he knows better based on what he can see, and another driver does what his autopilot says to do, there will still be a collision (in the collision mentioned, one pilot obeyed TCAS while the other obeyed ATC, and both collided in an accident in which there were no survivors).

    Today, pilots are given instructions to obey TCAS at all times, regardless of what ATC says. The same type of training would be required for human car drivers.
    Last edited by awj223; 06-30-2016 at 08:09 PM.

  2. #12
    Senior Member
    Join Date
    Apr 2014
    Posts
    58
    Thanks Given
    43
    Thanked 31 Times in 18 Posts
    Quote Originally Posted by awj223 View Post
    With semi autonomous cars, I can see the same issues arising that happened with this https://en.wikipedia.org/wiki/%C3%9C...-air_collision mid-air collision over Germany. If the autopilot is giving the driver instructions to do something and the driver does otherwise thinking that he knows better based on what he can see, and another driver does what his autopilot says to do, there will still be a collision (in the collision mentioned, one pilot obeyed TCAS while the other obeyed ATC, and both collided in an accident in which there were no survivors).

    Today, pilots are given instructions to obey TCAS at all times, regardless of what ATC says. The same type of training would be required for human car drivers.

    Yeah there's a lot of dilemmas that these semi-autonomous cars can cause. Off the top of my head:

    (1) Driver falsely lured into thinking the car is fully autonomous, and ends up paying less attention because his past experience leads him to believe the car is infallible.
    (2) The car gives an alarm but it catches the driver off guard, and the driver cannot understand enough context to take over. Human -> AP / AP -> Human transitions can be incredibly dangerous especially if it's due to an imminent collision. There was a near miss in aviation where a napping pilot woke up and mistook the planet Venus for the lights of an oncoming plane and commanded a nosedive. A lot can change on the road if you look away for a short period of time. If you look up and suddenly see the side of a semi truck, how long would it take to make sense of the situation?
    (3) Driver mistakes the driver assistance system as a more skilled driver than him. There's been a few plane crashes where after the plane gave a stall warning the pilots tried to use AutoThrottle/AutoPilot to apply corrective inputs.


    Driver assistance systems are going to be a cat and mouse game between the tech improving and human behavior regressing. But it seems like one can give benefit of the doubt in this situation that it was an out-of-the-ordinary traffic situation that can easily catch human drivers off-guard as well.


    Oh yeah, as an aside: Currently Tesla OS 7.1 doesn't recognize oncoming cars or perpendicular cross traffic reliably. Right now, it mostly understand your pack of traffic you're traveling with, as to help with lane and distance regulation. However, just today, rumors came out about Tesla OS 8.0, which does add recognition of those types of traffic…. I think Tesla deserves some credit here for constantly improving their semiautonomous capabilities for existing owners instead of just adding new features to next-model-year cars and keeping older car owners less safe when it's just a software change.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •