Close
Results 1 to 10 of 12

Hybrid View

Previous Post Previous Post   Next Post Next Post
  1. #1
    Senior Member RedRocket's Avatar
    Join Date
    Nov 2013
    Location
    North America
    Posts
    2,057
    Thanks Given
    1,081
    Thanked 1,543 Times in 846 Posts
    Quote Originally Posted by jdong View Post
    Tesla responded by claiming that the vehicle logs show that the car was in Drive and the driver slammed the accelerator down full force. Since all the Model X's delivered at the time were 600+HP P90D's that is a very fatal mistake to make and will end up in a wall 100% of the time until Elon invents mid-air reverse thrusters for SUV's! - lol, good one !


    I think it's troublesome that in modern cars, accident investigations put your word against the manufacturer's word since they designed the logging system and are interpreting it. It's the Toyota incidents all over again: does the car have a faulty accelerator sensor or did the guy's wife on one of the very first drives of a brand new super car hit the wrong pedal and launch it into a building? I would honestly say the latter is more probable.

    With the ridiculous power of the Model S / Model X we are seeing a lot more "ordinary" drivers have super car performance at their disposal.... for better or for worse!


    EDIT: It's worth noting that the owner's other claims are a bit dubious. Tesla's UI will not allow engaging cruise control or Autopilot unless you are going above 18mph except if you are following another car, which was not the case here. And of course, collision "avoidance" systems only kick in under limited circumstances, usually just inattentive rear-endings. Cars are not smart enough yet to prevent 100% of accidents, and no automaker is ballsy enough to program a car to disobey a driver's full-throttle inputs. I would guess most of the folks on this forum would oppose a car that disobeys the driver's intentions because the car's firmware thinks that the action is unsafe!
    Agree fully, it's a "She" said & "He" said situation when only one side gets 1st look at the in-car data logging....the potential for manipulation of recorded evidence is ever present at this time.

  2. #2
    Senior Member
    Join Date
    Apr 2014
    Posts
    58
    Thanks Given
    43
    Thanked 31 Times in 18 Posts
    Quote Originally Posted by RedRocket View Post
    Agree fully, it's a "She" said & "He" said situation when only one side gets 1st look at the in-car data logging....the potential for manipulation of recorded evidence is ever present at this time.
    Yeah, I think before long, the NHTSA may need to start taking a NTSB-like role in air craft black box interpretation…. The manufacturer has every interest to design the logging system and interpret the data to their benefit. I think in this particular case it's probably 99% driver error, but that's not good enough. If I were ever involved in a situation like this, I would want a neutral third party to be examining the data and the design of the logging system.

  3. #3
    Senior Member
    Join Date
    Mar 2014
    Posts
    560
    Thanks Given
    213
    Thanked 357 Times in 194 Posts
    Quote Originally Posted by jdong View Post
    Yeah, I think before long, the NHTSA may need to start taking a NTSB-like role in air craft black box interpretation…. The manufacturer has every interest to design the logging system and interpret the data to their benefit. I think in this particular case it's probably 99% driver error, but that's not good enough. If I were ever involved in a situation like this, I would want a neutral third party to be examining the data and the design of the logging system.
    Will be interesting to see how they analyze the logs from this crash. Do they have a camera/visual light capture system so that they can "see" what the driver would have seen?
    http://www.theverge.com/2016/6/30/12...nomous-model-s

  4. #4
    Senior Member
    Join Date
    Apr 2014
    Posts
    58
    Thanks Given
    43
    Thanked 31 Times in 18 Posts
    Quote Originally Posted by awj223 View Post
    Will be interesting to see how they analyze the logs from this crash. Do they have a camera/visual light capture system so that they can "see" what the driver would have seen?
    http://www.theverge.com/2016/6/30/12...nomous-model-s


    That guy was Joshua Brown, who's posted other Autopilot youtube videos before. So, for sure they have dashcam footage and can confirm whether or not he was distracted: http://electrek.co/2016/06/30/tesla-...ught-on-video/


    The article has the details on the crash (divided highway, he had right of way, oncoming truck made left turn assuming other traffic would stop for him). But, however, not at fault != could not have prevented the accident. It's unclear whether or not he was distracted because of trusting the Autopilot…



    Either way, it's tragic, but a cautionary tale that traffic conditions can change at a second's notice. You can use automation features like Autopilot to allow you to be more alert to the big picture around you…. Or you can use it to pay less attention and play Angry Birds while your car maintains you in your lane. Obviously, one is less moronic than the other, but human temptation can push you towards the other.



    (To directly answer your question: AP's camera is in a firewalled separate module, and does not record video and is not capable of transmitting the video to the rest of the car's electronics. So no, they would not have a video recording of what happened in the car. They do, however, have full logs on the settings of the car, speed, driver input, the digital representation of what the AP interpreted on the road, etc.)
    Last edited by jdong; 06-30-2016 at 06:15 PM.

  5. #5
    Senior Member
    Join Date
    Mar 2014
    Posts
    560
    Thanks Given
    213
    Thanked 357 Times in 194 Posts
    Quote Originally Posted by jdong View Post
    That guy was Joshua Brown, who's posted other Autopilot youtube videos before. So, for sure they have dashcam footage and can confirm whether or not he was distracted: http://electrek.co/2016/06/30/tesla-...ught-on-video/


    The article has the details on the crash (divided highway, he had right of way, oncoming truck made left turn assuming other traffic would stop for him). But, however, not at fault != could not have prevented the accident. It's unclear whether or not he was distracted because of trusting the Autopilot…



    Either way, it's tragic, but a cautionary tale that traffic conditions can change at a second's notice. You can use automation features like Autopilot to allow you to be more alert to the big picture around you…. Or you can use it to pay less attention and play Angry Birds while your car maintains you in your lane. Obviously, one is less moronic than the other, but human temptation can push you towards the other.



    (To directly answer your question: AP's camera is in a firewalled separate module, and does not record video and is not capable of transmitting the video to the rest of the car's electronics. So no, they would not have a video recording of what happened in the car. They do, however, have full logs on the settings of the car, speed, driver input, the digital representation of what the AP interpreted on the road, etc.)
    With semi autonomous cars, I can see the same issues arising that happened with this https://en.wikipedia.org/wiki/%C3%9C...-air_collision mid-air collision over Germany. If the autopilot is giving the driver instructions to do something and the driver does otherwise thinking that he knows better based on what he can see, and another driver does what his autopilot says to do, there will still be a collision (in the collision mentioned, one pilot obeyed TCAS while the other obeyed ATC, and both collided in an accident in which there were no survivors).

    Today, pilots are given instructions to obey TCAS at all times, regardless of what ATC says. The same type of training would be required for human car drivers.
    Last edited by awj223; 06-30-2016 at 07:09 PM.

  6. #6
    Senior Member
    Join Date
    Apr 2014
    Posts
    58
    Thanks Given
    43
    Thanked 31 Times in 18 Posts
    Quote Originally Posted by awj223 View Post
    With semi autonomous cars, I can see the same issues arising that happened with this https://en.wikipedia.org/wiki/%C3%9C...-air_collision mid-air collision over Germany. If the autopilot is giving the driver instructions to do something and the driver does otherwise thinking that he knows better based on what he can see, and another driver does what his autopilot says to do, there will still be a collision (in the collision mentioned, one pilot obeyed TCAS while the other obeyed ATC, and both collided in an accident in which there were no survivors).

    Today, pilots are given instructions to obey TCAS at all times, regardless of what ATC says. The same type of training would be required for human car drivers.

    Yeah there's a lot of dilemmas that these semi-autonomous cars can cause. Off the top of my head:

    (1) Driver falsely lured into thinking the car is fully autonomous, and ends up paying less attention because his past experience leads him to believe the car is infallible.
    (2) The car gives an alarm but it catches the driver off guard, and the driver cannot understand enough context to take over. Human -> AP / AP -> Human transitions can be incredibly dangerous especially if it's due to an imminent collision. There was a near miss in aviation where a napping pilot woke up and mistook the planet Venus for the lights of an oncoming plane and commanded a nosedive. A lot can change on the road if you look away for a short period of time. If you look up and suddenly see the side of a semi truck, how long would it take to make sense of the situation?
    (3) Driver mistakes the driver assistance system as a more skilled driver than him. There's been a few plane crashes where after the plane gave a stall warning the pilots tried to use AutoThrottle/AutoPilot to apply corrective inputs.


    Driver assistance systems are going to be a cat and mouse game between the tech improving and human behavior regressing. But it seems like one can give benefit of the doubt in this situation that it was an out-of-the-ordinary traffic situation that can easily catch human drivers off-guard as well.


    Oh yeah, as an aside: Currently Tesla OS 7.1 doesn't recognize oncoming cars or perpendicular cross traffic reliably. Right now, it mostly understand your pack of traffic you're traveling with, as to help with lane and distance regulation. However, just today, rumors came out about Tesla OS 8.0, which does add recognition of those types of traffic…. I think Tesla deserves some credit here for constantly improving their semiautonomous capabilities for existing owners instead of just adding new features to next-model-year cars and keeping older car owners less safe when it's just a software change.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •