Mid70's Chrysler Fanatic
Active Member
Tesla recalls 362,758 vehicles, says Full Self-Driving Beta software may cause crashes
And Elon Muskox got pissed off at the word “recall.”
And Elon Muskox got pissed off at the word “recall.”
Tesla recalls 362,758 vehicles, says Full Self-Driving Beta software may cause crashes
And Elon Muskox got pissed off at the word “recall.”
who is to blame in an accident when the driver is not actually driving? i have a hard time wrapping my head around that scenario.
I'm skeptical it ever will be solved.My last check on the technology was two years ago, and from what I know, the "trolley" problem wasn't solved.
Do you decide to let the CAR (its AI system) decide to "hit the child that ran in front of it", OR "wreck itself to avoid hitting the child but maybe injure occupants".
Before that decision, is the AI/elecronis even capable of situational awareness (the vision systems, the integration with vehicle systems, etc.) to even make the decision "hurt the pedestrian" or the "occupant".
Maybe its NOT binary, but sometimes it could be that kinda decision - the Trolley Problem. Someone/something is standing by the lever .. hurt five, or hurt one?
All I can think of is the movie "I Robot" where the robot made the decision who to save in the water.yes I agree -- I do NOT think the "AI" can develop the "situational awareness" to decide who gets hurt (its "sensing" wont be good enough).
lets say we CAN improve its sensing to equal or be better than humans, the rest of "situational awareness" is does it "appreciate" the trolley dilemma?
trolley is gonna kill somebody. five people, or one. what if the "one" is somebody the lever puller knows and all the "five" are strangers?
with a vehicle, the situation could be the stationary "one" in the street with his back turned is the car owner's brother (would the AI even know that?), while the "five" is a crowd of strangers standing on the curb? Somebody is gonna get hit by the car. Who?
we can "scenario" dozens of situations. animate things vs inanimate. people vs animals. elderly vs kids. and so on.
how many people do we have to actually imperil in oder to "teach" (code) the AI how to decide. even if teaching it were possible, do we want it (a machine WE created) to decide "life and death" for us (the creators)?
in any circumstances?
sticking to cars ... i dont think such a decision should be left up to the car to decide. i, too, dont even think it can be done!
so why are we trying this? that's rhetorical. I have a theory for another thread at another time.
![]()
Too many computer geeks watched Knight Rider as kids.so why are we trying this? that's rhetorical. I have a theory for another thread at another time.
![]()
I wonder what Musk would say…er, tweet…if Cadillacs with Super Cruise needed a major software fix and GM brass took exception to the word “recall.”
Musk may have a point about "recall". Regulators choices may NOT yet distinguish between "software downloads" as fixes vs. take the car into a repair facility to replace a "hard" part as has been the case for 100 years.