PayPal sold for a billion bucks, the largest sale ever, at the time. Now it's just integrated into eBay, which also isn't going anywhere, so I have no idea what you're implying. Did I miss something?
Autopilot turns off because the car doesn't know what to do and the driver is supposed to take control of the situation. The autopilot isn't autopilot, it's driving assistance and you want it to turn off if it doesn't know what it's should do.
Sure, what meant though was that Tesla doesn't have self driving cars the way they try to market it as. They are no different than what other car manufacturers got, they just use a more deceptive name.
If an incident is imminent within the next <2 seconds or so, autopilot must take the action or assist in an action. Manual override can happen at any time, but in such a duration it's unlikely and only the autopilot has any chance, therefore it cannot turn off and absolve itself if liability.
It seems reasonable for the autopilot to turn off just before collission, my point was more in the line of "You won't get a penny from Elon".
People who rely on Full Self Driving or whatever it's called now, should be liable for letting a robot control their cars. And I also think that the company that develops and advertises said robot shouldn't get off scot-free but it's easier to blame the shooter rather than the gun manufacturer.