If someone steals your car and kills someone with it, then disappears without ever being identified, the car owner doesn’t assume liability. Liability falls on whoever was operating it at the time. If software was driving, then the software company assumes the liability.
But you bought the driverless car and turned it on. You never agreed to the thief’s joyride. Where do you draw the line for “operation” - like operating a steering-assist car, or operating a Roomba?
Doubt it. I mean, any self driving car is going to make the driver agree to responsibility for what the car does and ensure the user has a manual override available just in case.
No company is going to ship fully autonomous driving software (for example to have fully autonomous driverless taxis) without contractually making the fleet owner responsible for their fleet cars.
You treat it like any other traffic accident, except if a self driving car is responsible, that responsibility lies with the vehicle’s owner.
It would have to be the manufacturer.
If someone steals your car and kills someone with it, then disappears without ever being identified, the car owner doesn’t assume liability. Liability falls on whoever was operating it at the time. If software was driving, then the software company assumes the liability.
But you bought the driverless car and turned it on. You never agreed to the thief’s joyride. Where do you draw the line for “operation” - like operating a steering-assist car, or operating a Roomba?
Doubt it. I mean, any self driving car is going to make the driver agree to responsibility for what the car does and ensure the user has a manual override available just in case.
No company is going to ship fully autonomous driving software (for example to have fully autonomous driverless taxis) without contractually making the fleet owner responsible for their fleet cars.