I think we're at cross purposes. I'm sitting in my self-driving Tesla, or a passenger in someone else's. A toddler runs out into the road. The Tesla's AI has to evaluate -- and I'm not sure how it would do this -- various ways of avoiding hitting the toddler (brake, swerve, etc) and their likely outcomes. Let's assume that, if it swerves, it's going to run into oncoming traffic and if it slams on the brakes a large truck that's tailgating me is going to plough into it, with unforeseeable consequences. What does it do?
Tesla's probably a bad example because their self driving tech is garbage with its image recognition.
I would argue that proper AI tech wouldnhave seen the toddler before the toddler ran into the road, and if that toddler was close enough to be a problem, it would have slowed down and assumed the potential for the toddler to randomly run into the road.
If its at a point where it has to swerve into oncoming traffic to avoid a toddler, the tradgedy of the situation sucks, but the car should not have been on the road because the AI sucks, it was going to fast to simply stop.
But oh, what if the breaks fail.
That falls into the same area. The AI car should check its breaks before each trip, and if it detect potential for failure, refuse to drive. Period.
Human drivers can be and are negligent about maitenence, an AI, should be hard coded to NEVER be negligent. Humans can't be assed to take fove minutes to check over their vehicle before each trip. An AI can run a sensor check in seconds. If the sensor fails to report, assume the brakes (or whatever) are bad.
Basically, the AI should be programmed to always always err on the side of safety. Swerving to avoid toddlers should never need to be accounted for, because the AI will do anything to avoid being in such a situation to start with.
If it needs that coded in, the action should be to avoid the toddler. Its the path most likely to avoid loss of life or injury. A bare toddler being struck by a vehicle pretty much will 100% be killed. A person encased in a metal enclosure striking a car or a wall, will be much more likely to survive and be protected.
If a truck is tailgating, its still the same idea. The tradgedy sucks, but that the fault. Of the truck driver for being a poor driver. If its an aI truck, once again, it should and would be coded to always choose safety and would not be tailgating so closely. And in an actual perfect world of AI, its already stopping because "your" car, has told every car around it "There is a toddler in the road here".