• merc
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Those are pretty basic conditions that I hope are already in the training data.

    What about a wildfire evacuation? Police might have people driving on the wrong side of the highway to make use of all the lanes. Smoke might be obscuring everything. A human driver would know not to pay attention to any of the road signs in that situation without ever having been trained on it, but would a self-driving car?

    Or, how about any situation where a police officer has to have a driver roll down the window to give them instructions for dealing with some unusual situation, like a chemical spill or a landslide.

    Or, what about highway signs that have been shot by a shotgun so that it’s hard to read? Or, what about novelty highway signs that a business might put up as a joke?

    Self-driving cars definitely need to be tested against a much bigger range of situations than a human driver. Much as we might be baffled by their lack of common sense, the common sense of an average 16-year-old is still off the charts compared to an AI. Having said that, I know how bad many drivers are, and I wouldn’t be surprised if the competent self-driving car organizations (Cruize, Waymo, etc.) are already better than an average driver under 99.9% of common scenarios.