It is no secret that the safe operation of a motor vehicle requires the human visual system. Simply put, driving is first and foremost a visually guided behavior [1]. Some have estimated that up to 90% of the information used in driving is visual [1]. Given the above, there is no doubt that vision plays a central role in the tasks that a driver must successfully perform to safely operate a motor vehicle [1]. This puts a demand on our sight and insight, even though the world around us continues to adopt more and more technological changes. 

The challenge posed to the visual system while driving is amplified at night, as the human visual system at night has a reduced capacity for distinguishing between objects in the visual field. Contrast sensitivity is what enables a driver to successfully distinguish objects at night, so in order to be seen at night, objects must be sufficiently brighter or darker than their backgrounds, and these objects must be conspicuous. Sufficient intensity is required for an object to be identified at night, and for a driver to process this information. If the stimulus is not strong enough, the driver will either not respond, or the response time will be longer. Factoring in the limited time available to respond, the driver must see the situation hazard, pedestrian or object far ahead of the point of conflict as “vehicle stopping physics consumes distance.” [1]

Given the complexity of the human visual system, and the challenges which are posed during night time driving, or inclement weather for that matter, it is natural to wonder if self-driving vehicles (and their arrays of sensors) could do a better job than the human eye? Early news reports suggest that this will not be the case. For example, in March of 2018, what is believed to be the first fatal crash with a pedestrian and a self-driving car was reported in Arizona. The crash occurred at night, around 10 PM, where the self-driving vehicle struck a pedestrian as she walked across the street with her bicycle [2][3]. Although the vehicle was in autonomous mode, a safety driver was sitting in the driver’s seat [3]. It was later reported that the driver behind the autonomous Uber vehicle was watching “The Voice” via a streaming service in the minutes leading up to the crash [2]. As of March 5, 2019, Arizona prosecutors stated they had not found evidence to charge Uber with a crime in connection to the incident [3].

It is unfortunate that the driver could have prevented the accident if they were paying attention, but it also suggests that, at least in its current state, autonomous technology cannot replace the human visual system, particularly in night conditions.  For more information on the role of Human Factors in accident investigation and reconstruction, click here: https://www.liskeforensics.com/human-factors/

LISKE Reconstructs all Accidents and Injuries, Anywhere.


[1] Green, M. (2017). Roadway Human Factors: From Science to Application. Lawyers & Judges Publishing Company, Inc.