Aerial gracescale illustration of autonomous cars at urban intersection

Crossing the road in the world of autonomous cars.

VR prototyping for communicating autonomous vehicle intent to pedestrians. How will the chicken cross the road in an autonomous world?

Crossing the road is a social interaction. Eye contact, hand-waving, the head nod — these are all social norms pedestrians and drivers use to communicate intention. With driverless vehicles, this subtle, very human etiquette will present a new challenge to autonomous vehicles. As a first step in exploring this space, we set out to design for the seemingly simple scenario of a pedestrian crossing the road in an autonomous world.

Our design hypothesis is as follows:

Autonomous vehicles will need to communicate their behavior and intentions to human drivers, pedestrians, and potentially other autonomous vehicles. These means of communication will need to become as familiar as looking at a crosswalk signal or traffic light.

Designing for a future always requires setting some self-evident truths, so we developed the following assumptions:

  • Autonomous vehicles will always obey traffic laws
  • Autonomous vehicles will do their best to not injure humans
  • Pedestrians will still rely on signals when crossing a busy street
  • Pedestrians will jaywalk if they decide that it’s safe to do so

Building a virtual world of autonomous vehicles.

Woman crossing in front of a static autonomous car prototype and woman experiencing traffic in VR
Crossing in front of a static prototype (left) and experiencing traffic in VR (right)

Before we jump into our process and findings, we wanted to call out a few things we learned regarding our process. In a prior project, we explored a pedestrian crossing scenario by using a scale but stationary prototype. It had some empathetic merit, but it struggled to deliver the feeling of crossing through real traffic. So, in this project, we instead used VR. While still not actual traffic, it effectively drove empathy without risking life and limb. Also, VR let us build an experience with scale, experiencing streets filled with self-driving vehicles that collaborate and express as an intelligent whole was similarly impactful.

Vehicle behavior and create traffic flow created in Unity
We use Unity to design individual vehicle behavior (left) and create traffic flow (right).


Crosswalk scenarios.

Autonomous vehicles will no doubt bring changes to urban infrastructure. However, people will still rely on cognitive, environmental, and instructional cues when crossing the road. Using our VR prototype, we tested three crossing scenarios that would require various combinations of cues along with different types of crosswalks that will exist in a driverless future:

Marked crosswalk with traditional pedestrian traffic signal
Marked crosswalk with traditional pedestrian traffic signal.

Pushing the call button to request safe crossing at a designated spot is a common interaction at a crosswalk. This type of crossing is likely to still be present in the autonomous future.

Marked crosswalk without pedestrian traffic signal
Marked crosswalk without pedestrian traffic signal.

Similar to the first scenario, pedestrians will request to cross at a marked crosswalk by pressing a call button. Differing from the first scenario, the request is sent directly to the vehicles. The pedestrian crossing signal is then displayed across the vehicles instead of a dedicated traffic light. The vehicles effectively replace part of the urban infrastructure.

Unmarked crosswalk
Unmarked crosswalk.

In this scenario, pedestrians cross at a section of a smaller street wherever they please, relying on the cars to recognize their intent and stopping on both lanes. The vehicles signal their awareness of the pedestrian and actions. This notion of an autonomous vehicle signaling awareness and intention lead to a lot of iterations.


Communicating intent to pedestrians.

We explored different ways of communicating with pedestrians using abstract visuals, text, numbers, and icons. Through quick VR prototypes, we eliminated all text or numbers, as they were difficult to read when moving, created a visual noise, and had obvious language barrier issues. From the remaining ideas, we narrowed down to three forms of visual communication that best reflect our design hypothesis. These are not to be understood as stand-alone concepts, but as building blocks that can be combined for compound effects.

Communicating an autonomous cars' intent to pedestrians through front brake lights
Front brake lights.

Brake lights in current vehicles are relocated to signal when the car is breaking to pedestrians rather than to other cars.

Communicating an autonomous cars' intent to pedestrians through traffic ring light
Traffic light ring.

The traffic light ring expresses crossing instructions to the pedestrian through the use of color and animation similar to current traffic lights. We found more users understood what the vehicle was doing when we leaned on the established traffic signal behavior and colors. A more nuanced and novel signal language can be revisited as people become more comfortable with autonomous vehicle behavior and the resulting infrastructure.

Communicating an autonomous cars' intent to pedestrians through integrated walk sign
Integrated walk sign.

‘Walk’ and ‘do not cross’ symbols clearly tell the pedestrian when they should or shouldn’t cross. The symbols are legible from a distance and eliminate any remaining ambiguity. Lastly, pictographs stand a far strong chance of being understood across cultures.

What’s next?

We want to expose even more people to these scenarios. Stay tuned, as we will be posting our findings and evolving our designs and prototype.