Mobility
Prototyping in-car user experiences in virtual reality.
We have been working on connected and autonomous car projects at TEAGUE for a while now. We have found it to be indispensable to use physical prototypes to test our designs with users.
Building a physical mock-up, equipping it with hardware, and creating a digital infrastructure for the components to work together requires a great deal of work. There is a prototyping gap between digital prototyping and physical mock-ups. Sketches and renderings are relatively quick to make and allow for great flexibility and many iterations. However, it is hard to use them for user testing. Physical mock-ups are great for user testing but are time and labor intensive and limit concept iteration.
We decided to build a VR environment for autonomous car prototyping to bridge this gap and complement our existing physical approach. Unity and its graphical management of 3D models, interactivity, and code is the perfect tool for us, as it allows people with different skill sets to collaborate within one program. It also features live editing functionalities, which can be extremely useful in prototyping sessions.
Car Model & UI.
We built a simple car model with a focus on the interior. While we wanted it to feel like a car, we also wanted it to be as simple as possible, to achieve maximum flexibility in the prototyping phase. Simple forms and minimal colors created a car that is more a canvas for future ideas than a finished product. It includes the three screens typically found in cars: dashboard, central screen, and HUD, each with simple content (for now).
Endless highway.
It is challenging to provide test subjects with a sense of inertia in physical mock-ups since they are typically stationary. We used the curvy plugin to build a generative, endless highway for the autonomous car to drive along. While there is no traffic (yet), the car moves through a repeating landscape, which creates the sense of an ongoing journey rather than a carousel ride.
Once the basic platform was complete, we held a few informal testing sessions with designers in the studio. While we collaborated with each participant to redesign the interior in an interactive VR brainstorming, we were looking for missing elements of the platform that could improve the process. We made some changes to the platform after every session to accommodate new learnings. The overall experience was not only fun, but we learned a great deal, both about autonomous cars and prototyping them in VR. What follows are our main learnings about prototyping in-car experiences in VR. (We will dive into the autonomous car learnings in a separate article.)
Live modification.
Unity allows for live changes while the program is running, which makes real-time experimenting with different physical layouts easy and effective. “What if this seat was over there?” — “Let’s just move it and find out!”
3D sketching.
In many cases, a designer would say: “What if there was something here?”, while waving in the approximate area. So we wrote a simple script that allows the designer to sketch their ideas in space.
Quick experiments.
Many problems are more complex to explore than redesigning the physical layout inside the car. “What would it feel like to blur the windows in autonomous mode?”, “Could you drive the car from the backseat?”, “Would it be weird to see the car from a 3rd person perspective?”. Our platform allowed us to get early insights into these questions with very short turnaround times.
Improvements.
Our current platform and the state of VR is not well suited for everything yet. Usually, passengers experience g-forces when a car accelerates, hits a pothole, or takes a turn. In VR you don’t get that experience, which can make it hard to test scenarios credible. Even simple contraptions like this manually powered stage could help with that.
Working in VR usually feels like sketching with elephant feet rather than fingers. That can make interacting with small buttons on virtual screens frustrating. We played with the Leap motion hand rig but a better solution might be the drumstick approach by Daydream Labs.
The missing depth of field is another challenge; in the current state of VR devices, everything is always in focus, making some optical phenomena hard to test. The reason a HUD exists is to display information at a focal distance closer to the street. Future light field display technologies will make VR a more viable option for these scenarios.
The lack of haptic feedback can confuse the participant’s sense of touch and space. We found that matching the seating position with a low seat (or propped up feet) and steering wheel helps to overcome the initial disconnect. In the future, we are interested in using our large hotwire foam cutter to make physical models mirror the virtual shapes.