In Autonomy We (Don’t Quite Yet) Trust.
In recent years, designers around the world have crafted their versions of an autonomous future. We’ve seen stunning renders of reimagined vehicle interiors depicting people lounging in bed on the way to work, private dining experiences while cruising around town, or mobile conference rooms. But when you remove a human driver, many of the interactions we take for granted disappear. Replacing “Can you drop me on that corner instead?” or “Lyft for Sam, right?” with digital alternatives is a real design challenge that needs to be solved first before we can achieve the luxurious futures painted above.
In an effort to take some of those steps, Intel approached Teague to explore how people would realistically interact with autonomous vehicles five to ten years down the road. Their engineers required an in-depth creative perspective in order to understand what sensing and computing capabilities they’d need to build into their future platform. In addition, they’d need a partner that could make this future tangible by building immersive, evolvable physical and digital prototypes. With decades spent pushing the boundaries of the future of travel, Teague was a natural fit to take on this challenge.