Mobility
The driverless data conundrum.
Over the past few decades, Teague has designed at the intersection of travel and technology, helping bring both physical and digital solutions to life.
In recent years, the topic of consumer data privacy is cropping up more and more as something that we're being asked to help tackle, or at least be cognizant of as we're designing new products. It’s become somewhat of an obsession for our team.
For Seattle Design Festival, we gathered a panel of experts from Microsoft, Glympse, HERE Technologies, and Mighty AI in the consumer data privacy space as it relates to AI and autonomous data collection and find out how they and their companies are tackling this massive, industry-shifting challenge.
The wrench in the cogs: GDPR
Diving in with the difficulty of sorting through the massive amounts of data collected not only by the companies represented but businesses worldwide, the panelists discussed the challenge of attuning their business and data collection policies for the newly introduced GDPR regulations.
“It’s a real challenge for us as far as trying to protect pedestrian identity while still providing the level of high-quality, labeled data where we have to be pixel-accurate around outlining someone on the sidewalk,” states Daryn Nakhuda, Co-Founder and CEO of Mighty AI, when discussing how his technology, while training systems to track the alertness of drivers, also picks up data on passing civilians who have no idea they’re being recorded. “This means being able to see them in a recognizable way, and so we built some technology to blur faces so we can detect pedestrians and blur the parts that might be recognizable.”
“The point of being able to have a central authority for your data store, but also have controls around the scope, who gets to use it, and how long, is really important,” adds Cyra Richardson, General Manager for AI and IoT at Microsoft, but representing her own views that evening. “A lot of times we think about the state as just being in terms of your license plate or your face, but really if there's a series of data that may not be intuitively obvious as uniquely identifiable, there may be a constellation of data that becomes uniquely identifiable.”
The implications of GDPR, all panelists noted, weren’t just in Europe for their residents or businesses, but worldwide. Chris Ruff, CEO and President of Glympse, gave an example of working with a European company that was setting up a roadside assistance service in Mexico, but even there, it was universally required to tick off the GDPR compliance box.
“That's the biggest challenge is setting one level of security that may not be a global standard yet, but regardless of where the data is coming from, now you're trying to keep your company in the right position liability-wise, risk-wise, etc. The complexity of these new standards that we must meet are all different in different parts of the world.
Companies, governments, consumers: who’s responsible?
When it came to responsibility for data collection and usage, the panelists discussed a future where both global governments, companies, and individuals themselves would need to take responsibility.
When it came to responsibility for data collection and usage, the panelists discussed a future where both global governments, companies, and individuals themselves would need to take responsibility.
“To me is it really reminds me of a lot of the tax laws,” Richardson notes, in regards to different rules and regulations across the globe. “When you start working in commerce, you see all these different tax rules about where the transactions originated from, where it's cleared, and so I think we're going to start to see something really similar in the privacy space. And it really speaks to this need to have a really flexible platform that lets you arbitrate lots of different aspects like scope, who can look at it, who can use it, how you can consume it. I'm expecting to see some of this legislation in the U.S. sooner rather than later as well.” she concludes.
Rohan Thomas, Global Director of Partnerships for HERE Technologies, argues that companies need to be fully responsible for the data they’re collecting and managing, and relieve the burden from the users, who are starting to experience fatigue in this arena.
“When we design a product in today's world, what’s easier?” he asks, “To give a consumer a way to go and control all their data and show them how to switch it off, or for them just go into settings and say ‘turn off all location data all the time’? Privacy is such a cumbersome, burdensome topic that is becoming so convoluted that people are now saying, ‘Hey, I'll just switch off everything rather than fine-tune it.’
So, how do we get there?
Our final question, which our panelists (and the world) could merely speculate an answer to, was, “How do we get there?” As a society, how do we start to formalize these policies in a natural way? What is it the United States really wants when it comes to netting out the whole privacy policy issue?
“From my perspective as an American,” notes Nakhuda, “what I would like is transparency and the ability to control my destiny. I have to understand the benefit, so I think it's important to say, ‘Here's what we're tracking and why,’ and then I could decide, ‘Is that worthwhile to me? Is there a value?’ Certainly, there are things beyond that that are for the greater good.”
“I have a kind of schizophrenic answer to this.” chimes in Richardson, “On the one hand, I think about all the data that we have in medical records. If we could harvest the data and medical records, we could test drugs before they've gone out. Think about the $866 billion a year we pay for chronic disease management; imagine using data to cut that in half.
“But then the leftist inside of me says, ‘A lot of companies are getting free rides. They're using our data and they're not paying us for that damn data.’ Imagine Google giving us a share of every search that our data is included in, like a rerun of The Beverly Hillbillies; If they get residuals, I want my residuals. I have this whole range of reactions and I think if I had something like a credit card where I could release my data, track who's using it, but also get paid each time it's getting used, that'd be a great design project.”
“Younger generations, they'd do that in a heartbeat.” agrees Ruff, “If the trade-off is there, older people may still question what it’s doing or what’s really happening, but younger generations, they'll sell or trade that benefit for current gain very quickly today. They see the world more openly than any of us did when we started our lives without even computers.”
“In a study that HERE did with 8,000 people in eight countries, one of the findings was only 20% were able to comfortable enough to say, ‘I control my own data.’” states Thomas, “But what was interesting was 70% said, ‘I am willing to share my data if there is a way for me to know who is using it, how I get control of it, how I own it, and how I have a say in who how uses it.’
“Something we think about often is privacy-as-a-service. If you have something like that type of information available designed in a consumers friendly way, I think that would be a good start.” he concludes.
Design and transparency hold the key.
By the end of the evening, it was clear our panelists agreed on one thing: The future of consumers feeling comfortable with their data being harnessed and used by companies or governments begins, fundamentally, with transparency and great design. We couldn’t have said it better ourselves.
You can watch the full recording of the evening’s discussion above, or here on Vimeo. And, if you want to learn more about Teague’s ever-evolving thoughts on data and design, you can get in touch with our team at hello@teague.com.