by Kathryn McElroy, Creative Director, argodesign
A recent article by John Pavlus, 3 reasons why Tesla’s dashboard touch screens suck, got me thinking about the state of UX design in cars. His conclusion is that large screens are “on the wrong track,” then cites three usability issues that the Nielsen Norman Group writes about in a recent report. As a product designer who considers UX on a daily basis, I think he’s got it wrong.
Screens in cars are not the problem. Poor UX and UI design is — and that can be changed.
The future of cars, and most all other interactions, is software. It can be iterated on, updated, and streamlined after it’s been delivered to improve the experience for all users. It provides the opportunity for more, richer features, no longer being limited by the physical size of the dashboard. Speaking specifically to Tesla, the interface offers multiple pages of settings that allow users to fine-tune their driving experience. Cameras display new vantage points to the driver and improve safety and parking ability. As society shifts and users change, software-based interfaces can change in kind, meeting users where they are instead of requiring them to buy the newest model of car.
The alternative to screens and software isn’t knobs; it’s smartphones. Drivers aren’t going back to a fully physical interface. So how might designers make safer interactions easier to use and preferable to the driver’s smartphone? By designing thoughtful software, prioritizing driving as the action at hand, and providing multiple ways to interact with the car beyond just touch.
I’m a Tesla owner and product designer. During the day, I work on future-facing technology and interaction models for augmented reality and wearable tech. To get to this job, I commute 45 minutes each way in my Tesla Model X, and have been driving it for the past 2.5 years.
Although I miss some of the haptic feedback and tactility of my previous Toyota, and I have to stretch my short arms to reach some of the buttons on the center console (both issues in the Nielsen report), I believe the interactions with my Tesla have made me a safer driver and more focused on my driving.
I love the features my car affords me: streaming music, direct navigation, and accepting and starting phone calls. In my previous car I performed these tasks on my smartphone. Now I handle all three interactions with only my voice, my eyes glued to the road. And I’m motivated to do it this way because it’s not only safer but easier than using my smartphone.
Drivers expect and demand more features for their cars today than ever before. They will go out of their way to buy the slickest, largest-screened car they can afford. Compared to other cars, Tesla is leading the industry with one of the most intuitive, easy-to-use systems for those features. The car balances multiple ways for users to interact and makes certain tasks easier with safer input methods.
You can tell that all of this has been specifically designed into the Tesla driving experience. It’s the responsibility of designers to have the best interest of their users in mind as they create an interaction system. Instead of making interfaces based on what users say they want, designers must create it for how users will act in the moment. That means designing the path of least resistance toward safety when it comes to car interfaces. That path must be easier than a smartphone, and the feature set must work better than any app the user already has.
One area that the Nielsen report missed is how Tesla offers multiple ways to interact with its controls. They comment that some of the most-used features are buried in an app menu instead of easy to tap, such as the phone app. What they fail to disclose is that the user doesn’t need to touch the screen to use the feature; it’s available through other means that are far less distracting than using the main screen.
Though users can adjust the climate, start a phone call, or engage navigation with the touch screen, they can also use the physical buttons and dials on the steering wheel, their voice, or the smartphone app (if they’re away from the car) to perform these actions.
If a user needs haptic feedback for changing the temperature, they can use the dial on the steering wheel, with the satisfying “click click click” as they dial the temp up and down. And with the visual display on the dashboard instead of the center console, it’s easy to glance down, taking less time away from the road than other interaction methods.
Speech is a huge area of opportunity for designers today. As users become more accustomed to interacting with Siri, Alexa, Cortana and the rest, designers can leverage that familiarity to create safer interactions that don’t require any hand movement or eye-time. As designers incorporate robust voice commands into their car-based interaction models, users will naturally use the touch screens less and learn to lean on this direct interaction.
Tesla has a few things figured out: providing multiple ways to interact, making them unobtrusive, and putting them at the user’s fingertips. As designers, we can take inspiration and apply it to both car-driving experiences and other product experiences. First, focus on what the user should be doing in the moment. If it’s driving, make sure that action is prioritized and easy to accomplish. Then make secondary actions quick, direct, and the least distracting as possible. Finally, ensure that users have a variety of inputs available, including voice interactions, to streamline their tasks and help them stay focused on the primary action.
Designers need to take ownership of creating intuitive systems for their users.
By leveraging the potential of software, we have the opportunity to do this in an iterative way, testing and updating those interfaces for users as capabilities improve. Because if we get it wrong, users already have an easy way to fix the problem: going back to their distracting smartphone.
Kathryn McElroy is an award-winning Creative Director at argodesign, a design and invention agency in Austin, TX, where she envisions the future and develops products and strategies for a wide variety of clients. She is the author of Prototyping for Designers (O’Reilly) and has employed user-centered methodologies to create and iterate on impactful experiences in health wearables, AI interaction patterns, AI image recognition and training interfaces, and cloud development tools, while working on world-class design teams like IBM Watson Visioneering and IBM Mobile Innovation Lab.