One of the most powerful aspects of driving a car is the feeling of independence and control of being able to go anywhere at any time, to listen to whatever you like, and to have a space that is your own. With autonomous cars, there is the potential for a wider set of people to experience these joys of driving without actually driving.
​
To approach this challenge, we will build off the well-established area of accessible interface design for smartphones.
We are currently working on the design system for control and communication using the accessible features of the mobile phone and we are developing the first versions of the design.
Interface Features Walkthrough
Accessible Onboarding
Choose your accessibility requirements as you sign into the app
Order A Ride
Order a ride from marked accessible locations
Verify your ride and destination with scan or voice (for visual impairment)
Choose seating side and deploy ramp if needed
During A Ride
Get notified of updates and
Control home page and seating controls
More controls, also activated by voice
Dropping Off From Ride / Next Destination
Set pickup location, next destination or end
User feedback
Inclusive Design for Autonomous Vehicles
Research Study for Department of Transportation Design Challenge
Timeline
Tools Used
February - December 2021
Figma
Skills
User Research
User Testing & Interviews
Collaborative Designing
Storyboarding
Wireframing
Collaborators
Nik Martelaro and research team
Roles
UIUX Designer
Researcher
The Challenge
How might we design vehicle transportation interfaces to be more accessible to all?
Designing interfaces for accessible autonomous ride-share experiences
Autonomous vehicles are the future, where more and more methods of transportation including our personal vehicles are autonomous. With this powerful tool of driving, we want to expand these experiences to more people by creating an interface that takes into consideration people living with disabilities.
​
Often times, many don't realize that a design is not accessible to all because of the lack of exposure and thus knowledge, therefore, I hope to as a designer on this team, not only learn more about designing for accessibility but to also create more awareness into the importance of creating products for the differences in people.
​
​
I worked with researchers to conduct interviews with people who had experience with disability in our local community, analyzed this information and built out the preliminary structure for our interface.
Research
Community Interviews
Every month we had a community meeting where we talk to our users and ask for feedback
Communicating with the research team
We gathered information on what was already out there for accessibility features on major phone operating systems such as IOS and Android
Using this information on accessible features, we placed them into various scenarios
​
​
Journey mapping
What users are currently experiencing
We collectively in the design team created journey maps of what accessibility users are currently experiencing in different contexts and with different disabilities based on research to get a wholistic understanding of what users would want to experience. This also helped understand at what points in the travel can this interface intervene and significantly improve an accessible autonomous car experience.
While the research team focused on collecting data from external sources and organizing them into textual information, I took this and created visual scenerios to further understand the situation and what users might want or come into problem with during a car ride
I worked together with the researchers to develop a series of user cases to more fully understand what is necessary as part of our interface and interviewed with others in order to get feedback. The researchers complied our process into a spreadsheet as well as a document guideline for designing the interface.
Prototyping Interface
As I continuously created scenarios, I began to translate them into voice flows and use this to guide how I designed the initial interface.
​
​
Initial frames for onboarding
First time user selection for personal autonomous car screens
Transitioning to car-share interface instead of personal car with more user feedback
By talking to more users, our team began to shift for designing a personal autonomous car interface to a car-share service because most interviewees assumed that this was the most common and more widely used. This is because of how car-sharing would be a more affordable option.
With public usage cars, we would need to be more questions in the beginning of travel to indicate what type of accessible features one would require, such as the size of the wheelchair. Thus, we later would create a series of necessary onboarding questions.
Emphasize allowing users to visualize the status of the car
I suggested using step by step notifications the status of the vehicle for people living with sight impairment. The user can listen to and visualize where and when the car is approaching. They are empowered to be in control during the entire process.
​
A log at the bottom lists the processes and steps so that people living with cognitive impairment can look back to it for reference or anyone that wants to confirm that everything is proceeding accuretly.
I emphasized the redeploy button because during our interviews, people that regularly used cars with ramps, often had to redeploy the ramps due to how finicky they were.
​
During the ride
In a car share scenario, users wondered if there was some way to indicate the drop-off/pick-up zone was accessible friendly. This was a common problem our users faced, therefore, I explored including options for users to rate the drop-off/pick-up zones as well as have a satellite view of the area live.
​
​
​
​
​
Clear and consistent communication of travel
Knowing how different users rely on either sound or visuals to input and understand information from the interface, I designed the starting navigation page to be as visual as possible at major points in the process, such as having a visual of the car in real-time space and having voice control narrate significant events in the journey.
​
In our interviews, I found out that users want to know what is happening during the ride, but they also don't want to be overloaded with information too often. So creating a balance of what to include and not in the voice feedback is an important aspect I focused on exploring.
​
Important events that should require voice control updates include:
• Traffic, delays in travel time
• Changes in road condition (series of speed bumps, etc.)
• Rerouting
• Emergencies (low fuel, malfunction, etc.)
​
​
Anytime during the ride, the user can also use their voice or click to the controls tab top change any of the conditions in the car, such as the seat position and temperature.
​
​
Ending the ride
At the end of the ride, similarly to during the ride, the user is updated on where they are being dropped off at. This includes a live view of the area to better understand the outside conditions, as well as an update on the outside environment and weather. This I learned would be especially relevant to wheelchair users.
​
The interface then confirms where the user would like to be picked up and parks itself in case of any mistakes.
​
​
​
Mitigating user mistakes
What is often mentioned in our discussions is how to navigate when mistakes happen. Therefore in my interface concepts, I focused on inputting areas for the user to intervene at crucial moments. For example, if the destination arrived is not what the user wants, then they are able to click as well as tell the interface that they want to reroute, stay parked, or even go back to their original location (also accessible during the ride anytime).
Creating a user community
I realized as I talked to accessibility users in our community when designing this interface, that each person was very passionate about the topic and had many personal experiences and things they wanted to day. Therefore, I thought that as our interface becomes further developed, it would be valuable to encourage a community base where users can interact with each other, share information, and help each other in autonomous vehicle related problems.
Refinement
Consolidating designs with cross-functional teams
We collaborated with the research team to create a finalized design system for designers and researchers to be on the same page. This included explanations of the reasoning and importance behind chosen colors, fonts, sizes, consistent user interactions and more.
Reflection & Next Steps
My major takeaways
• I really appreciated how our advisor gave us control over how we managed our work and our teams. By doing this, I was really able to practice my leadership and organizational skills to work in smaller teams and contribute as part of a larger team.
​
• This was such as large project and I am very lucky to have joined the team at an early stage. Therefore, I was able to contribute and guide the project from the beginning to a well-designed state. I learned the full process of how communication designers design an app interface to where they hand it off to coders.
​
• I really enjoyed how closely we interacted with community members as well. I learned how to lead interviews, debrief, and take away key insights for improving an interface design.
Interface implementation & further design refinement
• Mapping the voice flow with user interactions for interface
• Refinement of user interface design to be more succinct and for better flow
• Testing with our users of the current prototype
• Start developing the interface and working with coders in doing so