Home » projects » On-road Driver Support

On-road Driver Support

Amazon Logistics

Summary

While delivering customer parcels, Amazon Logistics drivers contact Driver Support to resolve delivery issues. The new Driver On-road Support UX integrates these separate systems into a single UI, reducing contact handling times and getting drivers back on the road faster.

Design team

  • Solo designer (me)

My role

  • Product definition
  • Qualitative user research
  • Quantitative user research
  • Workflow analysis
  • Interaction design
  • Prototyping

TL;DR

Line icon of a warning triangle with exclamation point

WHY: The problem

Drivers call in for help in delivering packages. Cecelia must use from 3 to 14 different systems to resolve a single driver contact, while the driver remains on the phone.

Line icon of a pencil on paper

HOW: The process

Greenfield development: the Amazon Logistics customer service organization formed in January 2017.

Research-driven design: Time-on-task studies, stakeholder interviews, CSA feedback on initial sketches, concept studies on a mid-fi prototype, usability testing on a high-fi prototype.

Line icon of a check mark in a circle

WHAT: The solution

Single-page UI optimized for the top 5 use cases (~80% of contacts in 2016).

Problem

The potential value, like that of many internal tools, was to improve efficiency and reduce operating costs.

In 2016 the Last Mile Transportation Operations Center (TOC) handled 4.5MM phone contacts. Capacity forecast a year over year growth rate of 151%, which would not be possible using the existing systems and staff. At the beginning of 2017 the Amazon Logistics Customer Service organisation was created to support drivers and recipients.

Customer service associates used as many as 15 different systems to handle an on-road driver contact. Not only did context switching increase cognitive load on the associate, the systems themselves were not performant and switching to a new UI could add as much as 90 seconds.

Based on time-on-task studies conducted by leaders in the call center, a single UI that eliminated transition times between tools would save ~3100 work hours in 2017.

Personas

Cecelia, an Amazon Logistics Customer Service Associate (CSA)

Pain points

Cecelia must use from 3 to 14 different systems to resolve a single driver contact

Each tool has a separate UI. Finding the necessary information in a tool can require 9-90 seconds per task.

Moving between applications adds 10-90 seconds to the contact for each transition

Raul is an Amazon Logistics driver

Pain points

He has to stay on the call for 3-5 minutes or more when he has a problem with a delivery

Every delay makes it harder for him to meet his delivery quota

p

Frank, an Amazon customer whose purchase is being delivered by AMZL

Pain points

In the past some of his deliveries have been late, delayed by a day or more.

Scenario

Unable to Access (UTA) is one of the top 5 reasons a driver would contact on-road support. UTA means that driver cannot physically access the delivery location (e.g., a secure apartment building).

Walkthrough

Before

To help visualise the difference between before and after I’ve created a Journey counter: each system will have a circle, and the numbers underneath represent the pages visited in that system. I chose UTA because it’s the simplest Driver contact use case.

This is the screen in Customer Service Central (CSC) that loads for Cecelia when a new contact is routed to her. She asks Raul why he’s calling (it could be one of more than 20 reasons)

She clicks on the tracking number

Which opens a new tab for Package Search tool (PSUI), showing that transaction.

This system doesn’t have any driver profiles, so she’ll have to wrap up the contact under the customer’s account. She copies the tracking ID and returns to the CSC

She searches by the tracking ID to find the related order

If contacts aren’t auto-authenticated (via a logged-in account on the web or in the mobile app) they must be manually authenticated before any customer service associate can see details.

But because she’s speaking with the driver, not the customer, she first has to bypass authentication. (I’m sure you can imagine how much the Chief Security Officer’s staff loves this standard procedure).

Now CSC loads the customer’s order detail page for that shipment. Cecelia knows that Raul can’t get into the building so she switches back to the PS UI

and looks for the notes field, which might contain access codes

Then views customer notes

There are codes in the notes! She tries each code with Raul and neither works. Time to try another tool…

This scenario requires Cecelia to access 12 different screens in 3 separate systems, which , excluding time on the phone with Frank, took approximately 3.5 minutes. This is the simplest Driver contact use case.

After

We’ll walk through the exact same UTA scenario, using the same journey counter.

This new UI is integrated into the CSC. The package tracking ID is passed from the Driver’s app based on the delivery active when they clicked the “get help” button and is displayed immediately — Cecelia doesn’t have to search for the package. Each Driver has their own profile so authentication is automatic.

The most current available access code is displayed on load;

Additional codes are a single click. Remember, this is the exact same scenario, so Cecelia needs to contact Frank to get an updated access code.

Cecelia clicks on the call link

Which triggers a one-click outbound call accelerator, prepopulated with Frank’s telephone number.

As she’s talking to Frank, Cecelia switches to edit mode

And edits the access code in place. She saves the changes while switching over to give Raul the new code.

Success – Raul can get into the building and the is contact complete. Cecelia completed the task in 1 screen. In usability testing, a simulated contact using the prototype averaged only 1 minute 30 seconds, less than half the average real-life UTA contact.

Process

Discover

Foundational user research

To add context to the time-on-task quantitative studies from the call center operations team, I did a bunch of foundational qualitative research. This included driver ride-alongs, customer associate listen-ins / observations, and customer associate interviews.

Process mapping

As part of my analysis I did process diagramming for the top 5 use cases. In the UTA process I found a segment (outbound call) that took 7 steps and cost up to 30-45 seconds. The solution was a single click.

Quantitative analysis

For each of the top 5 use cases I did a deeper quantitative analysis (based on activity logs) to identify key opportunities for intervention. System performance was a common issue, but impact was different per use case. The systems used in UTA contact resolution were particularly non-performant.

Ideate

Product strategy

As a project team we identified the most important flaws in the system, and then brainstormed the characteristics of a good system. The product manager and I turned that brainstorm into a today-tomorrow statement. Our new UI would be fundamentally designed for customer service associates, with a single UI that integrates the 15 backend systems and uses smart defaults and accelerators to reduce time on task.

UX strategy and design constraints

To keep the team aligned and focused I explicitly defined the attributes for the new system as well as the design constraints. Because the platform, tech stack, and design system (AUI) were hard constraints, this project was all about innovating and optimising while staying inside the lines.

Collaborative explorations

As a team we did many rounds of whiteboard sessions, from analysis of data elements to rough wireframes.

Card-based architecture

We landed on a card-based UX architecture focused on the three most important bits of information to the customer service associate:

  • Who: Driver banner is the contextual anchor at the top of the screen
  • What: Shipment details are the core card at the center of the screen
  • What’s next: The Itinerary section at the bottom of the page

Although it was not yet a pattern used in the Customer Service platform, we chose a card-based framework because

  • It’s common: cards are a well understood pattern, and increasingly used for other tools in Amazon, particularly the retail mobile app
  • It’s flexible: providing a consistent framework while permitting variations in content type and display
  • It matched our user’s mental model: the cards imply fluidity of display, which allowed us to optimise screen real estate

Iterate

CSA feedback

Using high-fidelity prototypes, we iterated based on feedback from customer service associates. Our constant question to them was What can we remove? For example, associates found photos distracting and were concerned they could lead to unconscious bias.

The evolution of the shipment card

This is the story of the evolution of the Shipment card. For context, there were 9 types of delivery methods: Parcel (AMZL), with sub-types for Single family & multi-family dwellings, Locker, Prime Now, Restaurant, Fresh, Hub lockers, In-home, and In-car. The shipment card had to accommodate all delivery types.

This version of the card has shipment split into order details and delivery status.

We learned that order details weren’t necessary (they were only ever used for Restaurants and Fresh orders, which were each less than 2% of contacts). 

We expanded and enriched the shipment card. Then we learned that tracking details were only used as a reference and should be collapsed — still available but focusing attention on the shipment card.

For clarity (both for developers and stakeholders) I refactored the prototype UI to match the latest version of the design system, Amazon UI (AUI)

We continued exploring levels of detail and placement for each data element, including elevating the detail about the delivery location type

Build

Collaborative roadmap

This is what our final design looked like. We then sat down together and prioritised implementation based on user value and technical feasibility.

Outcomes

Six weeks after the project kicked off, the product went into A/B testing with 50 associates (about 15% of the team). After a week it was so successful — those associated had nearly a 50% increase in efficiency — that it rolled out to the rest of the team 3 weeks early.