top of page

Building a body of research for a new ESI dashboard experience

My role: Researcher (final design proposal included)

​

Note: As a UX researcher at Express Scripts, I performed all parts of the research process: research planning, writing protocols, conducting research, data collection, data synthesis, presenting, and making recommendations to stakeholders. In my work, I apply myriad methodologies including, but not limited to: moderated in-depth interviews, contextual inquiry, diary studies, usability testing, surveys, RITE testing, tree tests, and card sorts. 

​

Context & The Problem

A business product owner pulled me into an effort to revamp the Express Scripts post-auth dashboard experience. At the point that he brought me in, we had plenty of time to conduct multiple studies, and still make the deadline for hand-off to the development team. 

​

From the adobe analytics data on site visits, we knew that our number of return site visits was very low. We also knew that the majority of revisiting users were home-delivery members, doing little other than checking their order status. Finally, the current dashboard offered more accessibility to mail order tasks than to benefit-related tasks. Our hope was that by revamping the dashboard, we could:

1. increase the rate of return visits, which could, in turn, lead to more home delivery conversions, and

2. increase members' awareness of ESI's role as a PBM, and associated benefits. 

​

My Approach

I'd planned to host an assumption-mapping workshop for the research kick-off of Dashboard 2.0, had one not already been conducted by another researcher prior to my joining the project. Collaborative assumption-mapping sessions at the beginning of projects helps ensure all stakeholders are aligned in our "knowns" and "unknowns" as well as priorities. Reviewing our collected assumptions together also helps us collectively identify outstanding questions, and form our objectives for a research study

Snip20231116_1.png

Sample of collaborative notes (knowns & unknowns) from kick-off meeting with stakeholders

Secondary research

In the assumption mapping session, some of my stakeholders had posed questions that I knew had been addressed in previous internal research, so I went to work on secondary research, compiling research previously done related to their questions. After compiling previous related research, I summarized the takeaways and presented the collected research, highlighting the answers that addressed questions that surfaced during the assumption mapping session. Reviewing the validation of these "knowns" helped us further refine our outstanding questions / objectives.

Image of some secondary research, including a tree test I'd previously conducted to gain insights into users' mental models around the site's information architecture.

Snip20231204_3.png
Group 633910.png

From collecting and studying the research already conducted, I was able to sufficiently answer stakeholder questions:

1. What are methods that members use to research the meds they've been prescribed?

2. What is the threshold of savings that impacts whether cost overrides convenience?

3. What impacts member preferences in terms of home delivery vs retail vs a mix?​

​

To address the questions still outstanding, I knew more than one study was necessary. Because we had a forgiving timeline, my proposal was:
Phase 1. an interactive dashboard co-design study
Phase 2. a series of interviews with new members to gain insights into their jobs to be done (JTBD)

Phase 3. a diary study with members

​

​

​

​

​

​

​

​

​

 

 

Phase 1

For the interactive dashboard co-design study. I recruited 12 participants who:

  • were a mix of ESI members and other health insurance members

  • managed at least 3 long-term medications

  • had all searched for either a claim or benefit info online in the last month. (I wanted a mix of people who would not only be mail-order-minded but PBM-minded as well.)

​

Our objectives for the dashboard co-design study were:

1. What features are top priority for participants?  What are least?

2. How confident are users in knowing what they need/want from an online pharmacy and PBM?

3. What's the right amount of "bite size" and/or "scannable" information to answer questions without user needing to click deeper?

4. Are members clear in understanding what our components represent?

5. What are members' expectations of the search functionality?

6. What types of alerts to members expect to find in situ vs via a communication channel? 

​

I find having a visual activity helps enable people to articulate why they make the decisions they make. For the dashboard co-design study, I collaborated with a designer to create a blank dashboard template and an assortment of features. During the session, I asked the participant to pull components from the right to the blank dashboard and tell me why that feature was important to them to have on the dashboard.

Watching and listening to people talk about their preferred features on the dashboard helped us get closer to answering the questions above. 

Group 633918.png

Together with the team, I created a figjam board notating the planned research and respective correlating objectives. 

Group 633866.png

dashboard co-design session template

partial image of my notes from 5 participants

Snip20231204_2.png

 

I presented my findings in figma. Pictured is a small portion of my presentation.

Group 633907 (1).png

​

Significant Actionable Insights

Among our most note-worthy findings were:

1. Participants wanted to be notified of any and all actionable notifications (with the exception of medications with savings available)
My recommendation: surface all actionable medications on dashboard; if too many to surface, provide a notification or alert informing member of more actions hidden.

2. All participants reported Order status and My Meds to be top priorities (even those who don't typically receive medications via mail order.)

My takeaway: this research validates what we already do: give both features prime real estate on the site.

3. There was some perception that Order status and My Meds were similar and sentiment that they could be consolidated into one.

My recommendation: we could consider merging these products into one in a future MVP.

4. As expected, high value was placed on the search bar; however, perception of functionality of the search bar varied.
My recommendation: 
sample terms (ghost text) in the search field could potentially mitigate uncertainty about how the search bar could be used.

5. Most participants perceived savings opportunity as "advertising" and did not want it on the dashboard.

My recommendation: if placed on the dashboard for MVP1, members should be able to opt out (remove from the dashboard) for a better experience.

​

Phase 2

We wanted to better understand members' jobs to be done at all points in their journeys managing medications. But I also wanted to talk to new members about their onboarding experience and initial visits to the dashboard, as I believed these conversations could generate rich insights. I decided to tackle both the JTBD objectives and new member onboarding experience in one in-depth interview with contextual inquiry.

worked with an analytics partner to run a query for members who had joined our service September 1st (2-3 weeks prior to interviews) so that the onboarding journey would be fresh in their minds. I sent a Qualtrics survey to everyone in the list pull, and ultimately recruited 6 new members for interviews. I drafted my research proposal which included the research goal, objectives, recruitment details, materials, and protocol.

image 120.png
image 121 (1).png
image 122 (1).png

Storytelling & Triangulation

Once I completed my interviews, I spent a few days synthesizing my findings, then began work on my presentation. While I always try to use an abundance of quotes to enrich the anecdotal aspect of my presentations, I felt that effective storytelling was especially critical to presenting this research. Because the onboarding process is a member's first impression of the service, I believed that intimate relatability of the pain points was paramount. Below are some slides from my final presentation. 

image 101.png

For each member I interviewed, I wanted to clearly depict the timing of the pain points he experienced over the 2-3 week time span between the date his benefits became effective and the time of our interview. I created a timeline to help show the sequence of events, and I included quotes and videos of each member talking about any frustrations with the onboarding process. The orange exclamation point icons indicate pain points, the blue phone icons indicate any point they made a call to Express Scripts.

Group 633874 (1).png
image 114.png

Triangulation in research is the practice of using multiple methods or sources of research to support a finding.

Two of the most prevalent themes I heard expressed in interviews were:

1. The prior authorization process is a major pain point for members

2. The inability to use coupons on ESI's site does not meet members' expectations

I implemented triangulation by finding native mobile app feedbacks (received via a 'contact us' link on the native app) that support the themes. Below is a slide from my presentation which details the insight from my interviews, supported by correlating, supporting feedback I pulled from the native app feedbacks (pulled via a report generated from Jira).

image 113.png

An objective I had been given was to gain insights around how well ESI members understand the service's full benefits, as well as their confidence in their understanding. The below slide depicts how I presented findings on this objective. Since this was a qualitative study with just 6 people, I couldn't use a graph or chart as I would for a larger quantitative study, so I used quadrants to represent the participants' confidence levels as well as actual comprehension.

image 124.png

Significant Actionable Insights

Among my most note-worthy actionable insights were:

1. Calls - All six members called the ESI call center multiple times during the first 2-3 weeks of their membership (number of calls ranged between 2 and 8). Note: Because staffing the call center is so expensive, our business partners have an ongoing initiative to reduce incoming call volume. Some of the calls from these six members were related to needing help to log in for the first time.

My recommendation: we enable new members to access an activation code (for first-time log in) via an automated email, rather than having to call the call center. 

2.  PA - Another major pain point and purpose for calling ESI was to request, or check on the status of prior authorization (PA). Prior authorization process has long-been a known pain point for members (validated in this study).

My recommendation: we might consider:

     a. messaging the member before membership becomes effective (but as soon as medical records are obtained) alerting them if any of their medications will require prior authorization, so that they can begin the process of obtaining it.

     b. providing more detailed status updates, e.g. a time and date stamp next to every prior auth status update (the current experience does not include enough truly helpful status details.) 

3. Coupons - Two of the members interviewed had planned to use coupons when purchasing their medications and were disappointed to learn that ESI did not accept coupons. This also contributed to the high volume of calls to ESI upon onboarding.

My recommendation: because the inability to use coupons is a known pain point, we might consider informing members earlier in their onboaring journey (e.g.:in the welcome packet) that ESI does not accept coupons, so that we can better manage members' expectations.

​

Phase 3

At the time of writing this case study, the diary study is in planning. High-level objectives include:

​

1. How do members expect and / or need the dashboard to change over time?

2. When do members' rx- and benefit-related tasks occur to them?

3. Are our communications helpful? Too few? Too many?

4. Do all members of a family have the same view / visibility needs?

5. How else are members saving money on prescriptions?

​

Redesign Proposal of ESI's Dashboard

​

While I was not the designer on the project, I conducted my own redesign of the dashboard based on the user feedback received during testing.                       

During the 1st research phase, I discovered that our users' top priorities (in no particular order) were:

  • Order Status

  • My Medications

  • Notifications

More importantly, participants consistently emphasized that they wanted to be notified of any important updates on medications or payment issues on the dashboard. 

 

Following (in red) are my imagined revisions (see correlating numbers in diagram) First pictured is the current dashboard for comparison

​

Group 27 (2).png

1. Since participants unanimously said they wanted and expected to see important notifications immediately upon login, I decided to add a notifications section. Keeping in mind the "Z Rule" I placed the notifications on the upper-left part of the screen, since several medication order-related notifications would inevitably be of importance and need immediate attention. Participants had voiced that they would be overwhelmed if many (more than 7-8) notifications appeared on the dashboard, so I opted to use a carousel view, and an X close button on each notification, so that users would know they could easily close unimportant notifications, or click on important notifications to learn more.

2. Since some important notifications would be hidden, I wanted to show the total number of notifications/ alerts user has next to the header.

3. The tiles (My Medications, Active Orders, and Make a Payment) as well as the drop down menus were intuitive to members tested and were reported as priority features on the dashboard, so I didn't want to remove or change them. (Additionally, the business stakeholder wanted the revamp of the dashboard to happen gradually to avoid shock or disorientation to regular members.) But since the notifications I added took up real estate, I made the tiles slightly smaller than on the current dashboard experience. 

4. In testing, participants had been divided in responses about savings opportunities; some participants did not want to see content about savings at all because it felt spammy and they didn't trust it. So I decided to remove the savings opp from the side panel and place it in a notification that the user could choose to explore or ignore. In testing,

5. I'd found that a significant number of users liked the Price a Medication feature, so I placed it in the space previously occupied by the savings opp widget. While the CTA competes for attention, Price a Medication was a more popular feature than Request an Rx, so it warranted a top placement. 

6. The Request an Rx feature tested relatively well so I left it on the side panel beneath the Price a Med feature. 

7. One of our objectives in rethinking the dashboard was to think about how we might promote ESI's role and benefits to the user as a PBM. In general, our users were highly aware of ESI's offering as an online pharmacy but less aware of its role as a PBM. I added an interactive deductible widget reflecting the user's up-to-date deductible status, which the member can click on for more details. I added this to showcase ESI's role as a PBM and entice members to explore other benefits available to them. 

8. The Quick links don't compete for attention with other CTAs and are consistent with competitors in the space, so I opted to keep them. 

​

Group 29 (1).png
bottom of page