this is a tagline
Report thumbnail.png

Sales Reports Case Study

BevSpot Sales Reports

Redesigning BevSpot’s Sales Reports

Making complicated reports more self-serve

 

Role

UX Designer, UI Designer, Research, QA

Timeline

April 2019 - September 2019

Team

Erie Burkland (Product Manager), Nichole Mace (Product Manager), Anna Lee Barber (Developer), Minshu Zhan (Developer)

Overview

BevSpot allows bars and restaurants to take inventory, place orders, and track all of their data in one place. Its sales reporting feature maps customers’ inventory items (often the raw ingredients) to sales items (the final product that is put on a plate) by taking in customers’ POS system information in order to gain valuable insight into menu performance, usage, and cost information.

Problem

There were a number of problems with the existing sales reports, including:

  • The sales reports were manually run by our customer support team, taking up a huge amount of their caseload and taking multiple days to get reports processed and back to our customers, meaning they were losing valuable time.

  • The reports were too complex for users to read on their own - they often needed the support team to walk them through in order to understand the data that was being presented.

  • At the time, we only offered a few direct integrations with POS systems, but we wanted to increase this number so more of our customers could have their sales data flowing immediately into BevSpot and run reports almost instantly.

One tab of the old sales reports, there are so many column headers it is hard to remember what each one means and there are actually hovers with long descriptions of the calculations on every single one of the column titles.

Goals

  1. Make running reports more self-serve, meaning we needed to create a way for users to “map” their inventory items in BevSpot to the correct sales items from their POS.

  2. Reduce the customer support team’s time working on reports by 80%.

  3. We knew that customers who were using our sales reporting feature had a much higher retention rate, so we wanted to increase the number of users actively using sales reports.

Process

Initial Discovery

To start my process, I really needed to dig in and understand the way this process worked in it’s existing state. Users would often export a “PMIX” from their POS system, which is a spreadsheet of all of their sales items with sales price, menu group, POS ID (a unique identifier used by the POS), and amount sold. They would upload this spreadsheet to BevSpot, where someone would take that and process the report, mapping any new sales items to their relevant items in BevSpot. This process was tedious and took too long for many of our customers.

There are so many different parts and pages that you need to know exactly what you’re about to click into
— Quote from an actual customer during initial interviews

User Flows

Once I had a good base understanding, I met with the product manager to go over the initial user flows they had put together.

The initial user flow put together by the product team, it covers the high level points of the journey, but I knew we needed to dig in really deep to understand the complex flow for this project.

Working with the product team’s user flow I created my own while trying to identify key decisions a user would need to make along the flow, as well as starting to identify the different kinds of users who would be using the reporting feature.

Once I had created a user flow and shared it with product and engineering, we started to really hone in on the different kinds of users. This was important because while we wanted most of our customers to have a direct integration with their POS system and BevSpot, we knew realistically that some of our customers would still need to be uploading spreadsheets in order to run a report. The three different users we identified were those with integrations, those who would upload a spreadsheet, and those who wanted to manually enter their sales information.

Wireframing and Testing

Once we had the user flows built out, I was ready to jump into wireframing and building a prototype for testing. I moved quickly into higher fidelity wireframes because our product team had done a number of customer interviews before I was even brought in to the project and we had a good base of requirements and assumptions that we wanted to test.

For testing, we knew we wanted to understand what information was the most important to users and what they would want to see first and foremost. We also wanted to make sure the entire process from start to finish and seeing their report made sense. During this phase I gathered inspiration from a lot of complex software including Google Analytics and HubSpot. They both had lots of interactive tables and key insights on dashboards.

I created a prototype in Sketch to share and run through during our customer interviews and usability testing.

Key Takeaways from Testing

During testing we talked to different customers about their experiences with sales reporting and I moderated the usability testing of my prototype. Some of our key learnings were:

  1. The “mapping” process was understood by all users without any explanation. They knew that they were seeing their POS items and needed to match those to BevSpot items. Success!

  2. Variance was key for almost everyone we talked to. Variance is the difference between what your costs should have been based on inventory and ordering data, and what they actually were given your sales data. BevSpot is able to break down variance to the item-level and this is what users wanted to see in order to make key decisions about their menu and operations.

  3. Easy to find and digestible insights were the most valuable thing to our users - meaning that revamping the summary page of the report and getting that information was key.

  4. Users were confused by the navigation, especially when creating a report. There were too many options and they weren’t confident they were going where they wanted to go.

Iterating

Based on the feedback from testing, I decided to take a step back and go back to the user flows. Because the navigation was confusing in testing, I decided to take a pass at a flow broken out by whether it was a user’s first time running a report or not. That made a huge difference in the steps of their process.

User flow for users who are running reports for the first time, and who will have more set up to do.

User flow for when a user comes back to run a report and it is not their first time. We really wanted to make sure we were getting the navigation and whole process right.

Once I had a good handle on the navigation and flow, I iterated on some of the key screens to simplify and make sure it was clear how to move through the process.

The Run Report screen was very important, especially for first time users so we wanted to make sure it was inviting and simple to understand.

One of the other very important screens was the mapping screen. We wanted to make sure it was very clear which information was coming from the POS, how to search to add items to map, and how to move from one item to the next to make it easy to move quickly through the process.

There were many many MANY component pages like this to hand off to engineering, and plenty of iterating on them to get it right.

Final Mocks

In the end, I think we were able to make huge improvements in the feature as a whole and also move it to a much more self-serve experience.

New “Report Summary” tab, with high level totals at the top, a table to quickly see items with the highest variance and sort by type of variance, and a table to see insights into the performance of sales items.

New “POS Data” tab to see all of the data coming in from the POS that is included in the report, and to manage the sales data.

Variance tab with detailed information about items, and hovers for sales information, cost %, and usage data.

Profitability tab that shows sales and cost information for sales items and allows users to rank them by all items, or by menu groups they have assigned to items (such as Beer, Wine, Spirit, etc)

New Run Report screen - we were able to offer 14 direct integrations by the end of the project!

Final mapping screen.

A selection of mobile screens. All of the screens in this project were responsive.

Lessons and Outcomes

BevSpot’s product was a very complex one, and there were multiple complex equations going on - many of which were not fully understood by me going in. Having a better grasp of this going in to it would have let us move more quickly and understand what was possible or possible work arounds.

In the end, we reduced support tickets by 75%, and drove product design efforts that transformed the product from tech-enabled to a self-serve, no-touch, consumer-level experience, reducing outsourced support costs by 40%.

We saw an increased retention rate of 95% for users who were using sales reports, and we had more users than ever using the feature. And ultimately, a user could be up and running with reports in hours instead of days.