Usage Analytics: Continuously Proving ROI
Learn how we built the Usage Analytics dashboard to provide a seamless way to view your team's usage and engagement within Pathlight.
The new Usage Analytics dashboard provides a platform for leaders to gain insight on their organization’s engagement within Pathlight. The feature measures user engagement on a daily basis, recording primary actions such as messages sent, meetings scheduled, mini goals created, and much more.
By providing our customers with engagement data directly, leaders are able to see the positive impact of the Pathlight product on their teams. Before we shipped this feature, the Pathlight team needed to manually run internal reports to share with customers, and our customers would only see engagement data in quarterly business review (QBR) meetings.
In this article we will discuss the reason for building out an in-house engagement measurement tool, our approach to the problem, and some challenges encountered along the way!
The Need for Usage Analytics
Engagement in Pathlight is high, and by building Usage Analytics, we are able to continuously visualize to our customers just how active they are within our web and mobile applications. As shown in the chart above, the average user logs into Pathlight 5 days a week, and nearly 20% of users use Pathlight each and every day. However, these users do not just log in, they check their metrics, send messages, react to comments, and schedule 1-on-1s. Highlighting these extraordinary levels of user engagement plays a pivotal role in both driving customer retention and attracting new customers.
Making it easier to view engagement within Pathlight has done more than just prove the pivotal role our product plays in the day to day operations of each of our customers. Rather, we have seen multiple downstream effects that have saved our team time and have helped to lead product investment. In quarterly business reviews, we traditionally spend a large amount of time building out custom reports to highlight just how ingrained our product is into their workforces daily routine. However, with Usage Analytics, we have replaced an extremely manual report generation process that takes up a significant amount of time for our customer success team, and made it possible to easily view these engagement statistics across the organization.
Moreover, Usage Analytics is a low barrier to entry tool that enables both technical and non-technical leaders to understand how their teams use Pathlight. For us, this means being able to direct product and engineering focus towards the most used features, allowing our development to have a focused impact. Additionally, by giving our customers access to their own usage data across the application, we have been able to highlight user engagement in the Pathlight web and mobile applications, better demonstrate return on investment (ROI) on a continuous basis.
Challenges of Building Usage Analytics
With building a tool such as Usage Analytics, we faced a variety of challenges that heavily affected the way in which we approached the problem. The first challenge was in the consistent measurement of actions.
Consider the following examples:
- Mike sends a message
- Mike sends a message at 6pm on a Tuesday but quickly unsends the message
- Mike sends a message at 6pm and unsends the message the following Monday
Clearly the first instance would count as a message being sent. The interesting dynamic comes into play between scenarios 2 and 3 and how we strike the balance between what counts as activity and what does not. The second case is more straightforward, as most would agree that if a message is promptly deleted after sending, it should not count as user activity.
However, in the third case, the user’s motivation for deletion is not clear. Mike may just be clearing out his previously sent messages after the recipients read and interacted with the original message. The original sent message should therefore be counted in user activity metrics. We chose to construct Usage Analytics to count activity that is performed but not undone for an extended period of time, but not count activity that is promptly undone.
To address this problem, rather than recording engagement whenever an action is performed or undone, we chose to take an activity snapshot approach. Every day at midnight local to the organization we automatically compare two activity snapshots per organization, taken 24 hours apart. The daily snapshots record all actions that still exist from the previous day. This allows us to not capture the actions that are promptly undone, but still count the actions that are at least around overnight.
Another interesting problem was how to store this data in a scalable way. Initially, we thought to store all analytics data for a given person on a given day in a JSON object. However, we decided against this for a few reasons:
- Because users are not performing each type of engagement on a day to day basis, this would result in a large number of fields set to zero on the JSON object. We could omit fields with a zero value to slim down the JSON, but checking for missing fields would make the aggregation logic more complicated.
- JSON is not easy to extend if we support more metric types in the future.
- JSON fields don’t give strong guarantees about the value type (ex. string vs number) or allow us to set a default value, leading to potential errors if the dictionary contains unexpected data.
To avoid these possible issues we adopted the following data model:
We made the UsageStats model deliberately lightweight and store a single data value (usage_count). Each of these models defines a UsageCategory to denote what metric type the data is associated with. We currently support 18 different types of metrics and can easily add more categories.
The UsageStats model also guarantees that the data stored is always an integer, and the model is only created when there is nonzero data for that specific metric. We therefore can make the overall database smaller by saving only rows with nonzero data.
Django’s built-in database functions then make it easy to aggregate across individual rows, with any missing metric types assumed to have a data value of zero:
And since we store each metric type in a separate row, it becomes trivial to support filtering or ordering the data by a specific metric type:
We have built the Usage Analytics dashboard to easily support ordering by ascending or descending values for each metric type. Internally, we often use the dashboard to recognize the team members who have sent and received the most gifts over various time periods.
The Future for Usage Analytics
Usage Analytics is still in its early stages. We have exciting plans to make this product even more powerful by adding even more features. This has laid the foundation for another new product, Analytics Explorer, which allows for viewing metric data over week, month, and quarter long periods across groups and individuals, further bolstering the power of Pathlight.
If you are excited about building 0-to-1 products like this, we are hiring here at Pathlight. Learn more about us, what we are building, and our various opportunities here!
Thank you to Daniel Lipkin, Stephanie Rogers, Jacob Chan, and Vivian Qu for reviewing this blog post and providing suggestions.
QA Autograders, Part 2
How we future-proofed the web architecture to scale beyond our current set of AI models and optimized the UX to make QA Autograders a delightful experience.
How we built and deployed the AI models powering Quality Assurance Autograders at Pathlight.