Analytics dashboard as first user story

Problem

After a go-live, the delivery team finds measuring success harder than expected.
Building the analytics dashboard reveals gaps in tracking capabilities and overlooked external variables.

Context

When planning a new initiative, success criteria seem clear.
Usually, the initiative is part of a well-defined business objective with its quantified key results.
Everything looks straightforward because at this stage the initiative is in isolation.

Once live, things get harder.
The initiative is now part of a complex system with many variables affecting the same metrics.
For example, data tracking can be too corse grained or an ongoing marketing campaign can skew the results.

Solution

The first user story of an initiative is to build the analytics dashboard. This dashboard encodes the success criteria of the initiative.
The benefits are:

  • Hard conversations start earlier, for example how to detect revenue cannibalization
  • Tracking gaps surface sooner, so they do not increase scope unexpectedly just before to go-live
  • More trustworthy success criteria, as there is no retrofitting to what is measured after go-live

A/B testing

Sometimes the difficulty of defining the analytics dashboard makes it clear that an A/B test is needed.
Understanding this close to go-live, or worse after it, can be very costly.

Notes

Analytics dashboard for tech initiatives

Analytics dashboard are fundamental for product initiatives as for tech ones.
In terms of tools, product analytics dashboards often take the form of something like Mixpanel, while tech dashboards of something like Datadog.

Users behavior as success criteria

For product initiatives, the best success criteria are the ones proving a change in users behavior.
Generating revenue is just a side effect of changing users behavior, hopefully for the better.
On top of this, profitability needs to be accounted for as well.