We wantď¸ ď¸our ď¸work to have an impact. Along with personal fulfillment, analytics that donât have an impact are less likely to lead to more trust or budget, and by extension, career safety and growth.
One of the main ways people expend effort on analytics without having an impact, or at least, having less of an impact than they could, is failing to understand the business goals of what they are tracking and reporting on. Or, losing sight of them along the way.
And this isnât just a personal observation of mine. I asked around among some leading voices in the analytics industry about what problems most hold back analytics teams from having an impact. Not aligning what we are doing to the underlying business goals was a recurring theme.
In this post, weâll look at why it can happen, and what we can do to better align our analytics work to business goals.
It seems obvious that weâd need to understand exactly what someone is trying to accomplish if we are going to give that person the right data and recommendations to help them succeed. Unfortunately, with all of the moving pieces that go into everything from gathering data through to presenting it, itâs very easy to lose the forest for the trees. This can happen for several reasons, but major ones are overreliance on tools, and overreliance on processes.
Donât Let the Tool Do the Work
One major pitfall that can cause analytics work to become misaligned from business goals is an over-reliance on the metrics and reports automatically provided by analytics tools.
Loren Hadley, who has a wonderful understanding of how to help organizations succeed with data, had the following to say about why the right metrics arenât always included:
Itâs often what does my tool give me? CRM systems and analytics tools (Iâm looking at you GA) tend to provide what seems meaningful and is easy to calculate without much context. Not that they are bad metrics in any way. Just that they may over promise. I donât fault marketers for reaching for this or what an agency did 4 years ago. Iâd just like to help them make sure they are focusing on what really matters.
Itâs very easy to let automatically gathered and calculated metrics, that are presented prominently in the pre-generated reports offered by analytics tools, get the lionâs share of the attention. Most businesses donât really live or die based on the total number of people who come to their website or how many pages they look at, but tools put a lot of emphasis on users, visits, and views, so it often gets more prominence than it needs in reporting.
Engagement rate is another metric that people like to use, even though remembering what it actually represents off the top of your head can be tricky, let alone educating stakeholders on it. Some might argue that the point is to give an idea of what portion of traffic is engaged. Iâd argue back that the threshold to be considered an engaged visitor is too low for a lot of use cases, and that a more tailored approach to measuring engagement would be useful.
Tools will inherently push us towards defaults instead of the specificity that will enhance our ability to really support decision making, and we need to guard against that. Processes, on the other hand, can either hurt or help us align our work to business goals.
Process: Finding Your Groove vs Getting Stuck in a Rut
We all know that we canât afford to reinvent the wheel for every ask that comes in. We use repeatable processes for efficiency and reliability, and rightly so. The right processes serve as crucial guidelines to keep things understandable and on track. Overly restrictive processes, or an overreliance on processes, however, can lead people to just repeat whatâs worked before.
If your website/app has a lot of funnels, you may get into the swing of things and have a process you follow when setting up tracking and reporting for a new funnel. That process can help you be efficient when planning, tracking, and creating reporting for each new funnel that comes your way. You need to know where the funnel is, what the steps to track are, and establish technical elements like is the funnel a single page app. And youâd certainly be wasting a lot of time to start the reports or dashboards from scratch every time.
The danger is deciding that the system will work as is for every funnel we are asked to track. If we assume that completion rate is the primary KPI for anything we do involving a funnel, our reporting for some funnels wonât be telling a very useful story. Maybe one project is to modify a funnel to introduce a âYou may also want to considerâ element to the checkout.
Completion rate is still very important, but trying to figure out if this change is working or not means considering at least average order value. And youâll want information on interactions with the new element, and changes to items in the cart, so that you can segment metrics based on different behaviors.
Another funnel is to open a support ticket, and success for the business looks very different here. Ideally, a content recommender will show the user some help content that resolves their problem without having to open a ticket.
Analysis, reporting, and by extension, tracking, on this funnel are going to be different than how we handle purchase checkouts. Or at least they should be. Treating a funnel drop out as success doesnât account for people who just get fed up and leave.
We can ask that the funnel be set up with distinct steps so we can clearly track people who drop out while being shown help content, as opposed to those who drop out after indicating that they still have an issue and want to contact support. Now we can start to tell the difference between someone dropping out because they solved their problem, and someone dropping out because our contact form has issues.
Without going further into the example rabbit hole, we see how getting too comfortable with treating similar asks as identical can lead to reporting and analysis that isnât aligned to business goals.
Making Analytics Discovery Processes That Align to Business Goals
An overreliance on an inflexible process can lead us right past crucial information we need to make sure that the analysis we eventually provide will be valuable. But, we can also adapt our processes to make sure that we gather that information before we make any other decisions.
Iâve written about this a bit before, when discussing requirement gathering for analytics projects using the 5 Ws. The difference is that in that post I tried to look at all the information youâd want to bring together when figuring out what will go into fulfilling a request. And yes, âWhyâ was presented first. But with five more years of experience under my belt, I feel like the Why questions deserve a deep dive.
You Need to Know Why They Are Doing It If You Want to Help
If youâll allow an anecdote, I lived in a bit of a rough neighborhood while in university. One day I was out jogging, and a very large friend of mine happened to spot me from a nearby balcony.
He looked concerned, and shouted âAre you running on purpose?â
It took me a second to grasp his meaning, and I answered âYup – Iâm just working out. No one is chasing me!â
My friend, understanding that I was running to improve my health, and not fleeing danger, encouraged me to keep it up. Had someone been chasing me, this friend was the type who would have raced downstairs to help.
Point being, knowing what someone is doing isnât good enough to render the right assistance. You have to know why they are doing it.
Make âWhyâ Questions a Discovery Conversation Priority
To fully understand the goals of a project, especially a larger one, you canât just ask people what they are and write down the answers. You need to ask the right questions to get people talking about their goals, and have a conversation. A conversation both about their project, and about how analytics can support it in succeeding. And conversations are arguably the most important part of your job.
An analytics leader whose LinkedIn posts are laden with insight, Tris J Burns, recently shared the following:
The most powerful analysis tool we can ever hope to master and possess is:
â CONVERSATION đŁď¸(use your words)
Every analysis must begin and end with conversation.
We use conversation to understand the problem we are hoping to solve.
We use conversation to gather the initial data points, whether it be to gather context on the problem at hand, or to gather highly valuable qualitative data
And finally, we use conversation to deliver the insight, recommended action and estimated business impact of the analysis we’ve performed.
â Tris J Burns
If you donât like having conversations about analytics with people that donât know as much about analytics as you, you are going to have a hard time figuring out how to best help them with data.
Weâll tackle the first two use cases of conversation here, and talk more about recommendations towards the end.
Why Are You Doing It?
I start every discovery with âWhy are you doing it?â
If I already have a pretty good idea of why based on what I know going in, Iâll confirm that understanding and ask stakeholders to expand if I missed any points or nuances. Note that we are talking about why they are doing the underlying work, not why they want analytics for it. If someone asks for analytics help with a funnel, for now, we ask why the funnel exists, not why they want help analyzing it.
With that broad understanding in place, I move on to defining what the desired outcomes are, in two contexts.
What does success with this feature/asset look like for the end user?
What did the user come here to do? Get information? Buy something? Return something? Apply for a job? Post an ad or message? What does a successful outcome for the individual end-user involve?
What does success with this project look like for you?
This is where we get more specific about why we are doing this project. If the point of updating the checkout funnel is to reduce the exit rate on the shipping step, this is where we get into âwhat is the rate currently?â and âis there a target improvement we are trying to achieve or threshold we want to reach?â
Analytics Discovery Continued: Questions About Questions
By now, we have a good high level understanding of what people are trying to achieve, and depending on our knowledge of the business and stakeholders, we could start planning a solution. But weâd not have as much detail as we could benefit from. And, weâd be missing a chance to understand and manage expectations.
Liz Oke, a stellar marketing strategist, wisely suggests the question:
âWhat do you want your analytics to answer?â
– Liz Oke
In some cases, particularly with stakeholders that are reasonably data-savvy, this will yield some great ideas you can include in your solution with little modification. In other situations, youâll be able to have important conversations about what questions will be possible and useful to answer.
What Inexperienced Stakeholders Want Answered
People who donât really know whatâs possible with analytics will generally give one of two types of answer:
âI donât know, I was hoping you could tell us what would be most helpful.â
This can be a very good situation to be in, in that you are being trusted to recommend what analytics can do for the stakeholders. As long as you have the right domain expertise around what the business does, you can lead them in the right direction. If you arenât sure if your domain knowledge is sufficient, ask the stakeholders questions until it is.
âI have no idea, but it would be cool if we could know this thing YOUâVE NEVER DREAMED OF INCLUDING BEFORE.â
These can be frustrating, but they can also be incredibly fun. A lot of the time, itâs just a matter of interpreting the desires of someone who isnât used to analytics jargon and doing a bit of translation. Sometimes, though, the ask will be for something youâve never considered reporting on before.
The temptation here is to dismiss it as an impossible ask from someone that doesnât understand the limitations of the technology. And that might be the right response, in the end. But before we throw out every crazy-sounding  idea a stakeholder has, we do them, and ourselves, a disservice, by not at least giving the idea some thought.
People who donât know what to expect from analytics, but know the business well, donât have the same habits and biases baked into their ideas that we do. And that can be a fantastic source of outside the box thinking on what analytics can do for the organization.
Is the idea actually impossible because itâs asking for something that is more magic than technology? Or, does it seem impossible because itâs asking for something our tools donât already do, or maybe more personally, something that you’ve never done before?
Depending on how valuable the information would be, and the resources available, maybe you need to think about extending the abilities of yourself or your tools, or adding a new tool to your stack.
What Semi-Experienced Stakeholders Want Answered
This is the stakeholder that has been exposed to analytics enough to understand what is possible in a technical sense, and get comfortable with some common measurements. But, they have trouble focussing on what they actually need in a business sense. Instead of focusing on their own business priorities, they put on their analytics hat and try to think of everything that might be useful.
If they are used to seeing it in the tools, theyâll ask for it. If they are used to seeing it in dashboards and reports, theyâll ask for it. If it was useful to them in a completely different context, they may very well, still, ask for it.
This is, in many ways, the âoverreliance on tools and what we did in the pastâ problem, but this time it is coming from the stakeholders instead of our team. Which is entirely forgivable, as itâs not their job to avoid these issues, itâs ours.
Letâs look at some techniques to help us eliminate, or at least deprioritize, the questions that are less relevant.
Prioritizing Requirements In Analytics Discovery
Veteran voice of the analytics community, Jim Sterne, has a great method to help discern idle curiosity and old habits from the burning questions that will have an impact:
Dive into their goals and find out what they will change based on the results being above or below expectations.
If someone has a laundry list of things they want to know, you can get a feel for how important each item is by asking stakeholders what they will do with that information. If someone insists that engagement rate (or anything else) be reported prominently, ask them what they will do if that metric were to increase or decrease by 10% – Nothing?
Ok, what about 20% – Still nothing? What about 30%, 40%, 50%?
If something can change drastically and it wouldnât elicit a response from the stakeholders, it stands to reason that reporting on it wonât have an impact on the business. No action is being taken based on the information.
You can use this to help the stakeholders prioritize their questions in terms of value to them. Maybe youâve only got enough resources to get to the most important ones. Or maybe you have plenty of resources, and theyâve asked for really basic stuff thatâs built into the tools and wonât even use that many resources.
Either way, you need to help them focus on whatâs important, both right now through prioritization, and down the road by making them analysis and dashboards that focus on those priorities.
Moving Forward
It may seem like we are skipping a lot in the middle, but solution design and implementation depend too much on the specifics of the project to get into here. And, to be fair, we have laid the foundation we need to be successful in doing those things in a way that will lead to having an impact on business decisions.
As long as you design a solution that delivers on what stakeholders want to know, as we established in the questions above, you can follow that map. Especially with a larger project, though, you can get lost in technical, documentation, or taxonomy details along the way, so itâs good to revisit the summaries you made of your discovery notes periodically.
I make it a point to do this when I move to the visualization, and again when I move to the analysis, phases of a project. Especially if it was a longer timeline, I want to make sure I remember what the point of all the work to get here was. It would certainly be unfortunate to have done everything right up until that point, then drop the ball when actually sharing the output.
Wrapping It Up
Before we can call it a day, we need to finish strongly. We need to make sure that what we get back into the hands of our users is actually going to help them achieve the goals we asked about on day 1.
Jason Thompson is not just a renowned analytics leader, but someone who often reminds us that we are human beings first, and analytics professionals second. He also had a fantastic response to my LinkedIn post asking about what prevents analytics teams from making an impact:
The biggest issue I observe in analytics teams that are struggling to make an impact comes down to struggling to move beyond data collection and reporting.
To make a true impact, analytics teams must be about more than capturing data, sharing numbers, and observing trends. To make a true impact, analytics teams must not only do meaningful analysis but must make informed business recommendations based on that analysis.
â Here is some data we collected
â Here is a chart attached to some data we collected
â The chart line for the data we collected goes up and down
đĄ When we increase paid search budgets, the line goes up
â Our analysis shows a strong correlation at a 28-day lag which highlights a monthly cycle in user buying behavior. This means for some high-value purchases, customers may take up to a month from their first visit to finalize their decision and place an order. Recognizing that customers may take several weeks to decide on a purchase we should test nurturing campaigns that present return website visitors, specifically entering into our web store, valuable information and potential incentives through a 28-day decision window…
Like most things, this is a people issue.
– Jason Thomson, 33 Sticks
The point isnât that analysis needs to be complicated, the point is that it needs to lead to recommendations that will help the business achieve its goals. Especially in predominantly self-serve analytics environments, itâs easy to overlook that we need to make the output focused on the business questions at hand.
Go back to those discovery summaries and remember why you are collecting this data in the first place. Even in self-serve environments, we need to provide guidance and help people find the deeper insights they wonât get from looking at an automated report or dashboard.
Donât just be a conduit for data. Where can you use what you know about the reasons for individual projects, and the business overall, to turn the data into actionable recommendations?
How do you keep your analytics work aligned to stakeholder goals and your recommendations relevant? Leave a comment and let me know!
Leave a Reply