5 Key Characteristics of a Solid Analytics Foundation
January 24, 2023 -
As data analytics become more important for businesses and organizations, it is essential to have a strong foundation in place to ensure your analytics are both compliant with current regulations, and able to grow alongside your business’s goals. But there’s a third quality essential to a strong foundation – one that’s often overlooked – and that’s being futureproof.
Below we’ve identified five key characteristics of a solid analytics foundation that are as much principles of design as practices in execution. Paying attention to each of these ensures you can best leverage analytics data to generate meaningful and actionable insights for your digital teams.
Together, these characteristics are ultimately about resilience to change and ensuring your foundation can evolve with the changing needs of your business, the regulatory landscape, and unforeseen changes in business direction.
And, in order for these to be truly successful, they all need to be SMART goals (Specific, Measurable, Achievable, Relevant, and Time-Bound). For example to answer the question, “How do I know I have a completed set up?” You will first need to define the questions you are trying to answer, what your business goals are, and have a strategy for turning those goals into measurable and impactful insights.
When developing a strategy for a solid analytics foundation, it is crucial that it overarches everything you are doing.
Often, when people think of completeness, they think it means they should be measuring everything, but in today’s reality, what it really means is having coverage across all of the touchpoints where you expect your customers and prospects to interact with your brand. AND that you have the necessary data points to generate actionable insights based on your goals for the brand.
To achieve this characteristic you need to be able to answer “yes” to this question:
“Do I have coverage that allows me to collect data to answer the kinds of questions I’m trying to answer, to achieve the kinds of goals the business is trying to achieve”
Until you know what questions you are trying to answer, or what goals you’re trying to achieve, you’ll never be able to answer “yes”.
It’s also very important to remember that with the recent regulatory and industry concerns about end user privacy, it’s not free anymore to just measure everything, there’s a cost. You not only have to have a reason for collecting data, you have to be able to demonstrate that reason. And not just internally, but increasingly due to regulations, you need to be able to tell your end user why you’re collecting data. So completeness is a little more specific than just full coverage of everything.
With a “complete” collection process, businesses can gain insights into customer behaviors and preferences as well as develop a more comprehensive understanding of their target audience.
The key point of accuracy as a characteristic of a strong analytics foundation is for it to be measurable. To ensure this, your data analytics points need to be compared against a reference source of truth. If your analytics data points are digital behaviors like “did you click this button”, or “fill in this form”, the digital analytics tools is probably your source of truth – what the digital analytics tools tells me is what the number is.
In the case of a form fill, you probably have numbers in your backend system that tell you how many people filled out that form, this makes your backend system a reference source of truth that you can use to measure the accuracy of your analytics deployment.
The same is true for revenue and transactions. Both of these data points are in your order management system, and your organization is more than likely treating the order management system as the reference source of truth, making it the bar against how you measure how good your analytics implementation is.
In order to be able to measure accuracy effectively, you need to make decisions about what the reference sources of truth are for your brand and then evaluate the quality of your analytics deployment against them.
Accuracy and reliability as characteristics of a solid data analytics foundation are closely tied together. Reliability is in some ways a measure of your implementation. Meaning, how you choose to instrument this measurement is going to significantly impact how reliable that measurement is.
Here is an example from the early days of analytics. Back then, different browsers had different behaviors when it came to rendering a page, so the code I used to implement analytics might work one way in one browser, and a different way in another. This meant the data wasn’t necessarily consistent and reliable across all browser experiences, simply because of the way I chose to write the code.
In the above example, accuracy and reliability depended heavily on following best practices for implementation. This is why we use tools like Google Tag Manager and use the design patterns we do in GTM to deploy analytics – to ensure we hit accuracy and reliability standards.
Keep in mind, accuracy and reliability aren’t point-in -time measurements, they’re things that need to be maintainable across the inevitable changes people are going to make, for example, launching a new website, or adding new features, or an app design change. Whatever changes you make, you still need your measurement foundation to have all of these characteristics. To ensure it does is where design patterns, implementation standards, and processes play a huge role.
At Napkyn, reliability is one of the places where we often introduce business processes around change management to clients. Asking questions like,
- “What happens if you need to deploy a new tag?”
- “How is that code reviewed?” “
- How is it tested in lower environments?”
- “How are things promoted to production?”
These types of questions are operational underpinnings to ensure not just reliability, but the other four characteristics in this list as well.
Once you have accurate and reliable information gathered from multiple sources, it needs to be turned into something useful—actionable insights from which decisions can be made quickly and efficiently. In other words, actionability is where strategy and business objectives meet measurement.
In order for data to be actionable you need to be collecting data on measures that tie to business objectives. Think of the “So what, now what?” question.
Example of non actionable data: You launched a new website and measured that 500 users visited on the first day.
So what, now what? What are you going to do with this data now that you have it?
This is where your data analytics foundation has to be tied to the business outcomes and objectives of the organization in order for the things you are measuring to allow you to generate insights you can take action on. This is not just about predictive modeling or regression analysis (though these are patterns you can use with this data), actionable data is data you can tie to business decisions you will make.
Example of actionable data:
If I see a drop in conversion on my mobile platform
Then I will look at where there are problems in the conversion funnel in my data
And my action will be to remedy the check out flow in the mobile app.
In order for data analytics to be successful in driving growth for a business its foundation must also incorporate resilience. A resilient system is one that is robust to unexpected change. It’s an overarching principle that says when you architect the infrastructure you are going to use to measure things, you’re thinking about:
- What happens if you need to change something you couldn’t predict you would need to change.
- What happens if the behavior of something changes in a way you can’t predict?
- How easily can you recover from that disruptive change and continue on with fulfilling the business objectives you’re trying to fulfill.
A lot of the operational processes we employ at Napkyn and the design pattern we use for deploying the mechanics of measurement improve the resilience of the system.
An example to think of in terms of improving resilience is ensuring you have the big query export for GA4 turned on. If you do, not only does it mean you have the data residing within GA4, it means you have a backup dataset that allows you to do more sophisticated analysis, AND is historical. Meaning it’s resilient to disruptions like Google turning off the GA4 interface, or the system going down, etc. Resilience is an intangible that impacts everything from the tactics of how you deploy measurement to how you think about the architecture of your measurement infrastructure. It’s about maintaining continuity of operations in the face of disrupted change.
Building a strong foundation for digital data analytics requires accuracy, reliability, actionable insights, resilience, and completeness—all things which must come together harmoniously if you want to ensure an optimal outcome.
One of the things that sets Napkyn apart is that these characteristics are baked into us and how we approach every piece of work, whether it’s a GA4 migration, a deployment, or if we’re talking to our clients about first party data. We think about and plan for them so our clients hit one of these disruption points like third party cookie deprecation, they have a solid data analytics foundation behind them.
For more information or if you’d like our assistance building an optimal data analytics foundation, contact us.