Part Three in this series. Read Part 1 Flashback: Data Analytics Infrastructure 1995-2015 and Part 2 Flash Forward: Data Analytics Infrastructure: 2023
I gave a talk on this subject a couple of months ago to a room of analysts, and I asked for a show of hands of people whose businesses were in the process of preparing themselves for the curriculum. One hand went up. Then I asked for a show of hands from people who felt that in the next 18 months, the “oh, no, this is bad, what do we do now,” will get dropped on their desk. Like a hand grenade, every single hand went up.
There is a lot of trepidation and concern on the analyst side, and on the business side. We know changes are required, but how do we manage the panic, let alone the project planning?
The nice thing is that we have been here before, several times, in our industry. We will be here again every couple of years as things change and evolve. The cookieless world, regulatory compliance, and changes to infrastructure are daunting and create disconnect. People are freaking out! But again, we’ve seen this before. Remember 2013 when Google decided to change their search engine from HTTP to HTTPS?
HTTP to HTTPS
The reason for the change was because of the Edward Snowden leaks that were showing that information was being picked up in transit by US government departments about individual web searches. Google was legitimately not happy about that, so they made their browsers secure. None of that data was available anymore to be picked up by a government agency, but it also meant it was no longer available to digital analytics tools. If you were an analyst in early 2010, the organic search data in Google Analytics, Omniture, or Webtrends, was really robust and powerful. It was a critical component of successful SEO, as well as in things like content management, and even inventory planning. It all disappeared over the course of less than a year when SEO reporting disappeared.
All of our clients at that time completely freaked out. Wondering what they were going to do because this was critical business information. Our response was to have a conversation with those clients where we said, “this is what you used to have. This is what you no longer have, or will ever have again. Of that data that used to be available to you, what were the critical business decisions in terms of reporting and planning and site optimization that you needed that data for? And how can we reconstitute that data in a different way, with a different approach, which gets back certain types of reporting? And what can we do to help you get to those decision points in a different way?” Because, again, some things are and some things are not, and SEO data is gone.
It happened again a few years later with what I call ‘The Rise of the Tag Marketing System (TMS)’.
The Rise of Tag Management System
Most businesses had implemented a web analytics tool of record in early 2010. That tool had barely been touched, was relatively accurate, not super robust, but it will always work the same way.
When every organization decided to add a TMS into their ad tech, and martech infrastructure, they had to take all of their direct deployments off the site, including web analytics, and redeploy them to the tag manager. I do not know a single company that did this properly, because it was brand-new, people were moving at speed. Fundamentally, every single web analytics deployment either broke or changed in such a way where historical data analysis was difficult. Forget about the impact that that had on the ad type deployments for things like adwords, ad serving, and email, it also created huge problems for every digital marketing organization. All of our clients freaked out and called us asking what to do because all the new stuff was making easy stuff hard or impossible.
This was even scarier for Napkyn because we do so much data analysis that we had a number of customers say if their data is no longer accurate, why should they hire a world class analytics firm? Our response was to build our implementation and technology organization, which is now one of the things we are really well known for, to assist clients in doing something similar to what we did for SEO; What do you have? What do you want? What does ‘properly’ look like? Then let’s get things set up properly in your Tag Manager to get you to a place where you’re more accurate and more able to make changes moving forward.
Those were two examples of end of the world feeling moments for people in digital data, ad tech data, and martech data. We got through them, the approach was the same, and we are going through another major set of changes that, again, in general people are not prepared for. So, how do we help deal with these changes?
Four Data Foundations
The first big thing that needs to be considered is the act of data creation, otherwise known as data foundations.
If your tools, infrastructure, and the availability of things are changing, in order to be agile, and keep up with those changes, you can not bespoke custom code for every single thing you want to do. You need proper foundations.
- Data layer. We have done a number of talks on data layer best practices, what they are, and how they work, which can be found in the Napkyn blogs. It is crucial to properly create data once so it stays decoupled from the user experience available to every pixel that needs it. This is especially important when dealing with the cookieless world and preparing for what is next.
- Server side tagging. It is a great and robust way to take certain types of data, primarily engagement data, and not visit data, and make that available in a way where it will not break, it is accurate, and it will scale.
- Best practice tag management. This is absolutely critical. Even though Tag Manager has created a huge problem in the industry when they first came out, they are really important for proper ad tech, and martech work to get done. Best practices, in terms of process, documentation, and alignment between IT and business users is not just critical to deal with the cookieless world, regulatory compliance, and changes to advertising, it is equally critical about time to value. There is a change coming up, we need to make sure that we are prepared for the change as quickly and as properly as possible.
- Treating data engineering and the creation and management of data like software development, not old-school web development. It needs to be treated like high-end software development, because at this point, that is what it is. We have hundreds of millions of dollars flowing through these systems, and very few organizations are getting accurate reporting and targeting the right advertising in the right ways at the right time. That is unacceptable. Things need to be documented, they need to be governed, they need to be version controlled. These are the kinds of best practices in the technology side that will allow you to quickly manage current issues and be very iterative in dealing with upcoming ones.
Can’t wait for Part 4? Watch the full recording.