Applying Moore’s Law to Analytics Are We Getting Better, Smarter, Faster? Or, Exponentially Dumber?
February 24, 2021 -
Moore’s Law is the principle that the speed and capability of computers can be expected to double every two years, as a result of increases in the number of transistors a microchip can contain. (Definition from Oxford Languages).
They are getting better, smarter, faster (and cheaper). But does that concept apply to analytics? Well, if we think about analytics like computers, then maybe….maybe not..certainly not cheaper. If the transistor powers the computer, then data powers analytics. And no doubt there is exponentially more data being manufactured today.
Think of the history of web analytic data. In the beginning there were web server logs. They had basic information, very basic. By default the IIS logs did not even capture referrers. Of course it was easy to enable logging for this field once you realized it. But you missed all of the data up until the day you did this.
But then, aha, companies realized they could append parameters on the URL for these online ads, and thus they controlled the data. Were the dollars they were spending on these ads generating any revenue on their site?
The picture was becoming more complete. As was the complexity. A ChiefMarTec article showed that the number of solutions dealing with this data grew from less than 20 in 2011 to over 1,200 in 2020.
Oh wait. There’s more! We’re just talking online data. Well, that is not the complete story is it. Companies, both B2C and B2B (then B2B2C and D2C:) wanted to combine their offline data with their online data, CRM, POS, CDP – oh my!
This adds a whole other new layer of complexity. Now you have to verify the data is accurate/consistent, invariably you will have to do some cleansing and normalizing of some, maybe all, of your data sets.
And then, the fun part – harmonization. Putting it all together so it’s useful – finding the common ‘keys’ to join the data – then verifying, cleansing and normalizing once again. But now you are off to the races!
We’ve talked about the evolution of data over, say, the last decade. But data needs humans (or at least humans and machines) to really be useful. What does that look like?
In his 2008 book, ‘Outliers,’ Malcolm Gladwell explained that if a person had the grit to spend 10,000 hours focusing on their position or hobby or talent, they become an expert in that field. For our purposes (and simple math), let’s translate 10,000 hours to 10 years. If you had been doing the same things you’ve done for the last ten years, do you qualify to be an analytics expert?
Sorry to say, but if you are still doing the same old thing as you were a decade ago, you are probably very good at what you do, but you would not be considered an analytics expert. The role has expanded and atomized into many different roles and fields of expertise ranging from data scientists to data quality experts to data governance specialists.
In future posts, we’ll meet some of these folks. But in the meantime, feel happy to know that we, in analytics, are somewhat better, faster, and smarter and certainly not exponentially dumber.
Napkyn Analytics offers a wide range of services to both brands and agencies to help understand and address data-centric measurement challenges, including:
– First-Party Data Strategy
– Strategy & Frameworks
– Privacy Assessment & Remediation
– Google Analytics 4
Contact Napkyn for a no-charge 1-hour Q&A with one of our experts on how to approach and create a sustainable first-party data-centric strategy.