Measuring for Success: Analytics for Site Updates, Upgrades, and Redesigns

by Nick Bennett

As organizations are under increasing pressure to provide enhanced digital experiences, investment in updates, upgrades, and redesigns to site content is also ramping up to achieve success and competitive differentiation.  Measuring the impact of these changes requires a consistent measurement strategy.

A good measurement strategy needs time.  Time to review the details of changes being made to the site, time to understand new user flow, time to plan for consistently measuring the launch, and time to understand how success will be measured for this initiative. Unfortunately, we often find ourselves in this scenario:

Part 1: WTF

I was in an account planning meeting a couple of weeks ago when one of the team raised a concern about one of our clients.  The team had just been told that the client planned to launch a major upgrade to their primary site and asked if we could help with measurement.  So far, nothing out of the ordinary:  We are an analytics consultancy after all…  Then came the kicker:  The site update was scheduled to launch in 3 weeks.

That’s right — a full site redesign (clearly in the works for some time), complete with new features, user flows and capabilities, to be launched in less than a month — could we help them clone (read ‘copy/paste’) the existing measurement from the current site over to the new one and QA/validate function before the launch?

Unfortunately, this is just one example of an all-too-familiar story, more so as industry faces increasing pressures for digital transformation.  Organizations are focused (and rightly so) on improving their clients’ online experience, providing ever richer content and features.  Development teams are under pressure to deliver major updates quickly in service of this goal.

What’s the problem, you might ask?  After all, the fundamentals of deploying digital measurement aren’t that complicated are they?  After all, it’s just a few tags… right?

Well, as with so many of these questions, the answer is ‘yes and no’.  Ultimately, it’s true that measuring things using digital analytics isn’t that complicated.  However, measuring the right things on purpose and with consistency across a major change in experience is a different story altogether, and involves quite a bit more than just deploying a few tags and hoping for the best.

Ultimately, we’re going to be able to help our client, of course.  But there’s a number of opportunities to generate real insights that will be missed simply because of the timelines involved.  In the end, we’ll be able to do a great job of instrumentation for this new site, it just might not be there right at launch.

Part 2: Retrospective: what & how it could have gone better

So what would we prefer to have seen in this scenario?

First and foremost, a good measurement strategy needs time.  Time to review the details of changes being made to the site, time to understand new user flow, time to plan for consistently measuring the launch, and time to understand how success will be measured for this initiative.

Organizations don’t make the investment in a major site upgrade without a need or a goal.  Development resources cost money and every launch carries risk.  Good measurement will help us to understand how well a site relaunch or any other significant effort is doing at addressing these needs or meeting these goals, but only if we understand what they are and agree on what measures best represent them.

Measurement deployment and instrumentation are often also not the primary focus or an area of expertise for development teams working on new features or redesigns.  These teams can benefit from engagement with analytics experts while development is ongoing, making the process of deploying instrumentation but also the necessary underpinnings (like data layers) a part of the overall development process (but that’s a whole different topic).  This also gives the analytics experts additional time for QA and remediation of any issues that are found along the way.

The second place that benefits from additional time is transition reporting.  That is:  What happens in the minutes, hours, days and weeks following the launch?  One of the first questions that is going to get asked (and it will get asked repeatedly) once the launch is underway is ‘How are we doing?’.  Obviously, the quality and sophistication of the answers to these questions will need data, but early indicators of success (or failure) can help an organization make informed decisions quickly and respond quickly to either opportunity or risk, and in the end, this is the purpose of data collection and analytics in the first place — helping organizations and stakeholders use data to make informed decisions.

Proper planning for the launch transition allows analysts to set appropriate expectations with the business around this change.  There’s often a brief negative impact to performance from major site updates.  People are surprised by change and this can be reflected in performance in those first moments. This usually only lasts a short time after which real impacts of these changes can be properly measured.

Part 3: Best Practices

So when should measurement and analytics become a part of a redesign initiative?  Too early in the process, and too many things are in flux and we risk the dreaded analysis paralysis — endless meetings talking about too many possibilities and going in circles.  Too late, and we miss the chance to put together meaningful and deliberate measurement of the success of the initiative.

In our experience, knowing that a redesign initiative is happening as soons as possible allows us to plan accordingly to make sure we’re asking the right questions and having the right conversations.  From kick-off until a certain point in the design cycle though, that’s really all we need.

Once a client is  to the point where design comps are more or less finalized, it’s time for us to get more involved.  At this stage there are really two parts to our work.  First we review those design comps to understand new features, user flows, and design elements that could benefit from instrumentation, and also work to align recommendations for any new measurement with what’s already deployed.  This helps us ensure consistency across the launch, and makes sure that comparative reporting (year over year, vs.previous period) will make sense in the future, regardless of the launch dates.

Second, is discussion with the client about success measures.  This can (and usually does) differ from tactical measurement of updated site content or new features and focuses more on why the launch is being done in the first place.  What outcomes is the business looking to achieve through this investment?  What additional tactics are accompanying the launch?  The goal of these discussions is to understand both how to report on the transition across the launch, but also to determine any longer-term updates, revisions or adjustments to measurement strategy that should be contemplated for the long term.

As important as getting measurement right for the initiative at hand is, it’s equally valuable to understand how any given launch or feature update plays into longer-term goals.  If the development team is going to open the hood on the site code, there’s always an opportunity to look at upgrades to measurement that can be accommodated at the same time.

Some key considerations to keep in mind as you plan for a site update:

  1. Define how you will measure the success of your initiatives
  2. Consider how site updates provide opportunities to upgrade your measurement capabilities
  3. Involve your analytics team once design comps are near final
  4. Factor time for implementation and validation of measurement in your timelines
  5. Work with business leaders to appropriately set expectations about transition performance
  6. Upon launch, monitor, interpret and act using data
  7. Document and annotate key changes to ensure continuity in future analysis

Conclusion

Ultimately, the very best organizations and best partnerships for us occur when measurement and analytics are treated as a first-class source of requirements for any launch or upgrade rather than an afterthought or something that’s bolted on at the last minute.  The goal of our client partnerships is to bring an organization to this level of maturity.  Think of it this way:  If these site updates and feature launches are intended to drive success for a business, then measuring the impact of these efforts with the right analytics is how you know if you’re succeeding or not.

A good measurement strategy needs time.  Time to review the details of changes being made to the site, time to understand new user flow, time to plan for consistently measuring the launch, and time to understand how success will be measured for this initiative. Unfortunately, we often find ourselves in this scenario:

Napkyn Analytics offers a wide range of services to both brands and agencies to help understand and address data-centric measurement challenges, including:

– First-Party Data Strategy
– Strategy & Frameworks
– Privacy Assessment & Remediation
– Google Analytics 4

Contact Napkyn for a no-charge 1-hour Q&A with a Napkyn expert on how companies should approach and create a sustainable first-party data-centric strategy.

Nick Bennett

Chief Technologist

With more than 20 years experience, Nick Bennett is a skilled senior software engineer, digital business consultant and noteworthy leader in the digital data space. As Chief Technologist, Nick works with enterprise and big brand companies, leading the development of Napkyn's capabilities and services in enterprise digital transformation, sophisticated analytics technology implementations, data integration, and digital data analysis.

See more posts from Nick