Why Every Side Should Want Log-Level Data

I’ve been feeling a little reminiscent lately, so I thought I’d dust off a post I wrote a year ago. I hope you enjoy it!

I’ve been feeling a little reminiscent lately, so I thought I’d dust off a post I wrote a year ago. I hope you enjoy it!

    A (fairly) new ‘The Ad Platform: eMarketer Podcast’ tackled the question 'Why the Buy Side Wants Log-Level Data.’ So, I ask you the question, ‘Why would the buy side NOT want log-level data?’ Log files are fantastic. Web server logs, application logs. You name it! But web server logs have an especially big spot in my heart.

    Why, you ask, are you such a fanatic about web server logs? (And I say) ‘Only because they were the proverbial dinosaur egg that spawned WEB ANALYTICS!’

    I won’t go in to the (not-so) long history of web analytics. But I could:)


    A little something I put together as part of a Meetup I did a while back.

    (Kind of) Long before there was ‘tagging,’ people looked to the web server logs to figure out what was going on with their websites. Sure, in the beginning (early 1990’s), analyzing these logs had more to do with understanding the actual technical performance of a site. A task that was done by the IT folks.

    But if you’ve ever looked in a web server log, you know they are a treasure trove of useful information. And, sure, they are technically unstructured data. But that shouldn’t stop you. There are programs you can use to make sense of the information. Back in the day, at ClickTracks, the preferred method was to grep the file using in cygwin,’ But as an Excel purist, I chose to open them in Excel, structure them by using ‘text to columns’ and then slice and dice the data by sorting by fields like IP, then date/time, etc. A poor man’s rudimentary log file analysis. One made fun of mercilessly by the more technically adept:)

    The beauty of the log file was multiple fold. First, if someone were to question their reports in ClickTracks, I could open their web server log file, do my (not-so) fancy-schmancy analysis in Excel and show the same results.

    If they questioned why data was not showing up in their reports, I could show them that the data was not being logged with the default logging. Back then disappointingly, referrer was not logged by default. So you would have to go into the IIS Manager and add it. It was easy to do. But it was not retroactive. So that was a bummer. Sorry, just an aside. Back to the point.

    Super smart people realized the value of the information in the log files to marketing departments. They created ways to automatically parse the logs and populate reports. Taa-daa. Web Analytics.

    Even super smarter people realized how to visualize this data in ways that normal people could understand and use.

    I could show other screen shots, but I may have a bias. Early day segmentation, path visualization and funnel reporting.




    So now we come to the buy side request. Again, I won’t go into the long history of analytics evolving into attribution (both view-thru and click), but I was pretty stoked when I learned that you could go above the last click attribution by stitching together lines of the log file (whether by this time, was created by the web server or a java script tag when someone got to your site). What could be better,” I thought.

    Then I heard the podcast. You can go higher in the funnel. There are companies that know what happens when someone does NOT get to your website. The Supply Side (SSP) and Demand Side (DSP) vendors.

    They SSPs work with sites and networks that have space to show ads (inventory). The DSPs work with advertisers who have the need (demand) to show ads somewhere. And the pricing is based on bids (an oversimplification, there are other factors).

    Their log files uncover clues about what is NOT working as far as bidding and bid strategy. Without this information, advertisers would never know (beyond guessing and inferring) why certain strategies work sometime and not other times.

    Aha. This answer to the question. This is why the buy side is demanding log-level data. But here is the rub. Just like in early days, these log files have the same issues. Lack of understanding, lack of people who are comfortable parsing and analyzing these files (although I would argue there are TONS more people comfortable with task these days), and a bigger problem, data not being logged.

    So it is very heartening to see that log file history is repeating itself. There are efforts to put standardization, in logging, in naming and in defining this data. And these log files are starting to become productized.

    Life, of a log file enthusiast, just keeps getting better.

    Latest Articles

    March GA4 & Google Updates

    GA4 UI changes for Key Events and Conversions

    What is changing?

    At a high level, measurement teams across Google recognize that differences in...

    Ricardo Cristofolini April 10, 2024
    Ketul Dave, Senior Implementation Specialist March 21, 2024
    Ketul Dave, Senior Implementation Specialist March 20, 2024

    Ready to grow your company to the next level?

    Lorem ipsum dolor sit amet, consectetur adipiscing elit. Semper neque enim rhoncus vestibulum at maecenas. Ut sociis dignissim.

    Graphics