In Defense of “Web Reporting”

By on in , , with 7 Comments

Avinash’s last post attempted to describe The Difference Between Web Reporting and Web Analysis. While I have some quibbles with the core content of the post — the difference between reporting and analysis — I take real issue with the general tone that “reporting = non-value-add data puking.”

I’ve always felt that “web analytics” is a poor label for what most of us who spend a significant amount of our time with web behavioral data do day in and day out. I see three different types of information-providing:

  • Reporting — recurring delivery of the same set of metrics as a critical tool for performance monitoring and performance management
  • Analysis —  hypothesis-driven ad hoc assessment geared towards answering a business question or solving a business problem (testing and optimization falls into this bucket as well)
  • Analytics — the development and application of predictive models in the support of forecasting and planning

My dander gets raised when anyone claims or implies that our goal should be to spend all of our time and effort in only one of these areas.

Reporting <> (Necessarily) Data Puking

I’ll be the first person to decry reporting squirrel-age. I expect to go to my grave in a world where there is still all too much pulling and puking of reams of data. But (or, really, BUT, as this is a biggie), a wise and extremely good-looking man once wrote:

If you don’t have a useful performance measurement report, you have stacked the deck against yourself when it comes to delivering useful analyses.

It bears repeating, and it bears repeating that dashboards are one of the most effective means of reporting. Dashboards done well (and none of the web analytics vendors provide dashboards well enough to use their tools as the dashboarding tool) meet a handful of dos and don’ts:

  • They DO provide an at-a-glance view of the status and trending of key indicators of performance (the so-called “Oh, shit!” metrics)
  • They DO provide that information in the context of overarching business objectives
  • They DO provide some minimal level of contextual data/information as warranted
  • They DON’T exceed a single page (single eyescan) of information
  • They DON’T require the person looking at them to “think” in order to interpret them (no mental math required, no difficult assessment of the areas of circles)
  • They DON’T try to provide “insight” with every updated instance of the dashboard

The last item in this list uses the “i” word (“insight”) and can launch a heated debate. But, it’s true: if you’re looking for your daily, weekly, monthly, or real-time-on-demand dashboard to deliver deep and meaningful insights every time someone looks at it, then either:

  • You’re not clear on the purpose of a dashboard, OR
  • You count, “everything is working as expected” to be a deep insight

Below is a perfectly fine (I’ll pick one nit after the picture) dashboard example. It’s for a microsite whose primary purpose is to drive registrations to an annual user conference for a major manufacturer. It is produced weekly, and it is produced in Excel, using data from Sitecatalyst, Twitalyzer, and Facebook. Is this a case of, as Avinash put it, us being paid “an extra $15 an hour to dump the data into Excel and add a color to the table header?” Well, maybe. But, by using a clunky Sitecatalyst dashboard and a quick glance at Twitalyzer and Facebook, the weekly effort to compile this is: 15 minutes. Is it worth $3.75 per week to get this? The client has said, “Absolutely!”

I said I would pick one nit, and I will. The example above does not do a good job of really calling out the key performance indicators (KPIs). It does, however, focus on the information that matters — how much traffic is coming to the site, how many registrations for the event are occurring, and what the fallout looks like in the registration process. Okay…one more nit — there is no segmentation of the traffic going on here. I’ll accept a slap on the wrist from Avinash or Gary Angel for that — at a minimum, segmenting by new vs. returning visitors would make sense, but that data wasn’t available from the tools and implementation at hand.

An Aside About On-Dashboard Text

I find myself engaged in regular debates as to whether our dashboards should include descriptive text. The “for” argument goes much like Avinash’s implication that “no text” = “limited value.” The main beef I have with any sort of standardized report or dashboard including a text block is that, when baked into a design, it assumes that there is the same basic word count of content to say each time the report is delivered. That isn’t my experience. In some cases, there may be quite a bit of key callouts for a given report…and the text area isn’t large enough to fit it all in. In other cases, in a performance monitoring context, there might not be much to say at all, other than, “All systems are functioning fine.” Invariably, when the latter occurs, in an attempt to fill the space, the analyst is forced to simply describe the information already effectively presented graphically. This doesn’t add value.

If a text-based description is warranted, it can be included as companion material. <forinstance> “Below is this week’s dashboard. If you take a look at it, you will, as I did, say, ‘Oh, shit! we have a problem!’ I am looking into the [apparent calamitous drop] in [KPI] and will provide an update within the next few hours. If you have any hypotheses as to what might be the root cause of [apparent calamitous drop], please let me know” </forinstance> This does two things:

  1. Enables the report to be delivered on a consistent schedule
  2. Engages the recipients in any potential trouble spots the (well-formed) dashboard highlights, and leverages their expertise in understanding the root cause

Which…gets us to…


Analysis, by [my] definition, cannot be something that is scheduled/recurring/repeating. Analysis is hypothesis-driven:

  • The dashboard showed an unexpected change in KPIs. “Oh, shit!” occurred, and some root cause work is in order
  • A business question is asked: “How can we drive more Y?” Hypotheses ensue

If you are repeating the same analysis…you’re doing something wrong. By its very nature, analysis is ad hoc and varied from one analysis to another.

When it comes to the delivery of analysis results, the medium and format can vary. But, I try to stick with two key concepts — both of which are violated multiple times over in every example included in Avinash’s post:

  • The principles of effective data visualization (maximize the data-pixel ratio, minimize the use of a rainbow palette, use the best visualization to support the information you’re trying to convey, ensure “the point” really pops, avoid pie charts at all costs, …) still need to be applied
  • Guy Kawasaki’s 10-20-30 rule is widely referenced for a reason — violate it if needed, but do so with extreme bias (aka, slideuments are evil)

While I am extremely wordy on this blog, and my emails sometimes tend in a similar direction, my analyses are not. When it comes to presenting analyses, analysts are well-served to learn from the likes of Garr Reynolds and Nancy Duarte when it comes to how to communicate effectively. It’s sooooo easy to get caught up in our own brilliant writing that we believe that every word we write is being consumed with equal care (you’re on your third reading of this brilliant blog post, are you not? No doubt trying to figure which paragraph most deserves to be immortalized as a tattoo on your forearm, right? You’re not? What?!!!). “Dumb it down” sounds like an insult to the audience, and it’s not. Whittle, hone, remove, repeat. We’re not talking hours and hours of iterations. We’re talking about simplifying the message and breaking it up into bite-sized, consumable, repeatable (to others)  chunks of actionable information.

Analysis Isn’t Reporting

Analysis and reporting are unquestionably two very differing things, but I don’t know that I agree with assertions that analysis requires an entirely different skillset from reporting. Meaningful reporting requires a different mindset and skillset from data puking, for sure. And, reporting and analysis are two different things, but you can’t be successful with the latter without being successful with the former.

Effective reporting requires a laser focus on business needs and business context, and the ability to crisply and effectively determine how to measure and monitor progress towards business objectives. In and of itself, that requires some creativity — there are seldom available metrics that are perfectly and directly aligned with a business objective.

Effective analysis requires creativity as well — developing reasonable hypotheses and approaches for testing them.

Both reporting and analysis require business knowledge, a clear understanding of the objectives for the site/project/campaign/initiative, a better-than-solid understanding of the underlying data being used (and its myriad caveats), and effective presentation of information. These skills make up the core of a good analyst…who will do some reporting and some analysis.

What About Analytics?

I’m a fan of analytics…but see it as pretty far along the data maturity continuum. It’s easy to poo-poo reporting by pointing out that it is “all about looking backwards” or “looking at where you’ve been.” But, hey, those who don’t learn from the past are condemned to repeat it, no? And, “How did that work?” or “How is that working?” are totally normal, human, helpful questions. For instance, say we did a project for a client that, when it came to the results of the campaign from the client’s perspective, was a fantastic success! But, when it came to what it cost us to deliver the campaign, the results were abysmal. Without an appropriate look backwards, we very well might do another project the same way — good for the client, perhaps, but not for us.

In general, I avoid using the term “analytics” in my day-to-day communication. The reason is pretty simple — it’s not something I do in my daily job, and I don’t want to put on airs by applying a fancy word to good, solid reporting and analysis. At a WAW once, I actually heard someone say that they did predictive modeling. When pressed (not by me), it turned out that, to this person, that meant, “putting a trendline on historical data.” That’s not exactly congruent with my use of the term analytics.

Your Thoughts?

Is this a fair breakdown of the work? I scanned through the comments on Avinash’s post as of this writing, and I’m feeling as though I am a bit more contrarian than I would have expected.

Similar Posts:


  1. Tim,
    I like to put it this way:
    Analysis is a process and reporting is an outcome/output of analysis.

    If people are piling on reporting and questioning it’s efficacy what they’re really doing (whether or not they know it) is challenging the quality of the analysis behind the report.

    Analysis may be light-ish (performance monitoring) or heavy (hypothesis driven) but the quality of the analysis drives the quality of it’s output (the report).

  2. Tim, this more or less aligns with how i’ve organized my team at HP. I have four buckets of work that I track:

    Foundation = Classifications, instrumentation, etc required to enhance and upgrade your infrastructure to provide more robust capabilities. This can be recurring or one time.

    Reporting = straight forward; recurring

    Ad Hocs = random unstructured requests for further insight that sits between reporting and analytics usually something like ‘i saw the numbers in x report can you split it out y way’ so it’s not reporting but not full blown analytics either. one time

    Analytics = project based structured deep dives into data to uncover insights (sorry); one time

    Unfortunately we spend the majority of our time on the first three and I keep getting hammered to do more of the last one!

  3. Hi Tim – I totally agree with the sentiments expressed in this post, I have had the same niggle every time I read about Avinash’s actionable dashboards.

    Senior management want those summary performance reports/dashboards in their inbox as early as possible Mon morning for weekly summary meetings. They want/need a decent overview, not just 3 or 4 metrics. Commentary is great but like you said, most weeks it would be nothing to report. The weeks something has changed, the analyst needs time to dig into the data properly.

    I have created a lot of dashboards over the years and have developed a fairly standard format (agree the web analytics vendor dashboards aren’t good enough). These are Excel files containing 3 to 5 dashboards, each one page, containing data tables and visualisations and usually allow segmentation via drop-downs. While it is an Excel dashboard, I see it as a very simplified UI – containing all the data a manager needs and allows them to drill down one or two levels. It can replace the web analytics tool for these users. If they discover an issue, they call the web analyst who use the web analytics tool to investigate further.

    With analysis, you are trying to meet different objectives. The actionable dashboard is not a weekly performance dashboard, it is a structure to present insights & recommended actions.

  4. @Clint — It sounds like we do have a very different view of the terminology. My second post on this blog went into wordy detail as to how I viewed the difference between “reporting” and “analysis,” and my position hasn’t really changed. What I find is that the distinction really helps me identify when a data discussion is getting jumbled and combining performance measurement with hypothesis testing with vague “I want the data to tell me something insightful.”

    @Andrew — I like your breakdown (how consistently are you and your team able to do a gentle probe for ‘actionability’ or at least ‘hypothesis clarification’ when the ad hocs come in? Not trying to be difficult to work with, but nudging the requestor to be clear in their own mind what the results will really do for them.). Insights are great — they just can’t necessarily be “scheduled” (much less automated), and they’re a lot harder to get to without clarity on what needles are important to move, right?

    @Peter — thanks for adding on the defense of Excel. I’m convinced that a lot of the knocks it gets are because the people using it — even analysts — aren’t really using it well. I regularly include a dashboard with a second tab or two with pivot tables that allow some basic slicing. The recipients don’t have to understand pivot tables — just dropdowns! And, if they see something or have a question, that’s what the analysts are there for! And…that’s one reason I’m excited to have dinner with Jon Peltier ( tonight when he’s in town!

  5. Great post Tim – I concur. I had a discussion recently of the “We don’t reporting, we want analysis” variety, and I essentially explained that reporting is necessary, but not sufficient. It’s impossible to analyse if you don’t understand the status quo, and often any recommendations, deeper dives etc come from what you spot on a report. If done well, it has a place and can help inform analyses.

  6. I am not sure if this discussion is kind of theoretical (forgive me, I am German).
    I don’t think there are any doubts that good reports are mandatory for any further analysis. Defining and creating a good, valuable, well segmented, report is nothing one “pukes out”.

    Maybe it depends also to who is in the driver seat deciding any further actions based on web data. While an experienced WA probably identifies a problem at a glance just be looking at a (good, valuable) report, and by this derive the analytic insight, or has a hint where to dive in to find out more, others need an interpretation of the data in order to decide about any action.

    The goal is about optimization, right? If a team works well with reports, fair enough. If any decision is done in higher levels or by people with less online skills, an analysis might be better to transport the core message.

  7. @Michele Thanks! I like the “necessary, but not sufficient” phrase. That hits it.

    @Matthias I would *hope* that there weren’t any doubts that good reports are mandatory for any further analysis…but I think there are a lot of misperceptions — both by analysts and by marketers — on that front. The spark for this post was Avinash’s post, in which, early on, he quoted himself: “If you see a data puke then you know you are looking at the result of web reporting.” Right there, there was an *equating* of “reporting” with “data puking,” and I take issue with that.

Leave your Comment

« »