Web Analytics Platforms Are Fundamentally Broken

By on in with 13 Comments

Farris Khan, Analytics Lead at ProQuest and Chevy Volt ponderer extraordinaire, tweeted the question that we bandy about over cocktails in hotel bars the world over during any analytics gathering:

His tweet came on the heels of the latest Beyond Web Analytics podcast (Episode 48), in which hosts Rudi Shumpert and Adam Greco chatted with Jenn Kunz about “implementation tips.” Although not intended as such, the podcast was skewed heavily (95%) towards Adobe/Omniture Sitecatalyst implementations. As the dominant enterprise web analytics package these days, that meant it was chock full of useful information, but I found myself getting irritated with Omniture just from listening to the discussion.

My immediate reply to Farris’s tweet, having recently listened to the podcast, reflected that irritation:

Sitecatalyst throws its “making it much harder than it should be” talent on the implementation side of things, and I say that as someone who genuinely likes the platform (I’m not  a homer for any web analytics platform — I’ve been equally tickled pink and wildly frustrated with Google Analytics, Sitecatalyst, and Webtrends in different situations). I’m also not criticizing Sitecatalyst because I “just don’t understand the tool. ” I no longer get confused by the distinction between eVars, sProps, and events. I’ve (appropriately) used the Products variable for something totally separate from product information. I’ve used scView for an event that has nothing to do with a shopping cart. I’ve set up SAINT classifications. I’ve developed specs for dynamically triggering effectively named custom links. I’ve never done a stint as an Adobiture employee as an implementation engineer, but I get around the tool pretty well.

Given that I’ve got some experience there, I’ve also worked with a range of clients who have Sitecatalyst employed on their sites. As such, I’ve rolled my eyes and gnashed my teeth at the utter botched-ness of multiple clients’ implementations, and, yes, I’ve caught myself making the same type of critical statements that were rattled off during the podcast about companies’ implementations:

  • Failure to put adequate up front planning into their Sitecatalyst implementation
  • Failure to sufficiently document the implementation
  • Failure to maintain the implementation going forward on an on-going basis
  • Failure to invest in the people to actually maintain the implementation and use the data (Avinash has been fretting about this issue publicly for over 5 years)

In the case of the podcast, though, I wasn’t participating in the conversations — I was simply listening to others’ talk. The problem, though, was that I heard myself chiming in. I jumped right  on the “it’s the client’s fault” train, nodding my head as the panel described eroded and underutilized implementations. But, then a funny thing happened. As  I stepped back and listened to what “I” would have been saying, I got a bit unsettled. I realized I’d been seduced by the vendor. Through my own geeky pride at having cracked the nut of their inner machinations, I’d crossed over to vendor-land and started unfairly blaming the customer for technology shortcomings:

If the overwhelming majority of companies that use a given platform use it poorly…shouldn’t we shine a critical light on the platform rather than blaming the users?

I love digital analytics. I enjoy figuring out new platforms, and it’s fun to develop implement something elegantly and then let the usable data come pouring in that I can feed into reports and use for analysis. But:

  • I’ve been doing this for a decade — hands-on experience with a half-dozen different tools
  • It’s what I’m most interested in doing with my career — it beats out strategy development, creative concepting, campaign ideation, and any and every other possible marketing role
  • I’m a sharp and motivated guy

In short…I’m uniquely suited to the space. I’m neither the only person who is really wired to do this stuff nor even in the 90th percentile of people who fit that bill. But the number of people who are truly equipped to drive a stellar Sitecatalyst implementation are, best case, in the low thousands, and, worst case, in the low hundreds. At the same time, demand for these skills is exploding. Training and evangelization is not going to close the gap! The Analysis Exchange is a fantastic concept, but that’s not going to close the gap, either.

There is simply too much breadth of knowledge and thought required to effectively work in the world of digital analytics for a tool to have a steep learning curve with undue complexity for implementation and maintenance. The Physics of the Internet means there are a relatively finite number of types of user actions that can be captured. Sitecatalyst has set up a paradigm that requires so much client-side configuration/planning/customization/maintenance/incantations/prayer that the majority of implementations are doomed to take longer than expected (much longer than promised by the sales team) and then further doomed to be inadequately maintained.

The signals that Adobe is slowly taking steps to merge the distinction between eVars and sProps is an indication that they realize that there are cases where the backend architecture needlessly drives implementation complexity. But, just as the iPhone shattered the expectations we had for smartphones, and the iPad ushered in an era of tablet computing that will garner mass adoption, Adobe has a very real risk of Sitecatalyst becoming the Blackberry of web analytics. Sitecatalyst 15, for all of the excitement Adobe has tried to gin up, is a laundry list of incremental fixes to functional shortcomings that the industry has simply complained about for years (or, in the case of the the introduction of segmentation, a diluted attempt to provide “me, too” functionality based on what a competitor provides).

The vendors have to take some responsibility for simplifying things. The fact that I can pull Visits for an eVar and Visits for an sProp and get two completely different numbers (or do the same thing for instances and page views) is a shortcoming of the tool. We’ve got to get out of the mode of simply accepting that this will happen, that a deep and nuanced understanding of the platform is required to understand the difference, and then gnashing our teeth when more marketers don’t have the interest and/or time to develop that deep understanding of the minutia of the tool.

<pause>

Although I’ve focused on Sitecatalyst here, that doesn’t mean other platforms are beyond reproach:

  • Webtrends — Why do I have to employ black magic to get my analysis and report limits set such that I don’t miss data? Why do I have to employ Gestapo-like processes to prevent profile explosion (and confusion)? Why do I have to fall back on weeks-long reprocessing of the logs when someone comes up with a clever hypothesis that needs to be tested?
  • Google Analytics — Why can’t I do any sort of real pathing? Why do I start bumping up against sampled data that makes me leery…just when I’m about to get to something really cool I want to hang my hat on? Why is cross-domain and cross-subdomain tracking such a nightmare to really get to perform as I want it to?

My point here is that the first platform that gets a Jobs-like visionary in place who is prepared to totally destroy the current paradigm is going to have a real shot at dominating over the long haul. There are scads of upstarts in the space, but most of them are focused on excelling at one functional niche or another. Is there the possibility of a tool (or one of the current big players) really dramatically lowering the implementation/maintenance complexity bar (while also, of course, handling the proliferation of digital channels well beyond the traditional web site) so that the skills we need to develop can be the ones required to use the data rather than capture it?

Such a paradigm shift is sorely needed.

Update: Eric Peterson started a thread on Google+ spawned by this post, and the lengthy discussion that ensued is worth checking out.

Similar Posts:

13 Comments


  1. Good post, got me thinking. And thanks for listening to the podcast.
    I won’t pretend not to be a “homer” for Omniture, though I enjoy the freedom I have to be critical of things that I wish Omniture did differently. I’ve been unemployed by Omniture for longer than I was employed by them- I am under no obligation to defend the vendor or assign blame. But complaining about shortcomings of the tool does no good to the sitecatalyst users who don’t have control over those shortcomings. They do, however, have control over documenting, educating, and USING the reports, so that’s what I tend to focus on.
    I know this is wasn’t you were saying, but I’d be sad if anyone formed an opinion of SiteCatalyst based on that podcast (though if they decided to hate the Form Analysis plugin- well, no hurt feelings there;)). Unfortunately the majority of my job is fixing broken implementations (from many vendors), so I tend to focus on how things can go wrong and the ways to fix them that ARE within the client’s control. But I wouldn’t be so silly as to blame a disproportionate number of broken SiteCatalyst implementations entirely on the client. It’s a complicated tool. It’s easy to get wrong- or else I may have a little less job security. Nothing would make me happier than for my implementation-fixing skills to become unnecessary because people were no longer getting stumped by their tool- I have other skills and find analysis much more fun than troubleshooting.
    But I’ll admit it is endlessly frustrating to me to see people blame ANY tool, because if you blame someone else for your problems, you are giving away your own responsibility/ability to get things right (that’s the problem with any “blame game”). If you “get” analytics, you can thrive in any tool. If your analytics culture is broken, you WILL fail in any tool- as I’ve seen recently with even the simplest Google Analytics implementations.
    You asked “If the overwhelming majority of companies that use a given platform use it poorly…shouldn’t we shine a critical light on the platform rather than blaming the users”
    Because my role means most implementation that comes across my desk from most any vendor is broken, I have a skewed view of which platforms perform poorly- in my view, they all do. Depressing, I know, but I have yet to see the idiot-proof analytics tool. Even more depressing, I suspect if the idiot-proof analytics tool DID exist, there would still be a significant number who did not use the tool “right” (which is to say, using the data in a valuable way). The key issue is definitely not JUST the problem of implementation difficulties, the problem of tool complexity, or the problem of lack of valuable analysis. It’s all of the above. Let’s fix whichever parts we can.

  2. Thanks for the thoughtful comment, Jenn! I was admittedly playing the blame game a bit, which isn’t where I like to be. I hope I was, at least a little, pointing the finger at myself and “we” (the analytics practitioners) for being enablers of some of messiness that architectural decisions made many years ago by the different technology platforms have now wrought.

  3. Fantastic post! Even a lot of the smaller firms in the web analytics space are becoming more complex, and I don’t think their customers are embracing the changes. Their growing complexity seems intended to help them grow into the medium-sized business segment, but the move seems based on the premise that marketing departments will have easy access to project managers and engineers. I don’t think so. I like the approach of Spring Metrics for the SMB segment (www.springmetrics.com). They’ve taken a clean approach in the UI, and starting with conversions they work backwards to show you the campaigns, keywords, emails, and sources that are contributing to those conversions — “know what works, so you can do more of it.”

  4. Honestly, I don’t think that this is something that will be solved. I would love it if a vendor came out of the woodwork to do it, but I guess I’m just cynical.

    The main reason for my cynicism is the state of big data overall, which touches fields that have been around a lot longer than web analytics, and none seem to have solved the problem.

    I’ve been out of big data for a while, but I remember SAP, Cognos and Oracle being even more difficult to implement than webtrends, Omniture or google analytics. They often require highly trained full time staff to keep running, and a team of analysts to make the data useful. Heck, they often need to use specialized analysis tools that are different from the data system (e.g. SAS).

    I would love to be pleasantly surprised and hear that the big data issues have been solved, but I doubt it.

  5. Thank you Tim for the insightful post (and the mention).

    Here is my take, from a client, turned vendor (ForeSee Results), then back to a client (ProQuest) perspective.

    1. On the vendor side, I believe there is the following mindset: “Measure more variables because we just figured out how to do do so, then make cool visualizations, dashboards, 3D Fly throughs, spinning pie charts :) etc.”

    This works great in helping vendor sales teams sell the sizzle, but the steak is still the same.

    2. Vendors naturally feel that they are the center of the universe. Is it better to integrate ForeSee Results data (disclosure: my previous company)into the Omniture environment? or is it better to integrate Omniture data into the ForeSee Results environment?

    Depends on which vendor you talk to first. Regardless, you are not going to get a “seamlessly integrated holy grail”… and the business reality is that you need more than one vendor if you truly want to have a credible measurement ecosystem.

    2. On the client side, we do not think enough about the business questions our team needs to answer. We gravitate towards what vendors give us and try to amplify the value of those variables, even if they are not really that actionable. Why? Because it is hard to admit that we do not have the answer to a particular business question in the data and of course we are constrained by budget, time, bandwidth etc.

    I like your “Jobs-like Visionary” concept. Apple made itself the hub. It simplified its offerings. It focused. It changed its business model completely.

    3 of the top 4 Clickstream Analytics tools (Adobe/Omniture, IBM/Coremetrics, and Google/GA) are managed by parent organization for which #measure is not their main business. (Webtrends is still independent).

    Is it realistic to think that companies structured in this way can pull a “Steve Jobs” and revolutionize this industry?

  6. Pingback Data Rich, Optimization Poor

  7. Pingback Twitted by predictabuy

  8. Pingback Reflections from the Google Analytics Partner Summit | Gilligan on Data by Tim Wilson

  9. Great post,very enjoyable read,and I agree with most of what you say,and by the way,here is a great info about DVD Catalyst 4

  10. Awesome post Tim. Realize I’m almost 2 years too late to the debate, so not sure if everyone’s too tired now to continue to explore the themes you raised, but they’re as relevant now as they were when you penned this post…

    My own view is the web analytics will always be hard, but that architectural decisions taken by the Sitecatalyst tech team means Sitecat makes it harder than it needs to be. (Especially on the implementation side.) My thoughts in full (including specifics on what the Sitecat guys have messed up): http://snowplowanalytics.com/blog/2013/06/28/is-web-analytics-easy-or-hard-distinguishing-different-types-of-complexity/

  11. Pingback Data Viz News [13] | Visual Loop

Leave your Comment


Notify me of followup comments via e-mail. You can also subscribe without commenting.

« »