Big Data without Digital Insight Management Is a Big Hot Mess

By on in , with 7 Comments

One of the many exciting aspects of joining a new company is the opportunity for reflection. The lead-up to the job change forced some introspection — what was it I really most enjoyed about my profession and what would a dream job look like that allowed me to spend as much of each day doing that as possible? And, as a new company, everyone has had to put their heads together to build out the processes needed to bring the vision for the company to life, which has required a different flavor of reflection: reflecting on what has and has not worked in our collective experience when it comes to enabling brands to be as data-informed as possible in their daily processes.

Shortly after joining Clearhead, I attended eMetrics in Boston. The conference, as always, was a great time. And, as often is the case, one of the conversations that stuck with me the most occurred where I didn’t expect it — in the exhibit hall during the sessions with a vendor I’d never heard of before the conference: Sweetspot Intelligence. Sergio Maldonado (@sergiomaldo) explained the vision for Sweetspot, gave me a brief product tour, and handed me a copy of the paper they sponsored Eric Peterson to write: Digital Insight Management: Ten Tips to Better Leverage Your Existing Investment in Digital Analytics and Optimization. The concept of “Digital Insight Management” is intriguing. And, luckily, it’s much more than an abstract idea — it’s real and, I believe, something that all analysts should be striving to implement.

Let’s Start with the Basics — Demystified’s Hierarchy of Analytical Needs

Early in the paper, Eric included Web Analytics Demystified‘s Hierarchy of Analytical Needs:

Experienced analysts look at this diagram and think, “Well…yeah. That’s a good depiction of the battle we fight every day.” Any sort of ho-hum response to the diagram is because we’ve been fighting the battle to move “up the pyramid” for a while, and we often feel undermined by the business environment in which we work. This is one of the more succinct and elegant depictions (not just the labels on the left — the assessment in the boxes on the right) that I’ve seen.

One Step Back Adds Another Element

When viewed through the lens of “what an analyst can do,” the hierarchy is complete. In some respects, the analyst can only lead the proverbial horse to water (clearly communicate a data-informed recommendation). The analyst can’t necessarily make the horse drink (take action). But, still, it’s worth recognizing that, if we take just one step back from this pyramid, we want to see one more level on the hierarchy:

Again, this is somewhat obvious. Yet, it’s where “we” (businesses) seem to so often stumble. There is so much “Data” now that marketers are now conditioned to prepend any mention of the word “data” with the word “big!” Few reports rely on data from a single source as analysts, and marketers work hard to place the data into meaningful context.  But, of course, the further up the pyramid we go, the easier and easier it is to get derailed. Ultimately…limited action.

Pivoting the Process

While the hierarchies above are unequivocally true, the actual process for meaningful analytics — analysis that drives relevant action — actually looks quite different:

Let’s break this down a bit:

  • Everything hinges on having clear objectives and measures of success — it’s scary how often marketers stumble on this, and, as analysts, it behooves us to be skilled in helping marketers get these nailed down (these are soft skills!)
  • Performance measurement is key…but it’s not the source of insights — performance measurement is the alerting system; it tracks the KPIs against targets, as well as some supporting and contextual metrics. But, the reports and dashboards themselves don’t yield insights — they surface problems that then need to be further explored.
  • All analysis starts with a business problem, business question, or business idea — the lefthand column is where th magic happens (or, all too often, doesn’t!).

It is impossible to attend any analytics-oriented conference these days without being hit over the head with how critical it is to develop and foster strong relationships with your business partners: regularly communicate, listen for the problems they’re having that your analytical skills can help with, learn how to communicate effectively, etc. That is a recurring theme because actually teasing out the right business questions and problems can be tricky!

Conversely, the back end of the process can be tricky, too. We’ve all had cases where we completed the right analysis and got actionable results…but action never occurred. As I understand it, that is where Sweetspot comes in: technology that supports communication and workflow related to getting actionable information to the people who can take action:

So…Will Tag Management Solve This?

(Blog authors get to crack themselves up with their headings…)

What Eric’s paper, and Sweetspot’s product, got me thinking about are a couple of gaps that, hopefully, I’ve covered in this post:

  • As analysts, we need to develop, implement, and own workable processes within our companies to make analytics truly gain and sustain traction
  • There is an opportunity for better technology to support these processes…and that is analytics technology that has nothing to do with the mechanics of capturing customer data

Is “Digital Insight Management” the next big thing? I think it is. Big Data is just a big hot mess without it.

Similar Posts:


  1. MANY thanks for the dedicated post and thorough analysis, Tim! You did pay good attention 🙂 I am open to answering whichever questions come up about Sweetspot or the whole DIM concept…

  2. Nice post Tim. I really believe this point “As analysts, we need to develop, implement, and own workable processes within our companies to make analytics truly gain and sustain traction.” is the crux of it all, and where most businesses (and analysts) struggle. There is a lot of education to be done – at the organisation I work for, even pinning down the purpose of the website’s existence has been elusive (in the 9 months I’ve been here) – everyone seems to have their own justification for its existence, and the website KPIs tend to be completely disconnected from the organisational objectives and goals – seems they were implemented less around “what should we measure and why?” and more around “what can we measure?”. Difficult then to provide recommendations on improvement when the C-suite can’t even agree upon what the focus for improvement should be. But hey, if it was easy, everyone would do it!

  3. Thanks, Adrian. I totally agree that it’s all too easy for organizations to start focusing on “the metrics that are readily available” rather than “the metrics that matter.” I would argue that there are many more sites that *don’t* have “obvious” KPIs than there are ones that do. It’s definitely a challenge to find ways to focus the organization on what the purpose of those sites are. That makes it all the more gratifying when you have a case where the “analyst trying to nail down KPIs” actually leads to stronger organizational alignment around the goals of a digital initiative. Best of luck to you!

  4. Pingback PEOPLE are a Big Part of Conferences (Including #AdobeSummit) | Gilligan on Data by Tim Wilson

  5. Pingback Gilligan’s Unified Theory of Analytics (Requests) | Gilligan on Data by Tim Wilson

  6. Pingback » PEOPLE are a Big Part of Conferences (Incl. #AdobeSummit) | Tim Wilson's Blog at Web Analytics Demystified

  7. Pingback KPI-powered collaboration and the future of Productivity

Leave your Comment

« »