Gary Angel vs.(?) Adam Greco. Love the Debate…not the Tone
By Tim Wilson on in Web Analytics with 3 Comments
If you are in the habit of reading blogs by digital analytics guys who really know what they’re talking about, then you already read Adam Greco‘s (of Web Analytics Demystified) and Gary Angel‘s (of Semphonic) blogs. The two of them have had an interesting debate over the past few weeks. A quick recap of the sequence:
- June 27, 2011 — Adam wrote a post that listed 10 of his pet peeves with regards to Omniture Sitecatalyst implementations
- July 7, 2011 — Gary posted a response taking issue with some of the pet peeves, and, in the process, put forth a theory of the Analyst vs. Implementer in Sitecatalyst; as it turns out, the theory really wasn’t Sitecatalyst-specific, although the details in the post (and the subsequent back and forth) got pretty down in the details of that specific tool
- July 12, 2011 — Adam responded, both taking issue with the distinction Gary made and taking issue with most of Gary’s Sitecatalyst-specific counterpoints (and, in some cases, simply clarifying his original points, as the two were somewhat in agreement)
- July 17, 2011 — Gary responded to Adam’s response, reiterating his original Analyst vs. Implementer distinction, as well as responding to some of the original pet peeves (again, acknowledging when he felt valid points had been made by Adam)
While it has been a bit cringe-inducing to watch these posts teeter on the brink of out-and-out hostility (the rancor-that-shall-not-be-named between their respective companies not helping things in the least, I’m sure, and, yes, that was a Harry Potter reference!)…the underlying content is fascinating and useful.
Let’s Be Clear: Gary and Adam Know Their Stuff
Any company that has any digital analytics work it needs done — implementation, reporting, analysis, analytics strategy, analytics roadmap development — using any digital or social analytics tool, in any industry, should consider themselves lucky to have either Adam or Gary on the job. They are both sharp, deep, creative thinkers who have an obsessive focus on delivering work that creates business value. They both know the technical side of analytics, they both know marketing inside and out, they both know how to tackle really messy analyses in a disciplined manner, and they both know how to communicate effectively.
That doesn’t mean they’re interchangeable, that they would approach every analytics problem in the same way, or even that they would deliver similar results when presented with the same issue. No two analysts would, but there are scads of analysts who would deliver data that didn’t drive action, and I’d bet good money on either one of these guys to deliver eye-popping goodness.
Are we clear? These guys both really know their stuff.
So…Analyst vs. Implementer?
I like a lot of aspects of Gary’s thinking on the “Analyst vs. Implementer” front. I don’t like those labels being applied as the defining trait of anyone in our field. And, I’m not keen on the labeling of one “school” superior to the other:
…the Analyst school is ultimately focused on the most important set of problems,…
We all have strengths and blind spots.
The problem with being overly “Implementer”-oriented is that seeking the most clever, pristine, elegant, and efficient implementation of tags (be that in Sitecatalyst, Coremetrics, Google Analytics, or Webtrends…but Sitecatalyst does certainly set itself up for some extreme cases here based on the site-side implementation complexity). The problem is that this can, at times, come at the cost of maintainability, the robustness of the implementation, and the long-term ease of use of the Sitecatalyst interface (…and making such a claim will lead to the “Implementer” donning a highly exasperated expression launching into an excruciatingly detailed explanation of why everything he/she has proposed is designed precisely to minimize these problems; the caveat being that the site owners and IT staff supporting them “just” need to have a thorough understanding of the underlying mechanics of the tool).
The knee-jerk knock against the “Analyst” is, “Well, sure, the analyst just expects the data to show up in a usable fashion without really understanding the underlying ins and outs of the tool.” Now, that wasn’t what Gary was saying at all, but the downside of detailed blog posts in a Twitter world is that the sound bit of “Analyst vs. Implementer” leads to oversimplification.
In fairness to Adam’s irked-ness, Gary clearly puts himself (and Semphonic as a whole) in the Analyst role, while he puts Adam in the Implementer role. And…his description paints the Analyst in a superior light. Clearly, his point isn’t that analysts don’t need to have detailed knowledge of the tools they’re using and the ins and outs of Internet physics. The fact that Gary can go toe-to-toe with Adam on his pet peeves illustrates that, I think. He’s absolutely not saying that an implementation can be sloppy and haphazard. At all!
The Distinction — Can We Make It a Situational Lens Rather than a Person Label?
What I don’t agree with is Gary’s take that someone in our field is either an Analyst or an Implementer:
That doesn’t seem right, as it postulates that a person is one or the other with no gray area in between. So, what if, instead, we considered this as a spectrum, from focused solely on the analysis with no consideration for clean, maintainable, sustainable implementation (Analyst) to being focused solely on getting the data capture and making the tool configuration as elegant as possible:
So, with that view, what is the ideal? A perfect balance of both (the happy medium)?
I don’t think so. That reeks of unnecessary compromise. Instead, the ideal is “all:”
But…this doesn’t seem quite right, either, as it now simply says we’re looking for Superman. But, what if, instead of making these a person label, we used them more as a situational lens (note the label change at the ends of the spectrum):
In the end, this is the value I got out of the “Analyst vs. Implementer” portion of the exchange. I would not label Adam an “Implementer,” by any means, but I do recognize that one of his irrefutable strengths is in getting Sitecatalyst to do as much as it can possibly be expected to do in the most effective and elegant way possible. I wouldn’t label Gary an “Analys… doh! I would label Gary an Analyst. I’d label Adam an Analyst, too!
But, reflecting on a range of challenging client implementations and analyses, I certainly see the value in considering each implementation decision from both angles that Gary described.
Or, of course, it may be that I simply hate to see rancor in our field and I’m just splitting the baby. 🙂
A Final Note: the Pet Peeves — Where This All Started
Gary noted in his last post:
…it was probably unfair on my part to choose one of [Adam’s] posts to illustrate my [Analyst vs. Implementer] thesis. Truly, my apologies!
On the other hand, I think “Pet Peeves” is far from his finest work – and was much more representative of the “school” than his frequent ability to transcend it. Not a single one of his pet-peeves would have made my list had I tackled a comparable topic. Not one.
I, for one, would love to see Gary’s list. I suspect it will be more challenging for him to come up with a succinct set of Sitecatalyst implementation whiffs. More likely (and this is super-dangerous territory for me to try to trod in, as it’s Wild Speculation of the Highest Degree), it will be more along the lines of, “Failure to think through the two tiers of segmentation of visitors to the site.” No doubt — powerful and useful stuff…but a lot harder to put on a checklist to watch out for.
I’m not equipped to weigh in on the back and forth between Adam and Gary on the specifics as to who is “right” on each of the Sitecatalyst implementation practices. Here’s what happened:
- I read Adam’s initial post and thought, “Hey, that’s a good list of things to keep an eye out for.”
- I then read Gary’s response to that list and thought, “Interesting. Good points. That adds some nuance to consider relative to Adam’s list.”
- After reading Adam’s response to Gary’s response: “Okay…agreement that there are some corner case functionality in Sitecatalyst that seems like it’s rarely used, but I’ve now got a few more nuggets to be aware of and dig into deeper when I run into cases where they might be applicable.”
- And…my reaction to Gary’s response to Adam’s response to Gary’s initial response: “Wow. These guys really know their stuff. I bet, in any given specific situation, they’d have a healthy debate and ultimately agree on the best approach around each technical point…and they likely would have come up with pretty similar implementations even working wholly independently.”
Basically: any time a generic list gets made, it’s going to open itself up to criticism based on exceptions. It’s not as simple as an “exception that proves the rule,” unfortunately. Rather, it highlights that implementing Sitecatalyst requires some serious thought. Having worked with several Adobiture-generated implementation documents recently, I actually think they do a much better job now than they once did when it came to factoring in the specific site and business needs (and, I suspect, Adam is largely to credit for putting wheels in motion that continued to turn in a positive direction after his departure from the company).
Let’s take pet peeve no. 1 — tracking every eVar as an sProp. It seems safe to say that simply blindly doing this (as one commenter indicated he was told to do by an Adobiture implementation specialist) is a mistake. Likewise, it seems safe to duplicate a value in an eVar and in an sProp in cases where the value is a critical “slicer” of the data…but recognize that there is risk that someone will drop Visits on the eVar and someone else will drop Visits on the sProp, and you may have to deal with Mark Twain’s, “A man with one watch always knows what time it is, while a man with two is never quite sure” conundrum. It’s nuanced. There are no absolutes. There is value in having deep knowledge of the tool, but there is also value in doing a thoughtful assessment of the business environment in which reporting an analyses will be conducted.
I’ve enjoyed digesting the exchange and thinking the multitude of points through!