The Analyst Skills Gap: It’s NOT Lack of Statistics and Econometrics Training

By on in , , with 9 Comments

I wrote the draft of this post back in August, but I never published it. With the upcoming #ACCELERATE event in San Francisco, and with what I hope is a Super Accelerate presentation by Michael Healy that will cover this topic (see his most recent blog post), it seemed like a good time to dust off the content and publish this. If it gives Michael fodder for a stronger takedown in his presentation, all the better! I’m looking forward to having my perspective challenged (and changed)!

A recent Wall Street Journal article titled Business Schools Plan Leap Into Data covered the recognition by business schools that they are sending their students out into the world ill-equipped to handle the data side of their roles:

Data analytics was once considered the purview of math, science and information-technology specialists. Now barraged with data from the Web and other sources, companies want employees who can both sift through the information and help solve business problems or strategize.

That article spawned a somewhat cranky line of thought. It’s been a standard part of presentations and training I’ve given for years that there is a gap in our business schools when it comes to teaching students how to actually use data. And, the article includes a quote from an administrator at the Fordham business school: “Historically, students go into marketing because they ‘don’t do numbers.'” That’s an accurate observation. But, what is “doing numbers?” In the world of digital analytics, it’s a broad swath of activities:

  • Consulting on the establishment of clear objectives and success measures (…and then developing appropriate dashboards and reports)
  • Providing regular performance measurement (okay, this should be fully automated through integrated dashboards…but that’s easier said than done)
  • Testing hypotheses that drive decisions and action using a range of analysis techniques
  • Building predictive models to enable testing of different potential courses of action to maximize business results
  • Managing on-going testing and optimization of campaigns and channels to maximize business results
  • Selecting/implementing/maintaining/governing data collection platforms and processes (web analytics, social analytics, customer data, etc.)
  • Assisting with the interpretation/explanation of “the data” — supporting well-intended marketers who have found “something interesting” that needs to be vetted

This list is neither comprehensive nor a set of discrete, non-overlapping activities. But, hopefully, it illustrates the point:

The “practice of data analytics” is an almost impossibly broad topic to be covered in a single college course.

What bothered me about the WSJ article are two things:

  • The total conflation of “statistics” with “understanding the numbers”
  • The lack of any recognition of how important it is to actually be planning the collection of the data — it doesn’t just automatically show up in a data warehouse

On the first issue, there is something of an on-going discussion as to what extent statistics and predictive modeling should be a core capability and a constantly applied tool in the analyst’s toolset. Michael Healy made a pretty compelling case on this front in a blog post earlier this year — making a case for statistics, econometrics, and linear algebra as must-have skills for the web analyst. As he put it:

If the most advanced procedure you are regularly using is the CORREL function in Excel, that isn’t enough.

I’ve…never used the CORREL function in Excel. It’s certainly possible that I’m a total, non-value-add reporting squirrel. Obviously, I’m not going to recognize myself as such if that’s the case. I’ve worked with (and had work for me) various analysts who have heavy statistics and modeling skills. And, I relied on those analysts when conditions warranted. Generally, this was when we were sifting through a slew of customer data — profile and behavioral — and looking for patterns that would inform the business. But this work accounted for a very small percentage of all of the work that analysts did.

I’m a performance measurement guy because, time and again, I come across companies and brands that are falling down on that front. They wait until after a new campaign has launched to start thinking about measurement. They expect someone to deliver an ROI formula after the fact that will demonstrate the value they delivered. They don’t have processes in place to monitor the right measures to trigger alarms if their efforts aren’t delivering the intended results.

Without the basics of performance measurement — clear objectives, KPIs, and regular reporting — there cannot be effective testing and optimization. In my experience, companies that have a well-functioning and on-going testing and optimization program in place are the exception rather than the rule. And, companies that lack the fundamentals of performance management that try to jump directly to testing and optimization find themselves bogged down when they realize they’re not entirely clear what it is they’re optimizing to.

Diving into statistics, econometrics, and predictive modeling in the absence of the fundamentals is a dangerous place to be. I get it — part of performance measurement and basic analysis is understanding that just because a number went “up” doesn’t mean that this wasn’t the result of noise in the system. Understanding that correlation is not causation is important — that’s an easy concept to overlook, but it doesn’t require a deep knowledge of statistics to sound an appropriately cautionary note on that front. 9 times out of 10, it simply requires critical thinking.

None of this is to say that these advanced skills aren’t important. They absolutely have their place. And the demand for people with these skills will continue to grow. But, implying that this is the sort of skill that business schools need to be imparting to their students is misguided. Marketers are failing to add value at a much more basic level, and that’s where business schools need to start.


  1. Hi Tim,

    I can’t agree more with this. I am another web analyst who would consider themselves an advanced user of Excel but I don’t think I have ever used CORREL either. Statistics usage is just not required in most of the web analysis that I perform.

    I can see the benefits of statistics but primarily for answering a specific set of business questions. Most questions I encounter (which part of my site do I need to focus on for optimisation to achieve the biggest uplift in performance?) do not require statistics to answer. I have also encountered various people over the years who have applied statistics to data to arrive at results which can be discarded as soon as any commercial or common sense tests are applied.

    There is a place for statistics and statistical approaches in web analytics but it is not an essential skill for web analysts (along with SQL).



  2. Another non CORREL user here (though I may have used it a few times years ago).

    I agree strongly with your statement that understanding the fundamentals is where the gap is, not using wiz bang stuff that your computer program can kick out for you. Understanding pretty simple ideas like what normal variation is and what is something special is critical to managing yet not understood by most managers.

  3. Is this post about what B-School grads should have learned, or is it what analysts should know? Sort of different questions I think. If it is the former than I mostly agree with you. If the latter then I’m not sure
    I think if you are going play an analytic role in companies that have data at or close to their core (bitly, foursquare, etc.) then you prob need to have some serious data/analytical tech skills – you prob would also be calling yourself a Datascientist. What I think MH is trying to get at is, can you, as a web analyst, score yourself a datascience job?

  4. Pingback Invest in Analytics Resources – Digital Transparency – powered by Adversitement | Digital Transparency

  5. Good discussion here. I think an interesting question is comparing the skills a ‘web analyst’ needs and what an ‘analyst’ needs who works with call center, retail or database marketing. Having done both, my experience has been that for the latter if you don’t know some SQL and some stats you don’t get hired because you will use it.

    It is interesting that there is a skill gap between these two fields that are so closely related. I know this is broadly generalized but it is a pattern I have seen often.

    By the way, CORREL is often wrong and should really not be used without supporting statistics to test for significance. My $0.02.



  6. Peter and John — great! I’m glad it’s not just me. :-)

    Matt — I think the “what B-school grads should be *taught*” (not necessarily that they’ll have the aptitude or experience to fully *learn* it — but they need to at least be schooled in a solid approach/framework so they don’t head out into the world with misguided expectations) *and* “what analysts should know.” But, you make an excellent point that there is the “data scientist” role which, in my mind, is an analytics specialization that has a deeper knowledge of statistics and modeling. I could be talked out of thinking it’s a specialization — maybe it’s a separate role entirely. Personally, I don’t know that I’m interested in every scoring a data scientist job…but I think I’m a lifer in the field of analytics (although I feel like the word “analytics” is putting on slight airs).

    Jason — I agree that there is almost always a need for a base level understanding of “data” (how data I’m working with is captured — I call this “the physics of the internet” when it comes to digital data, but CRM data, call center data, etc. all have the same underlying need of understanding the processes that are generating and capturing the data) and how data can be joined (relationally through a common key or otherwise).

  7. I will always agree with posts like this.

    I don’t have a statistics background, though I’ve had stats and math classes plenty back in my day.

    As far as I’m concerned, the numbers help prove or disprove hunches I had all along. I work hand in hand with business and strategy and I don’t rely on the numbers to give me an answer, I rely on them to back up an answer I already had (and no, I don’t go looking for only numbers to support me).

    And your point earlier about schools aren’t better preparing kids. Who cares? Give them some basics and let them earn an entry level job, learning as they go. Classes aren’t going to give anyone the kind of inquisitive and curious problem-solving personality that makes you good at this job.

  8. Tim:

    Cool post, and Matt G. certainly knows my basic POV well.

    I highly reccomend Thornton May’s ‘The New Know,’ in which he outlines three levels of problems

    – New problems
    – Problems of a kind solved before
    – Reasonably routine problems

    My vision of the future can be summarized in an anecdote I heard from Bill Gassman at Gartner while attending XChange:

    At a certain amusement park, not sure I can share the name, the park had a semi-regular problem:

    – On days it rained the attendance of the park would decline, and the distribution of attendees would shift from lines for the best rides to lines at the best places to eat.

    This was a labor intensive process as a small legion of middle managers would descend on the park to point; point the employees to their new location.

    Until an analyst figured out that this was a Gamma level problem which has a repeatable solution.

    They created a chart, which hangs in the employee area, with arrows telling people where to go if it is sunny or raining. Think of all the money saved from not keeping all those pointers around for the few rainy days.

    While not everyone has to know advanced econometrics, or be able to debate intelligently on the advantages of a No-SQL solution over a SQL solution, knowing if you are a pointer or not seems like a good idea.

    Michael D. Healy

  9. hi, all This does work quite well, with the caveat that all you get for enaerxtl (and other otherwise missing) link data is when’ and how many’, in contrast to all the data you get with an embedded GA script. But this is plenty for us.We found more accurate counts when the gatag.js event is changed from onclick’ to onmousedown’. John Kwasnik for theDGS web team

Leave your Comment

Notify me of followup comments via e-mail. You can also subscribe without commenting.

« »