Social Media Measurement: A Practitioner’s Practical Guide

By on in , with One Comment

Connie Bensen has a Social Media Measurement post that is worth a read. While the post is focussed on measuring social media specifically, she hits on some areas that, all too often, are overlooked when it comes to developing metrics and then reporting on them over time.

The post includes a lot of resources for measuring social media — going well beyond simply web analytics data — as well as a list of examples of things that can be measured. What really struck me, though, was the list at the end of the post of what a community manager’s monthly report should include. First, the fact that it is a monthly report is somewhat refreshing — real-time on-demand reports are way overrated, and really are not practical when it comes to providing the sort of context that Connie describes.

On to Connie’s list of report elements — the bold text is from her list, and the non-bold description is my own take on the item:

  • Ongoing definition of objectives — the framework of any recurring report should be the objectives that it is attempting to measure, so I love that this is the first bullet on the list. I would qualify it just a bit — it does not seem right to be making the defining of objectives an ongoing exercise; rather, objectives should be established, reiterated on an ongoing basis (so that everyone remembers why we’re tackling this initiative in the first place), and revisited periodically (objectives can and should change).
  • Web analytics — this is the “easy” data to provide on a recurring basis, it’s data that most people are getting comfortable with, and, even though there is a lot of noise in the data, it is still reasonably objective; the key here is to focus on the web analytics data that actually matters, rather than including everything.
  • Interaction – Trends in members, topics, discovery of new communities — this is a somewhat community-specific component, but it’s a good one; the “discovery of new communities” actually implies an objective regarding the role of a community manager; what a great metric, though, to drive behavior within the role.
  • Qualitative Quotes – helpful for feedback & marketing — to broaden this list to beyond reporting for social media, let’s change “Quotes” to “Data;” make the report real by providing tangible, but qualitative, examples of what is going well (or not); reporting on lead generation activity, for instance, can include selected comments that were made by attendees at a webinar — highlighting what resonated with the audience (and what did not).
  • Recommendations – Based on interactions with the customers — recommendations, recommendations, recommendations! What is the point of pulling all of this information together if nothing gets done with it? I sometimes like to include recommendations at the beginning of a report — they’re a great way to engage the report consumer by making statements about a course of action right up front.
  • Benchmark based on previous report — my preference is to use stated targets (where it makes sense) as the benchmark, rather than simply looking for the delta of the data over a prior reporting period. But, sometimes, that is simply not feasible. Including “here’s the measurement…and here’s the direction it is heading” is definitely a good thing. But, it’s also important to not look at a 2-month span and jump to “we have a trend!”

Having recently relaunched the Bulldog Solutions blog, I’ve got a good opportunity to put Connie’s post into practice. Oh, dear…that’s going to require re-opening the, “What are our objectives for this thing…clearly stated, please?!” Stay tuned…

Similar Posts:

One Comment

  1. Thanks for your additions Tim! They are always much appreciated.

    I agree that the results need to be redirected into new action items. That’s how projects work – I’m continually creating, evaluating & adjusting.

    I need to check out your new look on the Bulldog Solutions blog!

Leave your Comment

« »