A Simple Process for Establishing Corporate Metrics

By on in with 2 Comments

Boy, are you ever lucky to be reading this post! I’m going to lay out a very simple framework for developing metrics that are actionable from the highest levels of the organization, all the way down to the individual line workers. Why, with this framework and 15-20 minutes of thought, you’ll be ready to purchase a BI tool and give everyone in the organization a dashboard that they can reference every morning to help set their activities for the day!

It’s really quite simple. First, you have to realize that, the higher up in the organization the dashboard user is, the more inherently strategic he is. Conversely, the farther down the org chart a person is, the more inherently operational he is. Tactics are the bridge between the strategic and the operational, and it’s all one fat, happy continuum that can be neatly placed on top of both your company’s org chart as well as your metrics framework.

The process for developing metrics is pretty simple:

  1. Figure out what the C-level execs and the board have decided as the strategy for the company and pick the metrics to measure them. It could be top line revenue. It could be bottom line profit. It could be growth. It could be a combination. Just ask ’em…and then measure.
  2. Drill down from those metrics to what each VP is responsible for with regards to driving those metrics. Manufacturing has to keep costs down and quality up. Sales has to bring in the business. You get the idea.
  3. In each of the VP’s areas, drill down further. Sales, for instance, may simply get decomposed into geographic territories.
  4. Keep on drilling down until you are at the individual contributor level. You’ve now got nice metrics for that person that can be traced all the way up to the CEO!!! Isn’t that wonderful?! It’s complete alignment of the whole company at all levels!

The figure below illustrates this approach in a pictorial form. It’s a pretty picture — with the use of a gradient fill, no less! — so things should now be perfectly clear. As the picture shows, obviously, there are a lot more metrics at the operational level than there are at the highest level. And, you can see how a CEO would be able to simply “drill down” from his level if he comes in and sees an issue with one of his metrics one morning. Why, if profitability slips, he just may be able to drill all the way down to the PCB technician who is getting sloppy with his solder usage!

Metrics hierarchy framework

Are you still reading this post? If so, I’d bet it’s for one of two reasons:

  1. You are spitting mad and feel like you need to at least scan the rest of the post before ripping me mercilessly for my naiveté
  2. You think this is brilliant, and you’re just itching to print it out to show it to the CEO of your company

If you fall in the latter category, then BEGONE! Do NOT press Print. Do NOT collect $200. As a matter of fact, do your company a favor and send yourself to jail!


Go away.

Stop reading this!

Okay, you’re still with me. And, that little voice in the back of your brain that was whispering, “I think he might have his tongue thoroughly lodged in his cheek, so let’s hear him out before calling him a nincompoop”…was right.

The above “proposed process” is frighteningly close to what many BI vendors and members of the business community actually believe (the BI vendors simply ought to be ashamed of themselves; business managers who believe this…will learn the folly of their ways eventually).

I’m not saying that this approach isn’t nirvana. It’s a lofty ideal that, unfortunately, is almost never attainable. Now, a claim can be made that, attainable or not, if this is what we aim for, then we’ll be heading in a positive direction. As a great professor of mine would say: “Maybe so.”

What’s interesting to me is that I have had two experiences in the last week where sanity and pragmatism have prevailed. One experience was with a high tech client of Bulldog Solutions. The other experience was at a committee meeting for the United Way of Central Ohio. What? A nonprofit?! Actually taking a more viable approach to measurement than many for-profit companies?! Joe! Joe! Say it ain’t so!

It is so.

A High Tech Example

(Understand that I have to speak in generalities here to protect the confidentiality of the client.)

In the high tech case, the organization recognized that there is a measurement disconnect between end-of-the-day, rubber-hits-the-road business unit results and the tactics that they expect to use to drive those results.

What they did — and I played this back to the fellow to be sure I heard him right — is dove in and understood their business by being in the business, by bringing in customers and stakeholders and listening to them, and by thinking about what their value add was and the complete value chain for their end users. Then, they developed a couple of high-level strategies that, if they had done their homework right, would drive the revenue/growth results they were shooting for. They converted those strategies into tactics. And, here’s where it got interesting. They then focussed on measuring the effective execution of those tactics. Now, the knee-jerk response is, “Well, that’s not in conflict with your pyramid framework at all, is it?” But, au contraire! The difference was that they were very much not trying to directly link the tactic to a top line goal. They were saying that, if they missed their high-level goals, then one of two things (or a combination) happened:

  • Their strategy was ill-conceived and, consequently, the tactics did not work
  • The strategy was solid, but the tactics were poorly executed

By focusing on metrics for the tactics that were tied closely to the effective execution, they could determine which of these two root causes were really in play.

This makes academics and theoreticians uncomfortable, because it acknowledges that, at the end of the day, there is thought, knowledge, experience…and a little bit of instinct and supposition…that goes into setting a strategy. And, sometimes that strategy is a big, fat whiff.

A good way to increase your chances of swinging for the bleachers and then hearing the thwop! of ball hitting mitt (the catcher’s mitt), is to: 1) spend a of energy and resources trying to link tactical execution results directly to top-line strategic targets, and 2) sitting back and waiting for that linkage to be made rather than driving the business forward in an imperfect world.

But…A Nonprofit!?

Nonprofits regularly get dinged for not running their organizations “more like a business.” And, that’s a fair accusation at times. But, nonprofits also have a really tough row to hoe when it comes to performance measurement. In the good ol’ days when process engineering was limited to Manufacturing, measurement was “easy” — what’s my first pass yield? what’s my waste? what’s my throughput? Then, we started to apply process engineering to other areas of the business. Marketing was the last holdout. “Marketing?! Measurement? But…but…but we’re all about awareness and brand. You can’t measure those!” Well, lots of ways of measuring that sort of thing are cropping up. But it’s still damn tough.

Cut to the nonprofit sector. Do you have any idea how hard it is to count “nots?” For instance, how many homeless people are not staying in shelters? How many people did not become homeless because they received one-time financial assistance to help them out of a tough spot? It’s tough. REAL tough.

United Way — I’ve worked with one extensively (in Austin), and am just starting to work with another one — has always faced that challenge head-on. Their agencies have to link the outcomes (tactical) that they are trying to achieve to high-level goals (strategic). They have to state up front what outcomes they expect from the programs included in their proposal, and they have to identify 2-3 measures that, if not a direct measure of that outcome, are a resonable proxy. I learned this at the knee of a fellow named Pat Craig, who was the volunteer chair of one of the first committees I ever sat on at the United Way Capital Area. But…it’s a concept that permeates United Way.

Today, I attended a results committee meeting with the United Way of Central Ohio. The lady who was slated to deliver the “approach to identifying performance measures,” Lynette Cook, was out sick, so the material was ably presented by another staff member. But, the process that Cook developed looks to be solid. My understanding at this point is that United Way of Central Ohio has spent a lot of effort getting more focus around the areas of social services that they are going to try to impact. They’ve divided those up into four high-level areas. Within each area, they have a couple of sub-areas. Those sub-areas, then, are going to work to identify the most pressing issues and the outcomes that are most needed…and then identify performance metrics for measuring progress.

Again, they are not trying to put a direct, hierarchical measurement link, between these sub-areas and the UWCO overall mission. The metrics simply do not “roll up” like that.

A Final Word

These two examples stand out because they are so much the exception rather than the rule. The problem with simple pictures — constructed in 5 minutes in PowerPoint — is that they can support a simple story. And, that story can sure sound good. But, a good story isn’t necessarily reality. All too often, though, we treat reality as simply “details.”

I am all about having a solid, workable framework for metrics development. But, that needs to be a framework grounded on planet Earth. Business is complicated. It’s getting more complicated every day. Strategy is not reserved for only the highest levels of an organization any more than operational execution is reserved for only the lowest levels. There is a blending across all levels, and that blend varies across departments.

We can collect and report on more data than we have ever been able to. It is a fallacy, though, to believe that more data means that, despite the complexity of the real world, we can fill in all the boxes in a conceptual pyramid of metrics. That’s just not true. I doubt it will ever be true. Trying to fill in all the boxes — and spending endless cycles explaining why it should be doable (if wishes were horses…) is a good way to deliver data with no insights, metrics with no actionability.


  1. Pingback Gilligan on Data by Tim Wilson » So, You Think Measuring Marketing Performance Is Hard?

  2. Pingback krem flux

Leave your Comment

« »