Dashboard Design Part 3 of 3: An Iterative Tale

By on in , with 11 Comments

On Monday, we covered the first chapter of this bedtime tale of dashboard creation: a cutesy approach that made the dashboard into a straight-up reflection of our sales funnel. Last night, we followed that up with the next performance management tracking beast — a scorecard that had lots (too much) detail and too much equality across the various metrics. Tonight’s tale is where we find a happy ending, so snuggle in, kids, and I’ll tell you about…

Version 3 – Hey…Windows Was a Total POS until 3.1…So I’m Not Feeling Too Bad!

(What’s “POS?” Um…go ask your mother. But don’t tell her you heard the term from me!)

As it turned out, versions 1 and 2, combined with some of the process evolution the business had undergone, combined with some data visualization research and experimentation, meant that I was a week’s worth of evenings and a decent chunk of one weekend away from something that actually works:

Some of the keys that make this work:

  • Heavy focus on Few’s Tufte-derived “data-pixel ratio” –- asking the question for everything on the dashboard: “If it’s not white space, does it have a real purpose for being on the dashboard?” And, only including elements where the answer is, “Yes.”
  • Recognition that all metrics aren’t equal –- I seriously beefed up the most critical, end-of-the-day metrics (almost too much – there’s a plan for the one bar chart to be scaled down in the future once a couple other metrics are available)
  • The exact number of what we did six months ago isn’t important -– I added sparklines (with targets when available) so that the only specific number shown is the month-to-date value for the metric; the sparkline shows how the metric has been trending relative to target
  • Pro-rating the targets -– it made for formulas that were a bit hairier, but each target line now assumes a linear growth over the course of the month; the target on Day 5 of a 30-day month is 1/6 of the total target for the month
  • Simplification of alerts -– instead of red/yellow/green…we went to red/not red; this really makes the trouble spots jump out

Even as I was developing the dashboard, a couple of things clued me in that I was on a good track:

  • I saw data that was important…but that was out of whack or out of date; this spawned some investigations that yielded good results
  • As I circulated the approach for feedback, I started getting questions about specific peaks/valleys/alerts on the dashboard – people wound up skipping the feedback about the dashboard design itself and jumping right to using the data

It took a couple of weeks to get all of the details ironed out, and I took the opportunity to start a new Access database. The one I had been building on for the past year still works and I still use it, but I’d inadvertently built in clunkiness and overhead along the way. Starting “from scratch” was essentially a minor re-architecting of the platform…but in a way that was quick, clean and manageable.

My Takeaways

Looking back, and telling you this story, has given me a chance to reflect on what the key learnings are from this experience. In some cases, the learning has been a reinforcement what I already knew. In others, they were new (to me) ideas:

  • Don’t Stop after Version 1 — obviously, this is a key takeaway from this story, but it’s worth noting. In college, I studied to be an architect, and a problem that I always had over the course of a semester-long design project was that, while some of my peers (many of whom are now successful practicing architects) wound up with designs in the final review that looked radically different from what they started with, I spent most of the semester simply tweaking and tuning whatever I’d come up with in the first version of my design. At the same time, these peers could demonstrate that their core vision for their projects was apparent in all designs, even if it manifested itself very differently from start to finish. This is a useful analogy for dashboard design — don’t treat the dashboard as “done” just because it’s produced and automated, and don’t consider a “win” simply because it delivered value. It’s got to deliver the value you intended, and deliver it well to truly be finished…and then the business can and will evolve, which will drive further modifications.
  • Democratizing Data Visualization Is a “Punt” — in both of the first two dashboards, I had a single visualization approach and I applied that to all of the data. This meant that the data was shoe-horned into whatever that paradigm was, regardless of whether it was data that mattered more as a trend vs. data that mattered more as a snapshot, whether it was data that was a leading indicator  vs. data that was a direct reflection of this month’s results, or whether the data was a metric that tied directly to the business plan vs. data that was “interesting” but not necessarily core to our planning. The third iteration finally broke out of this framework, and the results were startlingly positive.
  • Be Selective about Detailed Data — especially in the second version of the scorecard, we included too much granularity, which made the report overwhelming. To make it useful, the consumers of the dashboard needed to actually take the data and chart it. One of the worst things a data analyst can do is provide a report that requires additional manipulation to draw any conclusions.
  • Targets Matter(!!!) — I’ve mounted various targets-oriented soapboxes in the past, but this experience did nothing if it didn’t shore up that soapbox. The second and third iterations of the dashboard/scorecard included targets for many of the metrics, and this was useful. In some cases, we missed the targets so badly that we had to go back and re-set them. That’s okay. It forced a discussion about whether our assumptions about our business model were valid. We didn’t simply adjust the targets to make them easier to hit — we revisited the underlying business plan based on the realities of our business. This spawned a number of real and needed initiatives.

Will There Be Another Book in the Series?

Even though I am pleased with where the dashboard is today, the story is not finished. Specifically:

  • As I’ve alluded to, there is some missing data here, and there are some process changes in our business that, once completed, will drive some changes to the dashboard; overall, they will make the dashboard more useful
  • As much of a fan as I am of our Excel/Access solution…it has its limitations. I’ve said from the beginning that I was doing functional prototyping. It’s built well enough with Access as a poor man’s operational data store and Excel as the data visualization engine that we can use this for a while…but I also view it as being the basis of requirements for an enterprise BI tool (in this regard, it jibes with a parallel initiative that is client-facing for us). Currently, the dashboard gets updated with current data when either the Director of Finance or I check it out of Sharepoint and click a button. It’s not really a web-based dashboard, it doesn’t allow drilling down to detailed data, and it doesn’t have automated “push” capabilities. These are all improvements that I can’t deliver with the current platform.
  • I don’t know what I don’t know. Do you see any areas of concern or flaws with the iteration described in this post? Have you seen something like this fail…or can you identify why it would fail in your organization?

I don’t know when this next book will be written, but you’ll read it here first!

I hope you’ve enjoyed this tale. Or, if nothing else, it’s done that which is critical for any good bedtime story: it’s put you to sleep!  🙂

Similar Posts:

11 Comments


  1. Looks like you made a spectacular leap in terms of information visualization between pt. 2 and pt. 3, what was the driver of that change?

  2. Tim,

    >>As much of a fan as I am of our Excel/Access solution…it has its limitations (…) but I also view it as being the basis of requirements for an enterprise BI tool

    With some tools like Micro Charts and the XLCubed you can turn your Excel Dashboard into and enterprise BI solution. XLCubed brings data connectivity to relational an OLAP sources, drill down capabilities and allows you to publish Excel Dashboard to the web as a scalable multi user web application:

    http://www.bonavistasystems.com/Products_SparkLiner_Dashboards.html

    http://www.xlcubed.com/en/Products_XLCUbed_Overview.html

    Andreas

    http://blog.xlcubed.com/

  3. @clint It really was something of a lightbulb moment — it just took me a year to find the switch. I kept making additional iterations and knowing immediately that they were another “not it.” By the time I finally *read* Few’s book, it all came together.

    @Andreas It’s almost like you have to be an apologist if you’re “just” using Excel at times. I’ve casually poked around on BonaVista Systems and XLCubed (and Crystal XCelsius for that matter) in the past, but never bitten the bullet and tried using them. Thanks for the reminder that they’re there!

  4. Pingback Contextures Blog » Excel Twitters 20080830

  5. Here are some additional thoughts …

    The balanced scorecard of Kaplan and Norton tracks business in the categories of financial, customer, internal processes, and learning and growth, where each category is to have objectives, measures, targets, and initiatives.

    In most organizations, a significant change in strategies can occur in time for various reasons; e.g., perhaps there was a change in leadership. Hence, organizations that follow the balanced scorecard card and similar strategy-tracking strategies, as described above, do not have a structured framework for orchestrating and making analytically/innovatively derived decisions from an assessment of long-lasting value-chain metrics.

    Balance is important; however, it can be detrimental to have forced balance, as described in the strategy-driven approach above. What is needed is a total integration and balance of selected metrics to avoid conflicting goals. For example, focusing on on-time delivery could lead to a reduction in product quality in order to meet ship dates.

    Also, setting goals without a roadmap for making process improvement can lead to resource-draining firefighting activities; e.g., one week a metric could be red and the next week green just from process common-cause process variability – when nothing was done different to the process between weeks. Goals are not bad but goals need to be set after analytically/innovatively assessing the organization as a whole – to avoid organizational sub-optimizations. When a metric needs to be improved this metric is “pulling” for a process-improvement project creation.

    As part of all effort, it is not only important to determine what to measure but also on how to report it, so that metric performance tracking leads to the most appropriate actions. This is achieved with satellite-level financial and 30,000-foot-level operational metric reporting. These metrics assess process predictability, and for processes that are predictable, a prediction statement is made.

    A long-lasting value-chain measurement system can quantify how the organization is doing relative to its value chain. Current levels of value-chain satellite-level and 30,000-foot-level metrics can then be structurally analytically/innovatively assessed as a whole to build targeted strategies which lead to process improvement or design projects, for achievement of the strategies.

    The Integrated Enterprise Excellence (IEE) system provides a methodology to avoid potential conflicts by basing the selection of all metrics on the enterprise value chain and assigning all metrics to the owner who is accountable for the metric’s performance. These metrics can be cascaded downward to lower organization functions, where they are assigned to owners who have performance accountability. With the IEE system basic value chain metrics will not change even if there is an organizational change. Only the ownership will change; i.e., the org chart is subordinate to the value chain.

  6. I like the final product, plus the journey sounds familiar. You said what you didn’t use (BonaVista, etc.), but was this all done in Excel? Any tools for the sparklines or were those homegrown?

  7. @Forrest You point out a lot of the challenges when it comes to implementing some of the more traditional BPM approaches, and IEE sounds like it is a methodology worth exploring. However, I’m not sure any of these would have addressed one of the most fundamental challenges we (and many young, small companies) face, which is that the value chain is very dynamic — there is a lot of experimentation involved, especially in a services-oriented business, when it comes to finding the right niche and then filling it effectively.

    @Greg This was all done in Excel 2003, with no add-ons. The only macro in the spreadsheet is a “refresh all” button for refreshing the underlying data — it refreshes the data and updates one cell as to when the data was last updated (but is mainly there for usability reasons so that three different people in the company, who all have their ODBC connections set up properly, can refresh the data at any time). As I was working on this, a co-worker was busy developing a separate dashboard (external-facing) using Excel 2007 — we would regularly compare notes, and he would point out how much easier much of this was to do in 2007. The sparklines are just a small, very simple chart.

  8. Tim,

    I loved your post relating your journey. The year you spend finding the light switch is short in comparison to most of us. I use Excel for prototyping but also recognize its not for large audiences.

    Somewhere in your blog you mentioned that in your new job you are looking for the platform to take it to the next level, and that were evaluating vendors.

    I’d like to know your thoughts on MS reporting services (not Analysis… and the cubes). I’m playing with it and looking for a way to recycle the Excel prototypes into it.

    Regards,

    Jose

  9. @Jose Well…my journey with that dashboard ended when I left Bulldog Solutions a couple of months after I wrote this post. For this dashboard, but even more for the external-facing dashboards I referenced in my earlier comment, we had definitely done some exploration of MS reporting services. The co-worker I referenced who was running point…also left Bulldog, and he was the person who had done the bulk of the research. As I recall, MS was promising from an out-of-pocket licensing perspective, as we already had most of what we needed license-wise in-house. The kicker was that we were going to need to stitch together MS reporting services with SQL Server data with Sharepoint to get to a fully realized solution, so it was going to take some development chops to get it working and maintain it. We’d also thought about trying to use something like XLCubed…but didn’t really get to the formal exploration stage there.

  10. Jose;

    I am the co-worker that left Bulldog as well, that Tim speaks of… Tim hits the nail on the head. MS has an extremely promising tool and continues to make it’s way up the Gardner Magic Quadrant. With SQL Server 2005 and Office 2007, it is becoming a little easier to create solutions (I have done so at two separate companies now within the last year.) The reason why MS is a good solution is that most of us have MS in house or can purchase these tools cheaper and easier than other solutions because most companies would spring for MS tools a lot quicker than other systems. Plus, you have better support services for MS in house right now (am assuming). Furthermore, unless you have gone through a good period of prototyping of what you want to dashboard out (like Tim has done above), an expensive software system is of little use, especially if you haven’t exhausted MS capabilities.

    Having said that, if you are ready to take the next step, MS can pose some problems.

    MS is strong in pulling and pushing of the data (SQL Server 2005) but the delivery can become an issue. Dashboards in Excel can be super-robust by using some good technique, but to try and build on a global scale takes a LOT of time (imagine counting pixels all the time versus having WYSIWYG with another tool) and patience. In short, with MS as Tim stated, you are stitching pieces together in order to deliver a full solution and you will eventually need a lot of development to make things work. This is very similar to an open source solution. Therefore, even though the price is lower than a traditional BI tool (not by huge amount, either) on the outset, you have to weigh it with what resources you will need to bring in to make a full solution. Put on top of that MS notorious reputation for bugs, and I tend to lean over to a different solution (seriously, if anyone wants to argue this, have them spend every waking moment building a solution in MS and watch them sweat when implementing it!)

    My advice, build prototypes in MS. You can change and update these pretty quick. At the same time, spec out the requirements you are going to need from the tool to get what you want. If MS can do this for you, by all means, use it, but if you find that it can’t, then time to start looking for another tool. The key is to have what you are trying to build, speced out already. If not, any good salesmen will show you a lot of things that have nothing to do with what you are trying to do!

  11. @Tim: Thanks (again) for sharing your experiences / you are really honoring your Architecture background on Business Intelligence.
    Sharepoint Hmmmm! Hadn’t consider how the effort grows in integrating RS vs just uploading XL… thanks for that tidbit.
    Also, can you please forward to JT my e-mail address.

    @JT: Is Great to have you to joining in the conversation, and your insight. I’m currently switching from MS Access to SQL Server 2008 (Express of Course) to improve scalability. I haven’t used Reporting Services but it definitely looks good on paper… Definitely the built-in integration with SAP Net weaver BI. Wish their report designer was more like Excel and less like a report writing tool like Crystal Reports (Darn!)… Nevertheless I’ll give it a try but your comment on the extra effort makes a lot of sense.
    That said I’d love to continue the conversation. If you would please send me through e-mail the best way to contact you.

    Cheers!

    Jose

Leave your Comment


« »