Someone suggested I review the new IBM Center for The Business of Government report on Use of Dashboards in Government by Sukumar Ganapati, Florida International University, pointing out one irony off the bat: There aren’t a lot of examples of dashboard illustrations in this report. So I first decided to create a dashboard of this PDF report in my social knowledgebase and use it to analyze the report, and reference all of my dashboard work relating to most of the examples in this report.
- Recovery.gov
- Data.gov
- IT Dashboard
- Open Government Initiative
- Performance.gov
- The OMB 25-Point Plan
- USAspending.gov
- Human Resources Dashboard
- FDA-TRACK Dashboards
- USPTO’s Data Visualization Center
- USPTO’s Patents Dashboard
- Data quality is key to the credibility
- Best practices resources are necessary in the design and ue of dashboards
- Dashboards are only tools; and
- Effectiveness depends on use
- Dashboards should fit on a single page (or screen)
- Dashboards should be simple
- Dashboards should use the best display medium for communicating data effectively.
For example I used the PDF files of the US EPA Report on the Environment, the DoD Office of Inspector General Semi-Annual Report to Congress, and the US Army’s Weapon Systems 2011, to create dashboards which distilled the findings of the report.
- Mapping a PDF file to a knowledgebase with well-defined web-addresses to all the key parts of the content
- Converting the knowledgebase to a spreadsheet so the data are in an interoperable/reusable format by those wanting to use different dashboard tools
- Make it easy to import the spreadsheet into dashboard tools with a wide variety of statistical and visualization functions for processing in-memory data in the cloud, and
- Use dashboard tools that support both conventional business process management and new dynamic case management (a new methodology beyond so-called agile software development that adds semantics and rules to the data, not to lines of software code).
The process has reminded me of the importance of people behind these principles who understand and know how to create functional dashboards–and why many dashboards fall short of the mark.
Aneesh Chopra said recently “the government needs data science and data scientists.” Well I would like to volunteer to be his first chief data scientist and improve his dashboards!
_________________________________________________________________
Here are the the more detailed summaries of the Four Lessons Learned:
Lesson One: Data Quality is Key to the Credibility of Dashboard Performance Measures
The dashboards in the case studies (especially the cross-agency ones) have faced data quality issues. This compromises dashboard performance measures and could eventually damage the dashboard’s credibility. To overcome some of the data quality issues, standardized data definitions and training of key agency personnel are required. Adopting a standard schema, such as the Extensible Business Reporting Language (XBRL) used in business applications, for federal financial dashboards such as Recovery.gov or USAspending.gov would enhance data quality and reporting efficiency.
Lesson Two: Best Practices Resources Are Necessary in the Design and Use of Dashboards
Agencies have different design approaches to dashboards. Whereas the USPTO dashboards are visually rich, the FDA-TRACK dashboards are essentially tables. The Recovery.gov and
USAspending.gov dashboards feature maps. Although design may be idiosyncratic and vary based on technical capacity within the organization, a set of best practices or standards would enhance design quality. The Usability.gov website, developed a decade ago, enhanced government websites by providing standardized guidelines. A website for standardizing dashboards or giving best practices would be equally useful. Focus group feedback would assist in enhancing the usability of the dashboards as would the creation of communities of practice within government.
Lesson Three: Performance Measures Should Reflect Organization Goals
Performance measures differ based on agency needs. Cross-agency dashboards have common measures. The essential approach should be to align performance measures to organizational goals. This increases the usability of dashboards. Responding to different audiences requires reporting different performance metrics. Indeed, performance measures in some dashboards (e.g., Recovery.gov, USPTO’s Data Visualization Center, FDA-TRACK) evolved in response to different audiences’ needs.
Lesson Four: Dashboards are Only Tools; Effectiveness Depends on Use
Dashboards are only tools to visualize performance data. Their effectiveness depends on how organizations use them to enhance internal performance and external accountability and transparency. Organizations should be cognizant of both the strengths and weaknesses of dashboards. Dashboards need to be useful to the organization’s purposes. In internal organizational management, this implies that dashboards are used in the decision-making process (e.g., the face-to-face sessions based on the Federal IT dashboard and FDA-TRACK to identify weak projects). At the external accountability level, use of dashboards means that agencies are exposing their performance metrics to public scrutiny. In this context, both the dashboard performance measures and the underlying data need to be publicly accessible for credible organizational accountability.