Using analytics to make better decisions is taking root in agencies across the government, notes a new report, which explores how a dozen pioneers did it. The report, by the Partnership for Public Service and the IBM Center, also offers advice from leaders of some of those organizations that shows how others can make better decisions using analytics as well.
Investing in, and using, program evaluation has been a hard sell in many agencies for years. While evaluation is important for long-term program assessments, it can be expensive and take years to complete. But with new technology and greater availability of data, many agencies are beginning to take advantage of the value of existing real-time administrative data. This movement is called “data analytics.” And the immediacy of results is appealing to many executives.
In my blog post last year about an earlier 2011 report, From Data to Decisions: The Promise of Analytics, I talked about the importance of investing in trained data analysts in agencies – “data detectives” – who solve problems and tell stories with the data.
But how do you get started? This new report, From Data to Decisions II: Building an Analytics Culture, addresses that question. Its key points:
- Start with a systematic and disciplined approach
- Make analytics the way you do business
- Get the people piece right
To illustrate these points, the Partnership invited several pioneers from federal organizations it interviewed to come and talk to an invited audience about their experiences.
Federal Emergency Management Agency. Carlos Davila is the director for the business management division of FEMA’s Recovery Directorate. Prior to that, he had earned a PhD in history and anthropology and served as a management consultant in the technology industry. His colleagues call his office “Nerd Palace” because of all the analyses it undertakes.
His division mapped out its operations and intended impact using a logic model. Davila launched an analytics program to determine whether services like debris removal, housing, and financial assistance were achieving the desired results. He created a special team to measure the effectiveness of your division and in its initial report it focused on the quantity of services. But he quickly sent them back and asked them to focus on the quality of services and their impact on people. Which they did, and it changed the focus of what the staff saw as their key mission – making it more people-oriented, toward those who experienced losses from a disaster.
He promotes among his staff the need for “analytic thinking, not just process thinking.” He calls the work he does “performance analysis” – not performance measurement or performance assessment. He says this sends a very different symbolic signal to agency staff by engaging them in problem-solving, and that data analytics should not be seen as a “gotcha” exercise.
Transportation Security Administration. Daniel Liddell is the Federal Security Director for TSA at the Syracuse, New York airport, along with spoke operations at six other airports in the mid-state New York region. Prior to working at TSA, he was a combat-decorated Marine for 23 years.
Five years ago, he set out to link staff training and performance, and measure their effects on good security at airports. He had his staff identify the specific skills needed and they identified over 1,300 specific skills necessary to be a successful screener. He engages his front-line managers in diagnosing problems, collecting real-time data (such as detecting fake bombs in luggage), and then doing an immediate “lessons learned” feedback briefing of staff involved.
To some extent, Liddell had adapted what he had learned in the Marines, which they call “hotwashing” as an active, hands-on training and skill-building approach for teams of individuals. He said they started with a set of performance standards that described “where we are now” and compared them with the training standards, that described “where we would ideally like to be.” The gap was where he and his teams targeted their training and analytic efforts. He says he relies on a “bedrock of a systems analysis approach.” He found these efforts significantly improved security officer performance on the front line.
When asked if his approach was being used elsewhere in TSA and he said that it has spread to about 30 other airports so far.
Internal Revenue Service. Dean Silverman is a senior advisor to the Commissioner and is the lead executive for IRS’s new Office of Compliance Analytics. The Commissioner invited him to join the IRS in January 2011 to stand up a new unit focused on data-driven analytics. Prior to that, he was a business school professor and a consultant with a law degree.
He encourages the use of “systems thinking” among staff and pushes them to identify problems, such as identify theft, and to determine when and where it will occur – and what kind of data would be needed to predict and prevent it. The staff then tests their hypotheses in the “real world” via pilot or simulations.
Silverman says the new function he leads is not intended to grow into a large, central office. Rather, he envisions a small staff that serves as specialists on initiatives that would be owned by the different IRS divisions. He sees their role as more of a catalyst, and that in the long run, this would begin changing the culture to be more analytic – “data detectives” – in approach toward the huge volumes of data that they already collect.
He says that by focusing on initiatives (which are finite) and topics that are outside the normal flow of work, that this should help others in IRS to not see his Office as a positive contribution, not a threat. Interestingly, he says the initiatives he takes on are “really big problems, not ones that are easily solved” in order to demonstrate the power of analytics.
He also said that, when creating an analytics function, it is important to distinguish between analytic skills and problem-solving skills. He sees a greater importance in finding talent that has problem-solving skills, where curiosity and higher purpose are the drivers, as a key to a successful analytics program. Interestingly, this seemed to be borne out by the three pioneers – none of whom had a highly statistical or engineering background.
John Kamensky is a Senior Fellow with the IBM Center for The Business of Government, where this article originally appeared. He is also an Associate Partner with IBM’s Global Business Services.