This and other topics were covered on the August 25 episode of “Federal Spending,” a new Web-based show aimed at tracking federal dollars. In addition to examining the salient data sets related to the housing crisis, guests also discussed the flawed methods by which congressional organizations monitor GSEs.
In the years prior to the housing collapse, there was no shortage of warning signs. As early as 2004, Alan Greenspan warned that Fannie and Freddie posed a “systemic risk” to the country’s financial system, citing investors’ belief that the government would bail them out if they ran into trouble. And run into trouble they did. As part of the 2008 conservatorship of Freddie and Fannie, the U. S. Treasury was to provide up to $100 billion in subsidies in exchange for periodic dividends on preferred stock. That was quickly amended to allow for unlimited capital over the subsequent three years.
Dr. Clifford Rossi, Executive-in-Residence and Tyser Teaching Fellow at the Robert H. Smith School of Business, University of Maryland, said that while a number of behaviors contributed to the housing bubble, a lack of data quality was certainly one of them. “Data management and analytics during the years preceding the crisis, I would say, they are a very large part of what caused some of the problems,” he said.
The Congressional Budget Office (CBO) is the agency charged with providing Congress with the information and estimates required for the budget process. While the CBO uses a mortgage cash flow simulation to project GSE commitment costs and the stability of the housing market in general, Rossi said it’s the details around the data that tend to produce inaccurate estimates. “You can have the most analytically complex model,” said Rossi, “but if the data underlying that isn’t very strong, then you really haven’t improved your situation much at all.”
Originally, the CBO projected that GSE commitment costs through 2020 would total $291 billion, but the latest estimates push the total up 9% to $317 billion. Rossi said that in the grand scheme of things this is not a lot of money given the size of the mortgage crisis to date. “But it does call into question the way in which the CBO, or for that matter any other governmental agency, is going about putting these estimates together.”
The recently created Office of Financial Research(OFR) is one of the other agencies involved in improving the quality of financial data available to policymakers and regulators. Born from the Wall Street Reform and Consumer Protection Act (Dodd-Frank), the OFR seeks to gather critical information and analysis in the hopes of anticipating and responding to emerging vulnerabilities, such as Standard and Poor’s unprecedented recent credit downgrade. Rossi said one of the things that remains to be done via Dodd-Frank should be a better focus on what the credit ratings agencies bring to the table. “There are often times more subjectivity in preparing ratings by any of those entities than the public has any appreciation for,” he said.
The OFR has the onerous task of not only conducting analysis, but developing data standards in an attempt to make it easier to sort through and organize data. The lack of data definitions can make it impossible to perform meaningful analysis, said Mark Madsen of Third Nature, Inc. “I would hope that the federal agencies that are supposed to monitor, for example, the finance industry and banking industry would define those standards,” said Madsen, “but it seems like they don’t hire anybody who knows anything about data.”
Finding clean data is no easy matter, despite the drive for transparency in government figures. According to Madsen, the publicly available charts and graphs do little to reveal an accurate portrayal of data. “It’s very hard to have public transparency over government spending and debt and the financial industry when you don’t have the details, and as it turns out, even the government doesn’t necessarily have those details,” said Madsen.
According to figures provided by the Mortgage Banker’s Association, foreclosures are on the decline. That is true if only two consecutive quarters are being compared, but when looking at the past year, foreclosures are only down .01%, hardly enough to consider an improvement. Shallow comparisons, misleading numbers and sweeping estimates can paint quite a different picture from reality.
“If we had the details, then 100,000 people could be out there poking around, looking at stuff, as opposed to deriving metrics and everybody being in the dark about what’s in there,” said Madsen. “But maybe that’s what everybody’s afraid of.”
And so again, it comes down to data quality, a practice that host Eric Kavanagh said can only be achieved by using data that is comprehensive, consistent, relevant and timely. Using information provided by Rick Sherman of Athena IT Solutions, Kavanagh said it’s important to be aware of the misconceptions surrounding data quality: data cleansing will solve the issues; data quality is an IT problem; and data entry is to blame.
“With all these mergers of banking institutions, you sure lose a lot of context and insight on data points as they move through system after system after system,” said Kavangh.
Pre-Show Abstract
The collapse of Fannie Mae and Freddie Mac was a factor in the massive financial meltdown of 2008, though there were many other factors that preceded the insolvency of these Government Sponsored Enterprises (GSEs). Could better data quality have helped avert this disaster? Or better processes? Register for Episode 4 of Federal Spending to learn from Dr. Clifford Rossi, Executive-in-Residence and Tyser Teaching Fellow at the Robert H. Smith School of Business, University of Maryland.
Prior to entering academia, Dr. Rossi had nearly 25 years’ experience in banking and government, having held senior executive roles in risk management at several of the largest financial services companies, including Citigroup, Washington Mutual, Fannie Mae and Freddie Mac. Rossi will take a hard look at the current state of affairs for the GSEs, and what impact they are having on taxpayers. He’ll explore how Congressional Budget Office estimates credit subsidies based on forward-looking projections using a fair value approach, but notes that these values are fraught with data and analytical perils. What kind of scenario modeling can help better predict the ultimate cost of these bailouts? Rossi will also provide a spreadsheet empowered with a Monte Carlo scenario that you can use to make your own assessment.
Guests
Dr. Clifford Rossi, Executive-in-Residence and Tyser Teaching Fellow at the Robert H. Smith School of Business, University of Maryland
Mark Madsen, President of Third Nature, Inc.
Good Quotes
- “It’s not as though no one saw this all coming.”-Kavanagh
- “You can have the most analytically complex model, but if the data underlying that isn’t very strong, then you really haven’t improved your situation much at all.”-Rossi
- “You could have a simulation process that generates a number of very tightly clustered paths that says things are looking very rosy, yet you miss completely the fact that you could see a huge downturn in the market as a result of looking backward in time over this data.”-Rossi
- “These numbers can be very sensitive to the underlying data that you’re using and the parameterization of the model that you’re employing.”-Rossi
- “Data management and analytics during the years preceding the crisis, I would say, they are a very large part of what caused some of the problems.”-Rossi
- “What that led to was an inability, at least on the mortgage side of these businesses, to integrate their platforms seamlessly, and what that led to was virtually an incapability by risk folks, as well as anybody else for that matter, to really understand the profile of the risk they were putting on.” -Rossi
- “There are often times more subjectivity in preparing ratings by any of those
- entities than the public has any appreciation for.” -Rossi
- “It’s one of the things that remains to be worked on coming out of Dodd-Frank is a better focus on what the credit ratings agencies bring to the table.”-Rossi
- “Many companies are turning a blind eye to data quality problems.”-Kavanagh
- “If you don’t have agreement on key definitions, you’re going to get wildly different numbers.”-Kavanagh
- “No matter how well you do MDM, you can rest assured you are losing insight and context every time you move data from one system to another.”-Kavanagh
- “People would actually fall behind on their house payment before their car payments because if their car was repossessed, they couldn’t get to work.”- Madsen
- “Improvements in the badness of something is not the same as something getting better.”-Madsen
- “Aggregates hide the details.”-Madsen
- “You can manage anything by averages, but the average doesn’t tell the story because some places will be worse than others.”-Madsen
- “It’s very hard to have public transparency over government spending and debt and the financial industry when you don’t have the details, and as it turns out, even the government doesn’t necessarily have those details.”-Madsen
- “I would hope that the federal agencies that were establishing, that are supposed to monitor, for example, the finance industry and banking industry, would define those standards, but it seems like they don’t hire anybody who knows anything about data.”-Madsen
- “If we had the details, then 100,000 people could be out there poking around in the details, looking at stuff, as opposed to deriving metrics and everybody being in the dark about what’s in there, but maybe that’s what everybody’s afraid of.”-Madsen
Key Questions
How are the costs for bail out accounted for and what are some of the
issues associated with them?
What assumptions by the Congressional Budget Office (CBO) drive budget estimates and what are some concerns?
How big an issue was data quality and process quality in trying to merge all these books?
Is there a relationship between unemployment and foreclosures?
Good Insights
- In 2004, Alan Greenspan warned that Freddie and Fannie posed a “systemic risk,” and he said both companies had grown much faster because investors think the federal government will bail them out if they get into trouble.
- According to Meeker’s report, the real estate bubble was caused by the government’s push for home ownership, declining interest and personal savings rate, and aggressive borrowing and lending.
- Home prices rose 7% annually from 1997 to 2007, while building costs stayed relatively stable.
- CBO’s 2009 budget projections for Fannie and Freddie show 291b in combined subsidy costs; it is projected to be a total of 389b by 2019.
- As part of the 2008 conservatorship of the GSEs, the US Treasury was to provide up to $100b initially, then it was amended to provide unlimited capital over three years.
- As their balance sheets became negative, quarterly cash infusions were provided by the treasury.
- In return, the US gets periodic dividends on preferred stock it now owns. Cash payments made out to cover losses from mortgage defaults, but it periodically gets dividends. $24b in dividends as of March, 2011.
- The Office of Management and Budget (OMB) does not record credit subsidies of the GSEs; they are kept off the balance sheets and treated as non-government entities.
- They do keep track of cash payments, but they are not accounting for the long-lived potential losses.
- The CBO estimates credit subsidies based on forward-looking projections using a fair value approach, but these are fraught with data and analytical perils.
- They use fairly industry standard analytics.
- It comes down to key assumptions and modeling, but those can create very, very different views of the outcome in terms of how much exposure, or how much contingent liability, exists for both Fannie and Freddie.
- The approaches are analytically rigorous, but it’s the data and the details around the data that drives the ultimate estimation.
- The CBO uses a mortgage cash flow simulation (Monte Carlo Trial) that estimates changes in value of a mortgage pool based on assumptions for interest rates, prepayments, house prices and defaults.
- Home prices end up being a key trigger event for mortgage default.
- The CBO actually simulates a number of possible paths of home prices and interest rates.
- One of the tricks is while it’s again a fairly standard methodology, the problem with it is exactly over what period of time the home price parameters are drawn.
- The average losses will change dramatically based on house price appreciation.
- In 2009, projected GSE commitment costs through 2020 were $291b; the latest estimates in 2011 suggest a 9% increase to $317b.
- In the years preceding the collapse, large banking institutions were gobbling up smaller ones, and that led to a huge undertaking of merging data systems.
- Many companies are turning a blind eye to their data quality problems, perhaps because they mistakenly believe that bad data is the only data quality issue they need to worry about.
- According to the Mortgage Bankers Association, fewer mortgages were in foreclosure or delinquent in the first quarter, but their same data shows that foreclosures are only down. 01% since the same quarter last year.
- Total personal debt is at $11.4 trillion; 71% in mortgages.
- 90% of mortgage holders are current on their home loans.
- Instead of only looking at foreclosures, a leading indicator might be delinquencies, because people who fall three months behind or more tend to default at a much higher rate than those who fall thirty days behind.
- Year over year change demonstrates past to current performance, but it isn’t an indicator of future performance.
- In 2005 Congress passed a law making bankruptcies harder to declare because of the amount of grief they caused to the banks, and the number of bankruptcies plummeted.
- The cause of bankruptcies is that people had more debt than they had income.
- Bankruptcies up, foreclosures, down, that can’t last forever, unless most of the bankruptcies are those who don’t own homes.
- There are high unemployment and low foreclosure rates in Alaska; in the Midwest, it’s the opposite.
- There’s a lot of noise in the data, because lots of government data is estimates.
- The data quality problems in the government and financial sector stem from minimal data standards; agencies and organizations that duplicate work; minimal history or time series; and a lack of detail underlying the aggregates.
- We need to focus on better data, not more data. A lot of the metrics are derived from samples of data.
Resources
Timeline of Financial Meltdown:
http://timeline.stlouisfed.org/pdf/CrisisTimeline.pdf
Details on Stock Market Crash Oct. 1-10, 2008
http://www.money-zine.com/Investing/Stocks/Stock-Market-Crash-of-2008/
Fannie Mae Before the Meltdown: The View from August 2002
http://www.washingtonian.com/articles/people/8593.html
Fed Chief Warns of a Risk to Taxpayers
http://www.nytimes.com/2004/02/25/business/fed-chief-warns-of-a-risk-to-taxpayers.html
Contact Info:
Host: Eric Kavanagh, [email protected] – 512.426.7725
Show Manager: Rebecca Jozwiak, [email protected] – 817.320.3495
Robin Bloor, Chief Analyst, The Bloor Group: [email protected]