The past three decades have seen many notable changes in the U.S. banking industry, but chief among those has been bank consolidation, with a disproportionate number of smaller banks disappearing through mergers and acquisitions, as well as failures. There were 14,400 commercial banks at the end of the first quarter of 1984 and 7,022 at the end of the fourth quarter of 2008. By first quarter 2018, only 4,852 remained.
The scale of this consolidation has not been seen since the Great Depression. Economics professor Paul Wilson is using high-performance computing to not only analyze some of the factors driving this consolidation, but also to take some of the mystery and uncertainty out of that historical precedent.
Between 1984 and 2008, the average size of U.S. banks increased fivefold in terms of inflation-adjusted total assets, explains Wilson, who spends about six weeks each year conducting research with David Wheelock, vice president and deputy director of research for the Federal Reserve Bank of St. Louis. Their collective work has been mentioned by Federal Reserve chair Ben Bernanke; a high-level official from the Bank of London recently cited their work in a speech.
Together, Wilson and Wheelock are currently distilling more than three decades of data from thousands of banking institutions to help separate perception from reality when it comes to what banks should look like in the next 30 years.
Changes in regulation along with advances in information-processing technology created the environment for the growing size and diminishing number of banks, according to bank executives and industry analysts. These insiders contend that banks must grow larger to work more economically. Critics say recent policies favor large banks, kill competition and create banks “too big to fail.”
Wilson and Wheelock’s findings strike at the heart of this debate, and their answers are being gleaned from vast data sets, consisting of dozens of observable variables: Think observations about consumer, business loans and real estate loans. Think observations about securities and the price of labor services. Think observations about income and condition reports, assets, inflation, and much more. Wilson then breaks down all those observations, not just by year but quarterly within those years.
To quantify the data-crunching process, Wilson’s initial sample consisted of 887,369 quarterly observations on all U.S. commercial banks between 1984 and 2006; the second sample was 868,647 quarterly observations on all commercial banks from 1984 to 2000, as well as a mix of commercial banks and bank holding companies comprising the largest top-tier banking organizations from 2001 to 2006.
If it sounds like a lot of data — that’s because it is. And it’s astronomically more than what’s been done to date. Previous highly regarded statistical studies on the topic have used sample sizes between 300 to 441.
With the Palmetto Cluster’s computing capabilities, Wilson can analyze data samples almost 3,000 times larger than those used in some of the most highly regarded statistical studies of commercial banks and bank holding companies. It’s no wonder Wilson and his Federal Reserve colleague’s findings are making waves near and far, the most recent being this: While size limits or “caps” on banks could end the perceived problem of creating banks “too big to fail,” it could, in reality, have the opposite effect on the economy by increasing the overall cost of banking services, preventing banks from being able to economize.
That means caps could be more costly to the American public than the occasional bailout. It’s controversial, but it’s based on data, which is making experts pay attention.
The ability to crunch all their data on the Palmetto Cluster was key in the findings. To put it in perspective, one of Wilson’s first papers was published in the Journal of Monetary Economics in 2001. At that time, he was at the University of Texas, with access to one of the fastest machines in the world at the time. He was able to estimate only one year at a time because of the limit of the computing technology.
Compare that to 2018: Using the Palmetto Cluster, he processed in the realm of a million observations for a paper he co-authored on banks “too big to fail” that appeared in the Journal of Applied Econometrics.
“What was impossible 18 years ago,” Wilson says, “we can now do at Clemson in a matter of weeks.”