Tag Archives: financial crises

Challenging the Role of Capital Adequacy using Historical Data

Bank Capital Redux: Solvency, Liquidity, and Crisis
By Òscar Jordà (Federal Reserve Bank of San Francisco and University of California Davis), Bjorn Richter (University of Bonn), Moritz Schularick (University of Bonn) and Alan M. Taylor (University of California Davis).

Abstract: Higher capital ratios are unlikely to prevent a financial crisis. This is empirically true both for the entire history of advanced economies between 1870 and 2013 and for the post-WW2 period, and holds both within and between countries. We reach this startling conclusion using newly collected data on the liability side of banks’ balance sheets in 17 countries. A solvency indicator, the capital ratio has no value as a crisis predictor; but we find that liquidity indicators such as the loan-to-deposit ratio and the share of non-deposit funding do signal financial fragility, although they add little predictive power relative to that of credit growth on the asset side of the balance sheet. However, higher capital buffers have social benefits in terms of macro-stability: recoveries from financial crisis recessions are much quicker with higher bank capital.

URL: http://econpapers.repec.org/paper/nbrnberwo/23287.htm

Distributed by NEP-HIS on: 2017-05-07

Review by Tony Gandy (London Institute of Banking and Finance)

In 1990-1991 I started a new job, having nearly completed my PhD (which I fully admit I took longer than it should). I joined The Banker, part of the Financial Times group, and proceeded to cover bank statistics, research and bank technology (the latter being a bit of a hobby). Thanks to the fine work of my predecessor, Dr. James Alexander, we had been through a statistical revolution and had revamped our Top 1000 listings of the world’s biggest banks, moving to a ranking based on capital rather than assets. This was the zeitgeist of the moment; what counted was capital, an indicator of capacity to lend and absorb losses. We then also ranked banks by the ratio of loss absorbing capital to total assets to show which were the “strongest” banks. We were modeling this on the progress made by the Basel Committee on Banking Supervision in refocusing banking resilience on to this important ratio, so called capital adequacy and the acknowledging the development and launch of the original Basel Accord.

All well and good, the role of capital was to absorb losses. It seemed on the face of it, that whichever bank had the most capital, and which ever could show the best capital adequacy ratio was clearly the most robust, prudent and advanced manager of risk and the one able to take on more business.

As the years progressed, Basel 1.5, II, 2.5, III and, arguably, IV have each added to or detracted from the value of capital as a guide to robustness. However, the principle still seemed to stand that, if you had a very large proportion of capital, you could absorb greater losses making the bank and the wider economic system more robust. Yes, OK there were weaknesses. Under the original Accord, only the only risk being worries about was credit risk and in only a very rudimentary way. This seemed odd given that one of the events which led to the Basel Accord was the failure and subsequent market meltdown caused by the failure of Bankhaus Herstatt [1] (Goodhart 2011), but it was hard to see how that was in isolation a credit event. Nevertheless, through all the subsequent crises and reforms to the Basel Accords, the principle that a higher proportion of quality capital to assets held by a bank was a good thing.

Jordà, Richter, Schularick and Taylor challenge the assumption that greater capital adequacy can deflect crisis, though they do find that higher initial capital ratios have a great benefit in the post crisis environment. In this working paper, Jordà et al. create a data set focusing on the liability side of bank balance sheets covering a tight definition of Common Equity Tier 1 capital (paid up capital, retained profit and disclosed reserves), deposits and non-core funding (wholesale funding). This is a powerful collection of numbers. They have collated this data for 14 advanced economies from 1870 through to 2013 and for three others for a slightly shorter period.

One note is that it would have been interesting to see a little more detail on the sources of the data used. Journal papers and academic contributions are acknowledged throughout, but other sources are covered by “journal papers, central bank publications, historical yearbooks from statistical offices, as well as archived annual reports from individual banks”. Bank statistics can be a complex area, some sources have sometimes got their definitions wrong (one annual listing of bank capital had an erratum which was nearly as long as the original listings, not mine I hasten to add and maybe my memory, as a rival to that publication, somewhat exaggerates!), so a little more detail would be useful. Also, further discussion of the nature of disclosed reserves would be interesting as one of the key concerns of bank watchers in the past has been the tendency of banks to not disclose reserves or their purposes.

Jordà et al.’s findings are stark. Firstly, and least surprisingly, bank leverage has greatly increased. The average bank capital ratio in the dataset shows that in early period it hovered at around 30% of unadjusted assets, falling to 10% in the post war years and more recently hovering around 5-10%.


Source: Jordà et al. (2017)

Next, they consider the relevance of capital adequacy as a protection for banks and a predictor of a banking system’s robustness; does a high, prudent, level of capital reduce the chances of a financial crisis? The authors note the traditional arguments that higher levels of capital could indicate a robust banking system able to absorb unexpected losses and thus reducing the chance of a financial crisis, but also note that high capital levels could equally indicate a banking system taking greater risks and therefore needing greater amounts of capital to survive the risks. They find no statistical link between higher capital ratios and lower risk of systemic financial crisis, indeed, they find limited evidence that it could be the reverse. It’s worth noting a second time: Increasing capital ratios do not indicate a lower risk of a financial crisis

The authors do note, however, that high levels and rapidly increasing loan-to-deposit ratios are a significant indicator of future financial distress. Clearly, funding a bubble is a bad idea, though it can be hard to resist.

However, capital can have a positive role. The paper finds that systems which start with higher levels of leverage (and consequently lower capital ratios) will find recovery after a crisis harder as banks struggle to maintain solvency and liquidate assets at a greater rate. Thus, while a high capital adequacy ratio may not be a protection against a systemic crisis, it can provide some insight into the performance of an economy after a crunch as banks with higher capital ratios may not face the same pressure to sell and further deflate asset prices and economic activity. Therefore, capital can have a positive role!


Source: Jordà et al (2017)

I won’t pretend to understand fully the statistical analysis presented in this paper, however, while many, including those at the Basel Committee, have recognised the folly of tackling only prudential control through a purely credit risk-focus on capital adequacy and have introduced new liquidity, leverage and scenario planning structure to deflect other routes to crisis. Nevertheless, Jordà et al. provide a vital insight into what is still the very core of the prudential control regime: the value, or not, of capital in providing protection to banks and banking systems. Its role may not be what we expected, its value being in a post-crisis environment and not a pre-crisis environment where higher requirements could have been expected to head-off problems. Instead they find that it is credit booms and indicators of them, such as rapidly rising Loan to Deposit ratios which are better indicators of looming crisis, and capital is more relevant to making brief the impact of an unravelling bubble.

On a more practical note, this fascinating paper offer those who teach prudential regulation to bankers or students a wealth of data and challenges to consider, a welcome resource indeed.


[1] The other main response was the more appropriate formation of the first netting services and then the Continuously Linked Settlement Bank as a method of improving operations to remove the risk which became known as “Herstatt Risk”.

Goodhart, Charles (2011) The Basel Committee on Banking Supervision: a history of the early years, 1974–1997. Cambridge University Press, Cambridge, UK


Lessons from ‘Too Big to Fail’ in the 1980s

Can a bank run be stopped? Government guarantees and the run on Continental Illinois

Mark A Carlson (Bank for International Settlements) and Jonathan Rose (Board of Governors of the Federal Reserve)

Abstract: This paper analyzes the run on Continental Illinois in 1984. We find that the run slowed but did not stop following an extraordinary government intervention, which included the guarantee of all liabilities of the bank and a commitment to provide ongoing liquidity support. Continental’s outflows were driven by a broad set of US and foreign financial institutions. These were large, sophisticated creditors with holdings far in excess of the insurance limit. During the initial run, creditors with relatively liquid balance sheets nevertheless withdrew more than other creditors, likely reflecting low tolerance to hold illiquid assets. In addition, smaller and more distant creditors were more likely to withdraw. In the second and more drawn out phase of the run, institutions with relative large exposures to Continental were more likely to withdraw, reflecting a general unwillingness to have an outsized exposure to a troubled institution even in the absence of credit risk. Finally, we show that the concentration of holdings of Continental’s liabilities was a key dynamic in the run and was importantly linked to Continental’s systemic importance.

URL: http://EconPapers.repec.org/RePEc:bis:biswps:554

Distributed on NEP-HIS 2016-4-16

Review by Anthony Gandy (ifs University College)

I have to thank Bernardo Batiz-Lazo for spotting this paper and circulating it through NEP-HIS, my interest in this is less research focused than teaching focused. Having the honour of teaching bankers about banking, sometimes I am asked questions which I find difficult to answer. One such question has been ‘why are inter-bank flows seen as less volatile, than consumer deposits?’ In this very accessible paper, Carlson and Rose answers this question by analysing the reality of a bank run, looking at the raw data from the treasury department of a bank which did indeed suffer a bank run: Continental Illinois – which became the biggest banking failure in US history when it flopped in 1984.


For the business historian, the paper may lack a little character as it rather skimps over the cause of Continental’s demise, though this has been covered by many others, including the Federal Deposit Insurance Corporation (1997). The paper briefly explains the problems Continental faced in building a large portfolio of assets in both the oil and gas sector and developing nations in Latin America. A key factor in the failure of Continental in 1984, was the 1982 failure of the small bank Penn Square Bank of Oklahoma. Cushing, Oklahoma is the, quite literally, hub (and one time bottleneck) of the US oil and gas sector. The the massive storage facility in that location became the settlement point for the pricing of West Texas Intermediate (WTI), also known as Texas light sweet, oil. Penn Square focused on the oil sector and sold assets to Continental, according the FDIC (1997) to the tune of $1bn. Confidence in Continental was further eroded by the default of Mexico in 1982 thus undermining the perceived quality of its emerging market assets.

Depositors queuing outside the insolvent Penn Square Bank (1982)

Depositors queuing outside the insolvent Penn Square Bank (1982)

In 1984 the failure of Penn would translate into the failure of the 7th largest bank in the US, Continental Illinois. This was a great illustration of contagion, but contagion which was contained by the central authorities and, earlier, a panel of supporting banks. Many popular articles on Continental do an excellent job of explaining why its assets deteriorated and then vaguely discuss the concept of contagion. The real value of the paper by Carlson and Rose comes from their analysis of the liability side of the balance sheet (sections 3 to 6 in the paper). Carlson and Rose take great care in detailing the make up of those liabilities and the behaviour of different groups of liability holders. For instance, initially during the crisis 16 banks announced a advancing $4.5bn in short term credit. But as the crisis went forward the regulators (Federal Deposit Insurance Corporation, the Federal Reserve and the Office of the Comptroller of the Currency) were required to step in to provide a wide ranging guarantee. This was essential as the bank had few small depositors who, in turn, could rely on the then $100,000 depositor guarantee scheme.


It would be very easy to pause and take in the implications of table 1 in the paper. It shows that on the 31st March 1984, Continental had a most remarkable liability structure. With $10.0bn of domestic deposits, it funded most of its books through $18.5bn of foreign deposits, together with smaller amounts of other wholesale funding. However, the research conducted by Carlson and Rose showed that the intolerance of international lenders, did become a factor but it was only one of a number of effects. In section 6 of the paper they look at the impact of funding concentration. The largest ten depositors funded Continental to the tune of $3.4bn and the largest 25 to $6bn dollars, or 16% of deposits. Half of these were foreign banks and the rest split between domestic banks, money market funds and foreign governments.

Initially, `run off’, from the largest creditors was an important challenge. But this was related to liquidity preference. Those institutions which needed to retain a highly liquid position were quick to move their deposits out of Continental. One could only speculate that these withdrawals would probably have been made by money market funds. Only later, in a more protracted run off, which took place even after interventions, does the size of the exposure and distance play a disproportionate role. What is clear is the unwillingness of distant banks to retain exposure to a failing institution. After the initial banking sector intervention and then the US central authority intervention, foreign deposits rapidly decline.

It’s a detailed study, one which can be used to illustrate to students both issues of liquidity preference and the rationale for the structures of the new prudential liquidity ratios, especially the Net Stable Funding Ratio. It can also be used to illustrate the problems of concentration risk – but I would enliven the discussion with the addition of the more colourful experience of Penn Square Bank- a banks famed for drinking beer out of cowboy boots!


Federal Deposit Insurance Corporation, 1997. Chapter 7 `Continental Illinois and `Too Big to Fail’ In: History of the Eighties, Lessons for the Future, Volume 1. Available on line at: https://www.fdic.gov/bank/historical/history/vol1.html

More general reads on Continental and Penn Square:

Huber, R. L. (1992). How Continental Bank outsourced its” crown jewels. Harvard Business Review, 71(1), 121-129.

Aharony, J., & Swary, I. (1996). Additional evidence on the information-based contagion effects of bank failures. Journal of Banking & Finance, 20(1), 57-69.

On Macroeconomics After the Financial Crisis

Short-Run Macro After the Crisis: The End of the “New” Neoclassical Synthesis?

By Oliver Landmann (Albert-Ludwigs-University Freiburg)

Abstract: The Financial Crisis of 2008, and the Great Recession in its wake, have shaken up macroeconomics. The paradigm of the “New” Neoclassical Synthesis, which seemed to provide a robust framework of analysis for short‐run macro not long ago, fails to capture key elements of the recent crisis. This paper reviews the current reappraisal of the paradigm in the light of the history of macroeconomic thought. Twice in the past 80 years, a major macroeconomic crisis led to the breakthrough of a new paradigm that was to capture the imagination of an entire generation of macroeconomists. This time is different. Whereas the pre‐crisis consensus in the profession is broken, a sweeping transition to a single new paradigm is not in sight. Instead, macroeconomics is in the process of loosening the methodological straightjacket of the “New” Neoclassical Synthesis, thereby opening a door for a return to its original purpose: the study of information and coordination in a market economy.

Persistent Link: http://EconPapers.repec.org/RePEc:fre:wpaper:27?

Reviewed by Catherine Dorman (final-year BSc Business Economics student, Bangor University, Wales)


This paper was distributed by NEP-HIS on 2014-02-08, and it addresses the impact that the recent financial crisis has had upon macroeconomic thought. Specifically in terms of how the New Neoclassical Synthesis has held up to scrutiny following the most recent economic debacle. Landmann offers an overview of the history and progression of macroeconomic thought from the “Keynesian revolution” (p.4) to New Neoclassical Synthesis economics, right up to modern day contemporary economics, and its response to current macroeconomic issues.

The purpose of Landmann’s paper is to explain how economics has evolved since the Keynesian school of thought emerged in the aftermath of the 1930s depression, and to show how the macroeconomic community has been left splintered as a result of the recent financial crisis, without a consensus in sight. It asks the questions: Why has this occurred? How did the New Neoclassical Synthesis fail to foresee or explain the worst economic downturn since the 1930s? Finally, it asks the all-important question: Is it necessarily a bad situation to be in? Or has having smashed the previous concept to pieces resulted in an environment in which macroeconomics can really explore and develop itself without the shackles of archaic and contextually inapplicable economic theory?

Prof. Dr. Oliver Landmann -Bild Schneider

Landmann introduces his paper by assessing the state of macroeconomic affairs, operating within a New Neoclassical Synthesis environment, in the run up to the financial crisis of 2008. The ‘Great Moderation’, described a period of economic constancy spanning from the 1980s to 2008, which was characterized by a continually stable business cycle (Davies and Kahn, 2008). Famously, Ben Bernanke, who coined the phrase ‘Great Moderation’, is quoted as having attributed this period of economic success to structural change, improved macroeconomic policies, and good luck (Bernanke, 2004). Ultimately, Landmann describes a period in which the great moderation had lulled the economic community into a false sense of stability, much like that described by Hymen Minsky (Minsky, 1992).

The next section of the paper is dedicated to creating a contextual understanding, and this is achieved through showing the evolution of economics thought from Keynes to the New Neoclassical Synthesis.

Consider Fig 1 for a brief overview of the changes of economic thought from the 1930s to 2008:

Fig. 1

As is evident across each of these theories, their explanatory power tends to be relatively finite. In the case of Adam Smith and John Keynes’ theories, they were deconstructed and meshed in order to explain the economy’s operations at a specific point in time, and this came to be known as the Neoclassical Synthesis. This was largely credited to the work of Paul Samuelson during the 1950s (Samuelson, 1955). It took the underlying idea of Keynesian theory of underemployment, with the notion that monetary and fiscal policy can be employed to reduce this. It could therefore use classical equilibrium analysis to explain resource allocation and relative prices (p4). The economic policy was successfully adopted in developed countries as an effective treatment for the economy after the Second World War.

It was from the stability and growth that was created through the adoption of this macroeconomic approach, which helped to develop confidence in the prescriptive capabilities of economic theory. However, as history has taught us, ceteris paribus does not hold in reality. The theory was largely nullified in the 60’s and 70’s, because it had been unable to predict stagflation, and the Philips Curve was completely undermined (Motyovszki, 2013).
Consider Fig 2 for a concise history of the economic theory covered in this paper.

Fig. 2
Figure 2
(Source: Short-Run Macro After the Crisis: The End of the “New” Neoclassical Synthesis? By Oliver Landmann.)

The result of this was a new hybrid economic theory: New Classical economics. From this theory came the Real Business Cycle model, which argued that cycles result from the reactions of optimizing agents to real disturbances, for example, changes in technology.
In the 1970s, the New Neoclassical Synthesis emerged, with a combination of New Keynesianism and New Classical theories, and the basis of economic practice during the Great Moderation. It was felt amongst policy makers that the short term interest rate was enough of an instrument in economic management, and that the business cycle was believed to have been overcome (Aubrey, 2013).

Landmann’s paper addresses how the economic crash of 2008 threw macroeconomics into turmoil. The New Neoclassical Synthesis had not fully appreciated the effects of the financial market within its model, and the result was that it was inadequate as a means of remedying problems in the economy (Pike, 2012). Landmann makes a good point of acknowledging that although financial economics took great consideration of the behavioural antics of the banking sector, within the actual practiced model of the New Neoclassical Synthesis, these were fundamentally disconnected.

In light of this, the once unquestioned macroeconomic doctrine was suddenly under scrutiny. One of the greatest criticisms of the New Neoclassical Synthesis is its reliance upon “elegant” (p12) mathematical equations, which are often predictively insufficient due to the sheer number of assumptions that have to be made in order to create a working model. It doesn’t fully estimate factors such as irrationality and uncertainty (BBC NEWS, 2014) and the result of this is that the results can be wildly inaccurate (Caballero, 2010). This can also create coordination problems from assumptive behavioural models, such as the Robinson Crusoe model, which become overly stylized to the detriment of economic viability (Colander, 2009).

Consequentially, macroeconomics has begun to pay more focus to realistic behaviour, given that information is rarely perfect in actuality (Caballero, 2010; Sen, 1977).

Landmann concludes that out of the financial crisis, there has been a flood of new macroeconomic theories develop, and that the New Neoclassical Synthesis still has pedagogic merit. He does, however, primarily blame the era of Great Moderation for a period of complacency amongst economic academics. The simple acceptance of one concept of economics based purely on its merit during a stable business cycle, without inquisitive forethought into how it would respond when faced with an exogenous or endogenous shock, is Landmann’s greatest criticism.


This paper is incredibly relevant, and its themes and messages are certainly ones that economists need to be considering in the aftermath of such a fresh and colossal economic recession. There is perhaps an over simplification of some of the timeline of economics: broadly defining all economists during the Great Moderation as being one school of thought is unfair and inaccurate, but for the purpose of the paper, it is perhaps forgivable.

Landmann makes little mention of the pattern by which economic thought often evolves. Gul, Chaudhry and Faridi describe economic thought as developing from “quick fixes” (Gul et al. 2014: 11), and this would help to explain why, during the Great Moderation, very little new economic thought was developed: the need wasn’t there. Through their histories of economic development, Gul et al. (2014) and Landmann,suggest that macroeconomics is reactionary as opposed to precautionary, despite its attempts to be prophetic.

This echoes the “Lucas Critique”, the understanding that economic equations developed and implemented during one policy system, are unlikely to remain relevant or explanatorily applicable during another (Lucas, 1976).

Finally, it does little to explore the external factors that led to the period of Great Moderation. Globalisation had really taken a hold during this time, with containerization in full flow (at 90% of all non-bulk cargo worldwide being moved by containers on transport ships (C. E. Ebeling, 2009)), and advances in computation and communication technology (Bernanke, 2004) which helped to stabilize inventory stocks – something that is acknowledged as a contributory factor in cyclical fluctuations (McConnel and Quiros, 2000).

Ultimately, the paper makes the same conclusions that most macroeconomic papers do. There is no definitive explanation for everything that occurs within the economy, and certainly no blanket approach that will procure the most lucrative outcomes on every occasion. This paper goes a step further to explain why it can be damaging to rigidly subscribe to one theory of macroeconomics: it discourages continual change and forethought, which in turn can stunt the evolution of explanatory macroeconomic thought.


Aubrey, T., 2013. Profiting from Monetary Policy: Investing Through the Business Cycle. 1 ed. New York: Palgrave MacMillan.

BBC NEWS, 2014. Did Hyman Minsky find the secret behind financial crashes?. Available at: http://www.bbc.co.uk/news/magazine-26680993 [Accessed 07 April 2014].

Bernanke, B. S., 2004. Remarks by Governor Ben S. Bernanke At the Meeting of the Eastern Economics Association Available at: http://www.federalreserve.gov/Boarddocs/Speeches/2004/20040220/ [Accessed 07 April 2014]

Ebeling, C. E. 2009. Evolution of a Box. Invention and Technology 23(4): 8-9.

Caballero, R. J., 2010. Macroeconomics After the Crisis: Time to Deal with the Pretense-of-Knowledge Syndrome. Journal of Economic Perspectives 24(4): 85-102.

Colander, D. C. et al., 2009. The Financial Crisis and the Systemic Failure of Academic. Kiel: Kiel Institute for the World Economy.

Davies, S. J., and Kahn, J.A., 2008. Interpreting the Great Moderation: Changes in the Volatility of Economic Activity at the Macro and Micro Levels. Cambridge, MA: National Bureau of Economic Research. Available at: http://EconPapers.repec.org/RePEc:nbr:nberwo:14048 [Accessed 07 April 2014]

Gul, E., Chaudhry, I. S. and Faridi, M. Z., 2014. The Classical-Keynesian Paradigm: Policy Debate in Contemporary Era. Munich: Munich Personal RePEc Archive. Available at: http://econpapers.repec.org/paper/pramprapa/53920.htm [Accessed 07 April 2014]

Lucas, R. E., 1976. Econometric Policy Evaluation: A Critique. Carnegie‐Rochester, Carnegie‐Rochester Conference.

McCombie, J. S. L., and Pike, M., 2012. The End of the Consensus in Macroeconomic Theory? A Methodological Inquiry. Unpublished. Cambridge Centre for Economic and Public Policy WP02-12, Department of Land Economy: University of Cambridge. Available at: http://www.landecon.cam.ac.uk/research/real-estate-and-urban-analysis/ccepp/copy_of_ccepp-publications/wp02-12.pdf [Accessed 07 April 2014]

McConnell, M. M., and Perez Quiros, G., 2000. Output Fluctuations in the United States: What Has Changed Since the Early 1980s?. Federal Reserve Bank of San Francisco. Available at: http://www.frbsf.org/economic-research/events/2000/march/structural-change-monetary-policy/output.pdf [Accessed 07 April 2014]

Minsky, H. P., 1992. The Financial Instability Hypothesis. New York: The Jerome Levy Economics Institute of Bard College.

Motyovszki, G., 2013. The Evolution of the Phillips Curve Concepts and Their Implications for Economic Policy. Budapest: Central European University.

Samuelson, P., 1955. Economics. 3rd ed. New York: McGraw-Hill.

Sen, A. K., 1977. Rational Fools: A Critique of the Behavioral Foundations of Economic Theory. Philosophy and Public Affairs. 6(4): 317-344.

Must we question corporate rule?

Financialization of the U.S. corporation: what has been lost, and how it can be regained

William Lazonick (University of Massachusetts-Lowell)

The employment problems that the United States now faces are largely structural. The structural problem is not, however, as many economists have argued, a labor-market mismatch between the skills that prospective employers want and the skills that potential workers have. Rather the employment problem is rooted in changes in the ways that U.S. corporations employ workers as a result of “rationalization”, “marketization”, and “globalization”. From the early 1980s rationalization, characterized by plant closings, eliminated the jobs of unionized blue-collar workers. From the early 1990s marketization, characterized by the end of a career with one company as an employment norm, placed the job security of middle-aged and older white-collar workers in jeopardy. From the early 2000s globalization, characterized by the movement of employment offshore, left all members of the U.S. labor force, even those with advanced educational credentials and substantial work experience, vulnerable to displacement. Nevertheless, the disappearance of these existing middle-class jobs does not explain why, in a world of technological change, U.S. business corporations have failed to use their substantial profits to invest in new rounds of innovation that can create enough new high value-added jobs to replace those that have been lost. I attribute that organizational failure to the financialization of the U.S. corporation. The most obvious manifestation of financialization is the phenomenon of the stock buyback, with which major U.S. corporations seek to manipulate the market prices of their own shares. For the decade 2001-2010 the companies in the S&P 500 Index expended about $3 trillion on stock repurchases. The prime motivation for stock buybacks is the stock-based pay of the corporate executives who make these allocation decisions. The justification for stock buybacks is the erroneous ideology, inherited from the conventional theory of the market economy, that, for superior economic performance, companies should be run to “maximize shareholder value”. In this essay I summarize the damage that this ideology is doing to the U.S. economy, and I lay out a policy agenda for restoring equitable and stable economic growth.

URL http://econpapers.repec.org/paper/pramprapa/42307.htm.

Review by Bernardo Bátiz-Lazo

As I have noted before (see Bátiz-Lazo and Reese, 2010), financialisation has been coined to encompass greater involvement of countries, business and people with financial markets and in particular increasing levels of debt (i.e. leverage). For instance, Manning (2000) has used the term to describe micro-phenomena such as the growth of personal leverage amongst US consumers.

In their path breaking study, Froud et al. (2006) use the term to describe how large, non-financial, multinational organisations come to rely on financial services rather than their core business for sustained profitability. They document a pattern of accumulation in which profit making occurs increasingly through financial channels rather than through trade and commodity production.

Instead, in the preface to his edited book, Epstein (2005) notes the use of the term as the ascendancy of “shareholder value” as a mode of corporate governance; or the growing dominance of capital market financial systems over bank-based financial systems.

Alternative view is offered by American writer and commentator Kevin Phillips, who coined a sociological and political interpretation of financialisation as “a process whereby financial services, broadly construed, take over the dominant economic, cultural, and political role in a national economy.” (Phillips 2006, 268). The rather narrow point I am making here and which I fail to elaborate for space concerns, is that ascertaining the essential nature of financialisation is highly contested and is in need of attention.

Sidestepping conceptual issues (and indeed ignoring a large number of contributors to the area), in this paper William Lazonick adopts a view of financialization cum corporate governance and offers broad-base arguments (many based on his own previous research) to explore a relatively recent phenomenon: the demise of the middle class in the US in the late 20th century. In this sense, the abstract is spot on and the paper “does what it says on the can”. Yet purist would consider this too recent to be history. Indeed, the paper was distributed by nep-hme (heterodox microeconomics) on 2012-11-11 rather than NEP-HIS. This out of neglect rather than design but goes on to show that the keywords and abstract were initially not on my radar.

William Lazonick

Others may find easy to poke the broad-stroke arguments that support Lazonick’s argument. Yet the article was honoured with the 2010 Henrietta Larson Article Award for the best paper in the Business History Review and was part of a conference organised by Lazonick at the Ford Foundation in New York City on December 6-7, 2012 (see program at the Financial Institutions for Innovation and Development website).

Lazonick points to the erotion of middle class jobs in a period of rapid technological change. This at a time when others question whether the rate of innovation can continue (see for instance The great innovation debate). Lazonick implicitly considers our age as the most innovative ever. But his argument is that the way in which the latest wave of innovation was financed is at the hear of the accompanying ever-growing economic inequality.

So for all its short comings, Lazonick offers a though provoking paper. One that challenges business historians to link with discussions elsewhere and in particular corporate governance, political economy and the sociology of finance. It can, potentially, launch a more critical stream of literature in business history.


Bátiz-Lazo, B. and Reese, C. (2010) ‘Is the future of the ATM past?’ in Alexandros-Andreas Kyrtsis (ed.) Financial Markets and Organizational Technologies: System Architectures, Practices and Risks in the Era of Deregulation, Basignstoke: Palgrave-Macmillan, pp. 137-65.

Epstein, G. A. (2005). Financialization and The World Economy. Cheltenham, Edward Elgar Publishing.

Froud, J., S. Johal, A. Leaver and K. Williams (2006). Financialization and Strategy: Narrative and Numbers. London, Routledge.

Manning, R. D. (2000). Credit Card Nation. New York, Basic Books.

Phillips, K. (2006). American Theocracy: The Peril and Politics of Radical Religion, Oil, and Borrowed Money in the 21st Century. London, Penguin.

“Nobody said it would be easy, and nobody was right.” On the (Im)possibilities of International Policy Coordination

International Policy Coordination: The Long View

Barry Eichengreen (eichengr@econ.berkeley.edu), University of California at Berkeley (United States)

URL: http://d.repec.org/n?u=RePEc:nbr:nberwo:17665&r=his

Abstract: This paper places current efforts at international economic policy coordination in historical perspective. It argues that successful cooperation is most likely in four sets of circumstances. First, when it centers on technical issues. Second, when cooperation is institutionalized – when procedures and precedents create presumptions about the appropriate conduct of policy and reduce the transactions costs of reaching an agreement. Third, when it is concerned with preserving an existing set of policies and behaviors (when it is concerned with preserving a policy regime). Fourth, when it occurs in the context of broad comity among nations. These points are elaborated through a review of 150 years of historical experience and then used to assess the scope for cooperative responses to the current economic crisis.

Review by: Manuel Bautista González

“The question is whether those who talk the talk also walk the walk.” (Eichengreen 2011: 1)

Barry Eichengreen

Financial turmoil in the European Union has been increasing in the last months. According to The Economist, credit in the eurozone is tighter than it was in the worst months after the Lehman bankruptcy. “Forget about a rescue in the form of the G20, the G8, the G7, a new European Union Treasury, the issue of Eurobonds, a large scale debt mutualization scheme, or any other bedtime story. We are each on our own”, wrote Simon Johnson and Peter Boone earlier this week (Johnson and Boone 2012). Paul Krugman has brought attention to the horrific consequences of the defeat of the European monetary experiment: “Failure of the euro would amount to a huge defeat for the broader European project, the attempt to bring peace, prosperity and democracy to a continent with a terrible history. It would also have much the same effect that the failure of austerity is having in Greece, discrediting the political mainstream and empowering extremists” (Krugman 2012).

It is in this context that this paper written by Barry Eichengreen and distributed by NEP-HIS on 2012-01-03 is an opportune “breathless historical review” (Eichengreen 2011: 29) of past attempts of international policy coordination in monetary, fiscal and financial matters from the last quarter of the nineteenth century to our days. In so doing, Eichengreen provides an interesting narrative centered in politics and institutions that complements optimally a reading of his classical work on the history of the international monetary system and global capital markets (Eichengreen 2008) as well as his most recent account of the US dollar as a dominant international currency (Eichengreen 2011b).

Continue reading

The European Debt Crisis in an American Fiscal Mirror

Fiscal federalism: US history for architects of Europe’s fiscal union

By C. Randall Henning (henning@piie.com) and Martin Kessler (mkessler@piie.com)

URL: http://d.repec.org/n?u=RePEc:bre:esslec:669&r=his

Abstract: European debates over reform of the fiscal governance of the euro area frequently reference fiscal federalism in the United States. The “fiscal compact” agreed by the European Council during 2011 provided for the introduction of, among other things, constitutional rules or framework laws known as “debt brakes” in the member states of the euro area. In light of the compact and proposals for deeper fiscal union, we review US fiscal federalism from Alexander Hamilton to the present. We note that within the US system the states are “sovereign”: The federal government does not mandate balanced budgets nor, since the 1840s, does it bail out states in fiscal trouble. States adopted balanced budget rules of varying strength during the nineteenth century and these rules limit debt accumulation. Before introducing debt brakes for euro area member states, however, Europeans should consider three important caveats. First, debt brakes are likely to be more durable and effective when “owned” locally rather than mandated centrally. Second, maintaining a capacity for countercyclical macroeconomic stabilization is essential. Balanced budget rules have been viable in the US states because the federal government has a broad set of fiscal powers, including countercyclical fiscal action. Finally, because debt brakes threaten to collide with bank rescues, the euro area should unify bank regulation and create a common fiscal pool for restructuring the banking system.

Review by: Manuel Bautista González

This paper was included in the NEP-HIS report issued on January 18th, 2012, through it C. Randall Henning and Martin Kessler contribute to the debate on fiscal solutions to the current European debt crisis. This by offering insights drawn from the past and present of U. S. fiscal federalism.

Henning and Kessler periodize their historical overview in five moments, namely, the financial reforms enacted after the adoption of the U. S. constitution, the state defaults of the 1840s, the financial troubles of state and local levels during the Reconstruction period, the fiscal instability during the Great Depression, and some recent experiences of state and local troubles from the 1970s to the current economic recession.

Later, in the analytical section of the paper, the authors study the probable adoption of balanced budget rules in the European Union with regards to their political enactment, their diversity across the Union and their effectiveness in preventing fiscal disarray. Henning and Kessler assess the need for (federal) countercyclical policies that complement the procyclical fiscal discipline at the state and local levels. They also review the literature on the relationship between state and local debt and capital and banking markets and offer preliminary conclusions relevant to both policymakers and scholars of monetary unions and fiscal federalism.

Continue reading