Universitetsavisen
Nørregade 10
1165 København K
Tlf: 35 32 28 98 (mon-thurs)
E-mail: uni-avis@adm.ku.dk
—
Science
To prevent the next crisis we need a much better understanding of something economists call 'disequilibrium'. So the argument of Professor Katarina Juselius, head of Copenhagen's new Soros-backed economics centre, in this featured comment for the University Post
For several decades, theoretical models in macroeconomics and finance have been almost exclusively based on the assumptions of the Rational Expectations Hypothesis (REH), the representative agent hypothesis, and the Efficient Market Hypothesis (EMH).
Somewhat simplistically one can say that the REH postulates that economic actors have ‘perfect’ information about a predetermined and complete economic structure. In such a context it is ‘rational’ to base expectations on future outcomes on this model. As the structure is assumed to be known, it is ‘rational’ for every economic actor to use the same model and we end up with the representative agent.
The Efficient Market Hypothesis postulates that financial markets are fully efficient and if undisturbed will efficiently allocate capital and bring prices to their equilibrium values. Hence, there is no need to be concerned about the impact of the financial sector on the real economy and there is no reason to interfere in the financial market by introducing regulations.
So why are these oversimplified assumptions so popular?
The simple answer is ‘mathematical tractability’! For a long period, leading mainstream economists, particularly in the USA, have argued that models have to obey the same mathematical precision that characterizes the natural sciences. The price for mathematical tractability is, however, that the models have to be based on many simplifying assumptions like the ones mentioned above.
But, economics is a social science which differs from natural sciences in the sense that economic outcomes affect and are affected by human behaviour and such ‘reflexive’ behavior is hard to describe with mathematical precision. For sure, not everyone in the profession was convinced that these simplified mathematical models described our complex empirical reality in the best way.
Many methodologically oriented scholars were vigorously debating the role of theory and empirical evidence in macroeconomics but these debates seemed only to have a marginal (if any) effect on the theoretical and empirical practice of the economics profession. The popularity of these models in graduate programs and among editors of top economics journals suggested that economics as a science had finally converged to a state of unanimity both regarding theory and how to apply it to data.
This view was challenged when the financial crisis struck with a suddenness that took most economists, central bankers, and politicians by complete surprise. Obviously, standard models must have lacked important features, as they failed to warn their users about the growing vulnerability of the macro-economy to the behaviour of the financial sector.
After the bankruptcy of Lehman Brothers, and the meltdown of the financial sector that transformed the financial crisis into a deep economic recession and a debt crisis, the chorus of critical voices became a tsunami that washed over newspapers, blogs, etc.
See for example the high level contributions in the blog here. . The debate on how to do economics was fresh and alive again. This positive aspect of the crisis was however overshadowed by the misery and suffering of millions and millions of ordinary people that lost almost everything.
Another consequence of the crisis was that the Institute of New Economic Thinking (INET) was established in 2009 in New York with the purpose of broadening economic thinking by facilitating an open debate. It was sponsored by the famous financier Geroge Soros who had been criticizing economic and financial theories for being too simplistic to describe the reality he faced in his daily work as a (highly successful) speculator in the financial markets.
The idea of INET is to promote new economic by research funding, community building, and spreading the word about the need for change. The INET advisory board currently involves a large group of well-known economists, including six Nobel Prize winners (George A. Akerlof, James Heckman, James A. Mirrlees, Amartya Sen, Andrew M. Spence, Joseph Stiglitz). In only two years the number of people that have joined the INET network has increased to close to 10,000.
INET is supporting new research through its grant programme which mostly consists of grants to individual researchers, but in some cases the advisory board identifies critical issues requiring in-depth research. The latter can be funded as a Research Programme typically to run over several years and usually involving a team of researchers led by senior economists often from the INET Advisory Board. Imperfect Knowledge Economics (IKE) is one these Research Programmes.
The aim of INET’s new Centre for Imperfect Knowledge Economics (IKE) at the University of Copenhagen is to continue the development of an alternative approach to macroeconomics and finance that builds on imperfect (rather than perfect) knowledge, fundamental uncertainty about future outcomes (rather than insurable risk), and a two-way relationship between the financial and the real economy. Based on these assumptions it is possible to understand the tendency by financial markets to generate persistent deviations from long-run equilibrium values, the basic reason why unregulated financial markets are inherently unstable. Ignoring this very important aspect when building economic models is likely to lead to empirically misleading conclusions and a failure to understand and foresee crises.
Why Copenhagen?
Over the last few decades, the University of Copenhagen has become widely known for its alternative econometric methodology that makes use of the Cointegrated VAR (CVAR) model. This methodology has proven to be remarkably fruitful in uncovering regularities in how market outcomes unfold over time. In the Copenhagen Centre we are exploring how to take IKE-based models to the data and test the basic assumptions of IKE-based models as well as any other competing theory model.
The empirical approach follows the general-to-specific methodology where data are allowed to speak as freely as possible about the empirical relevancy of (often competing) theory models. The methodology goes back Tryggve Haavelmo’s probability approach to economics (1944) for which he was awarded the Nobel Prize. In the forties, when his visionary ideas were worked out, the lack of computing technology seriously hampered an empirical realization of his ideas. This is no longer the case and the IKE centre in Copenhagen aims at developing this methodology to explore how to confront both the IKE models and its competitors with empirical evidence.
Why were empirical macro models not able to see the mounting imbalances that had started cumulating long before the crisis broke out, and why did they perform so badly, exactly when reliable advice was most urgently needed?
The main reason these models failed is because they are based on the above mentioned simplifying assumptions (REH and EMH) which simplistically imply that the economic system is always moving towards equilibrium (albeit sometimes sluggishly). In real life we see movements away from equilibrium for prolonged periods of time (5-6 years or even more). One could say that if the REH hypothesis was a good description of how economic actors behave, data would not have exhibited the persistent behaviour away from long-run steady states that preceded the crisis in the first place.
Another reason why the standard models did so poorly in foreseeing the financial and economic crisis was because they ignored the importance of the two-way interrelationship between the financial sector and the real economy, in particular the one that can lead to self-reinforcing speculative behaviour. In real life the long swings in asset prices (stock prices, exchange rates, house prices, energy prices, etc) typically influence perceived wealth and hence consumption behaviour. Many years before the bubble burst, our analyses showed that house prices moved very persistently away from ordinary consumer prices and that this was (primarily) facilitated by the perceived increase in financial wealth.
Prior to the bursting of the bubble, house prices increased almost exponentially, signalling that house prices were now moving very far away from their long-run sustainable levels.
This would have been the absolutely last moment for the authorities to react to prevent the subsequent disaster. But the extraordinary high level of house prices were possible only because of the enormous expansion of credit caused by exceptionally low levels of nominal interest rates, and this should have been reason for concern much earlier.
Can we do better next time?
With hindsight it is easy to see that the present crisis (as well as previous ones) was the consequence of imbalances that were allowed to accumulate over long periods of time. Therefore, to prevent the next crisis, we need a much better understanding of the disequilibrium forces at the macro level. Unless we build up a good macroeconomic understanding and a reliable signalling system it is likely that the next crisis again will come as a surprise.
One way of avoiding this to happen is to learn from the data in a systematic and structured way using available theories and hypotheses, but at the same time being open to signals in data suggesting that there are other mechanisms we do not yet understand. By embedding the theory model in a broader empirical framework, the empirical analysis should be able to point to possible pitfalls in macroeconomic reasoning, and at the same time to generate new hypotheses for how to modify too narrowly specified theoretical models.
The clues to the present crisis are hidden in the historical data and we need to take them much more seriously than usually done by economists.
Strong economic priors imposed on the data without testing may say more about the beliefs of the researcher than the state of the economic reality. The ultimate question is, therefore, whether we can afford to let economic policy be guided by beliefs which are not strongly backed up by empirical evidence.
We in the Copenhagen INET Centre hope that our work will help to open up extant models to more realistic assumptions which may improve our understanding of the inherent instability of the financial sector and how it influences the real economy.
universitypost@adm.ku.dk
Stay in the know about news and events happening in Copenhagen by signing up for the University Post’s weekly newsletter here.