Reassessment of the 'Dismal' Science Continues
In May, Knowledge at Wharton reported on the inevitable and painful assessment of the state of economics after its most esteemed practitioners failed to foresee the worst global downturn since the Great Depression. In that article, "Why Economists Failed to Predict the Financial Crisis," one reason cited was that as computers grew more powerful, academics came to rely on mathematical models to figure out how various economic forces will interact. But many of those models simply dispense with certain variables that stand in the way of clear conclusions, said Wharton management professor Sidney G. Winter. Commonly missing are hard-to-measure factors like human psychology and people's expectations about the future, he noted.
Now, Nobel economics laureate and New York Times columnist Paul Krugman points an accusing finger in the same direction. Economists, he writes on The Times web site, "turned a blind eye to the limitations of human rationality that often lead to bubbles and busts; to the problems of institutions that run amok; to the imperfections of markets — especially financial markets — that can cause the economy’s operating system to undergo sudden, unpredictable crashes; and to the dangers created when regulators don’t believe in regulation."
How can more of the human factor be introduced into economic models? Krugman says he's not sure. "But what’s almost certain is that economists will have to learn to live with messiness. That is, they will have to acknowledge the importance of irrational and often unpredictable behavior, face up to the often idiosyncratic imperfections of markets and accept that an elegant economic 'theory of everything' is a long way off. In practical terms, this will translate into more cautious policy advice — and a reduced willingness to dismantle economic safeguards in the faith that markets will solve all problems."