Quantcast
Channel: February 2014 – Orderstatistic
Viewing all articles
Browse latest Browse all 10

A Faustian Bargain?

$
0
0

Reflecting on a recent blog post by Simon Wren-Lewis, Paul Krugman argues that the modern insistence on microfoundations has impoverished macroeconomics by shutting down early understandings of financial markets “because (they) didn’t conform to a particular, highly restrictive definition of what was considered valid theory.”  In Krugman’s libretto, the role of Mephistopheles is played by “freshwater” macroeconomists. 

Krugman uses James Tobin as an example of one of the casualties of adopting freshwater methodology saying that as far as he could tell, Tobin “disappeared from graduate macro over the course of the 80s, because his models, while loosely grounded in some notion of rational behavior, weren’t explicitly and rigorously derived from microfoundations.” Tobin has not disappeared. In my course for instance, Tobin shows up in the section on investment, which is centered around Tobin’s Q (my co-author Matthew Shapiro constantly emphasizes that it should be called Brainard-Tobin’s Q.)  My students (and any graduate student familiar with David Romer’s Advanced Macroeconomics) is well aware of Tobin’s role in this line of work. Tobin’s early ideas on Q-theory were sketches – plausibility arguments – which were subsequently developed in greater detail by Andy Abel, Fumio Hayashi and Larry Summers (and also Michael Mussa). 

Adopting microfoundations does come with a cost. As I mentioned in a previous post, being precise and exact prevents economists from engaging in glib, hand-waiving theorizing. Many analysts (and commentators) see this as a serious limitation.  Using this methodology also has advantages. Being specific allows you to (1) make the theory clear by exposing the necessary components, (2) quantify the effects by attaching plausible values to parameters and (3) learn from the model. This last advantage is one of the biggest benefits to microfoundations.  Setting out a list of assumptions and then following them where they lead may expose flaws in your own understanding; it may lead you to new ideas, and so on. Let me give you two examples.

Suppose someone says that if demand goes up, prices will fall. Here is their argument: if demand goes up, the price is bid up. The price increase reduces demand and so ultimately the price falls. Every statement in this argument is reasonable but the conclusion is incorrect. The way to find the mistake is with a model – in this case a supply and demand model. (The error is a confusion of movements along a demand curve verses shifts in the demand curve.)

Here is another example. In the traditional IS/LM model, investment demand is assumed to depend negatively on the real interest rate. This assumption is important for the functioning of the model – it makes the IS curve slopes down. The assumption itself is based on a slight confusion between the demand for capital and the demand for investment. What would happen if we added some microfoundations? Suppose we removed the ad hoc investment demand curve and instead required that the marginal product of capital equal the real interest rate (the user-cost relationship).  In this case, there would be a positive relationship between output and the real interest rate (the IS curve would slope up! Higher output would require more employment which would raise the marginal product of capital and raise the real interest rate.) An increase in the money supply would cause the real rate (and the nominal rate) to rise. How should we interpret this? One interpretation is that we need to think a bit more about the investment demand component of the model. An alternative reaction would be to say “I know that the original IS/LM model is right; I don’t need the microfoundations; they are just preventing me from getting the right answer.”   

Who came up with this twisted version of the IS/LM model you might ask? Wait for it …

…yep … James Tobin. (1955, see Sargent’s 1987 Macroeconomic Theory text for a brief description of Tobin’s “Dynamic Aggregative Model.”)

Even today, when we analyze the New Keynesian model, it is often done without any investment (this is like having an IS/LM model without the “I”). Adding investment demand can sometimes result in odd behavior. In particular you often get inverted Fisher effects in which monetary expansions are associated with higher output but strangely, higher real interest rates and higher nominal interest rates.  (If you teach New Keynesian models to graduate students I would encourage you to take a look at Tobin’s model.)

It seems that Paul Krugman wants to revise the history of the field a bit. Reading his post it almost seems like he wants us to believe that the Keynesians would have figured out financial market failures if they hadn’t been led astray by microfoundations and rational expectations. This is not true. The main thing New Keynesian research has been devoted to for the past 20 years is an exhaustive study of price rigidity. If anything was holding us back it was the extraordinary devotion of our energy and attention to the study of nominal rigidities. We now know more about the details of price setting than any other field in economics. As financial markets were melting down in 2008, many of us were regretting that allocation of our attention. We really needed a more refined empirical and theoretical understanding of how financial markets did or did not work. 


Viewing all articles
Browse latest Browse all 10

Trending Articles