Latest Posts from Economist's View 
 The Rise of the Renminbi as International Currency: Historical Precedents
 "A Boom in 2013?"
 The New Keynesian ISLM and ISMP Models
 links for 20111011
 Fed Watch: Too Early to Sound the All Clear?
 The Nobel Prize in Economics: A Note on Chris Sims' Contributions
 The Sveriges Riksbank Prize in Economic Sciences in Memory of Alfred Nobel
The Rise of the Renminbi as International Currency: Historical Precedents Posted: 11 Oct 2011 12:51 AM PDT Jeff Frankel: The Rise of the Renminbi as International Currency: Historical Precedents, by Jeff Frankel: All of a sudden, the renminbi is being touted as the next big international currency. Just in the last year or two, the Chinese currency has begun to internationalize along a number of dimensions. A RMB bond market has grown rapidly in Hong Kong, and one in RMB bank deposits. Some of China's international trade is now invoiced in the currency. Foreign central banks have been able to hold RMB since August 2010, with Malaysia going first. Some are now claiming that the renminbi could overtake the dollar for the number one slot in the international currency rankings within a decade (especially Subramanian 2011a, p.19; 2011b). ... The dollar is one of three national currencies to have attained international status during the 20th century. The other two were the yen and the mark, which became major international currencies after the breakup of the Bretton Woods system in 197173. (The euro, of course, did so after 1999.) In the early 1990s, both were spoken of as potential rivals of the dollar for the number one slot. It is easy to forget it now, because Japan's relative role has diminished since then and the mark has been superseded. ... The current RMB phenomenon differs in an interesting way from the historical circumstances of the rise of the three earlier currencies. The Chinese government is actively promoting the international use of its currency. Neither Germany nor Japan, nor even the US, did that, at least not at first. In all three cases, export interests, who stood to lose competitiveness if international demand for the currency were to rise, were much stronger than the financial sector, which might have supported internationalization. One would expect the same fears of a stronger currency and its effects on manufacturing exports to dominate the calculations in China. In the case of the mark and yen after 1973, internationalization came despite the reluctance of the German and Japanese governments. In the case of the United States after 1914, a tiny elite promoted internationalization of the dollar despite the indifference or hostility to such a project in the nation at large. These individuals, led by Benjamin Strong, the first president of the New York Fed, were the same ones who had conspired in 1910 to establish the Federal Reserve in the first place. It is not yet clear that China's new enthusiasm for internationalizing its currency includes a willingness to end financial repression in the domestic financial system, remove crossborder capital controls, and allow the RMB to appreciate, thus helping to shift the economy away from its exportdependence. Perhaps a small elite will be able to accomplish these things, in the way that Strong did a century earlier. But so far the government is only promoting international use of the RMB offshore, walled off from the domestic financial system. That will not be enough to do it. [This perspective note summarizes the argument in "Historical Precedents for the Internationalization of the RMB"...] ... 
Posted: 11 Oct 2011 12:42 AM PDT One more from Tim Duy:

The New Keynesian ISLM and ISMP Models Posted: 11 Oct 2011 12:33 AM PDT David Romer's name has come up several times in recent discussions of the ISLM and ISMP models. This is how Romer's new edition of his graduate level macroeconomics book derives the ISLM and ISMP curves: Assume that firms produce labor using labor as the only input, i.e. Y = F(L), F'>0, F''≤0, and that government, international trade, and capital are left out of the model for convenience (so that Y=C+I+G+NX becomes Y=C). Also assume that "There is a fixed number of infinitely lived households that obtain utility from consumption and from holding real money balances, and disutility from working. For simplicity, we ignore population growth and normalize the number of households to 1. The representative household's objective function is": There is diminishing marginal utility (or increasing marginal disutility), as usual. (Note that assuming money is in the utility function is a standard shortcut. See Walsh for a more extensive discussion of this.) Next, let the utility functions for consumption and real money balances take their usual constant relative risk aversion forms: There are two assets in the model, money and bonds. Money pays no interest, while bonds receive an interest rate of i_{t}. Wealth evolves according to: where A_{t} is household wealth at the start of period t, W_{t}L_{t} is nominal income, P_{t}C_{t} is nominal consumption, and M_{t} is nominal money holdings. This equation says that wealth in period t+1 is equal to the amount of money held at the end of time t plus (1+i_{t}) times the bonds help from t to t+1 (the term in parentheses is bonds). Households take the paths of P, W, and i as given, and they choose the paths of C and M to maximize the present discounted value of utility subject to the flow budget The optimization condition (Euler equation) for the intertemporal consumption tradeoff is: We now, in essence, have the New Keynesian IS curve. To see this, take logs of both sides: And using the fact that Y=C, approximating ln(1+r) as r (which holds fairly well when r is small), and dropping the constant for convenience gives: This is the New Keynesian IS curve. It's just like the ordinary IS curve, except for the lnY_{t+1} term on the righthand side (in models with stochastic shocks, this becomes E_{t}lnY_{t+1}, where E_{t}lnY_{t+1} is the expected value of Y_{t+1} given the information available at time t  often the information set contains only lagged values of variables in the model). Thus, the big difference between the old IS and the microfounded New Keynesian IS curve is the E_{t}lnY_{t+1} term on the righthand side. (Thus, it's relatively easy to amend the traditional model of the IS curve to incorporate the expectation term.) It can also be shown (e.g. through a variations argument) that the first order condition for money holding is: This implies that: Money demand is increasing in output and decreasing in the nominal The ideas captured by the new Keynesian IS curve are appealing and useful... The LM curve, in contrast, is quite problematic in practical applications. One difficulty is that the model becomes much more complicated once we relax Section 6.1's assumption that prices are permanently fixed... A second difficulty is that modern central banks do not focus on the money supply. The first problem is that the LM curve shifts when P changes, so if there is inflation it will be in constant motion making it hard to use as an anlytical tool. That can be overcome, but the second objection is harder to dismiss. However, it is easy to address. Simply assume that the central bank follows a rule for the interest rate such as: If the central bank adjusts M to ensure this holds, then the money supply is now essentially endogenous (and the interest rate is set externally through the rule). This is an upward sloping curve in rlnY space, and it is called the MP curve (for monetary policy). It replaces the LM curve in the ISLM diagram giving us the ISMP model. However, it would still be possible to do the analysis with the ISLM diagram, just put a horizontal line at the fixed interest rate and find the money supply that makes this an equilibrium, but as noted above in the presence of inflation the LM curve shifts out continuously making the model hard to use. Thus, in the presence of inflation and an interst rate rule, the ISMP formulation is much simpler to use. But for other questions, e.g. quantitative easing at the lower bound or pedagogically examining a money rule, the ISLM model is often more intuitive. But the main point is that if you start from (very simple) microfoundations, the resulting model looks a lot like the old ISLM model. It still needs to be able to handle pricechanges, so it's necessary to add a model of supply to the model of demand provided by the ISMP or the ISLM diagrams, and the expectation term on the righthand side of the IS curve is an important difference from the older modeling scheme, but the two models have a lot in common. 
Posted: 11 Oct 2011 12:06 AM PDT

Fed Watch: Too Early to Sound the All Clear? Posted: 10 Oct 2011 03:24 PM PDT Tim Duy:

The Nobel Prize in Economics: A Note on Chris Sims' Contributions Posted: 10 Oct 2011 11:43 AM PDT Let me talk a bit about Sims contributions to economics, and if I have time I'll try to cover Sargent later. Prior to Sims work, in particular his paper "Macroeconomics and Reality," the state of the art in macroeconometrics was to use largescale structural models. These models often involved scores or even hundreds of equations, essentially a S=D equation for every important market, identities to make sure things add up correctly, etc. But in order to estimate the parameters of these models, the structural parameters as they are known, you had to overcome the identification problem. Without getting into the details, the identification problem essentially asks if its possible to estimate the structural parameters at all. The answer, in general, is no. For example, if every variable in the model appears in every equation, then it won't be possible to estimate the structural model. Let me give an example to illustrate. Suppose that X and and Y are the endogenous variables, e.g. price and quantity for some market, and that the structural model is: Y_{t} = a_{0} + a_{1}X_{t} + a_{2}Y_{t1} + a_{3}X_{t1} + u_{t} X_{t} = b_{0} + b_{1}Y_{t} + b_{2}Y_{t1} + b_{3}X_{t1} + v_{t} The a's and the b's are the parameters that economists are generally interested in, but in this form it is not possible to estimate them. There must be what are known as exclusion restrictions before estimation is possible. In this case, for example, identification can be achieved by making either a_{1} or b_{1} equal to zero (more on this below), i.e. excluding one of the variables from one of the equations. If there is a reason for this, then excluding the variable is okay, but a variable can't be left out simply to achieve identification  there must be good reason for excluding X_{t} from the first equation, or Y_{t} from the second (or both). Omitting a variable that ought to be in a model in order to satisfy the identification restrictions results in a misspecified model and biased estimates. In large models, these exclusions are numerous, and many researchers simply assumed whatever exclusion restrictions were needed to achieve identification, and then went on to estimate the model. In Macroeconomics and Reality, Sims pointed out the problem with this approach. The assumptions that researchers were imposing to achieve identification had no theoretical basis. They were ad hoc and difficult to defend (especially when expectations are in the model  expectations tend to depend upon all the variables in a model making it difficult to exclude anything from an equation involving expectations). What Sims suggested as an alternative was to drop structural modeling altogether, and to use generalized reduced forms as the basis for estimation. There would be no hope of recovering structural parameters in most cases, but there was still much that could be learned by using reduced forms instead of structural models. For example, the reduced form for the model above is (you can find the reduced form by expressing the endogenous variables X_{t} and Y_{t} in terms of exogenous and predetermined variables): X_{t} = [1/(1a_{1}b_{1})]{(a_{0} + a_{1}b_{0}) + (a_{1}b_{2} + a2)Y_{t1} + (a_{1}b_{3} + a_{3})X_{t1 }+ a_{1}v_{t} + u_{t}} Y_{t} = [1/(1a_{1}b_{1})]{(b_{0} + b_{1}a_{0}) + (b_{1}a_{2} + b_{2})Y_{t1} + (b_{1}a_{3} + b_{3})X_{t1 }+ v_{t} + b_{1}u_{t}} To estimate this, write it as: X_{t} = c_{0} + c_{1}Y_{t1} + c_{2}X_{t1 }+ a_{1}v_{t} + u_{t} Y_{t} = d_{0} + d_{1}Y_{t1} + d_{2}X_{t1 }+ v_{t} + b_{1}u_{t} This is a VAR model. At first, Sims thought we could draw important conclusions from this model, e.g. suppose that X is money and Y is output. Then this model could tell us how a shock to money would change output over time (these are called impulse response functions  you hit the system with a shock, and then use the estimated model to trace out the path of the endogenous variables over time). We could use this model to answer important questions such as whether money causes output (Sims' technique for testing causality was essentially the same as Granger causality, but Sims' made an important contribution in extending the causality techniques to systems with three or more variables when he introduced impulse response functions and variance decompositions). But, as Cooley and LeRoy pointed out in an important paper, these models don't avoid structural assumptions after all, at least not if you want to say anything about how variables in the model respond to structural shocks. To see this, note first that the shock we are interested in is the shock to money, v_{t}. Now look at the errors in the two reduced form equations. We can estimate each equation by OLS, and when we do the error terms will be estimates of a_{1}v_{t} + u_{t} for the first equation and v_{t} + b_{1}u_{t} for the second. Thus, we get estimates of linear combinations of the v_{t} and u_{t} shocks we are interested in, but we don't get the shocks in isolation like we need. And there's no way to isolate the shocks, i.e. to determine their individual values. That's a problem because we need to find the money shock alone if we want to estimate its effect on output. How can we do this? One way is to make either a_{1} or b_{1} equal to zero. Let's set b_{1}=0 because that's the easiest to discuss. In this case, when we estimate the second equation by OLS (the equation with the d parameters), the error will now be an estimate of v_{t}, which is just what we need. However, notice that this is nothing more than an exclusion restriction  by assuming that b_{1}=0, we are excluding Y_{t} from the second equation (see the structural model). Thus, we have come full circle. This is where Sims Structural VARS come into play. The reduced form above is known as a VAR model (in its estimable form, i.e. the second set of reduced for equations above involving the c and d parameters). It turns out that if we can often defend particular restrictions theoretically, e.g. if money can only respond to output with a lag, perhaps due to information problems, then there is no reason to have the contemporaneous value of output on the righthand side of the structural equation for money, i.e. this implies that b_{1}=0. Thus, while this still amounts to an exclusion restriction, the restriction is no longer ad hoc  simply imposed as necessary to achieve identification as back in the old, largescale structural model days  it is grounded in theory. And the fact that we insist these restrictions be grounded in theory marks an important difference from the work that came before Sims. And even better, this technique also allows the model to be identified without using exclusion restrictions at all. For example, if we think that some variables in the model have shortrun but not longrun effects, e.g. that money can affect output in the shortrun, but only produces price effects in the longrun  a standard assumption in most macro models  then the zero impact in the longrun can be imposed as an identifying restriction. Exclusion restrictions won't be needed (this is the BlanchardQuah and ShapiroWatson techniques). This just scratches the surface of Sims' work  I wish I had time to do more  but *hopefully* this provides a window into one part of Sims' contributions. 
The Sveriges Riksbank Prize in Economic Sciences in Memory of Alfred Nobel Posted: 10 Oct 2011 10:26 AM PDT I'm late getting to this, but congratulations to this year's recipients of the Nobel Prize in Economics, Chris Sims and Tom Sargent. Here are more details: Empirical Macroeconomics: One of the main tasks for macroeconomists is to explain how macroeconomic aggregates  such as GDP, investment, unemployment, and inflation  behave over time. How are these variables affected by economic policy and by changes in the economic environment? A primary aspect in this analysis is the role of the central bank and its ability to influence the economy. How effective can monetary policy be in stabilizing unwanted fluctuations in macroeconomic aggregates? How effective has it been historically? Similar questions can be raised about fiscal policy. Thomas J. Sargent and Christopher A. Sims have developed empirical methods that can answer these kinds of questions. This year's prize recognizes these methods and their successful application to the interplay between monetary and …fiscal policy and economic activity. In any empirical economic analysis based on observational data, it is difficult to disentangle cause and effect. This becomes especially cumbersome in macroeconomic policy analysis due to an important stumbling block: the key role of expectations. Economic decisionmakers form expectations about policy, thereby linking economic activity to future policy. Was an observed change in policy an independent event? Were the subsequent changes in economic activity a causal reaction to this policy change? Or did causality run in the opposite direction, such that expectations of changes in economic activity triggered the observed change in policy? Alternative interpretations of the interplay between expectations and economic activity might lead to 
You are subscribed to email updates from Economist's View To stop receiving these emails, you may unsubscribe now.  Email delivery powered by Google 
Google Inc., 20 West Kinzie, Chicago IL USA 60610 
No comments:
Post a Comment