Redirect


This site has moved to http://economistsview.typepad.com/
The posts below are backup copies from the new site.

July 9, 2009

Economist's View - 5 new articles

"The New Kaldor Facts"

What does growth theory need to explain? Has there been progress?:

The New Kaldor Facts: Ideas, Institutions, Population, and Human Capital, by Charles I. Jones and Paul M. Romer, NBER WP 15094, June 2009 [open link]: 1. Introduction ...[I]t is easy to lose faith in scientific progress. ... In any assessment of progress, as in any analysis of macroeconomic variables, a long-run perspective helps us look past the short-run fluctuations and see the underlying trend. In 1961, Nicolas Kaldor stated six now famous "stylized" facts. He used them to summarize what economists had learned from their analysis of 20th-century growth and also to frame the research agenda going forward (Kaldor, 1961):

    1. Labor productivity has grown at a sustained rate.
    2. Capital per worker has also grown at a sustained rate.
    3. The real interest rate or return on capital has been stable.
    4. The ratio of capital to output has also been stable.
    5. Capital and labor have captured stable shares of national income.
    6. Among the fast growing countries of the world, there is an appreciable variation in the rate of growth "of the order of 2–5 percent."

Redoing this exercise nearly 50 years later shows just how much progress we have made. Kaldor's first five facts have moved from research papers to textbooks. There is no longer any interesting debate about the features that a model must contain to explain them. These features are embodied in one of the great successes of growth theory in the 1950s and 1960s, the neoclassical growth model. Today, researchers are now grappling with Kaldor's sixth fact and have moved on to several others that we list below.

One might have imagined that the first round of growth theory clarified the deep foundational issues and that subsequent rounds filled in the details. This is not what we observe. The striking feature of the new stylized facts driving the research agenda today is how much more ambitious they are. Economists now expect that economic theory should inform our thinking about issues that we once ruled out of bounds as important but too difficult to capture in a formal model.

Here is a summary of our new list of stylized facts, to be discussed in more detail below:

    1. Increases in the extent of the market. Increased flows of goods, ideas, finance, and people — via globalization as well as urbanization — have increased the extent of the market for all workers and consumers.
    2. Accelerating growth. For thousands of years, growth in both population and per capita GDP has accelerated, rising from virtually zero to the relatively rapid rates observed in the last century.
    3. Variation in modern growth rates. The variation in the rate of growth of per capita GDP increases with the distance from the technology frontier.
    4. Large income and TFP differences. Differences in measured inputs explain less than half of the enormous cross country differences in per capita GDP.
    5. Increases in human capital per worker. Human capital per worker is rising dramatically throughout the world.
    6. Long-run stability of relative wages. The rising quantity of human capital relative to unskilled labor has not been matched by a sustained decline in its relative price.

In assessing the change since Kaldor developed his list, it is important to recognize that Kaldor himself was raising expectations relative to the initial neoclassical model of growth as outlined by Solow (1956) and Swan (1956). When the neoclassical model was being developed, a narrow focus on physical capital alone was no doubt a wise choice. The smooth substitution of capital and labor in production expressed by an aggregate production function, the notion that a single capital aggregate might be useful, and the central role of accumulation itself were all relatively novel concepts that needed to be explained and assimilated. Moreover, even these small first steps toward formal models of growth provoked substantial opposition.

The very narrow focus of the neoclassical growth model sets the baseline against which progress in growth theory can be judged. Writing in 1961, Kaldor was already intent on making technological progress an endogenous part of a more complete model of growth. ...

Growth theorists working today have not only completed this extension but also brought into their models the other endogenous state variables excluded from consideration by the initial neoclassical setup. Ideas, institutions, population, and human capital are now at the center of growth theory. Physical capital has been pushed to the periphery.

Kaldor had a model in mind when he introduced his facts. So do we. ... In the near term, we believe that this model should capture the endogenous accumulation of and interaction between three of our four state variables: ideas, population, and human capital. For now, we think that progress is likely to be most rapid if we follow the example of the neoclassical model and treat institutions the way the neoclassical model treated technology, as an important force that enters the formalism but which evolves according to a dynamic that is not explicitly modeled. Out on the horizon, we can expect that current research on the dynamics of institutions and politics will ultimately lead to a simple formal representation of endogenous institutional dynamics as well.

...

4. Conclusion ...[T]he facts we highlight ... reveal important complementarities among the key endogenous variables [ideas, population, human capital, and institutions]... Such complementarities exemplify the value of the applied general equilibrium approach. They are the fundamental reason why we seek a unified framework for understanding growth. Going forward, the research agenda will surely include putting ingredients like those we have outlined in this paper together into a single formal model. Further out on the horizon, one may hope for a successful conclusion to the ongoing hunt for a simple model of institutional evolution. Combining that with the unified approach to growth outlined here would surely constitute the economics equivalent of a grand unified theory—a worthy goal by which we may be judged when future generations look back fifty years from now and quaintly revisit our "ambitious" list of stylized facts.


Was it Risk Concentration or Leverage?

Ricardo Caballero hasn't given up on his argument that it was the excessive concentration or risk, not leverage, that caused problems in financial markets (and it's an argument I'm sympathetic to):

Economic Witch Hunting, by Ricardo Caballero, Commentary, Economists Forum: Perhaps one of the economic phenomena most akin to witch-hunting is the diagnostic and policy response that develops during the recovery phase of a financial crisis. Understandably, pressured politicians and policymakers rush to find culprits... All too often they find a ready supply of these in preconceptions and superficial analyses of correlations. This time around the scapegoats are global imbalances and leverage.

Global imbalances are the victim of preconceptions: Many economists and commentators argued before the crisis that large global imbalances would lead to the demise of the U.S. economy... The crisis indeed came, but rather than destabilizing the US economy, capital flows helped to stabilise it, as flight-to-quality capital sought rather than ran away from US assets. ...

The fact that the actual mechanism behind the crisis had nothing to do with that which was used to explain the forecast of doom has long being forgotten, false idols have been erected,... global imbalances have been indicted for witchcraft, and ever more exotic rebalancing and currency proposals make it to the front pages of newspapers around the world.

Leverage is the victim of superficial analyses of correlations: In my view one of the main factors behind the severity of the financial crisis was the excessive concentration of aggregate risk in highly-leveraged financial institutions. Note that the emphasis is on the concentration of aggregate risk rather than on the much-hyped leverage. The problem in the current crisis was not leverage per se, but the fact that banks had held on to AAA tranches of structured asset-backed securities which were more exposed to aggregate surprise shocks than their rating would, when misinterpreted, suggest.

Thus, when systemic confusion emerged, these complex financial instruments quickly soured, compromised the balance sheet of their leveraged holders, and triggered asset fire sales which ravaged balance sheets across financial institutions. The result was a vicious feedback loop between assets exposed to aggregate conditions and leveraged balance sheets.

The distinction emphasized in the previous paragraph may seem subtle, but it turns out to have a first order implication for economic policy... The optimal policy response to this problem is not to increase capital requirements (or to deleverage), as the current fashion has it, but to remove the aggregate risk from systemically important leveraged financial institutions' balance sheets. This should be done through prepaid and often mandatory macro-insurance type arrangements, which can accommodate valid too-big or too-complex to fail concerns, but without crippling the financial industry with the burden of brute-force capital requirements. ...

We shouldn't assume that the next potential financial crisis will be identical to this one in terms of how it comes about or how it expresses itself, so we need to ensure that the system can withstand different types of financial shocks. Given that these shocks can come from unexpected places, it's not clear to me that insurance discussed above will stop all of the ways in which financial market problems can lead to harmful deleveraging. Hence, we may want to put the type of insurance plan Ricardo Caballero would like to see instituted in place, and then buttress that protection with enhanced capital requirements to safeguard against unexpected causes of harmful deleveraging.


One Operating System to Rule Them All?

Google is moving forward with its plans to develop an operating system:

Google Plans a PC Operating System, Helft and Vance, NY Times: In a direct challenge to Microsoft, Google announced ... it is developing an operating system for PCs that is tied to its Chrome Web browser.

The software, called the Google Chrome Operating System, is initially intended for use in the tiny, low-cost portable computers known as netbooks... Google said it believed the software would also be able to power full-fledged PCs.

The move is likely to sharpen the already intense competition between Google and Microsoft... "Speed, simplicity and security are the key aspects of Google Chrome OS," said Sundar Pichai ... and Linus Upson ... in a post on a company blog. "We're designing the OS to be fast and lightweight, to start up and get you onto the Web in a few seconds."

Mr. Pichai and Mr. Upson said that the software would be released online later this year under an open-source license... Netbooks running the software will go on sale in the second half of 2010.

The company likely saw netbooks as a unique opportunity to challenge Microsoft, said Larry Augustin, a prominent Silicon Valley investor...

"Market changes happen at points of discontinuity," Mr. Augustin said. "And that's what you have with netbooks and a market that has moved to mobile devices." ...

Google's plans for the new operating system fit its Internet-centric vision of computing. Google believes that software delivered over the Web will play an increasingly central role, replacing software programs that run on the desktop. In that world, applications run directly inside an Internet browser, rather than atop an operating system, the standard software that controls most of the operations of a PC.

That vision challenges not only Microsoft's lucrative Windows business but also its applications business, which is largely built on selling software than runs on PCs. ... Google said Tuesday night that it still had work to do to develop a full-fledged operating system. ... [Here's Google's announcement.]

I resisted moving from DOS to Windows, and then got stuck on Windows once I did move, so I'm probably not the best judge of whether the model Google is using to challenge Microsoft will be successful, and perhaps both models can survive by serving different needs. However, I've also spent time on mainframe batch and time-share systems where you interact with the mainframe computer through a terminal (screen and keyboard), and Google's vision reminds me of an internet wide version of that system (if I understand it correctly, and I may not). If I want to do simulations of a theoretical model, will it be like graduate school where I had to work very late at night when the system had enough free resources to accommodate my requests without being so slow as to be nearly unusable? PCs freed me from that constraint (but not the late night work habit). It was hard to work at home then as well. It was possible to connect through a phone, but it was very slow, and this was also something PCs changed. You didn't have to be at school to do computer work. If we go to the Google model, will the internet be available broadly and reliably enough so that there won't be frustrating periods when lack of an internet connection means you can't get things done unless you do the equivalent of "going to school where there's a terminal"? And I also like having data backed up locally on my own disks or other media rather than trusting a centralized system to keep it safe for me, and with sensitive data it feels much more secure that way. I suppose this isn't a problem for people who use their computers mainly to browse the internet or send email, But if you use your PC for tasks that require lots of computing power or use sensitive data, I think you have reason to wonder if some of the speed, flexibility, and security PCs give you might be compromised with this system. For that reason, I wonder if Google's model will be able to capture some segments of the market, e.g. those that desire lots of computing power be available nearly on demand. But as I said, if people had listened to me, we'd probably still be using DOS.


"Administrative Costs in Health Care"

A follow-up to this post on administrative costs in health care:

Administrative Costs in Health Care: A Primer, by Ezra Klein: ...Paul Krugman, Greg Mankiw, Tyler Cowen and a handful of others began arguing about [administrative costs]... I'm not convinced any of them have it right.

Administrative costs are ... confusing... What counts as an "administrative cost" for a health insurer? We all agree that paying bills counts. But does ... disease management? Advertising? A nurse who dispenses health advice over the telephone? Hard to say. But all of them get grouped under administrative costs at various times. Indeed,... there's not perfect unanimity on how to measure any of this.

But most seem to think that Medicare's administrative costs are significantly undersold... An apples-to-apples comparison would not leave you with the 2 percent of total Medicare spending often bandied about in debate. That doesn't count, for instance, Medicare's premium collection, which is done through the ... IRS. Nor does it count most of Medicare's billing, which is outsourced -- and this might surprise people -- to private insurers like Blue Cross Blue Shield and listed under vendor services rather than program administration. A more straightforward estimate ... would be in the range of 5 to 6 percent.

Nor is it easy to measure administrative costs among private insurers. For one thing, which private insurers? ... Among employer-based plans, the largest firms had the lowest costs. Plans covering companies with at least 1,000 employees had a mere 7 percent in administrative costs. Those covering companies with fewer than 25 employees spent 26 percent of premiums on administration. And the individual market was a mess: 30 percent.

This tells us ... size matters. The most important predictor of administrative costs is not whether the plan is public or private, but whether it is large. ...

But administrative costs among ... insurers ... are only part of the story. And they may not even be the most important part. The hospitals and physicians ... are spending tremendous sums of money too. Hospitals ... employ people to argue over claims and navigate the rules of the dozen or so different insurance plans they contract with. And here the experts were unanimous: The problem is that the system is fractured. There's no standardization..., every insurer is complicated in its own way. And that complexity costs a lot of money.

As of now, no one I spoke with knew of good data separating the costs of dealing with Medicare and with private insurers. But there are studies comparing Canada and the United States that show a single payer vastly reduces administrative spending. ...

But ... slashing administrative costs ... will never be a panacea... Rick Kronick, a political scientist at the University of California at San Diego,... summed the situation up quite well. "The main question," he said, "is why are health care costs going up at 2.4 percent a year faster than GDP? And most of the answers to that question have nothing to do with administrative costs. The answers are that we do more stuff and have more technology. Even if we could wring administrative savings out of the system, which ... would be a good thing, we'd still be facing the question of how to slow the rate of cost growth."


links for 2009-07-08

No comments: