Saturday, February 28, 2009

S&P's response to Siegel on PE ratios - S&P still misses the point

A letter today in the Wall Street Journal claims that S&P does compute earnings correctly. I've copied the letter below. The link to the letters page is here.

S&P Does the Earnings Correctly

In his "The S&P Gets Its Earnings Wrong" (op-ed, Feb. 25), Jeremy J. Siegel claims that Standard & Poor's systematically understates the earnings of the S&P 500. In his view, the recent losses of the financial companies in the S&P 500 should be discounted because of their diminished weights in the index.

His argument, however, fails the simple tests of both logic and index mathematics. A dollar earned or lost is the same, irrespective of whether it is earned or lost by a big index constituent or a smaller one.

Prof. Siegel's example of Exxon-Mobil illustrates why S&P's method of calculating earnings works. If large Exxon-Mobil earned $10 billion and small Jones Apparel lost $10 billion, index investors collectively -- and individually -- would bear a proportionate share of both Exxon's earnings and Jones's loss, despite the fact that the value of Exxon-Mobil's shares in the index portfolio is about 1,381 times the value of the Jones's shares.

To use an analogy, we could hypothetically view the S&P 500 as a single company with 500 divisions, with each division having earnings and an implicit market value. The smallest of these divisions could have an outsized loss that wipes out the combined earnings of the entire company. Claiming that these losses should be ignored or minimized because they came from a less valuable division is flawed.

Prof. Siegel's approach -- applying the weights based on market values to the results based on a company's earnings -- effectively mixes apples and oranges.

David M. Blitzer
Managing Director, Chairman of the Index Committee
Standard & Poor's
New York


I still think that Siegel is correct, and that Blitzer (and S&P) are missing the point. S&P is correct that the total earnings of the index is the sum of the earnings of the individual companies. But when computing a PE ratio, this approach is flawed if you want a PE ratio that can be comparable to an individual stock's PE. This is because the S&P 500 is NOT a multi division company in which the earnings of a bad division can wipe out the earnings of the rest of the company. These are separate individual companies.

When a stock with a tiny market value posts a massive loss, this loss will have a disproportional effect on the overall PE ratio of the index. If you want the PE of the index to provide some indication of the overall valuation of the market, you will have too high a PE using S&P's method.

I blogged on this a couple of days back. But my basic argument (and I think Siegel's) is that if you have a stock with a tiny value and a huge loss, it's impact on the PE should be trivial because if you buy the index today you'll only be buying putting a tiny amount of your money in that stock. By not value weighting the earnings (or losses) you are assuming that that stock makes up a much larger chunk of the index, when it doesn't.

Or put another way. You have two stocks in the index. One has a PE of 10, and is massive. The other has a negative PE and earnings equal to minus the earnings of the big firm. Is the PE of the index 10 or close to infinity? If you bought the big stock, you'd buy it off a PE of 10, and if you bought the tiny, big loss stock you'd buy it as basically an equity option.

So while S&P is mathematically correct, if they want a PE ratio that is actually economically meaningful, their approach is flawed.

Wednesday, February 25, 2009

S&P Earnings are too low

Edited post...
On Feb 25, I originally posted the following (I subsequently withdrew the post to think about it more):

"The Wall Street Journal has an Op-ed piece by Jeremy Siegel who argues that earnings reported for the S&P 500 are understated because of the goofy way that S&P computes the index's aggregate earnings.

Whereas the returns on the S&P 500 are estimated on a value weighted basis, S&P estimates aggregate earnings by merely adding up the earnings of all the stocks in the index. Of course stocks that are loosing lots of money tend to have low values. So the earnings number for the index is artificially reduced by this approach. This means that a) S&P 500 earnings aren't as bad as they look, and b) the P/E ratio for the S&P 500 is actually much lower than reported."


Since posting this a commenter noted that I had it wrong. Also several blogs [here, here and here] had come to the same conclusion that indeed Siegel had messed up.

Their logic is simple - the S&P 500 is a value weighted index. The value of the index is basically all the market values added up (and adjusted for float - although that's not important here) and then divided by a fixed divisor.

Therefore a PE ratio of the index = [Sum of market values/divisor] / [Sum of earnings/divisor]

Obviously the divisor cancels and you are left with the sum of market values divided by the sum of earnings, which is what S&P does and what Siegel argued was wrong.

After thinking about it more, I think that Siegel is correct, although his point is perhaps not very well made.

Consider a simple value weighted index with two stocks:
A has 1000 shares and a price of 20, and a market value of 20,000.

B has 1000 shares and a price of 0.05, and market value of 50.

A is 99.751% of the index, B is 0.249% of the index.

B used to be a big company, but now isn't! (Think Citigroup).

A has net income of 2000, B has net income of -1000.

Using S&P's method the PE of the index is (20,000+50)/(2000-1000) = 20.05

Using Siegel's method the PE of the index is (20,000*.99751+50*.249)/(2000*.99751-1000*.249) = 10.013


OK, so what's going on here? It is quite unusual to have very low market cap firms loosing huge amounts of money, but with the financials in the S&P 500, that is what we are seeing now. If you bought the S&P 500 today, you are basically buying stock A, and a tiny stock called B which has a very bad history. But the bulk of your holdings come from stock A. Sure B has lost money, but if the losses exceed the market value, these aren't losses that you as a stock holder will bear. In fact the most you can loose now on your investment in B is 50. The losses are what got the stock down to this point. B has become trivial to your portfolio, you basically own A, and any decision to buy more of this portfolio should be based on whether you think A is fairly valued. I'd argue that assigning a PE of around 10 is far more realistic than A PE of 20. If the PE is a forward looking measure, then forward looking, your future contains mostly stock A, and hardly any B.

Another way of looking at this is that when a stock is close to zero, it is an option. You don't suffer the downside, you only get the upside. Therefore creating a PE that incorporates this huge downside is going to result in a PE that is too high.

This other blogger also thinks Siegel is correct

Origins of the financial mess

Alan Blinder talks about what went wrong and who screwed up. The answer is basically "a lot" and "pretty much everyone".

HT:Newmarks door

Tuesday, February 24, 2009

Finance cults

Following from my post yesterday of Felix Salmon's excellent article on the use of Copulas in finance. Today Mr. Salmon has posted some excerpts of a response to his article written by a quant blogger who argues that, at least on the sell side, there is a cult-like mentality on wall street. The high priest of the cult is Professor X who is pedaling his latest formula. The quant money managers are the willing sheep.

Monday, February 23, 2009

"The formula that killed wall street"

Great article here on the wired website about the Gaussian Copula which purported to allow investors to simply model correlation between disparate assets. Well worth a read...

Saturday, February 21, 2009

Dollar Cost Averaging.

Like most folks who have a 401k or equivalent, I have lost a lot of money of late. I don't even look at the statements anymore! But the one comfort I have is that I am buying stocks at bargain prices, and lots of them. This is the genius behind dollar cost averaging (DCA). You put the same dollar amount in the market every month, and at times like this, you get a lot of stock for your money.

My favorite blogger, Felix Salmon discusses the topic here. Felix also goes off on folks who criticize Suze Orman for her support of DCA. He's completely correct. While Suze Orman is a little annoying at times, she offers very sensible sound advice.

So, don't change your allocation - keep buying these bargain stocks. Oh and hope for brighter days!

Wednesday, February 18, 2009

Replicating economic research

Felix Salmon at portfolio.com has a post about a recent academic paper that documents how hard it is to replicated academic research - specifically in economics.

The paper he references, by B. D. McCullough and Ross McKitrick is pretty damning arguing that:
"In practice it is rare for scientists to make their data and code accessible, and it is rare for scientists to replicate one another's work, in part because it can be so difficult to get the data"

Although this all seems pretty bad for academic economics, several of the specific cases that are cited are well known. For example, one that is cited is the Boston Fed study on mortgage red lining. This study led to an expansion of mortgage lending, but turns out to be based on very flawed analysis. This case has been well known for a while and is often cited as one of the causes of the current subprime mess.

But the problem extends to other academic papers such as those appearing in the American Economic Review and the Journal of Money Credit and Banking. In both cases the authors found it pretty hard to replicate key studies, largely because the original authors wouldn't divulge programs or data.

So why don't economics (or finance) researchers just post their code on their websites in addition to their working papers?

The answer is simple, a significant amount of the research process is devoted to writing the programs that crunch the numbers. For a complex paper that appears in a top journal, there can be hundreds of hours devoted to coding. This code has the potential to be used for future research projects. For an empirical researcher, the statistical software analysis code that he or she writes is basically what pays the bills.

Having said this, I personally am always willing to help others replicate or extend my papers, because, among other reasons, that new author will cite my work. But its pretty rare that I will just hand out programs - they are my intellectual capital.

R-squared and return predictability.

The R-squared of a regression of mutual fund returns on various measures in the so-called 4 factor model is a measure of how much of the fund's returns can be explained by the model. The 4 factor model is pretty close to the "state of the art" for return measurement. A lower R-squared implies that the fund's returns are less well explained by the 4 factor model.

A recent working paper argues that this R-squared has useful forecasting ability. Specifically that lagged R-squared is negatively correlated with future fund alphas. Even more surprising is that the negative relation also extends to information ratios which is commonly defined as alpha divided by the funds idiosyncratic risk. This is important because you could boost alpha by just sacrificing diversification. The information ratio result suggests that this is not the case.

The authors interpret their findings as evidence of "selectivity or active management".

Interesting stuff. Especially after I just finished teaching my MBAs that idiosyncratic risk is not priced. Hmmm the tangled web we weave...

The paper in question is...

"Mutual Fund's R2 as Predictor of Performance" Free Download

YAKOV AMIHUD, New York University - Stern School of Business

RUSLAN GOYENKO, McGill University - Faculty of Management


and appears on SSRN here

Hedge fund fees

Rather unsurprisingly, hedge fund fees are getting cut. Are 1 and 10% fee structures the future?

Frontline - inside the meltdown

Frontline on PBS has a great new documentary on the financial meltdown that started with Bear, then Lehman and then AIG. Well worth watching. You can view it online here

Saturday, February 14, 2009

What do economists agree on?

Greg Mankiw lists things that economists agree upon. I have to admit that I agree on all these. That must make me pretty mainstream...
  1. A ceiling on rents reduces the quantity and quality of housing available. (93%)
  2. Tariffs and import quotas usually reduce general economic welfare. (93%)
  3. Flexible and floating exchange rates offer an effective international monetary arrangement. (90%)
  4. Fiscal policy (e.g., tax cut and/or government expenditure increase) has a significant stimulative impact on a less than fully employed economy. (90%)
  5. The United States should not restrict employers from outsourcing work to foreign countries. (90%)
  6. The United States should eliminate agricultural subsidies. (85%)
  7. Local and state governments should eliminate subsidies to professional sports franchises. (85%)
  8. If the federal budget is to be balanced, it should be done over the business cycle rather than yearly. (85%)
  9. The gap between Social Security funds and expenditures will become unsustainably large within the next fifty years if current policies remain unchanged. (85%)
  10. Cash payments increase the welfare of recipients to a greater degree than do transfers-in-kind of equal cash value. (84%)
  11. A large federal budget deficit has an adverse effect on the economy. (83%)
  12. A minimum wage increases unemployment among young and unskilled workers. (79%)
  13. The government should restructure the welfare system along the lines of a “negative income tax.” (79%)
  14. Effluent taxes and marketable pollution permits represent a better approach to pollution control than imposition of pollution ceilings. (78%)

Greg's blog is well worth reading. His posts are usually short and very focussed.