Relevant and even prescient commentary on news, politics and the economy.

Benford’s Law and the Decreasing Reliability of Accounting Data"

H/t Mike Kimel

Via Economist’s View

This is from Jialan Wang:

Benford’s Law and the Decreasing Reliability of Accounting Data for US Firms, by Jialan Wang: …[T]here are more numbers in the universe that begin with the digit 1 than 2, or 3, or 4, or 5, or 6, or 7, or 8, or 9. And more numbers that begin with 2 than 3, or 4, and so on. This relationship holds for the lengths of rivers, the populations of cities, molecular weights of chemicals, and any number of other categories. …

This numerical regularity is known as Benford’s Law, and specifically, it says that the probability of the first digit from a set of numbers is d is given by

In fact, Benford’s law has been used in legal cases to detect corporate fraud, because deviations from the law can indicate that a company’s books have been manipulated. Naturally, I was keen to see whether it applies to the large public firms that we commonly study in finance.

downloaded quarterly accounting data for all firms in Compustat,… over 20,000 firms from SEC filings… (revenues, expenses, assets, liabilities, etc.).

And lo, it works! Here are the distribution of first digits vs. Benford’s law’s prediction for total assets…

Next, I looked at how adherence to Benford’s law changed over time, using a measure of the sum of squared deviations of the empirical density from the Benford’s prediction…

Deviations from Benford’s law have increased substantially over time, such that today the empirical distribution of each digit is about 3 percentage points off from what Benford’s law would predict. The deviation increased sharply between 1982-1986 before leveling off, then zoomed up again from 1998 to 2002.  Notably, the deviation from Benford dropped off very slightly in 2003-2004 after the enactment of Sarbanes-Oxley accounting reform act in 2002, but this was very tiny and the deviation resumed its increase up to an all-time peak in 2009.

So according to Benford’s law, accounting statements are getting less and less representative of what’s really going on inside of companies.The major reform that was passed after Enron and other major accounting standards barely made a dent.

Next, I looked at Benford’s law for three industries: finance, information technology, and manufacturing. … [shows graphs] … While these time series don’t prove anything decisively, deviations from Benford’s law are compellingly correlated with known financial crises, bubbles, and fraud waves. And overall, the picture looks grim. Accounting data seem to be less and less related to the natural data-generating process that governs everything from rivers to molecules to cities. Since these data form the basis of most of our research in finance, Benford’s law casts serious doubt on the reliability of our results. And it’s just one more reason for investors to beware….

Tags: , , , Comments (3) | |

Random Notes 3 June 2011

  1. Buce has been on fire recently, so I’ll probably have to do a post about why this post is so off-target, though his conclusion is correct (short version: he’s been misled).
  2. If I’m reading this morning’s SIFMA Brief correctly, Moody’s—whose rating skills Robert has discussed at length—(1) may downgrade US debt if we spend too much and (2) will downgrade US banks unless we spend too much on them. Oh, and the banks object to regulation because it would “artificially” reduce asset values (presumably, many of the same ones Moody’s wants protected).
  3. Relatedly, James Salt (probably h/t Felix) notes that “generous” UK banks are playing reporting games. (The US version is to deny the rework and leave the asset marked at unsustainable levels.)
  4. That this is spot-on would make me sadder if I thought we still lived in anything resembling a meritocracy, or even a developing economy.
  5. If we needed further evidence of that, the state with the best secondary eduction system in the country is pushing forward with privatize-the-gains.
  6. I’m more and more convinced that China “is different,” but very much not certain the differences will make an ultimate difference. Daniel Gross is inclined to think not. More on this as I finally finish my review of BoomBustOlogy, which you should expect to see some time before the apocalypse.
  7. I assume everyone has already seen this. Just in case, check out the facts, stylised or not.
  8. Oh, and Felix is wrong here. But that’s a post that will probably never be written by me. Someone else want to send it in?

Tags: , , , , , , , , Comments (2) | |

Liquidity, Markets, and Pricing: A Contemporary Example

A lot of trading in the Fixed Income (and especially FX) market is done for “liquidity” purposes. There is often an underlying goal involved (e.g., push prices higher with small lots, sell large ones at the elevated prices) and frequently such strategies are discussed as “algorithmic trading.” (Example: the algorithm estimates that you will need to buy 5 $100MM lots of JPY at incrementally higher rates to be able to sell $1B USD at the higher JPY level.)

The liquidity of the “markets” is facilitated by algorithmic trading: the seller for the first five trades in the above example doesn’t care about the purpose of the counterparty’s trade, just that the price bid is agreeable.

Then there are the times when algorithmic pricing goes terribly wrong:

Eisen began to keep track of the prices until he caught on to what was happening: The two sellers of that particular book — bordeebook and profnath — were adjusting their product prices algorithmically based on competitors:

Once a day profnath set their price to be 0.9983 times bordeebook’s price. The prices would remain close for several hours, until bordeebook “noticed” profnath’s change and elevated their price to 1.270589 times profnath’s higher price. The pattern continued perfectly for the next week.

The biologist continued to watch the prices grow higher and higher until they hit a peak price of $23,698,655.93 on April 19. On that day “profnath’s price dropped to $106.23, and bordeebook soon followed suit to the predictable $106.23 * 1.27059 = $134.97.” This means that someone must’ve noticed what was happening and manually adjusted the prices. [italics mine]

As a mathematical exercise, the shift from $106.23 to $23,000,000 and change is clear: one dealer must price their copy higher than the other dealer. (If both do so, you get to the same point or higher even quicker.) Similarly, if both dealers price at a fraction below 1.000 of the other, the price will converge toward $0.00 as the algorithm progresses.

Consider the implication for a potential third seller, though. Depending on when they check, they may believe they have a book that will make them (if and when sold) rich. But the “market” they see is two computers offering against each other—there is no bid-side shown, and pricing “to sell” (say, $850K when both of the others are offered at around $1.7MM) implies that the third potential seller is carrying that asset at an inflated value.

Market transactions do not require two entities to like each other, or even to understand what the other is trying to do. Indeed, if your alogirthm is buying at 85.3 JPY/USD and mine is selling at that level, neither of us necessarily cares why the other is transacting. And the rest of the market sees an actual trade against which they can adjust their pricing.

It’s only when the algorithms are trying to do the same thing that $23MM+ books are offered.

The implication for mark-to-market valuation seems obvious, and is left as an exercise to the reader.

Tags: , , , , , , Comments (12) | |

Economics and Bosses

Peter Dorman at Econospeak, who is smarter and nicer than I am,* boils down the question:

[D]o you believe that managers normally make the right decisions over how to run organizations?

If you believe that premise, please explain:

  1. Why all those great managers of the late 1940s through the mid-1970s ran defined benefit contribution plans, but their successors—who supposedly are more capable—are only capable of offering defined contribution?
  2. That “underfunded pension benefits” are evil, but “overunded” pensions led to the LBO (now “Private Equity”) movement of the 1980s.
  3. That, in the 1980s, GM being $1B underfunded caused Congress to pass a bill allowing pensions to become fully funded over 20 years—and that most of those targets were missed?

If bosses are so good at managing “ongoing concerns,” why do they take their payments upfront? What does—and should—this tell us about discount rates?

*This is a fairly low standard, outside of people who work in finance.

Tags: , , , , , , Comments (70) | |

Accounting for Scott Sumner

Robert Waldmann

This whole post is after the jump as my accounting is not ready for prime time.

Scott Sumner thinks he is the first to note that the cost to the US government of bailing out the big banks is more likely to be a profit than a cost. Clearly he doesn’t read angry bear much, as I have been predicting that for months.

His accounting strikes me as very odd. Last I hear, the total cost of bailouts (including GSEs, AIG, GM and Chrysler) was predicted to be $87 billion. This does not include the cost of the FDIC honoring its contracts which was not discretionary and not a bailout by any normal use of the word.

Now Sumner reports the good news that the cost not including GM and Chrysler will be only 158 billion ?!?

Huh what happened ? First I think he forgot about roughly 125 billion when he wrote “Last time I wrote on this subject the eventual cost to the government from bailing out the big banks was estimated at a negative $7 billion–in other words a profit to Uncle Sam of $7 billion.” I believe that when he wrote “the government” and “uncle Sam” he meant “The Treasury”. Uncle Sam also has this little organization called the Federal Reserve Board. Last I heard it was predicted to make a profit of 125 billion out of its bailout efforts. Not all of that involved big banks, but I just don’t believe that the government made only 7 billion out of its direct interactions with big banks. In any case, the 125 billion (or probably more now) seems to have escaped Prof. Sumner’s notice entirely.

The news which he reports is that the current guess is that the cost of bailing out AIG is going to be about zero. That is, the amount AIG owes is roughly equal to the expected present value of future repayments.

Sumner gets his huge loss overall because he describes the cost of bailing out Fannie and Freddie as “$165 billion and rising.” I believe this is the amount they owe the Treasury minus zero. Sumner argues that big banks and AIG were OK investments and GSEs weren’t because in one case he includes expected discounted repayments and in the other he decides they are zero.

It is worth noting that the GSE rescue involved loans at 10% per year and the GSE debt is not equal to money transferred from the Treasury to GSEs plus the interest the Treasury paid on that extra debt. Oh no. It is the amount transfered plus penalty interest rates charged on that amount.

Basically, I beleive that Sumner did not stick to a consistent definition of “cost” and redefines the word so as to generate meaningless numbers which confirm his prejudices.

Also he doens’t understand the extent of the US government and thinks it is just the department of the Treasury.

One of us is profoundly confused.

Tags: , , , , Comments (11) | |

Let’s Play a Game: Connect the Dots

Let us assume:

  1. That there is an equity premium ([PDF] UPDATE: Link modified to Brad De Long posting in which the PDF is embedded. Hat tip: Don Lloyd in comments.)
  2. That the equity premium can be derived from a linear relationship (y = ax(1) + bx(2) + ….) of the most significant variables.
  3. That equity premia are, to some not-insignificant extent, based on Wealth Inequality [PDF].
  4. That one of the primary variables contributing to the premium is the reliability and quality of the information provided

Given the above, should we expect the imminent weakening of U.S. accounting rules, discussed here and here, to produce a higher equity premium?*

If so, there are two possibilities. Either,

  1. the firms are perceived to have a higher present value, despite the change being one of accounting, not business flows or
  2. the stock prices decline in the face of greater uncertainty, leading to an increase in the premium due to a lowering of the price.

My instinctive answer—which I can probably be convinced is wrong, but it would take some effort**—is that the second would be the result. Leaving only one question:

Why does Christopher Cox (R-CA) hate the Securities Markets?

*Following from [3] above, we can assume that greater information tends to result in more optimal investment practices, and therefore a lower equity premium.

**The argument would have to show that the loosening of accounting practices will result in improvements to the company’s business processes that would not otherwise occur.

Tags: , , , , Comments (0) | |