Data inundation constrains thinking on economy

  • By James McCusker
  • Thursday, January 21, 2016 2:32pm
  • Business

The Wall Street Journal recently ran a quiz entitled, “Test Your Data IQ. What Scary Economic Indicators Are We Charting?” Each question consisted of a chart that had a time line but no further information and the test taker had to identify what the data represented.

If you correctly identified all five charts, as one test-taker (obviously a ringer) around here did, you should probably get out more. But if you got even just a couple of them right, you probably have a sense of what it must be like to be on the Federal Reserve Board. One set of numbers tells you everything is wonderful and another tells you to put on your helmet and take cover.

The contradictory data lends itself to conspiracy theories, of course, and there is no shortage of them. The president is frequently seen as responsible for manipulating the numbers to create a rosy picture. This branch of thinking assumes that the motive is political, but a related branch believes that instead it is all part of Wall Street’s “rigged game” operation. Alongside the conspiracies, we have the usual civil service incompetence theories, which are aided and abetted by the Smithsonian-grade computers and information systems that government statisticians often have to deal with. None of these causal theories looks all that good under the quality assurance lights, however, and few economists take them seriously. The real reason is scary enough.

If we look past the easiest and least likely causes of the data conflicts we start to get a sense of the fault line that runs underneath modern economic theory and how it threatens its foundations. Fortunately, we also begin to get a sense of what we have to do to fix the problem.

At bottom, the fault is an over-dependence on data, caused by the data glut. When data was scarce, economic theory, seasoned by observation, experience, and logic, provided the main ingredient for economic forecasts and the economic policy decisions that stem from them. Today, though, we are awash in data. Other than a few back country areas treasured by skiers in winter and backpackers in summer, there is no place in America where you are not going to trip over a old correlation coefficient or a stray pile of data.

The data inundation affects everything we do. Politicians cite data and statistical studies when pushing their ideas and new laws. Sports teams trumpet the stats of their newly acquired players; rotisserie football lives on stats. Hollywood writers who used to have scripts in their back pockets now carry colorful graphs of Big Data on their telephones that prove their new TV sitcom idea can’t miss. And, from a probability standpoint, even barber shop and barstool arguments today are more likely to be settled with a statistics citation rather than the police citation.

Whether data, data everywhere is a good thing for us or not is best left to history. Its effects on economic theory and policy, though, raise questions that are serious enough to be addressed now rather than left for historians to ponder.

The flood of data into the offices, the desks and minds of economic policy makers has a good side, of course. It allows our databases to act as a rigorous method of testing ideas before they are turned loose on the world — much like the legendary description of a good committee: “Few good ideas are born there but a lot of bad ideas die there.”

The problem is that our economic ideas and our economic models, especially those to determine economic policy, are not just flooded with data, they are constrained by it. These limits to our thinking eventually dominate our thoughts on how the economy works. This effect is amplified by the methodology we use to create and test economic models. They are based on empirical data, which is good, but the higher the probability that the variables for the model are selected from the available data, the more that availability becomes the controlling element.

Where this will ultimately take us we do not know, but it is already affecting the quality of our economic forecasting. Our economic models failed to predict the Great Recession, and it is unlikely that they will predict the next sharp downturn — or upturn, for that matter. What they will predict, with increasing precision, is what will happen under an increasingly narrow set of circumstances; knowing more and more about less and less.

As the fundamental shortcomings of economic models become more visible, economists will be forced to look out the window and see what’s going on in the economy, and begin assembling theories that first describe it … then predict it. We can do that. We have the technology. James McCusker is a Bothell economist, educator and consultant. He also writes a column for the monthly Herald Business Journal.

Talk to us

> Give us your news tips.

> Send us a letter to the editor.

> More Herald contact information.

Support local journalism

If you value local news, make a gift now to support the trusted journalism you get in The Daily Herald. Donations processed in this system are not tax deductible.