Human element makes predicting downturns tough

  • By James McCusker
  • Thursday, January 28, 2016 1:07pm
  • Business

If a predictive software model can forecast a power company’s equipment failures, or maybe even an oncoming heart attack, why can’t our economic models predict downturns and allow us to short-circuit them? As it turns out, there is a reason.

The Salt River Project supplies electric power to over a million household and commercial customers in the Phoenix, Arizona, area. Although its configuration is, of course, unique, it shares many of the same problems that other energy companies cope with. One of those shared problems is the high cost of outages caused by equipment failures, a cost paid for not only in emergency responses and substitute power purchases but also in customers’ costs and dissatisfaction.

It is much less costly and disruptive to replace a piece of equipment before it fails. It is less expensive and in many, if not most, cases can be done without disrupting the customers’ energy flow. There is nothing good or inexpensive about the unexpected failure of a system or a piece of equipment, though, so reliability predictions are important. The Salt River Project, with the assistance of General Electric Corp., connected its equipment and its grid to a predictive software package that continuously monitored the system, looking for anomalies. Any anomaly that showed up in the data would be reported, diagnosed, and analyzed for the likelihood that it would lead to an equipment failure — and the utility managers could take steps to schedule a repair or replacement that will deal with the problem before it actually becomes a problem.

This kind of system is a world away from the labor intensive ways that have been used to ensure that critical systems stayed on line. As electronics began to play a greater role in our lives, failures plagued systems large and small, public and private. My personal experience with these systems first came with shipboard radar, where critical failures always seemed to involve back-ordered components. In these situations getting the radar back on line meant getting on the radio and finding a ship that had the right kind of klystron tube or some other needed part gathering dust in their spares inventory — and would trade for a warm duffel coat, or the steaks from a frozen six-way beef collection.

This sub-rosa world of trades and work-arounds sat atop an army of engineers and statisticians who labored extensively to calculate MTBF (Mean Time Between Failures) for individual components and systems. These MTBFs would eventually be used to set probability-driven inventory levels for replacement parts, and, theoretically, combine the highest level of system reliability with the most efficient budgeting. It worked well, but not perfectly, because it was constantly colliding with the reality of complex logistics systems and, worse, an x-factor that seemed almost demonic: systems always failed at the worst possible time.

The foundation of the MTBF system was a combination of experience-based and theoretical probabilities of when a component or system would fail. As the pace of electronic technology accelerated, though, the mix became more and more theoretical because not enough time had elapsed to develop experience-based data. The growing level of uncertainty that this caused was one factor in the decision to build the Mercury-Apollo moon landing program with “off the shelf” technology that came with a known reliability history.

The transition to sensor-based data and analysis — called a predictive software system — requires a lot of computer power and data base storage, more than most companies normally have or would be willing to pay for, install, and maintain. “Cloud” computing services are making this type of information system economically and technologically feasible, though, and organizations like Salt River Project are taking advantage of it.

The Snohomish County Public Utility District has a very different configuration from the Salt River Project, and therefore its use of predictive software focused initially on reliability issues closer to electricity consumers rather than production facilities. The first systems, then, addressed sub-station equipment as well as the organization’s pool of outage response vehicles. Further uses of predictive software are planned but each still has to be designed, evaluated and prioritized, a complex process in its own right.

Our savings would be huge if we had a system that could accurately forecast economic downturns. Unfortunately, though, economics is a long way from developing anything even vaguely resembling a predictive software system. The reason is that macroeconomics, especially in the short run, is based on predictably unpredictable human behavior and intentions — and there are no sensors or sensor-based data for either of these.

Economics will get predictive software eventually, because the incredible cost savings will provide the motivation. For the near future, though, we will be dependent on human-based judgment for our economic forecasts and policy decisions.

James McCusker is a Bothell economist, educator and consultant. He also writes a column for the monthly Herald Business Journal.

Talk to us

> Give us your news tips.

> Send us a letter to the editor.

> More Herald contact information.

Support local journalism

If you value local news, make a gift now to support the trusted journalism you get in The Daily Herald. Donations processed in this system are not tax deductible.