When Exotic Devices Create Exotic Problems

Vanity Fair has a fascinating article in the latest issue on airplane safety and how it affects pilot error. To make planes safer these days, engineers have designed automated systems that basically allow the planes to fly themselves.

Although the designers of the planes have technically made them as safe as they’ve ever been, accidents do still happen every once and a while. But it’s not the planes, but human error that causes the problems.

The issue is that there can be confusion between the pilot and the machine. An unintended consequence of increased safety for the plane is increased complexity for the pilots. This means most pilots rarely have take over the automated system to deal with a crisis situation, making it much more difficult to handle a crisis if and when it occurs.

Industry experts have warned of the side effects that can arise from this increased complexity for a number of years now:

One of the cautionary voices was that of a beloved engineer named Earl Wiener, recently deceased, who taught at the University of Miami. Wiener is known for “Wiener’s Laws,” a short list that he wrote in the 1980s. Among them:

  • Every device creates its own opportunity for human error.
  • Exotic devices create exotic problems.
  • Digital devices tune out small errors while creating opportunities for large errors.
  • Some problems have no solution.
  • It takes an airplane to bring out the worst in a pilot.
  • Whenever you solve a problem, you usually create one. You can only hope that the one you created is less critical than the one you eliminated.

Wiener pointed out that the effect of automation is to reduce the cockpit workload when the workload is low and to increase it when the workload is high. Nadine Sarter, an industrial engineer at the University of Michigan, and one of the pre-eminent researchers in the field, made the same point to me in a different way: “Look, as automation level goes up, the help provided goes up, workload is lowered, and all the expected benefits are achieved. But then if the automation in some way fails, there is a significant price to pay. We need to think about whether there is a level where you get considerable benefits from the automation but if something goes wrong the pilot can still handle it.”

I found a number of financial implications from this story. Although I’m a huge proponent of automating good decisions with your finances, that doesn’t mean everything can be set-it-and-forget-it forever. It can be dangerous when we do see something unexpected like the Flash Crash and investors freak out and completely abandon their plan.

Most of the time it’s not the model or automated investment process that causes the problem, but the fact that we can’t help ourselves from tinkering with them any time things don’t go as planned. Setting up a systematic, rules-based process only works if you follow those rules.

Far too often investors forget this fact and try to make ad hoc changes on the fly, which rarely works out. When quantitative stratgies fail it’s usually not the model that’s the problem, it’s that those running it aren’t able to control their emotions.

Wiener’s Laws could have been written specifically for the financial markets in many ways:

Every device creates its own opportunity for human error. Any financial model is only as good as the person or team using it.

Exotic devices create exotic problems. Complex strategies can create unforeseen complications.

Digital devices tune out small errors while creating opportunities for large errors. Risk comes in many forms and some models can lead to a false sense of security if you’re not aware of the imbedded risks.

Some problems have no solution. You have to choose which form of risk you want to deal with, risk now or risk in the future.

It takes an airplane to bring out the worst in a pilot. Financial markets magnify bad behavior in even some of the most intelligent people.

Whenever you solve a problem, you usually create one. You can only hope that the one you created is less critical than the one you eliminated. There’s no such thing a perfect portfolio or process. Every strategy involves trade-offs.

There are implications that go beyond the financial world as well. As technology continues to advance we’re all going to have to get used to working with machine-based intelligence both on the job and in our personal lives.

It will be interesting to see how these types of man-machine relationships will evolve.

Read the entire Vanity Fair piece for more:
The Human Factor (Vanity Fair)

Now for the best stuff I’ve been reading this week:

  • Turney Duff: “The single best thing I’ve learned since walking away from Wall Street is: I don’t need that much.” (Café)
  • There are 2 ways to build wealth: (1) Earn more or (2) Save more. Cullen Roche explains why (1) is even more important than (2) (Prag Cap)
  • Things I know to be true (Amni Rusli)
  • 30,000 blog posts later, tons of great lessons from Barry Ritholtz (WaPo)
  • It’s easy being a long-term investor in a bull market. The real test is when things go wrong (Reformed Broker)
  • An argument for international stocks (Novel Investor)
  • A 12 step program for controlling your emotions when investing (AAII)
  • 10 things smart investors never say (Daniel Crosby)
  • What’s on Charlie Munger’s book shelf? (Favobooks)
  • Why economic terrorism is good for the consumer (Leigh Drogen)
  • What’s the half-life on your investment ideas and beliefs? (Research Puzzle)

Subscribe to receive email updates and my monthly newsletter by clicking here.

Follow me on Twitter: @awealthofcs


This content, which contains security-related opinions and/or information, is provided for informational purposes only and should not be relied upon in any manner as professional advice, or an endorsement of any practices, products or services. There can be no guarantees or assurances that the views expressed here will be applicable for any particular facts or circumstances, and should not be relied upon in any manner. You should consult your own advisers as to legal, business, tax, and other related matters concerning any investment.

The commentary in this “post” (including any related blog, podcasts, videos, and social media) reflects the personal opinions, viewpoints, and analyses of the Ritholtz Wealth Management employees providing such comments, and should not be regarded the views of Ritholtz Wealth Management LLC. or its respective affiliates or as a description of advisory services provided by Ritholtz Wealth Management or performance returns of any Ritholtz Wealth Management Investments client.

References to any securities or digital assets, or performance data, are for illustrative purposes only and do not constitute an investment recommendation or offer to provide investment advisory services. Charts and graphs provided within are for informational purposes solely and should not be relied upon when making any investment decision. Past performance is not indicative of future results. The content speaks only as of the date indicated. Any projections, estimates, forecasts, targets, prospects, and/or opinions expressed in these materials are subject to change without notice and may differ or be contrary to opinions expressed by others.

The Compound Media, Inc., an affiliate of Ritholtz Wealth Management, receives payment from various entities for advertisements in affiliated podcasts, blogs and emails. Inclusion of such advertisements does not constitute or imply endorsement, sponsorship or recommendation thereof, or any affiliation therewith, by the Content Creator or by Ritholtz Wealth Management or any of its employees. Investments in securities involve the risk of loss. For additional advertisement disclaimers see here: https://www.ritholtzwealth.com/advertising-disclaimers

Please see disclosures here.

What's been said:

Discussions found on the web