“Perfection of planning is a symptom of decay.” – C. Northcote Parkinson
In the lead up to World War II, the French developed an intricate series of concrete fortifications along their borders to prevent the Germans from invading. The Maginot Line was something of an underground fortress created in response to the brutal trench warfare of the first World War.
It was mostly protected from aerial strikes and tank attacks and boasted state of the art living conditions underground. Many of the troops said their living conditions were better than some cities in France. There was air conditioning, recreation areas, rail lines for the transport of cargo and walls thicker than anything seen in World War I.
France was outnumbered by the Germans but felt pretty safe with this modern form of trench warfare.
Unfortunately, the Germans simply went around these fortifications, attacking through Belgium. Since the Germans attacked from behind the Maginot Line, all of the cannons the French had set up were pointed the wrong way.
It was only then that the French generals realized that the cannons couldn’t be turned around, rendering their defenses useless.
This is an obvious case of fighting the last war (literally) but it’s also a lesson in the law of unintended consequences.
When you implement any sort of complex system such as this, there are bound to be trade-offs. Nothing functions perfectly in these situations because if it did, that’s just what everyone would do. There would be no need for strategy if the perfect solution was readily available.
In his book, The Systems Bible, John Gall shares more examples of plans gone awry in unexpected ways:
- Insecticides, introduced to control disease and improve crop yields, turn up in the fat pads of Auks in the Antipodes and in the eggs of Ospreys in the Orkneys, resulting in incalculable ecologic damage. Meanwhile, the insects develop immunity to insecticides, even learning to thrive on them.
- The Aswan Dam, built at enormous expense to improve the lot of the Egyptian peasant, has caused the Nile to deposit its fertilizing sediment in Lake Nasser, where it is unavailable. Egyptian fields must now be artificially fertilized. Gigantic fertilizer plants have been built to meet the new need. The plants require enormous amounts of electricity. The dam must operate at capacity merely to supply the increased need for electricity which was created by the building of the dam.
- The Queen Elizabeth ll, greatest ocean liner ever built, has three separate sets of boilers for safety, reliability, and speed. Yet on a recent cruise, in fine weather and a calm sea, all three sets of boilers failed simultaneously.
- The largest telescope in the world, a 230-inch reflector, takes so long to reach thermal equilibrium with the surrounding air that the night is over before it can focus a star image.
Not only are complex systems difficult to plan for, but adding in the human element only further complicates matters.
Tyler Cowen recently interviewed venture capital firm a16z co-founders Marc Andreessen and Ben Horowitz. Cowen asked Horowitz what the best predictor of simply managerial intelligence is. Horowitz described two skills: (1) being a systems thinker and (2) understanding other people:
It’s two skills that normally don’t go together. It’s systems thinking. Most people are not systems thinkers meaning they cannot think about, “OK if I change this here then it’s gonna affect that over there.”
Can you actually see the people in your organization? Do you know who they are? As opposed to you’re talking to them like they’re you. Meaning: do you understand their motivation? Do you understand what they would think about something if they were in the room and you’re making a decision? Can you interpret them well enough so it’s as though they’re there? And can you understand the implications through the eyes of the people who work for you?
Horowitz wrote one of the best business and managerial books I’ve ever read called The Hard Thing About Hard Things. My main takeaway from the book is none of this stuff is ever easy:
There’s no recipe for really complicated, dynamic situations. There’s no recipe for building a high-tech company; there’s no recipe for leading a group of people out of trouble; there’s no recipe for making a series of hit songs; there’s no recipe for playing NFL quarterback; there’s no recipe for running for president; and there’s no recipe for motivating teams when your business has turned to crap. That’s the hard thing about hard things—there is no formula for dealing with them.
The hard thing about hard things is true in business and in life. There’s no handbook that tells you:
- when to give up and when to be patient.
- when to do more and when to do less.
- when to say yes and when to say no to opportunities.
- the difference between skill and luck.
- when it makes sense to take the job with more money and when it makes sense to take the one with more meaning.
- what’s a fat pitch and what’s a falling knife.
- when to put your foot on the gas and when to pump the brakes in your business.
- when to fire someone and when to give them a second chance.
- when to follow a model religiously and when to make some tweaks.
There are no shortcuts. There isn’t a textbook that can tell you exactly what to do in every situation. The conventional wisdom is often wrong. Making decisions based on past risks could work but it could also blow up in your face.
Maybe half the battle is understanding that there’s always going to be a battle in whatever it is you do and embracing that fact instead of shying away from it. There’s also something to be said for tackling uncertainty head on and building it into your expectations about the future.
Systems aren’t perfect because the human element often renders them useless at the extremes. And people aren’t perfect because systems allow you to make decisions without the impact of emotions.
Successful organizations require a combination of well thought out systems and people skills. But individuals also need to integrate these two skills to be successful.
Fighting the Last War