Book Review: “Thinking in Systems” by Donella Meadows

I have yet to see any problem, however complicated, which, when looked at in the right way, did not become still more complicated! – a quote from the book

The book “Thinking in Systems” by Donella Meadows starts unconventionally even in sedentary prefaces to the books and keeps the reader engaged till the end. Published posthumously in 2009 by editing the initial draft of the book as written by its author in 1993, this book gives a lot of insights to unpack and I believe can change one’s perspective of looking at complex systems. Naturally, examples used in the book seem “dated” yet key insights are practicable and relevant even 30 years later. As the author herself writes in the preface – “One of my purposes is to make you interested. Another of my purposes, the main one, is to give you a basic ability to understand and to deal with complex systems..”. By the time you finish the book, you’d probably agree with her!

Here are key learning for me

  1. A system is a set of things—people, cells, molecules, or whatever—interconnected in such a way that they produce their own pattern of behavior over time
    1. A system must consist of three kinds of things: elements, interconnections, and a function or purpose
    2. The system, to a large extent, causes its own behavior! Different systems product different results or exhibit different behavior for the same event
    3. The structure of a system is its interlocking stocks, flows, and feedback loops
      1. A stock is the memory of the history of changing flows within the system
    4. The behavior of a system cannot be known just by knowing the elements of which the system is made
    5. Systems can’t be controlled, but they can be designed and redesigned
    6. It’s easier to learn about a system’s elements than about its interconnections i.e. relationships that hold the elements together
    7. Is there anything that is not a system? Yes e.g. a conglomeration without any particular interconnections or function. Sand scattered on a road by happenstance is not a system. If you remove some part of sand, you still just have sand on the road!
    8. Not all systems have feedback loops. Also, the presence of a feedback mechanism doesn’t necessarily mean that the mechanism works well
      1. A feedback loop is a closed chain of causal connections from a stock, through a set of decisions or rules or physical laws or actions that are dependent on the level of the stock, and back again through a flow to change the stock
      2. The information delivered by a feedback loop—even nonphysical feedback—can only affect future behavior; it can’t deliver a signal fast enough to correct behavior that drove the current feedback
      3. Balancing feedback loops are goal-seeking or stability-seeking e.g. thermostat controlled home
        • A delay in a balancing feedback loop makes a system likely to oscillate
      4. Reinforcing feedback loops are self-enhancing, leading to exponential growth or to runaway collapses over time. They are found whenever a stock has the capacity to reinforce or reproduce itself e.g. population of a country
      5. Systems with similar feedback structures produce similar dynamic behaviors
      6. No physical system can grow forever in a finite environment
      7. ​​A quantity growing exponentially toward a constraint or limit reaches that limit in a surprisingly short time
  2. The systems-thinking lens allows us to reclaim our intuition about whole systems and
    1. hone our abilities to understand parts,
    2. see interconnections,
    3. ask “what-if” questions about possible future behaviors, and
    4. be creative and courageous about system redesign
  3. How to know whether you are looking at a system or just a bunch of stuff:
    1. Can you identify parts? … and
    2. Do the parts affect each other? … and
    3. Do the parts together produce an effect that is different from the effect of each part on its own? … and perhaps
    4. Does the effect, the behavior over time, persist in a variety of circumstances?
  4. Many of the interconnections in systems operate through the flow of information. Information holds systems together and plays a great role in determining how they operate
    1. Changing interconnections in a system can change it dramatically
    2. A stock is the memory of the history of changing flows within the system
      1. The human mind seems to focus more easily on stocks than on flows. On top of that, when we do focus on flows, we tend to focus on inflows more easily than on outflows
      2. A stock can be increased by decreasing its outflow rate as well as by increasing its inflow rate. There’s more than one way to fill a bathtub!
      3. Stocks generally change slowly, even when the flows into or out of them change suddenly. Hence, stocks act as delays or buffers or shock absorbers in systems
      4. The presence of stocks allows inflows and outflows to be independent of each other and temporarily out of balance with each other (as you can sense, this is the core argument of 99% of FinInfluceners exhorting creation of an emergency fund before doing anything else as part of an individuals’ FIRE journey!)
    3. If you understand the dynamics of stocks and flows—their behavior over time—you understand a good deal about the behavior of complex systems
      1. Systems thinkers see the world as a collection of stocks along with the mechanisms for regulating the levels in the stocks by manipulating flows
      2. When a systems thinker encounters a problem, the first thing she does is look for data, time graphs, the history of the system
  5. The least obvious part of the system, its function or purpose, is often the most crucial determinant of the system’s behavior
    1. Keeping sub-purposes and overall system purposes in harmony is an essential function of successful systems
  6. If information-based relationships are hard to see, functions or purposes are even harder
    1. The word function is generally used for a nonhuman system, the word purpose for a human one, but the distinction is not absolute, since so many systems have both human and nonhuman elements
    2. Purposes are deduced from behavior, not from rhetoric or stated goals
    3. A change in purpose changes a system profoundly, even if every element and interconnection remains the same
  7. Non-renewable resources are stock limited while renewable resources are flow limited
  8. Resilience, self-organization, or hierarchy make the systems work well 
    1. Resilience is a measure of a system’s ability to survive and persist within a variable environment
    2. Resilience is not the same thing as being static or constant over time. Resilient systems can be very dynamic
    3. A set of feedback loops that can restore or rebuild feedback loops is resilience at a still higher level—meta-resilience
    4. Loss of resilience can come as a surprise, because the system usually is paying much more attention to its play than to its playing space
  9. The capacity of a system to make its own structure more complex is called self-organization
    1. Self-organization produces heterogeneity and unpredictability
    2. Like resilience, self-organization is often sacrificed for purposes of short-term productivity and stability
    3. Even complex forms of self-organization may arise from relatively simple organizing rules—or may not
    4. Complex systems can evolve from simple systems only if there are stable intermediate forms
    5. Hierarchical systems evolve from the bottom up. The purpose of the upper layers of the hierarchy is to serve the purposes of the lower layers
  10. Systems keep surprising us, there are 3 simple truths to keep in mind for getting surprised less often!
    1. Everything we think we know about the world is a model. Every word and every language is a model. Inference: being aware of this limitation is powerful to have skepticism about one’s mental models and their efficacy in real world
    2. Our models usually have a strong congruence with the world. That is why we are such a successful species in the biosphere. Inference: your intuition is powerful!
    3. Our models fall far short of representing the world fully. That is why we make mistakes and why we are regularly surprised. Inference: keep revising your understanding of the world
    4. One reason why systems of all kinds surprise us. We are too fascinated by the events they generate and we pay too little attention to their history
    5. Beware of clouds! They are prime sources of system surprises. Clouds stand for the beginnings and ends of flows
  11. Event, behavior, structure. System structure is the source of system behavior. System behavior reveals itself as a series of events over time
    1. The behavior-based models are more useful than event-based ones, but they still have fundamental problems. First, they typically overemphasize system flows and underemphasize stocks. Second, and more seriously, in trying to find statistical links that relate flows to each other
    2. Behavior-based econometric models are pretty good at predicting the near-term performance of the economy, quite bad at predicting the longer-term performance
  12. There is no single, legitimate boundary to draw around a system. We have to invent boundaries for clarity and sanity; and boundaries can produce problems when we forget that we’ve artificially created them
    1. Boundaries are of our own making, and that they can and should be reconsidered for each new discussion, problem, or purpose
  13. There always will be limits to growth. They can be self-imposed. If they aren’t, they will be system-imposed
  14. When there are long delays in feedback loops, some sort of foresight is essential. To act only when a problem becomes obvious is to miss an important opportunity to solve the problem
    1. Overshoots, oscillations, and collapses are always caused by delays
    2. What is a significant delay depends—usually—on which set of frequencies you’re trying to understand. Perhaps explains longevity of redundant corporate reviews wherein the same information is presented to stakeholders but at different frequency!
  15. Bounded rationality means that people make quite reasonable decisions based on the information they have. But they don’t have perfect information
    1. Quick and easy behavior changes can come, with even slight enlargement of bounded rationality, by providing better, more complete, timelier information
  16. Some systems are more than surprising. They are perverse. These are the systems that are structured in ways that produce truly problematic behavior, called archetypes
    1. Some of the behaviors these archetypes manifest are addiction, drift to low performance, and escalation
    2. Policy resistance
      1. Such resistance to change arises when goals of subsystems are different from and inconsistent with each other
      2. One way to deal with policy resistance is to try to overpower it. Ever increasing fines by Green Tribunals in India which don’t seem to solve annual pollution spikes in Northern parts of the country
      3. The alternative to overpowering policy resistance is so counterintuitive that it’s usually unthinkable. Let go. Give up ineffective policies. How to increase the birth rate of a shrinking country? Would coercion work? 
      4. The most effective way of dealing with policy resistance is to find a way of aligning the various goals of the subsystems, usually by providing an overarching goal that allows all actors to break out of their bounded rationality
      5. Harmonization of goals in a system is not always possible, but it’s an option worth looking for
    3. The tragedy of the commons arises from missing (or too long delayed) feedback from the resource to the growth of the users of that resource. 3 ways to overcome tragedy of commons
      1. Educate and exhort. Help people to see the consequences of unrestrained use of the commons
      2. Privatize the commons. Divide it up, so that each person reaps the consequences of his or her own actions. Can effluent pollution in rivers be solved by forcing cities to set up water treatment plans for their own use only downstream?
      3. Regulate the commons –  mutual coercion, mutually agreed upon. RWA rules in urban housing societies!
    4. Drift to low performance i.e. some systems not only resist policy and stay in a normal bad state, they keep getting worse. Electoral discourse in democracies perhaps!
      1. In this system, there is a distinction between the actual system state and the perceived state. The actor tends to believe bad news more than good news
      2. There are two antidotes to eroding goals. One is to keep standards absolute, regardless of performance. Another is to make goals sensitive to the best performances of the past, instead of the worst
      3. In short, we know what to do about drift to low performance. Don’t weigh the bad news more heavily than the good. And keep standards absolute!
    5. Escalation comes from a reinforcing loop set up by competing actors trying to get ahead of each other. The goal of one part of the system or one actor is not absolute
      1. Escalation, being a reinforcing feedback loop, builds exponentially. Sibling fights in households!
      2. The best way out of this trap is to avoid getting in it. If caught in an escalating system, one can refuse to compete (unilaterally disarm), thereby interrupting the reinforcing loop. Or one can negotiate a new system with balancing loops to control the escalation
    6. Success to the successful or rich getting richer or competitive exclusion
      1. Species and companies sometimes escape competitive exclusion by diversifying. A species can learn or evolve to exploit new resources
    7. Shifting the burden on the intervenor or addiction
      1. The problem can be avoided up front by intervening in such a way as to strengthen the ability of the system to shoulder its own burdens
      2. If you are the intervenor, work in such a way as to restore or enhance the system’s own ability to solve its problems, then remove yourself
    8. Rule beating i.e. evasive action to get around the intent of a system’s rules—abiding by the letter, but not the spirit, of the law
      1. Rule beating is usually a response of the lower levels in a hierarchy to overrigid, deleterious, unworkable, or ill-defined rules from above
      2. To solve this – design, or redesign, rules to release creativity not in the direction of beating the rules, but in the direction of achieving the purpose of the rules
    9. Seeking the wrong goal
      1. Specify indicators and goals that reflect the real welfare of the system. Be especially careful not to confuse effort with result or you will end up with a system that is producing effort, not result
  17. Leverage points – how to intervene in the system
    1. Leverage points are places in the system where a small change could lead to a large shift in behavior
    2. Leverage points are points of power and are not intuitive 
    3. Here are the tools to intervene in the system to change structure of the system such that we have more of what we desire and less of what we don’t
      1. Numbers – constant and parameters such as subsidies, taxes and rates. Powerful in the short term perhaps
      2. Buffers: the size of stabilizing stocks relative to their flows
        • You can often stabilize a system by increasing the capacity of a buffer. But if a buffer is too big, the system gets inflexible. It reacts too slowly
        • There’s leverage, sometimes magical, in changing the size of buffers
      3. Stock and flow structures – physical systems and their nodes of interaction
        • The only way to fix a system that is laid out poorly is to rebuild it
      4. Delays – the length of time relative to the rates of system changes
        • Delay length can be a high leverage point, except for the fact that delays are not often easily changeable. Things take as long as they take
        • It’s usually easier to slow down the change rate, so that inevitable feedback delays won’t cause so much trouble
      5. Balancing feedback loop – the length of feedback relative to the impact they are trying to create
        • The strength of a balancing feedback loop is important relative to the impact it is designed to correct. If the impact increases in strength, the feedbacks have to be strengthened too
      6. Reinforcing feedback loop – the strength of the gain of driving loops
        • Reducing the gain around a reinforcing loop—slowing the growth—is usually a more powerful leverage point in systems than strengthening balancing loops, and far more preferable than letting the reinforcing loop run
      7. Information flows – the structure of who does and doesn’t have access to the information
        • Missing information flows is one of the most common causes of system malfunction. Adding or restoring information can be a powerful intervention, usually much easier and cheaper than rebuilding physical infrastructure
        • There is a systematic tendency on the part of human beings to avoid accountability for their own decisions. That’s why there are so many missing feedback loops!
      8. Rules – incentives, punishments and constraints
        • Rules are high leverage points, power over rules is real power!
        • If you want to understand the deepest malfunctions of systems, pay attention to the rules and to who has power over them!
      9. Self-organisation – the power to add, change or evolve system structure
        • The ability to self-organize is the strongest form of system resilience. A system that can evolve can survive almost any change, by changing itself
        • Insistence on a single culture shuts down learning and cuts back resilience
      10. Goals – the purpose of function of the system
        • Changing the players in the system is a low-level intervention, as long as the players fit into the same old system. The exception to that rule is at the top, where a single player can have the power to change the system’s goal
      11. Paradigms – the mindset of which the system – its goals, structure, rules, delays, parameters arises
        • How do you change paradigms? You keep pointing to the anomalies and failures in the old paradigms or by building a model of the system which takes us outside the system and forces us to see it as a whole
      12. Transcending paradigms i.e. that is to keep oneself unattached in the arena of paradigms, to stay flexible, to realize that no paradigm is “true
        • If no paradigm is right, you can choose whatever one will help to achieve your purpose. If you have no idea where to get a purpose, you can listen to the universe
        • The higher the leverage point, the more the system will resist changing it
  18. How to live in a world of systems?
    1. Get the beat of the system. Before you disturb the system in any way, watch how it behaves
      1. Listen to the wisdom of the system
      2. Locate responsibility in the system
      3. Celebrate complexity 
    2. Expand the time horizon. The longer the time horizon, better changes of survival
      1. In strict systems sense, there is no short term and long term distinction
    3. Starting with history discourages the common and distracting tendency we all have to define a problem not by the system’s actual behavior, but by the lack of our favorite solution
    4. Remember, always, that everything you know, and everything everyone knows, is only a model
      1. Model utility depends not on whether its driving scenarios are realistic (since no one can know that for sure), but on whether it responds with a realistic pattern of behavior
    5. Honor, respect and distribute information. Most of what goes wrong in systems goes wrong because of biased, late, or missing information
      1. Honoring information means above all avoiding language pollution. Second, it means expanding our language so we can talk about complexity
    6. Use language with care and enrich it with systems concepts
    7. Pay attention to what is important, not just what is quantifiable
      1. If something is ugly, say so. Don’t be stopped by the “if you can’t define it and measure it, I don’t have to pay attention to it” ploy
    8. Make feedback policies for feedback systems
      1. It’s easier, more effective, and usually much cheaper to design policies that change depending on the state of the system. Especially where there are great uncertainties
    9. Go for the good of the whole
      1. Don’t go to great trouble to optimize something that never should be done at all. Elon Musk has referenced something similar in his one of the very popular interviews!
      2. Defy the disciplines. See things as interdisciplinary!
    10. Expand the boundary of caring
      1. Living successfully in a world of complex systems means expanding not only time horizons and thought horizons; above all, it means expanding the horizons of caring
    11. Don’t’ erode the goal of goodness

If you were to read only a paragraph from the book, this is probably it

The bounded rationality of each actor in a system—determined by the information, incentives, disincentives, goals, stresses, and constraints impinging on that actor—may or may not lead to decisions that further the welfare of the system as a whole. If they do not, putting new actors into the same system will not improve the system’s performance. What makes a difference is redesigning the system to improve the information, incentives, disincentives, goals, stresses, and constraints that have an effect on specific actors

This book can be ordered from Amazon from here!

Leave a comment