- Systems, big or small, can behave in similar ways, and understanding those ways is perhaps our best hope for making lasting change on many levels.
- Once we see the relationship between structure and behavior, we can begin to understand how systems work, what makes them produce poor results, and how to shift them into better behavior patterns.
- A system is a set of things—people, cells, molecules, or whatever—interconnected in such a way that they produce their own pattern of behavior over time.
- The system, to a large extent, causes its own behavior!
- Every person we encounter, every organization, every animal, garden, tree, and forest is a complex system.
- The behavior of a system cannot be known just by knowing the elements of which the system is made.
- A system isn’t just any old collection of things. A system* is an interconnected set of elements that is coherently organized in a way that achieves something. If you look at that definition closely for a minute, you can see that a system must consist of three kinds of things: elements, interconnections, and a function or purpose.
- Systems can be embedded in systems, which are embedded in yet other systems.
- A system is more than the sum of its parts. It may exhibit adaptive, dynamic, goal-seeking, self-preserving, and sometimes evolutionary behavior.
- Many of the interconnections in systems operate through the flow of information. Information holds systems together and plays a great role in determining how they operate.
- A system’s function or purpose is not necessarily spoken, written, or expressed explicitly, except through the operation of the system.
- The best way to deduce the system’s purpose is to watch for a while to see how the system behaves.
- An important function of almost every system is to ensure its own perpetuation.
- Changing elements usually has the least effect on the system.
- The least obvious part of the system, its function or purpose, is often the most crucial determinant of the system’s behavior.
- A stock is the memory of the history of changing flows within the system.
- If you understand the dynamics of stocks and flows—their behavior over time—you understand a good deal about the behavior of complex systems.
- All models, whether mental models or mathematical models, are simplifications of the real world.
- A stock can be increased by decreasing its outflow rate as well as by increasing its inflow rate.
- A stock takes time to change, because flows take time to flow. That’s a vital point, a key to understanding why systems behave as they do. Stocks usually change slowly.
- Stocks generally change slowly, even when the flows into or out of them change suddenly. Therefore, stocks act as delays or buffers or shock absorbers in systems.
- Stocks allow inflows and outflows to be decoupled and to be independent and temporarily out of balance with each other.
- Human beings have invented hundreds of stock-maintaining mechanisms to make inflows and outflows independent and stable.
- Most individual and institutional decisions are designed to regulate the levels in stocks. If inventories rise too high, then prices are cut or advertising budgets are increased, so that sales will go up and inventories will fall.
- Systems thinkers see the world as a collection of stocks along with the mechanisms for regulating the levels in the stocks by manipulating flows. That means system thinkers see the world as a collection of “feedback processes.”
- When a stock grows by leaps and bounds or declines swiftly or is held within a certain range no matter what else is going on around it, it is likely that there is a control mechanism at work. In other words, if you see a behavior that persists over time, there is likely a mechanism creating that consistent behavior. That mechanism operates through a feedback loop. It is the consistent behavior pattern over a long period of time that is the first hint of the existence of a feedback loop.
- A feedback loop is formed when changes in a stock affect the flows into or out of that same stock. A feedback loop can be quite simple and direct.
- Not all systems have feedback loops. Some systems are relatively simple open-ended chains of stocks and flows. The chain may be affected by outside factors, but the levels of the chain’s stocks don’t affect its flows.
- A feedback loop is a closed chain of causal connections from a stock, through a set of decisions or rules or physical laws or actions that are dependent on the level of the stock, and back again through a flow to change the stock.
- Remember—all system diagrams are simplifications of the real world.
- Balancing feedback loops are goal-seeking or stability-seeking.
- A balancing feedback loop opposes whatever direction of change is imposed on the system.
- Balancing feedback loops are equilibrating or goal-seeking structures in systems and are both sources of stability and sources of resistance to change.
- The second kind of feedback loop is amplifying, reinforcing, self-multiplying, snowballing—a vicious or virtuous circle that can cause healthy growth or runaway destruction. It is called a reinforcing feedback loop,
- A reinforcing feedback loop enhances whatever direction of change is imposed on it.
- Reinforcing loops are found wherever a system element has the ability to reproduce itself or to grow as a constant fraction of itself.
- Reinforcing feedback loops are self-enhancing, leading to exponential growth or to runaway collapses over time. They are found whenever a stock has the capacity to reinforce or reproduce itself.
- Because we bump into reinforcing loops so often, it is handy to know this shortcut: The time it takes for an exponentially growing stock to double in size, the “doubling time,” equals approximately 70 divided by the growth rate (expressed as a percentage).
- One good way to learn something new is through specific examples rather than abstractions and generalities,
- The information delivered by a feedback loop can only affect future behavior; it can’t deliver the information, and so can’t have an impact fast enough to correct behavior that drove the current feedback.
- The information delivered by a feedback loop—even nonphysical feedback—can only affect future behavior; it can’t deliver a signal fast enough to correct behavior that drove the current feedback. Even nonphysical information takes time to feedback into the system.
- The specific principle you can deduce from this simple system is that you must remember in thermostat-like systems to take into account whatever draining or filling processes are going on. If you don’t, you won’t achieve the target level of your stock.
- A stock-maintaining balancing feedback loop must have its goal set appropriately to compensate for draining or inflowing processes that affect that stock. Otherwise, the feedback process will fall short of or exceed the target for the stock.
- Dominance is an important concept in systems thinking. When one loop dominates another, it has a stronger impact on behavior. Because systems often have several competing feedback loops operating simultaneously, those loops that dominate the system will determine the behavior.
- Complex behaviors of systems often arise as the relative strengths of feedback loops shift, causing first one loop and then another to dominate behavior.
- System dynamics models explore possible futures and ask “what if” questions.
- Model utility depends not on whether its driving scenarios are realistic (since no one can know that for sure), but on whether it responds with a realistic pattern of behavior.
- Physical capital is drained by depreciation—obsolescence and wearing out.
- Systems with similar feedback structures produce similar dynamic behaviors.
- A delay in a balancing feedback loop makes a system likely to oscillate.
- Delays are pervasive in systems, and they are strong determinants of behavior. Changing the length of a delay may (or may not, depending on the type of delay and the relative lengths of other delays) make a large change in the behavior of a system.
- Economies are extremely complex systems; they are full of balancing feedback loops with delays, and they are inherently oscillatory.
- But any real physical entity is always surrounded by and exchanging things with its environment.
- Therefore, any physical, growing system is going to run into some kind of constraint, sooner or later. That constraint will take the form of a balancing loop that in some way shifts the dominance of the reinforcing loop driving the growth behavior, either by strengthening the outflow or by weakening the inflow.
- In physical, exponentially growing systems, there must be at least one reinforcing loop driving the growth and at least one balancing loop constraining the growth, because no physical system can grow forever in a finite environment.
- Profit is income minus cost.
- A quantity growing exponentially toward a constraint or limit reaches that limit in a surprisingly short time.
- Nonrenewable resources are stock-limited. The entire stock is available at once, and can be extracted at any rate (limited mainly by extraction capital). But since the stock is not renewed, the faster the extraction rate, the shorter the lifetime of the resource.
- Renewable resources are flowlimited. They can support extraction or harvest indefinitely, but only at a finite flow rate equal to their regeneration rate. If they are extracted faster than they regenerate, they may eventually be driven below a critical threshold and become, for all practical purposes, nonrenewable.
- If pushed too far, systems may well fall apart or exhibit heretofore unobserved behavior. But, by and large, they manage quite well.
- Placing a system in a straitjacket of constancy can cause fragility to evolve.
- Resilience is a measure of a system’s ability to survive and persist within a variable environment. The opposite of resilience is brittleness or rigidity.
- There are always limits to resilience.
- Systems need to be managed not only for productivity or stability, they also need to be managed for resilience—the ability to recover from perturbation, the ability to restore or repair themselves.
- Awareness of resilience enables one to see many ways to preserve or enhance a system’s own restorative powers.
- This capacity of a system to make its own structure more complex is called self-organization
- Self-organization produces heterogeneity and unpredictability. It is likely come up with whole new structures, whole new ways of doing things. It requires freedom and experimentation, and a certain amount of disorder. These conditions that encourage self-organization often can be scary for individuals and threatening to power structures.
- Systems often have the property of self-organization—the ability to structure themselves, to create new structure, to learn, diversify, and complexify. Even complex forms of self-organization may arise from relatively simple organizing rules—or may not.
- Science knows now that self-organizing systems can arise from simple rules.
- Complex systems can evolve from simple systems only if there are stable intermediate forms.
- When a subsystem’s goals dominate at the expense of the total system’s goals, the resulting behavior is called suboptimization.
- Hierarchical systems evolve from the bottom up. The purpose of the upper layers of the hierarchy is to serve the purposes of the lower layers.
- Everything we think we know about the world is a model.
- Our models usually have a strong congruence with the world.
- We can improve our understanding, but we can’t make it perfect.
- Everything we think we know about the world is a model. Our models do have a strong congruence with the world. Our models fall far short of representing the real world fully.
- The behavior of a system is its performance over time—its growth, stagnation, decline, oscillation, randomness, or evolution.
- When a systems thinker encounters a problem, the first thing he or she does is look for data, time graphs, the history of the system. That’s because long term behavior provides clues to the underlying system structure. And structure is the key to understanding not just what is happening, but why.
- The structure of a system is its interlocking stocks, flows, and feedback loops.
- System structure is the source of system behavior. System behavior reveals itself as a series of events over time.
- A linear relationship between two elements in a system can be drawn on a graph with a straight line. It’s a relationship with constant proportions.
- A nonlinear relationship is one in which the cause does not produce a proportional effect. The relationship between cause and effect can only be drawn with curves or wiggles, not with a straight line.
- Nonlinearities are important not only because they confound our expectations about the relationship between action and response. They are even more important because they change the relative strengths of feedback loops. They can flip a system from one mode of behavior to another.
- Many relationships in systems are nonlinear. Their relative strengths shift in disproportionate amounts as the stocks in the system shift. Nonlinearities in feedback systems produce shifting dominance of loops and many complexities in system behavior.
- Everything, as they say, is connected to everything else, and not neatly. There is no clearly determinable boundary between the sea and the land, between sociology and anthropology, between an automobile’s exhaust and your nose. There are only boundaries of word, thought, perception, and social agreement—artificial, mental-model boundaries.
- The greatest complexities arise exactly at boundaries.
- Everything physical comes from somewhere, everything goes somewhere, everything keeps moving.
- The lesson of boundaries is hard even for systems thinkers to get. There is no single, legitimate boundary to draw around a system. We have to invent boundaries for clarity and sanity; and boundaries can produce problems when we forget that we’ve artificially created them.
- There are no separate systems. The world is a continuum. Where to draw a boundary around a system depends on the purpose of the discussion—the questions we want to ask.
- The right boundary for thinking about a problem rarely coincides with the boundary of an academic discipline, or with a political boundary.
- It’s a great art to remember that boundaries are of our own making, and that they can and should be reconsidered for each new discussion, problem, or purpose.
- Systems surprise us because our minds like to think about single causes neatly producing single effects. We like to think about one or at most a few things at a time. And we don’t like, especially when our own plans and desires are involved, to think about limits. But we live in a world in which many causes routinely come together to produce many effects. Multiple inputs produce multiple outputs, and virtually all of the inputs, and therefore outputs, are limited.
- At any given time, the input that is most important to a system is the one that is most limiting.
- Insight comes not only from recognizing which factor is limiting, but from seeing that growth itself depletes or enhances limits and therefore changes what is limiting.
- Any physical entity with multiple inputs and outputs—a population, a production process, an economy—is surrounded by layers of limits.
- Any physical entity with multiple inputs and outputs is surrounded by layers of limits.
- No physical entity can grow forever.
- There always will be limits to growth. They can be self-imposed. If they aren’t, they will be system-imposed.
- Delays are ubiquitous in systems.
- Changing the length of a delay may utterly change behavior.
- Delays determine how fast systems can react, how accurately they hit their targets, and how timely is the information passed around a system.
- When there are long delays in feedback loops, some sort of foresight is essential. To act only when a problem becomes obvious is to miss an important opportunity to solve the problem.
- Bounded rationality means that people make quite reasonable decisions based on the information they have. But they don’t have perfect information, especially about more distant parts of the system.
- We are not omniscient, rational optimizers, says Simon. Rather, we are blundering “satisficers,” attempting to meet (satisfy) our needs well enough (sufficiently) before moving on to the next decision.11 We do our best to further our own nearby interests in a rational way, but we can take into account only what we know. We don’t know what others are planning to do, until they do it. We rarely see the full range of possibilities before us.
- We often don’t foresee (or choose to ignore) the impacts of our actions on the whole system. So instead of finding a long term optimum, we discover within our limited purview a choice we can live with for now, and we stick to it, changing our behavior only when forced to.
- We misperceive risk, assuming that some things are much more dangerous than they really are and others much less. We live in an exaggerated present—we pay too much attention to recent experience and too little attention to the past, focusing on current events rather than long term behavior. We discount the future at rates that make no economic or ecological sense. We don’t give all incoming signals their appropriate weights. We don’t let in at all news we don’t like, or information that doesn’t fit our mental models. Which is to say, we don’t even make decisions that optimize our own individual good, much less the good of the system as a whole.
- If you become a manager, you probably will stop seeing labor as a deserving partner in production, and start seeing it as a cost to be minimized.
- Seeing how individual decisions are rational within the bounds of the information available does not provide an excuse for narrow-minded behavior.
- Change comes first from stepping outside the limited information that can be seen from any single place in the system and getting an overview. From a wider perspective, information flows, goals, incentives, and disincentives can be restructured so that separate, bounded, rational actions do add up to results that everyone desires.
- It’s amazing how quickly and easily behavior changes can come, with even slight enlargement of bounded rationality, by providing better, more complete, timelier information.
- The bounded rationality of each actor in a system may not lead to decisions that further the welfare of the system as a whole.
- The world is nonlinear. Trying to make it linear for our mathematical or administrative convenience is not usually a good idea even when feasible, and it is rarely feasible.
- Balancing loops stabilize systems; behavior patterns persist.
- Policy resistance comes from the bounded rationalities of the actors in a system, each with his or her (or “its” in the case of an institution) own goals.
- One way to deal with policy resistance is to try to overpower it. If you wield enough power and can keep wielding it, the power approach can work, at the cost of monumental resentment and the possibility of explosive consequences if the power is ever let up.
- The most effective way of dealing with policy resistance is to find a way of aligning the various goals of the subsystems, usually by providing an overarching goal that allows all actors to break out of their bounded rationality. If everyone can work harmoniously toward the same outcome (if all feedback loops are serving the same goal), the results can be amazing.
- Harmonization of goals in a system is not always possible, but it’s an worth looking for. It can be found only by letting go of more narrow goals and considering the long term welfare of the entire system.
- The trap called the tragedy of the commons comes about when there is escalation, or just simple growth, in a commonly shared, erodable environment.
- The tragedy of the commons arises from missing (or too long delayed) feedback from the resource to the growth of the users of that resource.
- To be effective, regulation must be enforced by policing and penalties.
- Some systems not only resist policy and stay in a normal bad state, they keep getting worse. One name for this archetype is “drift to low performance.”
- There are two antidotes to eroding goals. One is to keep standards absolute, regardless of performance. Another is to make goals sensitive to the best performances of the past, instead of the worst.
- Allowing performance standards to be influenced by past performance, especially if there is a negative bias in perceiving past performance, sets up a reinforcing feedback loop of eroding goals that sets a system drifting toward low performance.
- Keep performance standards absolute. Even better, let standards be enhanced by the best actual performances instead of being discouraged by the worst. Use the same structure to set up a drift toward high performance!
- Escalation, being a reinforcing feedback loop, builds exponentially. Therefore, it can carry a competition to extremes faster than anyone would believe possible. If nothing is done to break the loop, the process usually ends with one or both of the competitors breaking down.
- When the state of one stock is determined by trying to surpass the state of another stock—and vice versa—then there is a reinforcing feedback loop carrying the system into an arms race, a wealth race, a smear campaign, escalating loudness, escalating violence. The escalation is exponential and can lead to extremes surprisingly quickly. If nothing is done, the spiral will be stopped by someone’s collapse—because exponential growth cannot go on forever.
- The best way out of this trap is to avoid getting in it. If caught in an escalating system, one can refuse to compete (unilaterally disarm), thereby interrupting the reinforcing loop. Or one can negotiate a new system with balancing loops to control the escalation.
- Success to the successful is a well-known concept in the field of ecology, where it is called “the competitive exclusion principle.” This principle says that two different species cannot live in exactly the same ecological niche, competing for exactly the same resources. Because the two species are different, one will necessarily reproduce faster, or be able to use the resource more efficiently than the other. It will win a larger share of the resource, which will give it the ability to multiply more and keep winning. It will not only dominate the niche, it will drive the losing competitor to extinction. That will happen not by direct confrontation usually, but by appropriating all the resource, leaving none for the weaker competitor.
- The trap of success to the successful does its greatest damage in the many ways it works to make the rich richer and the poor poorer.
- Species and companies sometimes escape competitive exclusion by diversifying. A species can learn or evolve to exploit new resources. A company can create a new product or service that does not directly compete with existing ones.
- The success-to the-successful loop can be kept under control by putting into place feedback loops that keep any competitor from taking over entirely.
- The most obvious way out of the success-to the-successful archetype is by periodically “leveling the playing field.”
- If the winners of a competition are systematically rewarded with the means to win again, a reinforcing feedback loop is created by which, if it is allowed to proceed uninhibited, the winners eventually take all, while the losers are eliminated.
- Diversification, which allows those who are losing the competition to get out of that game and start another one; strict limitation on the fraction of the pie any one winner may win (antitrust laws); policies that level the playing field, removing some of the advantage of the strongest players or increasing the advantage of the weakest; policies that devise rewards for success that do not bias the next round of competition.
- Addiction is finding a quick and dirty solution to the symptom of the problem, which prevents or distracts one from the harder and longer-term task of solving the real problem. Addictive policies are insidious, because they are so easy to sell, so simple to fall for.
- It’s worth going through the withdrawal to get back to an unaddicted state, but it is far preferable to avoid addiction in the first place.
- THE TRAP: SHIFTING THE BURDEN TO THE INTERVENOR Shifting the burden, dependence, and addiction arise when a solution to a systemic problem reduces (or disguises) the symptoms, but does nothing to solve the underlying problem. Whether it is a substance that dulls one’s perception or a policy that hides the underlying trouble, the drug of choice interferes with the actions that could solve the real problem. If the intervention designed to correct the problem causes the self-maintaining capacity of the original system to atrophy or erode, then a destructive reinforcing feedback loop is set in motion. The system deteriorates; more and more of the solution is then required. The system will become more and more dependent on the intervention and less and less able to maintain its own desired state. THE WAY OUT Again, the best way out of this trap is to avoid getting in. Beware of symptom-relieving or signal-denying policies or practices that don’t really address the problem. Take the focus off short-term relief and put it on long term restructuring.
- Wherever there are rules, there is likely to be rule beating. Rule beating means evasive action to get around the intent of a system’s rules—abiding by the letter, but not the spirit, of the law.
- Notice that rule beating produces the appearance of rules being followed.
- Rule beating is usually a response of the lower levels in a hierarchy to overrigid, deleterious, unworkable, or ill-defined rules from above. There are two generic responses to rule beating. One is to try to stamp out the self-organizing response by strengthening the rules or their enforcement—usually giving rise to still greater system distortion. That’s the way further into the trap. The way out of the trap, the opportunity, is to understand rule beating as useful feedback, and to revise, improve, rescind, or better explain the rules.
- THE TRAP: RULE BEATING Rules to govern a system can lead to rule beating—perverse behavior that gives the appearance of obeying the rules or achieving the goals, but that actually distorts the system. THE WAY OUT Design, or redesign, rules to release creativity not in the direction of beating the rules, but in the direction of achieving the purpose of the rules.
- Although there is every reason to want a thriving economy, there is no particular reason to want the GNP to go up. But governments around the world respond to a signal of faltering GNP by taking numerous actions to keep it growing. Many of those actions are simply wasteful, stimulating inefficient production of things no one particularly wants.
- Seeking the wrong goal, satisfying the wrong indicator, is a system characteristic almost opposite from rule beating. In rule beating, the system is out to evade an unpopular or badly designed rule, while giving the appearance of obeying it. In seeking the wrong goal, the system obediently follows the rule and produces its specified result—which is not necessarily what anyone actually wants.
- You have the problem of wrong goals when you find something stupid happening “because it’s the rule.” You have the problem of rule beating when you find something stupid happening because it’s the way around the rule. Both of these system perversions can be going on at the same time with regard to the same rule.
- THE TRAP: SEEKING THE WRONG GOAL System behavior is particularly sensitive to the goals of feedback loops. If the goals—the indicators of satisfaction of the rules—are defined inaccurately or incompletely, the system may obediently work to produce a result that is not really intended or wanted. THE WAY OUT Specify indicators and goals that reflect the real welfare of the system. Be especially careful not to confuse effort with result or you will end up with a system that is producing effort, not result.
- Leverage points are points of power.
- The world’s leaders are correctly fixated on economic growth as the answer to virtually all problems, but they’re pushing with all their might in the wrong direction.
- Leverage points frequently are not intuitive. Or if they are, we too often use them backward, systematically worsening whatever problems we are trying to solve.
- As systems become complex, their behavior can become surprising.
- You can often stabilize a system by increasing the capacity of a buffer.5 But if a buffer is too big, the system gets inflexible. It reacts too slowly.
- Delays in feedback loops are critical determinants of system behavior. They are common causes of oscillations.
- A complex system usually has numerous balancing feedback loops it can bring into play, so it can self-correct under different conditions and impacts.
- A balancing feedback loop is self-correcting; a reinforcing feedback loop is self-reinforcing. The more it works, the more it gains power to work some more, driving system behavior in one direction.
- Reinforcing feedback loops are sources of growth, explosion, erosion, and collapse in systems. A system with an unchecked reinforcing loop ultimately will destroy itself. That’s why there are so few of them. Usually a balancing loop will kick in sooner or later.
- Missing information flows is one of the most common causes of system malfunction. Adding or restoring information can be a powerful intervention, usually much easier and cheaper than rebuilding physical infrastructure.
- There is a systematic tendency on the part of human beings to avoid accountability for their own decisions. That’s why there are so many missing feedback loops—and why this kind of leverage point is so often popular with the masses, unpopular with the powers that be, and effective, if you can get the powers that be to permit it to happen (or go around them and make it happen anyway).
- Constitutions are the strongest examples of social rules. Physical laws such as the second law of thermodynamics are absolute rules, whether we understand them or not or like them or not. Laws, punishments, incentives, and informal social agreements are progressively weaker rules.
- Power over the rules is real power.
- The most stunning thing living systems and some social systems can do is to change themselves utterly by creating whole new structures and behaviors.
- The ability to self-organize is the strongest form of system resilience.
- A system that can evolve can survive almost any change, by changing itself.
- Magical leverage points are not easily accessible, even if we know where they are and which direction to push on them. There are no cheap tickets to mastery. You have to work hard at it, whether that means rigorously analyzing a system or rigorously casting off your own paradigms and throwing yourself into the humility of not-knowing.
- Social systems are the external manifestations of cultural thinking patterns and of profound human needs, emotions, strengths, and weaknesses. Changing them is not as simple as saying “now all change,” or of trusting that he who knows the good shall do the good.
- Self-organizing, nonlinear, feedback systems are inherently unpredictable. They are not controllable. They are understandable only in the most general way.
- Before you disturb the system in any way, watch how it behaves.
- Starting with the behavior of the system forces you to focus on facts, not theories.
- Remember, always, that everything you know, and everything everyone knows, is only a model. Get your model out there where it can be viewed. Invite others to challenge your assumptions and add their own.
- Thou shalt not distort, delay, or withhold information.
- You can make a system work better with surprising ease if you can give it more timely, more accurate, more complete information.
- Information is power. Anyone interested in power grasps that idea very quickly.
- Our information streams are composed primarily of language. Our mental models are mostly verbal. Honoring information means above all avoiding language pollution—making the cleanest possible use we can of language. Second, it means expanding our language so we can talk about complexity.
- In fact, we don’t talk about what we see; we see only what we can talk about.
- The first step in respecting language is keeping it as concrete, meaningful, and truthful as possible—part of the job of keeping information streams clear. The second step is to enlarge language to make it consistent with our enlarged understanding of systems.
- Our culture, obsessed with numbers, has given us the idea that what we can measure is more important than what we can’t measure.
- Pretending that something doesn’t exist if it’s hard to quantify leads to faulty models.
- Human beings have been endowed not only with the ability to count, but also with the ability to assess quality. Be a quality detector.
- Remember that hierarchies exist to serve the bottom layers, not the top. Don’t maximize parts of systems or subsystems while ignoring the whole.
- The thing to do, when you don’t know, is not to bluff and not to freeze, but to learn. The way you learn is by experiment—or, as Buckminster Fuller put it, by trial and error, error, error.
- Pretending you’re in control even when you aren’t is a recipe not only for mistakes, but for not learning from mistakes.
- No part of the human race is separate either from other human beings or from the global ecosystem.
- • A system is more than the sum of its parts.
- Many of the interconnections in systems operate through the flow of information.
- The least obvious part of the system, its function or purpose, is often the most crucial determinant of the system’s behavior.
- System structure is the source of system behavior. System behavior reveals itself as a series of events over time.
- A stock is the memory of the history of changing flows within the system.
- • If the sum of inflows exceeds the sum of outflows, the stock level will rise.
- • If the sum of outflows exceeds the sum of inflows, the stock level will fall.
- If the sum of outflows equals the sum of inflows, the stock level will not change — it will be held in dynamic equilibrium.
- • A stock can be increased by decreasing its outflow rate as well as by increasing its inflow rate.
- Stocks act as delays or buffers or shock absorbers in systems.
- Stocks allow inflows and outflows to be de-coupled and independent.
- A feedback loop is a closed chain of causal connections from a stock, through a set of decisions or rules or physical laws or actions that are dependent on the level of the stock, and back again through a flow to change the stock.
- Balancing feedback loops are equilibrating or goal-seeking structures in systems and are both sources of stability and sources of resistance to change.
- Reinforcing feedback loops are self-enhancing, leading to exponential growth or to runaway collapses over time.
- The information delivered by a feedback loop—even nonphysical feedback—can affect only future behavior; it can’t deliver a signal fast enough to correct behavior that drove the current feedback.
- A stock-maintaining balancing feedback loop must have its goal set appropriately to compensate for draining or inflowing processes that affect that stock. Otherwise, the feedback process will fall short of or exceed the target for the stock.
- Systems with similar feedback structures produce similar dynamic behaviors.
- Complex behaviors of systems often arise as the relative strengths of feedback loops shift, causing first one loop and then another to dominate behavior.
- • A delay in a balancing feedback loop makes a system likely to oscillate.
- Changing the length of a delay may make a large change in the behavior of a system.
- System dynamics models explore possible futures and ask “what if” questions.
- Model utility depends not on whether its driving scenarios are realistic (since no one can know that for sure), but on whether it responds with a realistic pattern of behavior.
- In physical, exponentially growing systems, there must be at least one reinforcing loop driving the growth and at least one balancing loop constraining the growth, because no system can grow forever in a finite environment.
- Nonrenewable resources are stock-limited.
- Renewable resources are flow-limited.
- There are always limits to resilience.
- Systems need to be managed not only for productivity or stability, they also need to be managed for resilience.
- Systems often have the property of self-organization—the ability to structure themselves, to create new structure, to learn, diversify, and complexify.
- Hierarchical systems evolve from the bottom up. The purpose of the upper layers of the hierarchy is to serve the purposes of the lower layers.
- Many relationships in systems are nonlinear.
- There are no separate systems. The world is a continuum. Where to draw a boundary around a system depends on the purpose of the discussion.
- At any given time, the input that is most important to a system is the one that is most limiting.
- Any physical entity with multiple inputs and outputs is surrounded by layers of limits.
- There always will be limits to growth.
- A quantity growing exponentially toward a limit reaches that limit in a surprisingly short time.
- When there are long delays in feedback loops, some sort of foresight is essential.
- The bounded rationality of each actor in a system may not lead to decisions that further the welfare of the system as a whole.
- Everything we think we know about the world is a model.
- Our models do have a strong congruence with the world.
- Our models fall far short of representing the real world fully.
20171029
THINKING IN SYSTEMS: A PRIMER by Donella H. Meadows, Diana Wright
Labels:
books
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment