YOU ARE DOWNLOADING DOCUMENT

Please tick the box to continue:

Transcript
  • COMPLEXITY THEORY

    BASIC CONCEPTS AND APPLICATION TO SYSTEMS THINKING

    March 27, 1994

    John ClevelandInnovation Network For Communities

  • A Short Introduction to Complex Adaptive Systems On the Edge of Chaos

    The field of complex adaptive systems theory (also known as complexity theory)seeks to understand how order emerges in complex, non-linear systems such asgalaxies, ecologies, markets, social systems and neural networks. Complexityscientists suggest that living systems migrate to a state of dynamic stability they call theedge of chaos. Mitchell Waldrop provides a description of the edge of chaos in hisbook, Complexity:

    The balance point -- often called the edge of chaos -- is where the components of asystem never quite lock into place, and yet never quite dissolve into turbulenceeither. . . The edge of chaos is where life has enough stability to sustain itself andenough creativity to deserve the name of life. The edge of chaos is where new ideaand innovative genotypes are forever nibbling away at the edges of the status quo,and where even the most entrenched old guard will eventually be overthrown. Theedge of chaos is where centuries of slavery and segregation suddenly give way tothe civil rights movement of the 1950s and 1960s; where seventy years of Sovietcommunism suddenly give way to political turmoil and ferment; where eons ofevolutionary stability suddenly give way to wholesale species transformation. Theedge is the constantly shifting battle zone between stagnation and anarchy, the oneplace where a complex system can be spontaneous, adaptive and alive.

    Systems on the edge are notable for a hunger for novelty and disequilibrium thatdistinguishes them from rigidly ordered systems. At the same time, however, they alsopossess a deep underlying coherence that provides structure and continuity, anddistinguishes them from chaotic systems. Theorists use words like integrity, identitypersistent structure and self-reference to describe this opposite characteristic.Systems that evolve along the edge of chaos periodically re-integrate into structureswith temporary stability, which bear recognizable resemblance to the string ofpredecessor structures. They are free enough to change, but stable enough to stayrecognizable.

    Complexity scientists have identified several characteristics that distinguish edge ofchaos systems from systems that are either locked in rigid order, or too chaotic for anystability to emerge. These include:

    Autonomous agents. Like a swarm of bees, a flock of birds, or a healthy market,these systems are made up of many individual actors who make choices about howto act based on information in their local environment. All the agents make choicessimultaneously (parallel processing), both influencing and limiting each othersactions.

    Networked structure. The agents dont act randomly. They share some commonrules about how they decide what to do next. At the level of matter, these commonrules are the laws of nature (gravity, electromagnetism, etc.). At the level ofconscious actors, these are decision-making rules (preferences, interests, desires,

  • etc.). These rules connect the agents together and allow a global coherence toemerge without any central source of direction -- the swarm has velocity, shape,direction and density that do not reside in any individual agent. The rules used byagents evolve based on their successfulness in the changing environment. Theconnections between agents in edge of chaos systems are moderately dense not so interconnected so the system freezes up, and not so disconnected that itdisintegrates into chaos.

    Profuse experimentation. These edge of chaos systems are full of novelty andexperimentation. They have a quality of dynamic stability that is characterized byoccasional rapid and unpredictable shifts in shape and direction. They can react tosmall changes in big and surprising ways (rumors fly like lightning; a mob forms; themarket crashes; the hive swarms). Such systems can communicate almostinstantaneously, experiment with dozens of possible responses if they encounter aroadblock, and rapidly exploit solutions when one is found.

    In describing the edge of chaos, complexity scientists have documented and analyzedqualities that humans have sought in their systems for some time. A vibrant democracyis an edge of chaos form of governance; a healthy market is an edge of chaos formof economics; a flexible and adaptive organization is an edge of chaos institution; anda mature, well-developed personality is an edge of chaos psyche.

    In many of our systems, however, we have created forms of organization that arelocked in rigid order and incapable of adaptable evolution (e.g. bureaucracies,monopolies, dictatorships). These forms of social control were often responses tosituations that were previously too chaotic. In multiple sectors of society, we now see amigration from both extremes of incoherent chaos and rigid order towards the middleedge of chaos where systems have the capacity to grow, learn and evolve.

    The attached materials describe some of the basic concepts of complex adaptivesystems theory.

  • 1. TYPES OF SYSTEMS systems

    The Basic Concept:

    Scientists and others use many different labels to describe different kinds of systems.These terms can be confusing to the non-specialist. It is helpful to understand the"taxonomy" of system types in order to understand what complex adaptive systems areand are not.

    Discussion:

    These are the most commonly used terms for different kinds of systems. They areloosely listed in order from least complex to most complex system.

    Entropy -- No System. The condition of entropy is a condition where there is no usableenergy in the system, no connections between the elements of the system, and noobservable structure.

    Closed Systems. A closed system is a system that does not import or export energyacross its boundaries. The only truly closed system is the universe as a whole.Traditional physics deals with systems that are presumed to be closed systems. Inclosed systems, the final state of the system is determined by its initial conditions. Allclosed systems move to a condition of equilibrium and maximum entropy.

    Open Systems. The term "open system" is a general term given to any system whichexchanges matter, energy or information across its boundaries, and uses that exchangeof energy to maintain its structure. All living systems, including all complex adaptivesystems, are open systems. Not all open systems, however, are complex and adaptive.(For instance, a burning candle is an open system.)Self-Organizing Systems. The term "self-organizing" refers to the spontaneousemergence of new forms of order. Self-organization is distinguished by the fact thatthere is no external agent that designs, constructs or maintains the system. Structurefreely emerges from the internal interactions of the system itself. Many open systemsdisplay qualities of self-organization. The phenomenon of self-organization only occursin systems that are "far from equilibrium" -- where there is continuous flux and vigorousexchange of energy between the parts of the system. (See Self-Organization.)Dissipative Structures. This is the term the chemist Ilya Prigogine gave to self-organizing systems. The terms "self-organizing system" and "dissipative structure"mean essentially the same thing. The term "dissipative" refers to the fact that thesesystems consume energy and "dissipate" it into the environment (thereby creatingentropy.) Dissipative structures maintain a form of global structure and stability by aconstant pattern of internal fluctuations. Through the process of autocatalysis, smallfluctuations are often magnified into large disturbances that either cause the system todisintegrate, or to reorganize into a new form. (See Feedback.)

  • Autopoetic Systems. The term "autopoetic" is used to refer to any system that renewsitself and regulates the renewal process in such a way that its overall structure ispreserved. The terms "autopoetic", "self-organizing" and "dissipative" are generallymeant to mean the same thing when referring to systems. An autopoetic system(whose only purpose is self-preservation and renewal) can be distinguished from amachine, whose purpose is geared toward the production of a particular output.

    Natural Systems. This is the term that the general system theorist Ervin Laszlo gives toself-organizing systems. The four characteristics of natural systems as Laszlo identifiesthem are: 1) they are wholes, with irreducible properties; 2) they maintain themselves ina changing environment; 3) they create and recreate themselves in response to thechallenges of the environment; and 4) they mediate interaction between thesubsystems that make them up, and the larger "supra-systems" of which they are apart. Again, the term "natural system" can, for all practical purposes, be seen assynonymous with open, self-organizing, dissipative and autopoetic systems.

    Classes of Systems. The complexity scientists have adopted a classification ofsystems based on the work of the physicist Stephen Wolfram. The four classes arebased on the behavior of the system: Class I systems move quickly to a single point,and stay there (vaguely equivalent to a closed system moving to equilibrium); Class IIsystems oscillate between a limited number of end states; Class III systems are"boiling" -- totally chaotic, with no stability or structure; Class IV systems are "on theedge of chaos", "alive" -- they have enough structure to create patterns, but thepatterns never really settle down. Class IV systems is what is meant by "complexadaptive systems." (See Classes of Systems.)Complex Adaptive Systems. Complex adaptive systems are open, self-organizingsystems that have the added capacity to conserve and process high levels ofinformation. They live on the "edge of chaos" where the system maintains enoughstructure to process information, but fluctuates enough that new information (in theform of new patterns and structures) is always being created. (See Complex AdaptiveSystems.)Relevance for Thinking About Complex Systems:

    The terms and jargon used to talk about systems are often confusing for threereasons: authors often do not clearly define the terms that they are using; many termsthat mean basically the same thing are developed by different authors in differentdisciplines; and the distinctions between different kinds of systems are often vague andimprecise. Throughout these materials, our focus is on complex adaptive systems, thatare differentiated from other systems by their capacity for learning.

  • 2. CLASSES OF SYSTEMS

    The Basic Concept

    Systems fall into various "classes" of behavior. One classification used by somecomplexity scientists put systems into four categories ( Class I, II, III, and IV) accordingto the nature of their global dynamics, and the shape of their attractor. (See StrangeAttractors). In this scheme, complex adaptive systems are referred to as "Class IV"systems.

    Discussion:

    Steven Wolfram of the Institute for Advanced Study developed four "universalityclasses" to describe the various kinds of rules governing the behavior of cellularautomata (computer creations that are used to model the behavior of complex systems.) Chris Langton of the Santa Fe Institute used this classification system to categorizesystems according to their global behaviors:

    CLASSES OF SYSTEMS

    Class Behavior Attractor DescriptionI Always returns to a

    single equilibrium stateSingle pointattractor

    Ordered

    II Oscillates between alimited number of states

    Periodic attractor Ordered

    III Never establishes acoherent pattern --"boiling"

    Strange attractor Chaotic

    IV Growing and changingpatterns that never settledown -- regions ofconnected order in a seaof chaos

    Strange attractors Complex(Edge of Chaos)

    Other terms have been used for some of these classes. Among them:

    * Static equilibrium. This is another term for Class I systems that quickly return toa state of equilibrium and high entropy.

    * Dynamic equilibrium. This is another term for Class II systems, where there isalot of local fluctuation, but the global behaviors oscillate between a limitednumber of end states.

    * Near-equilibrium. This is another version of Class II systems, but with a wider

  • range of end states.

    * Far from equilibrium. This is another term for chaotic Class III systems.

    Langton noted the similarity between the three basic categories of dynamic systems(ordered, complex and chaotic) and the categorization of other physical phenomena,including phase transitions in matter, the capacity for computation in computers, andthe emergence of life-like behaviors in cellular automata systems:

    Phenomena CategorizationCellular Automata Class I, II Class IV Class IIIDynamicalSystems

    Ordered Complex(Edge of Chaos)

    Chaotic

    Matter Solid Phase Transitions FluidComputation Halting Undecidable Non-haltingLife Too static Life/Intelligence Too noisy

    Relevance for Thinking About Human Experience:

    The "classes of systems" categorization is a useful way to understand the differentkinds of behaving systems in our world. It gives us a framework for looking at thephenomena around us. This categorization again emphasizes that complex, life-likebehavior occurs "on the edge of chaos" where order and chaos are sufficientlyintermingled to create coherent patterns, but never to let them "freeze" or "boil away."

  • 3. COMPLEX ADAPTIVE SYSTEMSThe Basic Concept:The phenomena of life and evolution can be understood as the phenomena of complexadaptive systems. These systems are complex (they have many parts interactingwith each other in many different ways); self-organizing (they spontaneously emerge,without being designed from the outside); adaptive (they change their behavior basedon experience); dynamic (they are poised on "the edge of chaos" -- stable enough tomaintain their structure, but sensitive enough to external changes that they can undergorapid and unpredicatable periods of change); and co-evolving (the evolve together withthe systems that they interact with.)Discussion:Complex Adaptive Systems Distinguished from Other Systems. Complex adaptivesystems are distinguished by their capacity to conserve and process information, andtheir ability to evolve new forms of behavior based on that information. A weathersystem, for instance, is a complex system, but not a complex adaptive system. (SeeTypes of Systems.)Complexity. "Complexity" is a difficult concept to precisely define. In the context ofcomplex adaptive systems, it generally means that there are a large number ofelements interacting in many diverse ways in the system.

    Self-Organizing. Complex adaptive systems are "emergent" phenomenon. (SeeEmergence.) The particular form of their structure emerges from the patterns ofinteraction between the elements making up the system. They are not designed fromthe "outside" (in contrast to a machine and some other human systems), and youcannot determine the shape of the system from the characteristics of the elements.(Just as knowing the characteristics of bricks doesn't tell you whether they will be usedto build a wall or a cathedral.)Adapting and Learning. Because of their ability to conserve, process and createinformation, complex adaptive systems have the ability to change their behavior andadapt to new relationships with the environment. They display the basic elements of alearning process: rules that govern their relations with the environment; feedback loopsthat tell them about how the rules are performing; and the ability to form new rules fromcombinations of old rules and new information from the environment.

    Dynamic. While they maintain stability in the midst of fluctuation, complex adaptivesystems are senstitive enough to the environment that they can undergo rapid andunpredictable transformations as they adjust to internal and external fluctuations. Co-evolving. Complex adaptive systems both change and are changed by theirenvironments. They and their environments co-evolve. The patterns of relationshipbetween themselves and other systems forms a "fitness landscape" that is constantlychanging as they change.

  • 4. THE EDGE OF CHAOS edge

    The Basic Concept:

    Systems fall into three broad categories of behavior: ordered, complex and chaotic.The term "edge of chaos" is used to describe complex systems that lie in the regionbetween ordered and chaotic systems. Living systems migrate to the edge of chaosbecause that is where the opportunity for information processing is maximized.

    Discussion:

    The image of "life on the edge" has an intrinsic appeal to human beings that live in acomplex, turbulent and unpredictable world. The term is more than just a fanciful image,however -- it is also a precise scientific description of the condition of all living systems.Scientists who study whole system behavior have observed that systems tend to fallinto four categories. (See Classes of Systems.) Two of these classes are orderedsystems, where the behavior of the system rapidly moves to a predictable and repetitivecycle of behavior. Another class of systems is chaotic and never really settles down intoany observable pattern. The final class of systems (Class IV systems) are complexsystems that have many areas of order, but also many areas of flux and chaos. Chaosand order weave together in a complex and always changing dance of connections andfluctuations. There is enough stability for the system to store information, but alsoenough fluidity for rapids and intense communication to occur. This region is called the"edge of chaos."

    Systems on the edge of chaos have the ability to learn. Areas of stability allow for thestorage of information, and areas of flux and change allow for communication. Thecombination allows the system to store information, receive communication "signals",process information and act in response to it. (See The Nature of Information.)Because this region is the region most favorable to life, living systems tend to adjusttheir parameters so that remain on the edge of chaos. Chris Langton, one of thepioneers of complexity science, expresses his vision of "...life as eternally truing to keepits balance on the edge of chaos, always in danger of falling off into too much order onthe one side, and too much chaos on the other." (Complexity, P. 235.) The process ofevolution is the process of systems adjusting its parameters (e.g. reproduction rate;environmental temperature, etc.) that keep the system on the edge of chaos. Relevance for Thinking About Complex Systems:

    The edge of chaos is the desired state for systems that want to avoid the two kinds ofdeath -- death by freezing into rigid patterns, and death by "boiling alive" in the fire ofchaos. The edge of chaos is where life occurs, so we should seek those "rules" ofinteraction (complexity scientists call them "parameters") that will produce an emergentstructure that lives on the edge of chaos.

  • 5. SELF-ORGANIZATION selforgThe Basic Concept:Self-organization refers to the spontaneous emergence of order in a system. Thisoccurs when the conditions in the system allow for the formation of steady patterns ofrelationships between elements of the system. Self-organization is another way oflooking at the emergence of order out of chaos.

    Discussion:The idea of self-organization, or the spontaneous emergence of self-maintaining order,is in direct contrast to a machine orientation of the world, which assumes that theappearance of order requires the imposition of a design from outside the system, justas an engineer designs and builds a machine. Examples of self-organization can befound in chemical systems, weather systems, natural systems, and human systems.Increasingly, the phenomenon of self-organization is seen (along with the process ofnatural selection) as one of the two primary forces underlying all natural life processesin our known universe: "...there is a natural tendency for self-organized wholes to form.The wholes retain their identities, return to maximum stability after they have beendisturbed, and even to a certain degree regenerate their form when these have beenfractured." (The Unfinished Universe, P. 41.)A self-organizing structure emerges when many different parts form a steady pattern ofrelationships over time. This might be molecules in a chemical solution; species in aneco-system; cells in an organ; stars in a galaxy, or human beings in an organization. Inall cases of self-organization, you can observe:

    A very large number of elements that are richly connected with each other(obviously, without any connections, no stable patterns of interaction can develop);

    Various forms of rapid feedback and iteration occuring between the parts of thesystem (particularly the phenomenon of autocatalysis, where an element duplicatesitself by iteration with other parts of the system);

    Sufficient sources of negative feedback that allow for the maintainance of acondition of stability (referred to as "homeostasis"); and

    Rich exchange of matter, energy and information with the external environment.

    The self-organized (or "self-ordered") system is a stable structure that exists in acondition of "far from equilibrium". The external, or "global" stability is maintained bythe patterns of fluctuation within the system. Rythmic movement creates stablestructure -- structure "emerges" from the interactions of the parts of the system. Theform of the structure is impossible to predict from the characteristics of the individualparts, but is instead a function of the patterns of interrelationship that they engage in.

    Relevance for Thinking Complex Systems:The phenomenon of self-organization is a key characteristic of complex adaptivesystems. The capacity for self-organization suggests that we should seek to work withthe self-ordering properties of complex systems rather than attempting to imposestructure on them from the outside.

  • 6. EMERGENCE emerge

    The Basic Concept:

    Structure in complex systems is an emergent phenomenon, meaning that it arises outof the interactions within the system, rather than being imposed on it from the outside.

    Discussion:

    We are used to thinking about structure as something that is "designed into" a system,and that is regulated by some kind of central control function. In contrast, in complexsystems, structure emerges from the many interactions of the elements in the system.The phenomenon of emergence has several dimensions:

    Bottoms-Up . The structure that we observe in complex systems arises out of the"local rules" that parts of the system use to guide their interactions with eachother. Observance of these rules leads to recurring patterns of relationshipsbetween the parts in the system. These recurring relationships are whatconstitute the "structure" of the system. (See Feedback.)

    Complexity From Simplicity . Very complicated structure can emerge from asmall number of elements, and only a few rules.

    Unpredictability . It is not possible to predict what structure (if any) will emerge incomplex systems from any given set of rules. The only way to find out is to letthe system play itself out. This reality is what make reductionist analysis oflimited utility in understanding the behavior of complex systems. It is notpossible to understand the characteristics of ligher levels of organization from thecharacteristics of the smallest parts. Understanding the nature of bricks doesn'ttell you whether the building is a cathedral or a bus depot.

    Stratified Autonomy . Emergence occurs in layers. Individual elements interactand form integrated wholes; these wholes then interact with each other and formlarge wholes, and so on. At each scale, the parts retain high levels of autonomy,but also enter into patterns of relationships with other parts. Each layer includesall the layers below it, and leaves their individual functions intact. The part is notcontrolled by the system that it is a part of. (See Stratified Autonomy.) Notethat these "nested" systems have the fractal structure that is characteristic ofcomplex systems. (See Fractals.)

    Relevance for Thinking About Complex Systems:

    As we better understand the nature of emergent structure in complex systems, we canbegin to replace our machine-oriented ideas about top-down design with a more naturalapproach that allows ever higher levels of integration to emerge from the bottom up.

  • 7. FEEDBACK feedback

    The Basic Concept:

    The term "feedback" is used to describe the information a part of a system (or a wholesystem) gets back from its environment about something that it has done. Feedbackconnects elements in a system to each other, and is their means of "communication."There are several different kinds of feedback, and the combination present in any onesystem will determine the overall structure and dynamics of that system.

    Discussion:

    What is Feedback? Feedback is information that is "fed back" to a system orsubsystem from outside its boundaries. A "feedback loop" is the full cycle of:

    1) Inputs -- taking in matter, energy or information;2) Processing -- transforming the matter, energy or information in a way that is

    useful to the system;3) Output -- affecting the environment in some way; and 4) Feedback -- having the output affect the the new inputs the system processes.In a strict sense, the term "feedback" refers exclusively to inputs from the environmentthat have been influenced by or stimulated in some way by the system's own activity.Terms like "stimulus" or "external forces" are used to describe external influences onthe system that the system does not affect in any way (e.g. the force of gravity on anobject.) However, the more densely connected the parts of a system are, the morevague this distinction becomes. The concept of co-evolution, for instance, postulatesthat microsystem create their own macrosystem environments, and co-evolve withthem. (See Evolution and Coevolution.) The quantum physics theory of non-localreality also postulates that all particles in the universe are in constant contact with allother particles in the universe, therefore blurring the distinction between feedback andstimulus. (See Non-Local Reality -- Bell's Interconnectedness Theorem.)There are a number of different kinds of feedback that are important to theunderstanding of complex adaptive systems.

    Positive feedback encourages the system to do more of what it was doing before.Uninhibited positive feedback can lead to exponential rates of growth in output, and an"exploding" of the system into chaotic behavior. Positive feedback is an importantsource of growth and change in systems.

    Negative feedback tells a system to stop doing what it was doing. It "negates" theprevious action. The ultimate in negative feedback leads to equilibrium, system"death", and no activity at all ("entropy"). Negative feedback is an important source ofstability in complex systems.

  • Iteration is a term often used to mean two things: 1) one cycle of a feedback loop as in"The system went through one interation." and 2) a feedback loop where the output ofone cycle is the exclusive input for the next cycle.

    Autocatalysis is a special form of positive feedback in which a catalyst produces moreof itself by being in the presence of other elements. All forms of sexual procreation areforms of autocatalysis. Autocatalysis can lead to exponential rates of growth, as isobserved in natural populations when limiting condition (such as predators or foodscarcity) are not present.Feedback and Structure

    In a literal way, feedback creates structure. The "coupling" or stabilizing of feedbackloops between elements in a system is what creates the stable patterns represented bystructure. Some are rigidly structured (such as atoms and atomic elements) and othersare more loosely coupled (such as species in an ecosystem). To say that structureexists is just another way of saying that a stable pattern of interaction betweenelements has been achieved. In system language, stability is often referred to ashomeostatis. This refers to conditions where a balance between positive and negativefeedback has been achieve. (A very simple example of a device designed to createhomeostasis is the thermostat, which alternately sends positive ("turn on") and negative("turn off") feedback to the furnace, in order to maintain a steady temperature in thehouse.)Feedback and Decision Rules

    Every system has certain "rules" it uses to decide how to respond to externalinfluences, including feedback loops. These rules determine whether or not feedbackfrom the environment is interpreted as "negative" or "positive" feedback. The rulescontrol the nature of the system's relationships with its environment. Scientists usedifferent words to describe these rules (in machine intelligence, they are calledalgorithms, in learning theory they refer to "mental models".) These rules vary in how"tight" and "loose" they are -- in other words, in what level of input a system needsbefore it responds in either a negative or positive fashion. This determines howsensitive a system is to external influences.

    Feedback, Delay and Response Time

    There are two time-related dimensions of feedback that warrant mention. The first isthe amount of delay between when a system influences its environment, and there is aresponse fed back to the system (for instance, between the time that we spray CFC's inthe air, and we are affected in the form of ozone depletion.) The more delay, the lesslikely it is that the system can respond in a way that maintains its equilibrium. Thesecond is the response time, or the time elapsed between when the system recievesfeedback, and it acts in response to it (for instance between when I touch a hot objectand I respond by withdrawing my hand.) The slower the response time, the less likely

  • the system is to be able to maintain equilibrium with its environment.

    Feedback and Information

    Feedback loops provide a system with information about its environment. As systemsbecome more sophisticated at information processing (i.e. become more "intelligent")they build the capacity to anticipate events by linking one form of feedback with futureevents. (For instance, we see signs of bad weather and prepare for it before it come;animals sense a hard winter and grow extra thick fur; businesses interpret signs of achanging market and respond in advance.) The development of informationconservation devices allow systems to accumulate experience about the kinds offeedback that improve their odds of survival. In other words, they learn. Adistinguishing feature of the brain is its ability to build symbolic representations offeedback loops, and "learn" the iteration of these symbolic feedback loops as opposedto from direct experience.

    Feedback and the Edge of Chaos

    The combination of internal and external feedback loops, and the speed of iteration willdetermine the dynamics of the system. If there is an excess of negative feedback, thesystem will usually fail to adapt to environmental changes and will eventually die. Ifthere is an excess of positive feedback, the system will eventually approach chaos anddisintegrate. The "edge of chaos" is a balance between these two extremes where lifeoccurs, because there is enough negative feedback and structure to processinformation, and also enough positive feedback to drive growth, change and adaptation.

    Relevance for Thinking About Complex Systems:

    The concept of feedback and feedback loops is central to an understanding of complexsystems. Feedback is the core "life process" of complex systems. It is by understandingthe dynamics of feedback loops that we can eventually understand the dynamics ofcomplex systems.

  • 8. THE NATURE OF INFORMATION inform

    The Basic Concept:

    A key characteristic of complex adaptive systems is their ability to process information.Being "on the edge of chaos" maximizes this capacity. The ability to conserve andprocess information, and use it for adaptation is characteristic of all "living" systems. Tounderstand the relationship between complexity and information, it is useful to reviewsome of the many meanings of the term "information."

    Discussion:

    Information as Structure and OrderAt it's most basic level, information is synonomous with order, with the presence ofdistinguishable patterns and relationships. "Random" systems contain no information,in the sense that it is not possible to distinguish patterns in the system. (Eric Jantschquotes a definition of information as "...any non-random spatial or temporal structure orrelation." (P. 50)) Structure, patterns and differentiation are synonomous withinformation. When we say a system contains information, we mean that it is possible tomake distinctions within the system. Information is structure, and structure isinformation. (Ervin Laszlo refers to information as "encoded patterns of energy.") The original sources of "information" and structure in the universe were the four basicforces (the weak nuclear force, the strong nuclear force, gravitation, and theelectromagnetic force.) These forces created the first differentiated structure of matterafter the undifferentiated "singularity" of the big bang. (All four forces are theorized tohave appeared within 10 to the minus 12 power seconds after the big bang.)Subsequent sources of information (i.e. the creation of new patterns of matter andenergy) derive from this original structuration.Information as Representation of OrderThe more common meaning of "information" is as a signal or representation of an orderthat is different from the information itself. All complex systems, even chemical ones,display rudimentary forms of "memory". Living systems are characterized by theirability to conserve representations of order, pattern and process, and therefore makeavailable in the present the accumulated experience of the past. The informationconserving function of DNA is a primary example, as well as the "social DNA" in humansystems of culture, books, stories, rituals and customs. The memory made possible bythe neural patterning of the human brain allows us to store representations of patternsand relationships we have observed in the environment, or created through our ownmental processes. The representation of information in symbols and symbol systems(e.g. speech, writing, mathematics) is a distinction of the higher computationalcapabilities of human consciousness.

    Information as Both Novelty and ConfirmationThe concepts of "novelty" and "confirmation" can be thought of as two aspects of

  • information: "Pure novelty, that is to say, uniqueness, does not contain any information;it stands for chaos. Pure confirmation does not bring anything new; it stands forstagnation and death." (The Self-Organizing Universe, P. 51.) The process of life (andthe region where information processing occurs) falls between these two extremes --where there is enough novelty to create new information, but enough confirmation tocarry out the process of computation. This territory is the "edge of chaos" wheremaximum capacity for information computation occurs.

    Systems as the Self-Organization of InformationIf information is seen as the representation of structure, pattern and differentiation, thenit is possible to see dynamic systems and the phenomenon of self-organization as theself-organization of information itself. Information is not different from the patterns ofmatter and energy. Thus Eric Jantsch can say that the brain is "...a communicationmechanism which is used and directed by the self-organization of information."

    Information and CommunicationCommunication is the interaction of one system with another, when the "pattern" of onesystem is "recognized" by another through the sending and receiving of "signals." (Thefield of Information Theory is the study of this process.) Communication occurs whenthere is enough similarity between the two systems for "recognition" to occur -- in otherwords when one system can "recognize" the signals sent from another system. Themost basic "signals" are those associated with the four basic forces -- strong and weaknuclear, electromagnetic and gravitational. The gravitational "signal", for instance, is"recognized" by all matter in the universe, regardless of its structure.

    The nature of communication changes as structures become more complicated, as inthe evolution from inorganic to organic systems. Eric Jantsch defines four differenttypes of communication that are characteristic of biological systems. These forms ofcommunication are differentiated by the speed with which the signals travel:

    Gentic communication, in which the "signal" is the gene (e.g. DNA.) This is the slowestof the four forms of biological communication, and occurs over generations of thespecies.

    Metabolic communication is communication that occurs in chemical processes, aschemicals react in the presence of each other. This includes all exchanges of energyand matter in production-consumption cycles (such as the digesting of food;photosynthesis; etc.) and chemical regulatory processes, such as hormone regulation.Since the "signaling" consists of the movement and transformation of matter, this is arelatively slow form of communication and can take anywhere from minutes to days.

    Neural communication occurs when information is transmitted by electrical impulseover the central nervous system and between nervous systems (organism to organism.)through the senses (auditory, touch, smell, cognition, etc..) Neural communication ismeasured in tenths or hundredths of a second.

  • Biomolecular communication occurs in relatively small volumes between individualatoms, and is the most rapid form of biological communication, occuring in milliseconds.

    Information, Consciousness and the "Ecology of Ideas"Information processing reaches its known apex in the structure of the human brain andour varying levels of consciousness. (See The Nature of Consciousness.) With thedevelopment of the neocortex layer of the brain, we evolved the capacity to store andcreate information in symbol systems such as language, mathematics and art. Thisprocess is made possible by the self-organizing dynamics of the neural networks in ourbrain. (See Neural Darwinism.) These symbols now create complex adaptive systemsof their own, in which the elements interacting in the system are the symbols andpatterns of symbols (an "ecology of ideas".) The combining and recombining of theseideas and concepts creates new information that we conserve in symbolic structuressuch as books, mathematical formulae, video, and more recently, ordered electrons onoptical disks. The development of the computer in turn allows us to direct machines tocreate new combinations of information that supplement the capacity we have in ourown brains.

    The combination of our own capacity for memory, thinking and imagination; the abilityto store information in physical artifacts; and now our ability to have machines augmentour capacity for the creation of new information, we have increasingly made ourcapacity for learning independent of contact with the external world we are learningabout. We are no longer entirely dependent on experience itself for learning: "...theprocessing and organization of information becomes independent...of direct sensoryimpact." (Jantsch, P. 164) This capacity is reaching a new level of independence fromsensory reality in the evoloution of "virtual reality" technology.

    All forms of self-reflective information conservation reduce the role of DNA in ourspecies evolution. As we evolve more flexible, creative and adaptive forms ofinformation conservation and creation, we increase our ability to "evolve" without beingdependent on the very slow process of DNA selection. This capacity is reaching a newlevel in the current development of knowledge about the structure of DNA itself, and ourability to change that structure to alter the shape of individuals formed by it.

    Learning as the Creation of New InformationIn its most rudimentary form, learning is a process of taking in new information (novelty), selectively combining it with conserved information (confirmation) and creating morenew information. (See Adaptation and Learning.) All "dissipative structures", "opensystems" or "complex adaptive systems" engage in this processs. (Note the differencebetween learning as the creation of information, and the more common usage ineducation as the "receiving" of information.) The Subjective Nature of InformationAll information is by definition subjective, because it is dependent on both what kindsof signals the receiving system can recognize, and what kinds of patterns in thosesignals they are capable of recognizing. Depending on your structure, you receive

  • some signals and you don't receive others. For instance, the human eye is structuredto detect a certain range of frequencies of light wave. Anything outside of this range is"invisible" to us. Assuming that I can receive the signal in the first place, I will then varywith others about how I interpret the signal. For instance, we all may see an object, butwe may differ on the shades of color in the object, depending on our interpretations ofit. (This is the distinction between innate and contextual attributes. See Subjectivity.)When we use the term "objective" in reference to information, we simply mean that wehave identified a large group of pattern recognition devices (e.g. human beings andtheir senses, perhaps supplemented by mechanical devices such as microscopes orlasers) that have similar capacities for signal detection, and similar rules forinterpretation of the signals. The "information" is contained in the relationship betweenthe observer and the observed and does not exist independent of this relationship.(Nick Herbert quotes the physicist Niels Bohr as saying: "Isolated material particles areabstractions, their properties being definable and observable only through theirinteractions with other systems." Quantum Reality, P. 161.)Signal Detection and the Collapsing of the Probability WaveThere is a correlary between how we collapse the potential information in a collection ofsignals into specific "knowledge", and how quantum physics understands the collapseof a "probability wave" into a specific particle. (See Wave-Particle Duality.) In bothcases, the act of detection and observation "collapses" a broad range of possibilitiesinto a more discretely bounded "thing." We take a collection of signals from theenvironment, and we interpret it in a specific way -- we make it "mean" something. Outmeaning of course, is only one of a possible range of meanings. In quantum physics,the "probability wave" of the particle, which represents all possible paths of the particle,and all possible mix of attributes, is "collapsed" into a specific particle in a specificlocation. We need to be aware therefore, that everytime we create information in theform of meaning and interpretation, we are also destroying the potential for a broadrange of other meanings and interpretations.

    Yes/No or Maybe?Traditional information theory is based on "digitized" order in which all information isexpressed as yes/no decisions or strings of "0's" and "1's". While this is useful for awide variety of applications (the evoloution of cybernetics from information theory formsthe basis of much of our computer programming) it misrepresents some of the"fuzziness" of reality as represented by quantum probability and uncertainty (SeeHeisenberg's Uncertainty Principle.) The emerging field of "fuzzy logic" is buildingprogramming capacity based on the "shades of gray" represented by the regionbetween the 0 and the 1.

    Relevance for Thinking About Complex Systems:

    Complex adaptive systems appear to be drawn to the "edge of chaos" because that iswhere their capacity for information processing is maximized. As we begin to seeinformation as the source of structure in complex systems, we become extremelysensitive to the role of information flow, processing, and creation in human systems.

  • And as we understand the inherently subjective nature of information, we begin to takecare that the information potential in signals we receive is not too narrowly "collapsed",thereby destroying our capacity to create new information.

  • 9. LOCAL RULES rules

    The Basic Concept:

    The nature of the rules that govern the interaction of parts in a system determines boththe global structure (what its shape is) and the global dynamics (how it changes overtime) of the system. Different "rules" will produce different system "shapes" anddifferent patterns of change (e.g. ordered, complex, or chaotic.)Discussion:

    "Local rules" are the decision criteria that parts of a system use to decide how torespond to different kinds of feedback. (Link to parameter space?) The term "decide" isused in a broad sense here. In relatively simple inorganic systems, the "local rules" arepurely physical rules, dictated by the basic physical forces of the universe(electromagnetic, gravity, strong and weak nuclear.) There is no "choice" involved. Assystems evolve and become more complex, and gain the capacity for informationconservation and processing, local rules become more sophisticated and the systemsare freed from the constraints of response to immediate physical forces. (This shiftbetween domination by physical forces and capacity to process information is where thedifference between living and non-living things begins to be drawn. And this differenceoccurs on the edge of chaos: " So one of the interesting things we can ask about livingthings is, Under what conditions do systems whose dynamics are dominated byinformation processing arise from things that just respond to physical forces? Whenand where does the processing of information and the storage of information becomeimportant? Complexity, P. 232. "The edge of chaos is where information gets its foot inthe door in the physical world, where it gets the upper hand over energy." Lewin, P. 51.See The Nature of Information.)Simple Rules, Complex Behavior. One of the important insights of complexity scienceis that fairly simple rules can generate enormously complex patterns. The almostinfinitely complex shape of the Mandelbrot Set fractal is created by one simpleequation. (See Fractals.) As one complexity scientist framed it: "Hey, I can start withthis amazingly simple system, and look -- it gives rise to these immensely complicatedand unpredictable consequences." (Complexity, P. 329) This is possible because of therapid iteration and feedback in the system. (See Feedback.)Change the Rules, Change the Dynamics. As you change the rules in the system, youget very different kinds of system behavior. A slight change in the reproduction rate fora species, for instance, can result in stable population patterns becoming completelyunpredictable. One way of classifying rules is according to what kinds of systemdynamics they produce. In this scheme, rules fall into one of four "universality classes"that result in ordered, complex, or chaotic behavior. (See Classes of Systems.) StuartKauffman contends that living systems adjust their rules (in his language, their"parameters") until they "find" the rules that keep them in the territory of complexbehavior (on the "edge of chaos"), since that is the system state most conducive to

  • information processing and hence life. Kauffman refers to this process as the processof systems taking "adaptive walks through parameter space to find 'good' dynamicalbehavior." (The Origins of Order, P. 181)Human Rules. As the parts that are interacting with each other in the system becomemore complex (e.g. from atoms to molecules to protiens, to cells, to organisms, to self-conscious beings) the rules and feedback loops within the system also become morecomplex. The emergence of self-reflective consciousness in human beings introducesa new phenomenon in "rule making." We have the capacity to reflect on and predict theconsequences of different kinds of "local rules", and deliberately choose the rules wethink will lead to desired future system states. In human systems, our rules are themental models we use to decide how to behave. Some of these are driven by biology(e.g. hormones.) Others are built out of our social and historical experiences. These"rules" are encoded in our culture in the form of values, beliefs, customs, laws, theories,assumptions and morals. It is this human capacity that allows us to create "unnatural"and "designed" systems. Science can be understood as our attempt to understand the"rules" or "laws" of the universe, partly for the purpose of just knowing, but also for thepurpose of being able to predict and control the results those laws produce.

    Rules, Adaptation and Learning. Adaptation is the capacity to change behavior basedon feedback. Learning is the capacity to change our "local rules" based on experience.In Kauffman's language, learning is the ability to wander through parameter space.

    Relevance for Thinking About Human Experience:

    When we seek to change human systems, our tendency is often to try and "restructure"and "design" systems from the top down. A proper understanding of the role of localrules in determining global shape and dynamics would lead us instead to look for thesimple rules we would need to change that would then drive change in shape anddynamics. And we would seek to understand in particular those rules that will keep thesystem on the "edge of chaos" where it is most able to adapt, learn and grow.

  • 10. SELF-ORGANIZED CRITICALITY critical

    The Basic Concept:

    Complex systems naturally bring themselves to a condition which is stable, but fragileenough that small disturbances can cause large "avalanches" of change. The physicistPeter Pak refers to this state as "self-organized criticality". Self-organized criticality isanother way of understanding systems that are poised on the "edge of chaos." (SeeEdge of Chaos.)Discussion:

    Self-organized criticality is best understood by thinking of the example of a pile of sandthat is created by dropping grains of sand one by one on top of another: "...the resultingsand pile is self-organized in the sense that it reaches the steady state all by itselfwithout anyone explicitly shaping it. And it's in a state of criticality, in the sense thatsand grains on the surface are just barely stable." (Complexity, P. 304) Like othersystems at the edge of chaos, the sand pile has a balance of stability and fluidity.

    As grains of sand are dropped on the pile, unpredictable "avalanches" will occur, wherea single grain of sand will cause a major reshaping of the pile. The size and frequencyof these avalanches conforms to what is called a "power law." (See Transition Laws.)"Big avalanches are rare, and small ones are frequent. But the steadily drizzling sandtriggers cascades of all sizes -- a fact that manifest itself mathematically as theavalanches' 'power law' behavior: the average frequency of a given size of avalanche isinversely proportional to some power of its size." (Complexity, P. 305) The key to thebehavior of self-organized criticality is the way in which the steady input of energy (inthis case the grains of sand) drives the system to be structured in such a way that youhave many small subsystems and elements in a state of delicate interconnection witheach other.

    Self-organized criticality is another way of talking about systems at "the edge of chaos."Power laws are a special case of small inputs causing large changes in non-linearsystem -- in other words, another example of sensitive dependence. (See SensitiveDependence on Initial Condition.)Relevance for Thinking About Complex Systems:Complexity scientists suspect that self-organized criticality and power laws are featuresof many kinds of complex systems, including earth quakes, traffic jams, stock marketfluctuations, species extinctions, and economies. One distinction is important -- self-organized criticality does not require complex, adaptive behavior. Complex adaptivesystems are systems that have the capacity for information manipulation andconservation, which is a feature that cannot be identified with piles of sand andearthquakes. Thus, while complex adaptive systems might display features of self-organized criticality, self-organized criticality does not mean the system is a complexadaptive system.

  • 11. STRATIFIED AUTONOMY stratifi

    The Basic Concept:

    Complex systems tend to arrange themselves in "layers" of integration, such that eachsystem is part of a larger whole, which in turn is part of an even larger system, which ispart of a larger system, and so on. At each level, each system is simultaneouslyautonomous and integrated with the systems at its level, below it, and above it.

    Discussion:

    There is a distinct hierarchical structure in nature, but it does not operate anything likethe top-down control model that is associated with human hierarchies. Instead, nature'sform of hierarchy (referred to as "stratified autonomy"), is a system of emergent layersof system itegration that is driven from the bottom up, not the top down. (SeeEmergence.):"...stratified order...The tendency of living systems to form multi-leveled structures

    whose levels differ in their complexity is all-pervasive throughout nature and hasto be seen as a basic principle of self-organization. At each level of complexitywe encounter systems that are integrated, self-organizing wholes consisting ofsmaller parts, and, at the same time, acting as parts or larger wholes." (Capra, P.280)

    Even to speak of "layers" and "top" and "bottom" infers the wrong kind ofrelationships.In the stratified arrangements of natural systems, the "levels" refer not towho controls who, but to the scale at which system integration occurs. (In this sense ofscale, the stratified autonomy of complex systems forms a fractal structure in whichthere is a high degree of self-similarity at different scales of the system. See Fractals.)At each level of integration, each individual system maintains a high level of autonomy,but also establishes stable patterns of direct relationships with other systems.(Predators and prey, for instance, establish stable patterns of relationships with eachother over time.) This set of relationships then comes to constitute a system of its own,which in turn develops stable patterns of relationships with other systems of similarscale. (See Cosmogenesis chart showing communication and coordination patterns.)In their stratified order, complex systems proceed through a dual process ofindividuation and integration. Individuation creates the parts that can form and reformpatterns of relationship. These patterns of relationship then can constitute the processof integration. When the integration is so complete that the parts become dependenton each other for their very existence, individuation has again occured, but at a higherlevel of complexity. Thus the building of complex stratified order has a temporaldimension to it, in that each level of integration is built on the level that came before it:"Systems on a given level are aggregates of systems at the preceeding level."(Cosmogenesis, P. 213)

  • The dynamics of stratification through differentiation and integration at progressivelyhigher levels of complexity can be seen in many different kinds of complex systems,including biological ecosystems; human social systems; the structure and function ofDNA; and even the process of language acquisition (see Cosmogensis, P. 251).Different authors have constructed a number of different charts demonstrating thehierarchic structure of natural systems. Several are attached.

    Relevance for Thinking About Complex Systems:

    Understanding the stratified autonomy of complex systems gives us a way to thinkabout the building of layers of complexity and integration without the need to "design"the system from the top down, and without the need to control the parts from a centrallocation.

  • 12. PHASE TRANSITIONS

    The Basic Concept:

    A phase transition is a radical shift of a system from one condition to another conditionwith highly different properties. The transformation from steam to water to ice is anexample. the term "phase transition" describes the intermediate transition periodbetween the two different conditions. Both material substances, physical systems, andorganic systems exhibit a common pattern when they transition from one state toanother. They move from order to complexity to chaos. The middle condition is calledthe point of "phase transition". It is also sometimes referred to as the "edge of chaos".(See Edge of Chaos)Discussion:

    A phase transition is a form of bifurcation, a change from one state to another. (SeeBifurcation) This particular kind of bifurcation, however, is one where the fundamentalproperties of the material or system change. Examples from the material world include:liquid to vapor; nonmagnetic to magnetic; fluid to superfluid; conductor tosuperconductor; and fluid to solid. Complexity scientists noticed that there seemed tobe similarities in the phase transition, regardless of the system involved. The exactpoint of transition is one characterized by highly non-linear phenomena.

    There are two kinds of phase transition -- first order phase transitions, where the shift isabrupt, and the system jumps directly from one state to another; and second order (or"continuous") phase transitions where the shift is more gradual. In first ordertransitions, the elements (e.g. molecules) are forced to make an either/or choicebetween two states. In second order transitions, right at the transition, there is a fluidbalance between order and chaos: "Order and chaos intertwine in a complex,ever-changing dance..." (Complexity, P. 230) On one side stands the rigid pattern ofsolid order; on the other swirling, boiling turbulence of chaos. In the middle, at the pointof transition, stands a region of mixed order and chaos -- enough order to have form,but enough chaos never to get rigid. This point of "phase transition" is the "edge ofchaos", and is the condition that all living systems evolve towards.

    (A field of study that is related to first order phase transitions is "catastrophe theory",which develops models of the radically abrupt change from one condition to another(e.g. the popping of a balloon). See P. 84 - 87 of Turbulent Mirror.)

  • Relevance for Thinking About Complex Systems:

    Phase transitions have two implications for how we think about complex systems,including human ones:

    Systems have the potential for radically changing from one state to another (e.g. thecollapse of the Soviet Union); and

    Living systems are systems that exist in the constant tension between order anddisorder, in the territory of "second order" phase transitions.

  • SELECTED BIBLIOGRAPHY

    Popular Integrators

    Fritjof Capra, The Turning PointMargaret Wheatley, Leadership and the New ScienceMarilyn Ferguson, The Aquarian ConspiracyMichael Talbot, The Holographic UniverseKen Wilber, ed., The Holographic Paradigm and Other ParadoxesAnna Lemkow, The Wholeness PrincipleGeorge Land and Beth Jarman, Breakpoint and BeyondGeorge Land, Grow or DieJohn Briggs and David Peat, Looking Glass UniverseLouise Young, The Unfinished UniverseDavid Layzer, CosmogenesisJohn Casti, Paradigms LostMichael Rothschild, Bionomics

    Quantum Physics

    Nick Herbert, Quantum RealityDanah Zohar, The Quantum SelfDavid Bohm and David Peat, Science, Order, and CreativityFritjof Capra, The Tao of PhysicsHeinz Pagels, The Cosmic CodeBruce Gregory, Inventing Reality -- Physics as LanguageFred Wolfe, The Quantum LeapPaul Davies, ed, The New PhysicsDavid Bohm, The Unidivided UniverseDanah Zohar, The Quantum SocietyKen Wilber, ed., Quantum Questions

    Chaos Theory

    James Gleick, ChaosJohn Briggs and David Peat, Turbulent MirrorJohn Briggs, Fractals -- The Patterns of ChaosStephen Kellert, In the Wake of Chaos

  • Robert L. Devaney, A First Course in Chaotic Dynamical Systems

    Systems Theory

    Ludvig von Bertalanffy, General System TheoryErvin Laszlo, The Systems View of the WorldErich Jantsch, The Self-Organizing UniversePeter Senge, The Fifth DisciplineArthur Young, The Reflexive Universe

    Complexity Theory

    Mitchell Waldrop, ComplexityRoger Lewin, Complexity -- Life at the Edge of ChaosIlya Prigogine, Order Out of ChaosWilliam Roetxheim, Enter the Complexity LabJohn Casti, ComplexificationMurray Gell-Mann, The Quark and the JaguarLawrence Slobodkin, Simplicity and Complexity in Games of the IntellectStuart Kauffman, The Origins of OrderEllen Thro, Artificial Life Explorer's KitKevin Kelly, Out of Control

    Neuroscience and Learning

    Renate Caine, Making Connections -- Teaching and the Human BrainHoward Gardner, Frames of MindWilliam Glasser, The Quality SchoolJacqueline Brooks, The Case for Constructivist ClassroomsGerald Edelman, Bright Air, Brilliant FireCharles C. Bonwell, Active LearningLouis Perelman, School's OutLeslie Hart, Human Brain and Human LearningBobbi DePorter, Quantum Learning


Related Documents