CEPA eprint 1696 (HVF-110)


Foerster H. von (1987) Cybernetics. In: Shapiro S. C. (ed.) Encyclopedia for Artificial Intelligence. John Wiley and Sons, New York: 225–227. Available at http://cepa.info/1696
Table of Contents
The phrase “control and communication in the animal and the machine” can serve as a definition of cybernetics. Although this term was used by Andre Marie Ampere about 150 years ago (1) and its concepts were used by Heron of Alexandria more than 1500 years ago (2), it was the mathematician Wiener who, in 1948, with the publication of Cybernetics (3), gave name and meaning to this notion in the modern context. The name cybernetics is derived from the Greek word for steersman, […];, which in Latin became gubernator, governor in English. The concept associated with this term was to characterize a mode of behavior that is fundamentally distinct from the customary perception of the operations of machines with their one-to-one correspondence of cause-effect, stimulus- response, input-output, and so on. The distinction arises from the presence of sensors whose report on the state of the effectors of the system acts on the operation of that system. Specifically, if this is an inhibitory action that reduces the discrepancy between the reported state of the effectors and an internal state of the system, the system displays goal-oriented behavior (4), that is, if perturbed by any outside means, it will return to some representation of this internal state, the goal. Although this scheme does not specify the physical nature of the states alluded to, nor of the signals reporting about these states-whether they are electric currents, mechanical or chemical agents, abstract symbols, or whatever-the biological flavor of the language used is apparent. This is no accident; in the formative years of this concept the close cooperation of Wiener with the neurophysiologist Rosenblueth created a physiological context. Moreover, this cooperation stimulated the philosophical inclination of these two men, and together with Bigelow they set the stage for still ongoing epistemological inquiries with the publication in 1943 of “Behavior, Purpose and Teleology” (5). Another fruitful menage d: trois of philosophy, physiology, and mathematics was the collaboration first of McCulloch, philosopher, logician, neurophysiologist, or “experimental epistemologist,” as he liked to call himself, with a young, brilliant mathematician, Pitts, who published together two papers of profound influence on this emerging mode of thinking. The title of these papers almost give away their content: “A Logical Calculus of the Ideas Immanent in Nervous Activity” (6), written in 1943, and “How We Know Universals: The Perception of Auditory and Visual Forms” (7), published in 1947. Then von Neumann’s fascination with seeing a parallelism of the logical organization of computations in nervous tissue and in constructed artifacts (8) brought him close to McCulloch (9) and the people around him. The underlying logic of these various ideas and concepts was the topic for 10 seminal conferences between 1946 and 1953, bringing together mathematicians, biologists, anthropologists, neurophysiologists, logicians, and so on, who saw the significance of the notions that were spelled out in the title of the conferences: “Circular Causal and Feedback Mechanisms in Biological and Social Systems” (10). The participants became the catalysts for the dissemination of cybernetic concepts into the everyday vernacular (e.g., “feedback”), for epistemological inquiries regarding mentality, and of course “mentality in machines” (11). Should one name one central concept, a first principle, of cybernetics, it would be circularity. Circularity as it appears in the circular flow of signals in organizationally closed systems, or in “circular causality,” that is, in processes in which ultimately a state reproduces itself or in systems with reflexive logic as in self-reference or self-organization, and so on. Today, “recursiveness” may be substituted for “circularity,” and the theory of recursive functions (see Recursion), calculi of self-reference (qv) (12), and the logic of autology (13), that is, concepts that can be applied to themselves, may be taken as the appropriate formalisms.
Consider again systems with a functional organization whose operation diminish the discrepancy between a specific state and a perturbation. The system’s tendency to approach this specific state, the “goal,” the “end,” in Greek TEAoo; (hence “teleology”), may be interpreted as the system “having a purpose” (14). The purpose of invoking the notion of “purpose” is to emphasize the irrelevance of the trajectories traced by such a system en route from an arbitrary initial state to its goal. In a synthesized system whose inner workings are known, this irrelevance has no significance. This irrelevance becomes highly significant, however, when the analytic problem-the machine identification problem-cannot be solved, because, for instance, it is transcomputational (15) in the sense that with known algorithms the number of elementary computations exceeds the age of the universe expressed in nanoseconds. Hence, the notion of purpose can become effective when dealing with living organisms whose goals may be known but whose behavioral trajectories are indeterminable. Aristotle juxtaposes the “efficient cause,” that is, when “because” is used to explain the flow of things, with the “final cause,” that is, when “in order to” is used for justifying actions. In the early enthusiastic stages of cybernetics language appropriate for living things like desires, wants, ethics, thought, information, mind, and so on were sometimes used in talking about synthesized behavior.
Traces of this are found today in terms like “computer memory,” “processing of information,” “artificial intelligence,” and so on. The fascination with “bio-mimesis,” that is, “imitating life” keeps the present-day followers of Aristotle searching for a synthesis of aspects of mentation by using the powers of the large mainframe computers. On the other hand, the analytic problem “what is mind?” and “whence ideas?” in the Platonic sense keeps cybemeticians searching for principles of computation and logic underlying sensorimotor competence, thought, and language.
Although in the early phases of this search the notion of purpose appeared in many studies of these processes, it is significant that a completely purpose-free language can be developed for the same type of systems by paying attention to the recursive nature of the processes involved. Of interest are circumstances in which the dynamics of a system transforms certain states into these very states, where the domain of states may be numerical values, arrangements (arrays, vectors, configurations, etc.), functions (polynomials, algebraic functions, etc.), functionals, behaviors, and so on (16). Depending on domain and context, these states are in theoretical studies referred to as “fixed points,” “eigenbehaviors,” eigenoperators,” and lately also as “attractors,” a terminology reintroducing teleology in modem dress. Pragmatically, they correspond to the computation of invariants, may they be object constancy, perceptual universals, cognitive invariants, identifications, namings, and so on. Of course, the classical cases of ultrastability and homeostasis should be mentioned here (17).
In thermodynamically open systems a significant extension of circularity is closure, either in the sense of organizational closure as, for example, in the self-organizing system, or in the sense of inclusion as, for example, in the participant observer. Self-organizing systems are characterized by their intrinsic, nonlinear operators, (i.e., the properties of their constituent elements: macromolecules, spores of the slime mold, bees, etc.), which generate macroscopically (meta-) stable patterns maintained by the perpetual flux of their constituents (18). A special case of self-organization is autopoiesis (19). It is that organization which is its own Eigen-state: the outcome of the productive interactions of the components of the system are those very components. It is the organization of the living, and, at the same time, the organization of autonomy (20). The notion of “organization” carries with it that of order and then, of course, of disorder, complexity, and so on. It is clear that these notions are observer dependent, hence the extension of cybernetics from observed to observing systems and with this to the cybernetics of language (21). Here language is thought to be precisely that communication system that can talk about itself: a language must have “language” in its lexicon. Autology is the logic of concepts that can be applied to themselves (13). Among these are consciousness and conscience: Their corollaries, epistemology and ethics, are the crop of cybernetics.
[1] M. Zeleny, “Cybernetics and general systems: A unitary science?” Kybernetes 8(1), 17-23 (1979).
[2] 0. Mayr, The Origins of Feedback Control, MIT Press, Cambridge, MA, 1969.
[3] N. Wiener, Cybernetics: Or Control and Communication in the Animal and the Machine, Wiley, New York, 1948.
[4] R. Conant (ed.), Mechanisms of Intelligence: Ross Ashby’s Writings on Cybernetics, Intersystems Publications, Seaside, UK, 1981.
[5] A. Rosenblueth, N. Wiener, and J. Bigelow, “Behavior, purpose and teleology,” Philos. Sci. 10, 18-24 (1943).
[6] W. S. McCulloch and W. H. Pitts, “A logical calculus of the ideas immanent in nervous activity,” Bull. Math. Biophys. 5, 115-133 (1943).
[7] W. Pitts and W. S. McCulloch, “How we know universals: The perception of auditory and visual forms,” Bull. Math. Biophys. 9, 127-147 (1947).
[8] J. von Neumann, The Computer and the Brain, Yale University Press, New Haven, CT, 1958.
[9] J. von Neumann, The General and Logical Theory of Automata, in L. A. Jeffress (ed), Cerebral Mechanisms in Behavior, the Hixon Symposium, Wiley, New York, pp. 1-41, 1951.
[10] H. Von Foerster et a!., Cybernetics: Circular Causal and Feedback Mechanisms in Biological and Social Systems, Proceedings of the Sixth, Seventh, Eighth, Ninth, and Tenth Conferences on “Cybernetics: Circular Causal and Feedback Mechanisms in Biological and Social Systems,” (5 vols., The Josiah Macy Jr. Foundation, New York, 1950-1955.
[11] D. M. MacKay, Mentality in Machines, in Proceedings of the Aristotelian Society, Supplement 1952, pp. 61-86, 1952.
[12] F. J. Varela, “A calculus for self-reference,” Int. J. Gen. Syst. 2, 5- 24 (1975).
[13] L. Lofgren, Autology for Second Order Cybernetics, in Fundamentals of Cybernetics, Proceedings of the Tenth International Congress on Cybernetics, Association Internationale de Cybernetique, Namur, pp. 17-23, 1983.
[14] G. Pask, The Meaning of Cybernetics in the Behavioral Sciences (The Cybernetics of Behavior and Cognition: Extending the Meaning of “Goal”), in J. Rose (ed), Progress of Cybernetics, Vol. 1, Gordon and Breach, New York, pp. 15-44, 1969.
[15] H. J. Bremmermann, Algorithms, Complexity, Transcomputability, and the Analysis of Systems, in W. D. Keidel, W. Haendler, M. Spreng, (eds.), Cybernetics and Bionics, R. Oldenbourg, Muenchen, pp. 250-263, 1974.
[16] H. Ulrich and G. J. B. Probst (eds.), Self-Organization and Management of Social Systems, Springer, New York, 1984.
[17] W. Ross Ashby, An Introduction to Cybernetics, Chapman & Hall, London, 1956.
[18] P. Livingston (ed.), Disorder and Order, Stanford Literature Studies 1, Anma Libri, Stanford, 1984.
[19] H. R. Maturana and F. J. Varela, Autopoiesis and Cognition, D. Reidel, Boston, 1980.
[20] F. J. Varela, Principles of Biological Autonomy, Elsevier, NorthHolland, New York, 1979.
[21] H. R. Maturana: Biology of Language: The Epistemology of Reality, in Psychology and Biology of Language and Thought, Academic Press, New York, 1978.
== General References
K. Gunderson, Cybernetics, in The Encyclopedia of Philosophy, Macmillan, New York, Vol. 2, pp. 280-284, 1972.
B. P. Keeney, Aesthetics of Change, Guilford, New York, 1983.
W. S. McCulloch, Embodiments of Mind, MIT Press, Cambridge, MA, 1965.
W. T. Powers, Behavior: The Control of Perception, Aldine, Chicago, 1973.
Found a mistake? Contact corrections/at/cepa.infoDownloaded from http://cepa.info/1696 on 2016-05-13 · Publication curated by Alexander Riegler