CEPA eprint 2942

Embodiment of natural and artificial agents

Etxeberria A. (1998) Embodiment of natural and artificial agents. In: G. V. V. S. S. & M. D. (eds.) Evolutionary systems: Biological and epistemological perspectives on selection and self-organization. Kluwer, Dordrecht: 397–412. Available at http://cepa.info/2942
Table of Contents
Introduction
Life and cognition
First problem: Causal systems compute
Second problem: Organisms are adaptations
Third problem: Cognition is an informational process
Internalist and externalist embodiment
Representations in evolution
Acknowledgements
References
Introduction
The term embodiment suggests a return to the body (or to a physical or perceiva­ble realm) of something that was (but should not be) previously separated from it. This phenomenon can be found in a wide range of contexts; for example, abstract entities, such as computer programmes, may acquire dynamics when executed in material devices; theoretical ideas can become operative when put in relation to practical or contingent situations; or, similarly, when considered as properties of bodies (including brains), mental capacities recover a physical nature. The return we refer to has an explanatory character: it is motivated by an assumption that embodiment may throw light upon areas where disembodied explanations are unsatisfactory. Many scientific and philosophical traditions have postulated privileged realms (e.g. Platonic worlds) deprived of materiality, dynamics, interactions or praxis for explanation, but they priorise the know that in front of the know how (Ryle, 1949) and may thus side-step the more complex problems. This is the reason why it is important to explore a differently motivated epistemology, one able to approach phenomena in their original embodied situations. Then, a claim for embodiment would not be a demand for a restitution, but an urge to start from the beginning, from the things themselves.
A proper treatment of embodiment stems from a consideration of the role played by the organization and the physical structure of the body in cognitive processes. Therefore, the task of building artificial systems does not have a practical goal only, it is also a way to evaluate the understanding of the phenome­na our models reflect. A motivation underlying all the sciences of the artificial is to test our models by reproducing the complex phenomena they refer to. In the case of natural agents, embodied explanations require a consideration of different stages or scales. Organisms are self-organizing physical things, they may reproduce, they are homeostatic, they have experiences (for example, pain and pleasure help them protect their integrity), some of them can move to find food, escape from predators or look for mates, some can guess what the behavior of others will be and devise strategies according to it, etc. Many of these capacities are essential to understand life, but it is a matter of inquiry which of them, and to what point, are important to explain the origin of minds or of cognition. In the case of artificial systems, a proper notion of embodiment poses additional problems related to how to handle complexity. So far, there is no procedure to build living beings and it is doubtful that such a specification will ever be obtained. Our ideas about how possible artificial living creatures would look like or behave are fuzzy. While embodiment is relevant to understand the meaningful interactions of natural systems, it is crucial when we try to build artificial systems that present the same capacities.
An underlying question of this paper will be whether embodiment facilitates the appearance of cognition at different levels, including the higher ones, or it is just a set of constraints that shape the behavior of real material systems, but could be neglected in formal models. Our argumentation starts by considering the nature of cognition in respect to life and the next three sections will examine three different questions basical to characterize the approach of embodied cognition in an evolutionary frame. The last two sections identify two different ways to understand embodiment, an extemalist and an internalist one, and discuss briefly the role of representations in cognitive processes from an evolutionary point of view.
Life and cognition
Many attitudes to understand cognition can be classified in two loose groups: the one that takes a global organismal perspective and the one that focuses on specifically functional organs. In the first case all cognitive capacities are more or less sophisticated ways to structurally change the organism or the environment so as to articulate the system-environment transaction and possibly improve the viability of the system or the quality of experience. In turn, an approach that focuses on organs – like sensors or the nervous system – will try to identify the components of the relation that act as vehicles of function or meaning: the symbols or representations. The difference between both attitudes might imply the need of accepting complementary ways to explain a process as complex as the embodiment of cognition. We say more about this in the last section.
In both cases, a naturalist notion of what cognition is depends on judgements of when cognitive phenomena start in the evolution of life on earth or in an incremental scale of complexity in the case of artificial systems. From certain viewpoints, cognition already exists in single-celled organisms; others think that it only appears with the development of the nervous system; and, still others, require the origin of language in hominisation. The definition of what counts as a cognitive process changes in each case.
In recent times there has been some controversy on whether biologically inspired epistemologies can distinguish life and cognition in a consistent way. Defences that they are basically the same process arise from at least two different motivations. One of them, evolutionary epistemology, contends that evolution is the only source of novelty for organisms, therefore no cognitive gain takes place in ontogenetic time (for an example, see Heschl, 1990). If the origin of minds in living systems has an adaptive value, then mental capacities, like the other traits, should be a result of evolution by natural selection. A similar position is held by evolutionary psychology. Adaptationist approaches to this question hold externa­list views of cognition in which natural selection is the only force considered relevant to guarantee the cognitive adaptedness of the agent.
The enactive approach to cognitive phenomena (Varela et al., 1991; Stewart, 1992) maintains a similar position in what respects life and cognition starting from a very different background and with a radically different motivation. From this perspective, life is self-production (autopoiesis) in the form of a dynamical organization of component production that defines an identity by separating itself from the environment. This organization is already cognitive (like others, such as the immune system (Varela et al., 1987). If it is the autonomy of the global behavior of the system what makes it unlike inanimate things – for example, if its behavior cannot be characterized by straightforward input-output mappings —, then this autonomy implies the emergence of cognitive-like phenomena; for example, attribution of meaning to some of the perturbations detected in the environment. Therefore, even if different kinds of cognitive phenomena can be defined, they all start and ground in the same autonomy.
From another perspective, however, life and cognition seem to be distinguishable according to the properties of different processes that take place at different time scales within the same global system. This point of view maintains that the biological organization pre-exists any cognitive capacity and its existence is a pre-condition for its appearance (Moreno & Etxeberria, 1992). Cognition then deserves a special treatment: the biological basis of the organism adapts to its environment via evolutionary processes (variation and natural selection acting on genotypes), unlike the cognitive capacity, which depends on ontogenetic processes of maturation and learning at the level of the nervous system.
Following a classification by Godfrey-Smith (1994), the first two positions vindicate a strong continuity between life and cognition, saying either that life is a condition of mind (evolutionary epistemology) or that to live implies capacities that are intrinsically cognitive (enactive approach), while the second only sees a weak continuity, life is a condition for mind but mind implies more than just life. Both of them embrace a methodological continuity, because generic properties of life are considered important to explain cognition in all these cases.
In the following sections three questions important to characterize an embodied approach to cognition are examined. The three of them legitimize somehow a conceptual perspective that plays down or suppresses the need for embodiment in explanations of the nature of cognitive systems. The issues to analyze are:
1. The nature of the physical systems able to embody cognitive processes. The relation between causality and computation (between the material and the formal or the ontology and the description) is, of course, the central issue. A claim of isomorphism of causality and computations often appears in the literature, usually discussed in terms of whether computations can emulate physical processes. Here it will be posed in a slightly different way, by inquiring whether, and to which extend, causal processes embody computations. The notion of constraint appears to be a keyword.
2. Understanding complex properties of living systems as adaptations by natural selection. Explanations of the complex design of living beings as generating from a passive adaptation to the environment hide the question of embodiment, because most of the properties of organisms (bodies as well as minds) are studied from the point of view of external forces that shape them.
3. The idea that a cognitive process consists mainly in an informational transaction with the environment, so that other considerations of this transaction, generically energetic, may be left aside. A consideration of the nature of sensors arises some alternatives that may be interesting.
All three problems might help clarify epistemological assumptions underlying the adoption of disembodied epistemologies, which are obstacles to develop truly embodied ones. The problem of the origin and role of symbols or representations in cognition appears at the end of the paper.
First problem: Causal systems compute
An interesting locus to start the analysis of possible equivalences between formal and causal processes, and to examine artificial embodiments of living systems, is the Church-Turing thesis. It was based upon independent analyses of the general notion of an effective procedure proposed by A. Turing and A. Church in the 1930’s. The thesis is a statement of the theoretical limits of computation, but, in some of its interpretations, it constitutes also a claim about embodiment. Originally, the thesis stated that there are no number theoretic functions which cannot be computed by a Turing machine (TM), but can be computed by some other effective procedure. Since then, other interpretations take it to represent an ultimate limit to all possible computations. Hence, it has been construed as applying to functions in general, as opposed to just the number theoretic functions, but, more strikingly, it has also been maintained that its domain of application extends to the production of physical and/or mental phenomena. Turing himself argued that, because mental processes are procedural, anything that can be produced by a mental process can also be produced by a TM procedure. Finally, the thesis has also been applied to physical systems in general, therefore, the formulation with a maximum extension would be that any physical system, including the brain-mind, is an embodiment of some TM procedure. Here the original sense, that the TM can compute any function or that computation can emulate any process, is inverted. Now it states that nothing exists which is not TM computable or that any physical process computes.
Recently, the validity of the thesis for the extreme case of physical systems has been ruled out on the basis of the difference between formal and causal processes (Cleland, 1993). Causal processes, it is argued, are more powerful than the TM, therefore no formal process can perfectly simulate or duplicate a physical system, which are more powerful: “we have good reason to suppose that the computatio­nal capacities of causal processes exceed those of TMs and we have no idea as to the ultimate capacities of causal processes to compute functions” (Cleland, p. 287). In order to defend this point, Cleland proposes to consider well-defined procedures that cannot be realized by a TM: mundane procedures, those whose steps involve causal processes. If we compare formal procedures with everyday mundane procedures such as recipes, directions, etc., each step of the latter, and not of the former, initiates a causal process and its effectiveness depends on it. For example, when following a procedure that indicates how to prepare a Hollandaise sauce, the success of the performance does not depend only on the actions taken (which are not necessarily sequential), but also on the fact that each of them initiates and sustains some causal process. Therefore, causal processes differ from TM actions in that the former, at least as they are ordinarily construed, are neither temporally nor causally discontinuous ”(…) their constituent parts (sub-processes) are not, as in the case of TM actions, thought to be temporally separate (…) Causal processes just do not seem to be procedure-like, (they are) ’continuously evolving’ (…) ’self-generating’ and ’self-determining’” (ibid., p. 296).
I basically agree with this point, as far as it implies that formal processes may be less powerful than causal processes. Yet, the whole discussion seems to avoid an important problem, for it remains unclear who performs the computation and, in general, the sense in which the term computation is used, i.e. whether it is a human mental operation or an objective real process of natural systems. For example, it is said: “if, as many believe, there exist genuinely continuous causal processes, then the capacities of actual physical systems to compute (i.e. mirror the structure of) functions exceeds that of TMs (…) we have no idea as to the ultimate limits of causal processes to mirror precisely the structure of functions” (ibid., p. 309). For Turing (1936-37) the action of computing was performed by a person realizing a calculus. When computing machines appeared, and as their operation emulated the calculus specified by formal procedures (such as those of the TM), the machine itself could be interpreted as physically performing the computation. But, how far can physical systems in general be interpreted as computations? No analysis of how causal processes can manifest by themselves as computations is given in Cleland’s paper. The problem is not trivial though, some authors have considered it to be the core of epistemology: the origin of the modelling relation in which a formalism results from an abstraction of certain causal processes as action steps (see, for example, Rosen, 1985).
For the operation of a physical machine to be equivalent to the calculus (to the formal machine), the causal processes have to be fully constrained, to the extent that each step of the causal process corresponds to a step in the procedure. In effect, the causal processes of physically computing machines are completely constrained, so that all their operations are isomorphic with the steps required by the formal procedure. No knowledge of the causality of the physical system is required to calculate the next state. If it is a special-purpose machine, it may only emulate one specific formal process; if it can potentially emulate any procedure, then it is said to be capable of universal computation. The latter requires the physical system to be structurally programmable (Conrad, 1987), so that it can be modified to adjust the causal processes of the physical device to any formalism. In this case, these constraints are such that they can be specified as a procedure, so that any causal change of state of the physical device always corresponds to a logical step. As a consequence, the physical and the formal aspects of a computing device never interact, they are isolated, except when errors take place. And this constitutes a very special relation indeed. Conrad (1987, 1989) enunciates a trade-off principle between the programmability and the evolvability of a system. Structurally programmable systems are not robust in front of errors, whereas evolvable systems must be capable of integrating slight variations and perhaps taking advantage of them.
At this point, when the initial distinction between a TM procedure and a mundane procedure is revisited, it is readily manifest that embodiments of TMs share the causal characteristics associated with mundane procedures. Nothing can happen in the world that is not causal, therefore the difference between the two, as embodiments, cannot be placed in the fact that one involves causal processes while the other does not. Yet, we still find a difference between the effectiveness of computer programmes to calculate the identity function, for example, and a recipe to prepare Hollandaise sauce. The difference may have to do with the degree of constraint we would require to artificially emulate a process and the characteristics of the material substrate that is being constrained. Even if we eliminate the human component of the mundane procedure and we consider the operation of a fully automated machine or a robot, capable of performing all the actions specified by the recipe in a highly skilful way, the whole procedure seems to rely in causal processes to a point where the computer does not. The causal structure of a computing device performing a computation can be controlled by technology to an extent where the process of mixing egg yolks with butter to do a Hollandaise sauce might not. In what respects the fact of being a causal process it is only a difference of degree, not of principle. Both of them are causal processes, and this remains an important difference that should preclude any affirmation that physical systems “compute” in any way. Their behavior may be more or less accurately formalized, even emulated by a computational simulation, but we should not speak of computation in the case of physical systems unless we can specify how its behavior is constrained by a formal procedure.
Second problem: Organisms are adaptations
To understand how the origin of constraints of living systems is understood in evolutionary theory we should refer to discussions about externalism and internalism. Both positions, that organisms are shaped by the environment (externalist) and that adaptivity results from processes within the organism (intemalist) can be found in the history of biology (Godfrey-Smith, 1994). Evolutionary materialists, such as Lamarck, emphasize the capacity of living matter to complexify and, thus, to adapt to environmental changes. For Lamarcky increases of complexity are inevitable in nature, its progression is only disturbed by the effects of the environment, only the adaptability of living systems to different environments can give reason of departures from the orderly progres­sion. However, others, like for example Spencer (who was otherwise a follower of Lamarck), are externalists. Spencer proposed a general law of evolution that applies to the evolution of solar systems, planets, species, individuals, cultural artefacts, and human social organizations. According to it, there is a universal course of change from a state of indefinite, incoherent homogeneity towards a state of definite, coherent heterogeneity. This ideas were influenced by theories of development and thermodynamics. From this perspective, all increases in complexity are produced by changes in the environment: nothing would happen if the environ­ment remained constant and simple. Natural systems change from a state of little differentiation of parts and little concentration of matter, towards a state in which there is a variety of clearly distinguishable parts, where the individual parts differ from each other and are densely structured as an adaptation to the conditions of the environment (ibid. , p. 80).
A naturalist view of the origin of cognitive processes regards evolution as the source or process where constraints regulating the causal processes of living beings generate. Externalism is criticized by those who consider that knowledge and cognition need to be explained starting from life, to overcome dualistic accounts of relations between mind and world and artificial separations between them (Godfrey- Smith, 1994). For example, Dewey and other “progressive Darwinians”, regarded the organism plus the environment as constituting a single system. Organic activities tend to preserve the pattern of activities between organism and environment, rather than just the organism itself. Cybernetics can be counted among those approaches that focus in this self-maintaining perspective for life: first order Cybernetics main­tained an externalist approach similar to that of Spencer; but second order Cyber­netics was worried about the identity of the artificial systems and preferred inter­nalist ideas, an example being the theory of autopoiesis (Maturana & Varela, 1992).
Evolution by natural selection explains that variations among organisms give some of them advantages to survive or have offspring in certain environments. Selected traits are adaptations, evolved because they provided advantages to their bearers under their environmental conditions (Burian, 1982). The simplified understanding of the evolutionary process as driven solely by natural selection, exemplified by optimality models, is usually called adaptationism (Gould & Lewontin, 1979). Assumptions underlying adaptationism are that the properties of organisms can be explained as a sum of atomic features, each of them an optimization resulting from natural selection acting on alternative configurations.
Understanding organisms as a sum of atomic features is problematic in at least two senses. One is that if the evolution of each one is not independent of the others, then the constraints imposed by the interactions should be taken into account. This view demands to expand the theory of evolution to take into account evolutionary forces other than natural selection and historical events. Two types of constraints, constraints on adaptation and constraints of the appearance of form (Amundson, 1994) should be added to the standard models of population genetics to have a richer image of evolutionary processes. The view of evolution as a process of natural selection of each relevant trait fails to take into account evolutionary forces different from natural selection, such as genetic drift or pleiotropy, which are constraints on adaptation. On the other hand, the influence of developmental processes in evolution requires models of the capacity of matter to self-organize. Natural selection acting on genetic variation is not creative enough to account for the origin of form and organization of living systems (see Kauffmann, 1993; Depew & Weber, 1995). Thus, a consideration of both the historical and contingent processes and the intrinsic self-organizing capacities of living systems can expand the explanatory resources of evolutionary biology. Not every force in evolution is natural selection and not everything on which selection acts is random variation.
Third problem: Cognition is an informational process
In the recent work in Cognitive Science, there have appeared some criticisms against the functional decomposition of the cognitive systems in three subsystems: the perceptual, the processing, and the motor. Most theories of cognition tend to focus mainly on the structure of the nervous system and ignore other aspects of the body. Often it is said that the biological system (the body at large) is energetically coupled to its environment, whereas the cognitive system (the nervous system) is informationally. A different way of organizing the internal structure of the organism according to principles inspired in the incremental organization of evolved systems was proposed by Brooks (1990). Instead of dividing the work into functional modules he proposes a behavioral decompositi­on in which modelling starts from layers of simple complete behaviors on the top of which new more complex behaviors can be added. Thus, behavioral decompo­sition is a step forward in what respects the organization of the structure of a cognitive system, but it is still unsatisfactory to account for the autonomy of the physical system. Living systems are autonomous because they behave in an articulated way without the interference of any creator, they separate themselves from the outside and internally define ways of transaction with the environment. The relation between energetic and informational (biological and cognitive) capa­cities is closely entangled. In the organization of perceptual information in living systems the internal energetic state (hunger, fear, etc.) plays a role in the way sensory stimuli from the environment become relevant (or informational) for the organism. Also, many transactions with the environment involve mainly forces acting on the agent and the environment, rather than information (Smithers, 1994).
There are different degrees of the relative importance of energy and informati­on in the sensory projection of organisms. This gradation depends on the specificity of the physical element with which the sensor binds or couples. Some types of interactions are made possible by sensors that bind or couple with a given domain detected only by energetic means. Others require more indirect relations in which the outcome is transformed into active information within the organism. Both kinds of couplings may improve the adaptedness of organisms or enrich its experience, but, in the first case, the sensor is specialized to detect specific substances (required for the organism or that it has to avoid), while in the second case the detection is only indirect and the detected domain may not be linked in a direct way with the needs of the organism.
Enzymes, which bind with very specific substrates, are an example of the first case. Many regulating factors affect them in subtle ways and determine the meta­bolic paths in which components enter. Nevertheless, the basic reaction is very specific, the recognition of a specific pattern. The sensors that help bacteria swim in favor (or against) chemical gradients belong to this very class, protein receptors of the bacterial surface bind with the attractive (or repellent) substance and stimulate the locomotor system. Smell and taste sensors of mammals belong to the same category, because, even if the mechanisms involved are more complex, the pattern recognition is highly specific, achieved through comparison and association with previously acquired patterns. For example, rats can eat very different food and need a variety of different substances to maintain their metabolism, but they do not have a general procedure to help them distinguish what is convenient before hand. Their method for recognition is a trial and error learning that involves the whole organism, not only a single perceptual mode. The procedure acquires specificity because rats try any new food in tiny amounts and, if they feel sick after they have eaten some particular substance, this will be always rejected in the future (Richter, referenced by Bonner, 1980).
Indirect detection is a different case because there we can find a deferred relation between the kind of binding the sensor realizes and the domain with which the organism interacts in a meaningful way. The organism detects spheres of significance after an internal elaboration of data and an interpretation that makes possible the interaction. For example, in many organisms vision does not enable a relation with light, but with something else. This case can be made more clear by using a borderline example• the primitive acquisition of an eye by a paramecium (Wachterhauser, 1984). Some single-celled organisms can use light as nourishment, via photosynthesis, in which light is absorbed directly as energy. They present forms of phototropism: they can sense light and move towards the light. This detection is not usually considered to be vision, it is a case of the mode considered above, of direct detection. Other organisms cannot nourish out of light directly, but by eating plants they. indirectly obtain substances they cannot produce themselves, including vitamins, which enable them to be sensitive to light and to develop eyes. Thus, in evolutionary terms, the search for light starts as a search for nourishment, not of information. Wachterhauser reports the case of a paramecium, a single-celled creature, which feeds on green algae and who uses one of them (a chlorella) both as food and literally as eyes. The paramecium sticks the chlorella on itself and maintains a symbiotic relationship with it: as an eye, the chlorella is used to steer the movement of the organism and, as a stomach, it gives back part of its light nourishment to its host. The chlorella couples energetically with the light, but, for the paramecium, the same chlorella is a visual sensor, an eye, used as a means to detect light and thus co-ordinate its locomotion.
This example can show the difference between coupling only with specific substances or specific patterns, and elaborating further the physical, causal effect that certain processes trigger on an organism. The highly evolved perceptual systems of animals can be confusing in what respects what it is to be a sensor or a perceptual system. As a consequence, often perceptual operation is supposed to be completely separated from the energetic/metabolic structure of the organism. This is not the case for those sensors responsible for direct detection.
In fact, not all the highly complex perceptual systems share common signal- processing strategies, and this difference is related to the specificity of the information acquired by the organism. A comparison of the visual and olfactory systems of insects shows many differences between them, different neural processing strategies and different neural architectures (Osorio et al., 1994). Visual perception is influenced by the structure and statistical properties of optical signals, which are complex but highly constrained. Olfactory signals, on the other hand, carry less information, they are less constrained and predictable. Arbitrary patterns of excitation on the olfactory receptors have to be parsed and learnt, and then recognized against a complex background of smells. Vision is hard-wired, neural circuits tailored to specific behaviors, while olfaction lacks the ordered and highly differentiated neural circuits used for vision and may use an associative network at an early stage to recognize patterns. This difference suggests a further difference in the functions both systems accomplish. Probably the olfactory system has to recog­nize specific substances, while the main task of visual elaboration is not to recognize objects but to elaborate different types of behaviors, for example locomotion. Vision has evolved to allow many different behaviors, some related to locomotion, others to the recognition of a variety of features (relevant for the organism but not always specifically directed to recognition of objects) through an organism-specific elaboration of the properties of one single physical phenomenon: light.
As a consequence, perception cannot be reduced to simple informational tasks, many perceptual couplings are directly involved with the energetic maintenance of the organism. Both kinds of perception are important to understand how organisms involve in meaningful interactions with the environment. Vision provides detailed and coherent ways to interact with the surrounding world through locomotion, it is closely related to motion in the environment, and of the agent itself. This specialization might have provided structures to develop anticipatory behavior; motion detection requires high sensitivity to time constraints and perception of the relevant dynamical features of space. Consequently, it can be expected that neural structures have specialized to detect motion in relation to the movements that the agent itself must make to succeed in a moving world. This implies a great amount of embodiment of perception, so that the capacity of anticipation is a consequence of movement, not represented by the agent (in any kind of Cartesian map), but put in relation with the movement of the agent itself. If perception is not mainly a matter of object recognition, anticipatory behavior could be explained as a dynamical coupling to tune the movement of the body to the moving world. This coupling or tuning does not involve to represent the world as it will be in the future, but to perceive the movement of the world in relation to one’s own and act accordingly.
Internalist and externalist embodiment
Now the moment has arrived to somehow summarize the consequences of the three problematic aspects considered so far, to develop what I would call an epistemology of embodiment for cognition. The three of them legitimize conceptual perspectives that avoid or suppress the need for embodiment in explanations of the nature of cognitive systems even in the field of Artificial Life (ALife). ALife is a discipline that studies life by emulating living phenomena in artificial media, it is a “distinctive attempt to explain evolutionary and adaptive systems, including (ultimately) the phenomena that we group together with labels such as intelligence, mind, and cognition” (Wheeler, 1995, p. 65). Some of the features that make it distinctive are its synthetic flavor (opposed to the analytic approach of Artificial Intelligence), the use of evolutionary methods to construct individual cognitive agents or complete artificial worlds, and the attention paid to emergence. Many Alife related theories of mind and cognitive phenomena are motivated by anti-Cartesian ideas and they try to develop embodied models. Nevertheless, many problems, some of them conceptual and related to the aspects rehearsed before, make this task very difficult.
The discussion of these points brings us back to the question of the two ways of facing embodiment, as a restitution or as a start from the beginning. ALife models of cognitive processes have understood embodiment more as a restitution, developing a notion of embodiment that does not require to consider living properties to understand cognition. Externalism and internalism are polysemic conceptual dichotomies used in several different contexts (one classification appears in Oyama, 1992). So far, the forms of embodiment tried in ALife can be considered externalist in many of these senses.
In respect to the dichotomy causality-computation, a consideration of the complexity of living systems may be a good starting point. Complexity, it is well known, is difficult to measure. However, a notion that seems to be useful for our purposes is Bennett’s concept of logical depth defined as “the length of the logical chain connecting a phenomenon with a plausible hypothesis (e.g. the minimum description) explaining it” (Bennett, 1988). From this perspective, complexity would depend on the time it necessarily takes to construct a phenomenon. Complex entities should obey a “slow growth law” (this eliminates many trivial complexities such as smashed glass) and even if it takes a long time to form a given substance in the earth, if it could be made more quickly using shortcuts (as in a laboratory) it would not be complex (see Salthe, 1993, p. 4). Similarly, a minimum description will be complex in this sense if it takes a long time to boil it down from a larger description and it would describe something complex if it took a long time to synthesize it using the minimum description as instructions. Can this notion be related to our discussion on embodiment? The machine metaphor has been an inspiration to explain living beings. Polanyi (1968) described the design of machines in this way: “the machine as a whole works under the control of two distinct principles. The higher one is the machine’s design, and this harnesses the lower one, which consists in the physical-chemical processes on which the machine relies (…) the constructor of a machine restricts nature in order to harness its workings” (p. 1308). Machine constraints are built with clear design, and, in general, the restriction they exert in motion or change (in the causal processes) is well-known (at least, while the system works as expected). The complexity of the organization or design of organisms could now be measured according to the constraints limiting or directing underlying causal processes: are these easy to simulate or emulate? If living systems could be viewed as very complex cases of mundane procedures, it would imply that there exist ways in which causal processes can be artificially orchestrated to give rise to living phenomena. Yet, the problem rests in the notion of procedure itself: causal processes of living systems do not follow fixed functions, they vary according to many regulating processes which take place at different levels. These may not be amenable of short description, also, it may not be possible to impose them externally in the causal system. An artificial origin of life requires, then, a synthetic method to discover experimentally how to instantiate different causal processes (to discover a mundane procedure) and also a capacity or criterion to acknowledge or recognize the living aspect of the resulting system (Stengers, 1995). As a consequence, to say that living processes embody computations is misguiding. Further, computer simulations can produce evolution faster than real evolution, but the complexity of evolution should not be trivialized to the point in which important stages and factors are taken for granted.
Externalism appears in ALife also as adaptationism. Anti-adaptationists claim that the theory of evolution should expand its explanatory scope to consider other evolutionary forces such as pleiotropy, alometry or historical constraints. Thus, the dispute between adaptationists and anti-adaptationists involves a methodological aspect: to decide which factors must be included in evolutionary models (Sober, 1993). If factors omitted by adaptationists (history, self-organization) are important, then adaptive forces will not be sufficient to explain evolution. This is an important question when properties of life are to be emulated in artificial media like in ALife. Until now, most of the artificial models developed to study the evolution of behavior in ALife have an adaptationist character, their structure mimics the evolutionary explanations of adaptation by natural selection. Reasons for this are diverse: on the one hand, ALife models are often based on research described in textbooks of behavioral ecology which are clearly adaptationist; on the other, the simplest way to model artificial evolution implies fitness evaluations dependent on objective functions and thus, experiments cannot include phenomena independent of the externally imposed selection.
In what respects the third discussion, because it is considered important that the behavior of artificial agents arises from perception-action cycles, and is not imposed by the programmer, agents are situated in real or simulated environments, so that their actions depend on the perceived state of the environment. Still, they are not autonomous in the strong sense. Simulated agents or robots are built as special purpose devices, able to measure the relevant parameters to control certain degrees of freedom of the system from the very situation in which the system finds itself (and not from the point of view of an external observer). This way the results of sensors causally influence the motor devices. This strategy can produce situatedness but, can it generate autonomous cognition? In these ALife cases, unlike in organisms, there is no relation with the physical functions responsible for the capacity the creature has to “stay alive”. If, on the contrary, an internalist approach grounded in a notion of biological organization were attempted, situatedness would emerge as a function of the need to energetically maintain the organism, from those processes that endow the agent with identity (i.e. the self-organizing processes that maintain it as a physical system in a physical domain). Thus, behavior would be a manifestation of the self-maintaining structure and meaningful interactions would emerge from the whole pattern of system-environment relations. In sum, for an externalist perspective, a coupling of perception and action might be enough, but radical embodiment demands a broader consideration of the body and its evolutionary history. This type of embodiment is a property of causal evolvable systems and cannot be achieved by means of restituting certain capacities to physical devices that are per se devoid of them, embodiment has to be investigated from the very start in the material properties of evolvable systems.
Representations in evolution
There is still a point about the embodiment of cognitive processes that is considered to be central for this discussion, and has received a considerable attention in the literature. We refer to the problem of the vehicles of meaning, of the reasons to ascribe it to specific carriers (symbols or representations) and the relation those maintain with the organism as a whole. Even when meaning is con­sidered as an emergent phenomenon, emergence can either be studied at the level of the whole activity of the organism, or at the level of the perceptual system itself. The latter requires to study cognition as an activity that involves representations.
In Cognitive Science, the role of internal representations in cognitive processes has raised long debates. Sometimes research has urged to ground the meaning of representations or to find different explanations of their nature and function, others to preclude their use for, at least, some cognitive tasks. A position that disclaims relevance to internal representations is research in adaptive behavior, where roboticists claim that robots do not have to be provided of representations to be able to perform simple tasks such as navigation. Another comes from criticisms of the Cartesian mind-body duality and its need of a subject-object distinction to explain cognition (for example, Varela et al., 1991). Both of them allege that representationalism fails to adequately explain phenomena that should be embodied.
Many of the activities that humans and some animals perform are the produc­tion of representations of the world with a great variety of purposes: to show the location of things, to alert others, to escape from predators, etc. These are, of course, external representations. We are able to produce representations in a meaningful relation with the world and our activities require them. Yet, meaning itself arises from many other bodily functions, like, for example, reflections and analogies that can be drawn from the fact that different individuals share the same body structure (Sheets-Johnstone, 1990).
There have been attempts to clarify the nature of internal representations, for example, Clark and Toribio (1994) have tried to give away many pre-conceptions of what representation is, namely, that representations are explicit, that they can be manipulated literally as a text and that they bear familiar intuitive concepts. Thus, they suggest that “it may be fruitful to stop thinking in terms of a dichotomy (representation/no-representation) and instead imagine a continuum of cases” (p. 403). In this continuum, they are ready to concede that no representation is needed as “insofar as the bulk of our cognitive activity is (…) defined in terms of ’perceiv­ing’ and ’acting” (p. 426), while the need of representations starts when the system must dilate and compress (select and manipulate) the input space and is definite when this job is realized by a “systematically related body of intermediate represen­tations” (p. 427). These authors are probably thinking on a continuum of cases with­in a single organism (probably of the animal kingdom), but, of course, it is possible to analyze the problem starting with all biological beings and maybe even before.
Some of the stages for the evolution of cognitive systems in the Earth might have been be the following. The first organisms were protocells formed by self-organizing processes, which cannot reproduce reliably. The origin of life would imply the slow stabilization of a type of code for self-reproduction and a semantic closure between dynamic and symbolic components to define an identity (Pattee, 1982). The origin of sex would bring about an exchange of genetic material and, in some species, the appearance of dimorphism and reproduction from two parent-cells, instead of cell division. The origin of pluricellular organisms and the development of the germ/soma duality would be a further step after which, with the development of the nervous system, animals capable to produce and interpret external signs would appear. In hominids, natural language evolves as a system of signs shaped by bodily thinking (Sheets-Johnstone, 1990). Finally, formal symbol systems as human produc­tions which, as far as can be implemented in different materialities and give rise to different embodiments, constitute the most sophisticated natural example of arbitrary symbols. The evolution of complexity in evolution can be thus seen as a process of externalization or disembodiment of symbols at different levels, in the form of a transformation of causal relationships into relations mediated by symbols. This process has been called epistemic cut (Pattee, 1995). This perspective can explain how symbols appear in natural systems, it constitutes an evolutionary way to overcome or dissolve the world-mind (or matter-symbol) duality. The evolutionary continuous process extended in time helps to understand levels or scales of complex­ity without drawing strict discontinuities between different types of systems. This way, complexity can be given a materialistic explanation, for dualism is often a consequence of conceiving matter as inert and signs as “informing” entities. Organisms are not passive objects of descriptions in terms of symbols, they are active producers of them (Emmeche, 1994).
For example, in Cariani’s (1992) study of the semiotic properties of different subsystems of a cognitive system, sensors transform continuous and non-symbolic physical properties into discrete tokens, due to causal processes taking place in the device. Sensors are naturally occurring measuring devices that produce this transformation from causal processes to signs or symbols (Pattee, 1973). However, sensors of living systems are not fixed, they can produce a variety of transformations and participate in a number of system-environment interactions. The open-ended capacity of natural sensors should be an evolutionary property. Those who consider that only the behavior of the whole system can be seen as emergent defend an organismal view, because, it is alleged, it is not possible to understand emergence in terms of particular functions. These would be too arbitrary to provide relevance criteria, isolated sensors lack relevance criteria, unless they are considered as prostheses (like it is the case of Cariani). Probably both types of emergence have to be considered to understand meaning. Models and theories of hierarchical organization might allow research to flip from functional or semiotic approaches to global operational views of embodied processes, and back.
Acknowledgements
I am very grateful to G. Van de Vijver for her kind invitation to attend the ISES Seminar. Comments by R. Lemmen, S. Salthe and G. Van de Vijver helped to clarify some dark passages of older versions. This research was supported by a Contract associated to a DYGCIT Project Number PB92-0456 from the MEC (Ministerio de Educación y Ciencia, Spain).
References
Amundson R. (1994) Two concepts of constraint: Adaptationism and the challenge from developmental biology, Philosophy of Science, vol. 61: 556–578.
Bennett C. H. (1988) Dissipation, Information, computational complexity and the definition of organization. In: Emerging Syntheses in Science D. Pines (ed.) Addison-Wesley: 215–233.
Bonner J. T. (1980) The evolution of culture in animals, Princeton, University Press.
Brooks R. (1990) Elephants don’t play chess, Robotics and Autonomous Systems, vol. 6: 3–15.
Burian R. (1982) Adaptation. In: Dimensions of Darwinism M. Grene (ed.) Cambridge University Press, Cambridge: 287–327.
Cariani P. (1992) Some epistemological implications of devices which construct their own sensors and effectors. In: Toward a Practice of Autonomous Systems F. Varela & P. Bourgine (eds.) Cambridge MA, MIT Press: 484–493.
Clark A. & Toribio J. (1994) Doing without representing?, Synthese, vol. 101: 401–431.
Cleland C. E. (1993) Is the Church-Turing thesis false?, Minds and Machines, vol. 3: 283–312.
Conrad M. (1987) Rapprochement Of Artificial Intelligence And Dynamics, European Journal Of Operational Research, vol. 30: 280–290.
Conrad M. (1989) The Brain-Machine Disanalogy, Biosystems, vol. 22: 197–213.
Depew D. & Weber B. (1995) Darwinism evolving. Systems Dynamics and the genealogy of natural selection, MIT Press, Cambridge MA.
Emmeche C. (1994) The computational notion of life, Theoria, vol. 9, no. 21: 1–30.
Godfrey-Smith P. (1994) Spencer and Dewey on Life and Mind. In: Artificial Life IV, R. Brooks & P. Maes (eds.) MIT Press, Cambridge MA: 80–89.
Gould S. J. & Lewontin R. (1979) The spandrels of San Marco and the Panglossian Paradigm: A critique of the Adaptationist Program, Proceedings of the Royal Society of London, CCV: 581–98.
Heschl A. (1990) L=C. A simple Equation with astonishing Consequences J. Theor. Biol., vol. 185: 13–40.
Kauffman S. A. (1993) The Origins of Order: Self-organization and Selection in Evolution, Oxford, Oxford University Press.
Maturana H. & Varela F. (1992) The tree of knowledge, Shambala, Boston MA. http://cepa.info/624
Osorio D., Getz W. M. & Rybak J. (1994) Insect vision and olfaction, Different neural architectures for different kinds of sensory signals?. In: From animals to animats 3, D. Cliff P. Husbands J. A. Meyer & J. S. Wilson (eds.) Cambridge MA, MIT Press: 73–81.
Oyama S. (1992) Ontogeny and Philogeny: A case of Metarecapitualtion?. In: Trees of Life. Essays in Philosophy of Biology P. Griffiths (ed.) Dordrecht, Kluwer Academic Publishers: 211–239.
Pattee H. (1982) Cell Psychology: An Evolutionary Aproach to the Symbol/Matter Problem, Cognition and Brain Theory, vol. 5, no. 4: 325–341.
Polanyi M. (1968) Life’s irreducible structure, Science, vol. 160: 1308–1312.
Rosen R. (1985) Anticipatory Systems, Pergamon Press.
Ryle G. (1949) The concept of mind, Penguin, London.
Salthe S. (1993) Development and Evolution, Complexity and Change in Biology, Cambridge MA, MIT Press.
Sheets-Johnstone M. (1990) The roots of thinking, Philadelphia, Temple University Press.
Smithers T. (1994) What the Dynamics of Adaptive Behaviour and Cognition Might Look Like in Agent/Environment Interaction, Notes of the III Int. Workshop on Artificial Life and Artificial Intelligence, On the Role of Dynamics and Representation in Adaptive Behaviour and Cognition, 9 and 10 December 1994, San Sebastian, Basque Country.
Sober E. (1993) Philosophy of Biology, Oxford, Oxford University Press.
Stengers I. (1995) God’s heart and the stuff of life. Paper read at ECAL-95, European Conference on Artificial Life.
Stewart J. (1992) Life = Cognition. The Epistemological and Ontological Significance of Artificial Intelligence. In: Toward a practice of autonomous systems, F. Varela & P. Bourgine (eds.) Cambridge MA, MIT Press.
Turing A. M. (1936/7) On computable numbers, with an application to the Entscheidungsproblem, Proceedings of the London Mathematical Society, vol. 42, no. 2: 230–265; correction, ibid., vol. 43 (1937): 544–546.
Varela F., Coutinho A., Dupire B. & Vaz N. (1987) Cognitive Networks: Immune, Neural, and Otherwise. In: Theoretical Immunology A. Perelson (ed.) Redwood City CA, Addison Wesley, vol. II: 359–376.
Varela F., Thompson E. & Rosch E. (1991) The embodied mind, Cambridge MA, MIT Press. http://cepa.info/1969
Wachterhauser G. (1984) Light and Life: On the nutritional origins of sensory perception. In: Evolutionary Epistemology, Theory of Rationality, and the Sociology of Knowledge G. Radnitzky & W. W. Bartley (eds.) La Salle (IL), Open Court: 121–138.
Wheeler M. (1995) Escaping from the Cartesian Mind-Set: Heidegger and Artificial Life. In: Advances in Artificial Life F. Moran A. Moreno J. J. Merelo & P. Chacon (eds.) Springer, Berlin- Heidelberg: 65–76.
Williams G. C. (1992) Natural selection: Domains, Levels and Challenges, New York, Oxford University Press.
Found a mistake? Contact corrections/at/cepa.infoDownloaded from http://cepa.info/2942 on 2016-07-30 · Publication curated by Alexander Riegler