SelfAwarePatterns<p><a href="https://sites.google.com/nyu.edu/nydeclaration/declaration" rel="nofollow noopener noreferrer" target="_blank">The New York Declaration on Animal Consciousness</a> has been making a lot of headlines.</p><p>The declaration itself has somewhat careful language in terms of what it’s asserting, <a href="https://www.quantamagazine.org/insects-and-other-animals-have-consciousness-experts-declare-20240419/" rel="nofollow noopener noreferrer" target="_blank">but many of the headlines don’t</a>. The declaration is short, so it’s easy to quote in full.</p><blockquote><p>Which animals have the capacity for conscious experience? While much uncertainty remains, some points of wide agreement have emerged.</p><p>First, there is strong scientific support for attributions of conscious experience to other mammals and to birds.</p><p>Second, the empirical evidence indicates at least a realistic possibility of conscious experience in all vertebrates (including reptiles, amphibians, and fishes) and many invertebrates (including, at minimum, cephalopod mollusks, decapod crustaceans, and insects).</p><p>Third, when there is a realistic possibility of conscious experience in an animal, it is irresponsible to ignore that possibility in decisions affecting that animal. We should consider welfare risks and use the evidence to inform our responses to these risks.</p><p><a href="https://sites.google.com/nyu.edu/nydeclaration/declaration" rel="nofollow noopener noreferrer" target="_blank">https://sites.google.com/nyu.edu/nydeclaration/declaration</a></p></blockquote><p>The ever growing list of signers on the declaration include people like Jonathan Birch, David Chalmers, Peter Godfrey-Smith, Simona Ginsburg, Eva Jablonka, Anil Seth, and others whose work I’ve highlighted over the years.</p><p>Ironically, I think the declaration was released on the same day Daniel Dennett died, ironic because I’m sure Dennett would have questioned the premise of the statement. It seems to take as a sharp precise fact of the matter whether certain creatures are conscious. The Quanta writeup on this makes clear the authors are focused on phenomenal consciousness, the “what it’s like” aspect of consciousness, essentially the Cartesian Theater or movie notion I discussed in the prior post.</p><p>To me, this highlights the problem with the concept. The idea is that a creature either has or doesn’t have this form of consciousness. Under this binary view, the consequences of getting it wrong are high, since it might cause us to mistreat animals we mistakenly classify as not conscious, or waste efforts on the welfare of creatures we mistakenly classify as conscious.</p><p>However, letting go of this notion frees us up to consider the problem from a different perspective, an incrementalist one. To an incrementalist, it’s clear many of these creatures have some limited degrees of consciousness, while not having others. </p><p>I’ve discussed the idea of <a href="https://selfawarepatterns.com/2021/01/03/hierarchy-of-consciousness-january-2021-edition/" rel="nofollow noopener noreferrer" target="_blank">thinking in terms of functional hierarchies</a> of consciousness many times. A simple version might look like this:</p><ol><li>Automatic behavior (reflexes and fixed action patterns)</li><li>Body and environmental models</li><li>Causal models</li><li>Introspection</li></ol><p>Everything alive has 1, automatic behavior, but so do robotic systems. </p><p>2, body and environmental models, is implied by anything with distance senses (sight, hearing, smell), which would include any of the animals discussed in the declaration. It also implies bottom up reflexive attention since the system needs a mechanism to focus its reactions on. All of which dramatically expands the scope of what 1 is in reaction to. </p><p>3, causal models, is where some degree of reason and scenario predictions start to come into the picture. It can be thought of as increasing the scope of the reactions in time as well as space. It’s here where the reactions become subject to being overridden, based on what a system has learned. It’s also where I think top down controlled attention starts to come into the picture.</p><p>4, introspection, is a system modeling aspects of its own processing in 1-3. At the simplest levels, it might provide added degrees of control. In humans, it enables symbolic thinking and communication and sharing of cognitive states in social contexts.</p><p>A hierarchy of this type is admittedly an oversimplification. We could instead take these items and use them as dimensions, and talk in terms of how much of each a particular system has. Still an oversimplification but maybe more clearly recognizing the complexity involved. </p><p>So many insects clearly have 1 and 2, but any evidence for 3 seems marginal and subject to interpretation. And there is none that I know for 4. Among invertebrates, only cephalopods (octopuses and similar species) show a strong degree of 3. Many fish do seem to have a little of 3, but from what I’ve read it’s fragmented glimmers compared to what we see in mammals and birds. </p><p>When it comes to 4, there do seem to be limited degrees of metacognition in various mammals, such as dogs and rats, although the evidence again seems open to interpretation. It’s stronger in some monkeys and great apes, who seem able to assess the certitude of what they know in situations where treats of different desirability are on the line. All of which is very limited compared to humans.</p><p>A key question for many might be when an organism is capable of pain and suffering. A lot here depends on what we mean by “pain” and “suffering”. If we mean adverse reactions, then we have it with 1, but then have to include plants and robots in our definition. If we mean a more sophisticated mental state, then I’m not sure it exists without some degree of 3, that is, when we have systems capable of utilizing the feeling.</p><p>But that ties into a broader question: how do we know these organisms don’t have these feelings, or even self reflection, but just aren’t showing it in the way more intelligent creatures do? Strictly speaking, we don’t.</p><p>However, this problem looks less pressing when we look at it from an evolutionary perspective. Building environmental, causal, or self models is expensive in terms of energy and development. For these capabilities to be naturally selected, they must provide some fitness benefit. Natural selection can’t select on internal mental states, only on capabilities that increase or decrease the organism’s ability to pass on its genes.</p><p>In that sense, if nothing in an insect’s behavioral repertoire shows it making use of a feeling of pain, such as learning from it or flexible behavior related to it, if all we get is reflexive withdrawal and avoidance behavior, then there’s no reason for it to have evolved mechanisms we’d be tempted to label “pain”. I think this is why we can largely dismiss the idea of consciousness in plants; how would it increase their fitness levels?</p><p>But, as always, it pays not to be dogmatic with any of this. Science is continually turning up new discoveries. And many scientists seem eager to demonstrate conscious capabilities in animals. I’m totally onboard with them trying, as long as we’re careful about how we interpret the evidence.</p><p>In any case, I think an incremental view frees us from an either / or determination on whether to care for an animal’s welfare, to one more oriented around what considerations we should have for it. Under this view, we should care more about mammals and birds than fish, but that doesn’t mean we should completely disregard the welfare of those fish. And while I’ll always try to make it quick when killing insects in the house, I’m not going to be too concerned about their welfare beyond that.</p><p>But maybe I’m missing something? Are there reasons to ascribe higher levels of consciousness to many of these animals that I’m overlooking?</p><p><a href="https://selfawarepatterns.com/2024/05/04/consciousness-must-be-adaptive/" class="" rel="nofollow noopener noreferrer" target="_blank">https://selfawarepatterns.com/2024/05/04/consciousness-must-be-adaptive/</a></p><p><a rel="nofollow noopener noreferrer" class="hashtag u-tag u-category" href="https://selfawarepatterns.com/tag/animal-cognition/" target="_blank">#AnimalCognition</a> <a rel="nofollow noopener noreferrer" class="hashtag u-tag u-category" href="https://selfawarepatterns.com/tag/biology/" target="_blank">#Biology</a> <a rel="nofollow noopener noreferrer" class="hashtag u-tag u-category" href="https://selfawarepatterns.com/tag/consciousness/" target="_blank">#Consciousness</a> <a rel="nofollow noopener noreferrer" class="hashtag u-tag u-category" href="https://selfawarepatterns.com/tag/evolution/" target="_blank">#Evolution</a> <a rel="nofollow noopener noreferrer" class="hashtag u-tag u-category" href="https://selfawarepatterns.com/tag/mind/" target="_blank">#Mind</a> <a rel="nofollow noopener noreferrer" class="hashtag u-tag u-category" href="https://selfawarepatterns.com/tag/philosophy/" target="_blank">#Philosophy</a> <a rel="nofollow noopener noreferrer" class="hashtag u-tag u-category" href="https://selfawarepatterns.com/tag/philosophy-of-mind/" target="_blank">#PhilosophyOfMind</a></p>