Can Brain Evolution Teach Us Anything About Conflict?

The Russia-Ukraine war playing out on television these last two weeks has produced a helpless feeling of not being able to help. Yet, it has motivated me to explore alternative ways to be of use. This essay represents a small way of using what I know to consider future answers to such conflicts.

The 21st century is well on its way to becoming the era of translational biological information. This means we are applying the vast amount of knowledge gathered over the last century to change the world for the better. Our modern economy, law, politics, and the military reflect this process. These are several of the many institutions in society that owe a great deal to the growing understanding of the mind. Advertising, marketing, focus groups, negotiations, ethics, law, and intelligence work—all rely on awareness of how we think and decide. From genetics to personalized medicine, the study of the human mind sits at the edge of a truly transformational time. We know well the link between malnutrition and depression, while we learn more every day about the role depression plays in mild cognitive impairment. Other findings, such as the effects that microbiome bacteria in our stomachs have on cognition, are nothing short of extraordinary.

For the past century and a half, we have learned that changes in our brain result in modifications to the mind and to our personality. Tools to study and learn these brain-mind relationships, such as deep brain stimulation, have moved us from indirect to more direct mapping. We have discovered ways to know how people decide and think, and how it is we communicate with each other. A more recent development in neuroplasticity is that modifications to our personality can cause changes to our brains. Mindfulness meditation to manage stress rewires unhealthy circuits in the brain, such as the HPA axis response.

What such insights have produced is not only a better grasp of how information processing affects perception and cognition but how, through extensive training, we can extend our personal and sociocultural boundaries. Neurotechnology, one of the new sciences in this contemporary world, has developed methods for treating and repairing soldiers injured in battle. It has figured out how we move cursors across a screen through the power of thought and how to control an advanced prosthetic arm in the same way. Neurotechnology restores the sensation of touch to an individual with severe neuro-degenerative injury.

The consequences of this new science, however, are not always positive. In the name of national security and warfare preparation, neuropsychological training also eases individuals into controversial tasks, such as killing. Thoughts control the flying of drones. Pharmaceuticals help soldiers forget traumatic experiences or produce feelings of trust to encourage confession in an interrogation. Thus, the weaponization of biological information raises ethical concerns.

The dual use of scientific information for good and bad ought not to prevent us from extracting lessons on how to avert conflict. Given all the current clashes, it is an opportune time to ask whether there is anything in the biological treasure trove of knowledge that can help us deal with conflict or even how to avoid it.

Optimal prediction in decision-making is one innovative way to prevent conflict. Imagine being able to anticipate the plans of others, especially adversaries, and forestall, or prevent those efforts? Could we have stopped the Ukrainian war had we known that Russia would invade the country? Is it possible to stop any conflict if we know the problem before it happens? The rational answer would seem to be yes. Interestingly, the human brain evolved for “optimal” prediction in decision-making, turning Homo sapiens into one of the most successful species to survive a violent and uncertain world. It sounds reasonable, therefore, to ask whether there are lessons in this evolution that we can extract for more general use?

Recent developments in cognitive neuroscience, based on neurologically inspired theories of uncertainty, have led to proposals suggesting human brains are sophisticated prediction engines. This means the brain generates mental models of the surrounding environment to predict the most plausible explanation for what’s happening in each moment and updates the models in real time. According to Andy Clark, a cognitive scientist at the University of Edinburgh in Scotland, “You experience, in some sense, the world that you expect to experience.”

We assume the major function of “looking into the future” through prediction, preparation, anticipation, prospection or expectations in various cognitive domains is to organize our experience of the world as efficiently as possible. The brain-mind is optimally, not perfectly, designed to cope with both natural uncertainty (the fog surrounding complex, indeterminate human actions) and man-made uncertainty (the man-made fog fabricated by denials and deceptions). We do this by conserving energy while reducing uncertainty. This ability evolved to support human intelligence through continuously matching incoming sensory information with top-down predictions of the input. Analysis of the temporo-spatial regularities and causal relationships in the environment produce top-down predictions or expectations—something known as Bayesian inference.

The brain uses this knowledge of regularities and patterns to make a model or the “best guess” about what objects and events are most likely to be responsible for the signals it receives from the environment. This “best guess” goes through an iterative process of minimizing the mismatch (i.e., correcting the error) between expectancy and reality until it reaches an optimal solution. Mental models are forms of perception, recognition, inferences about the state of the world, attention, and learning, which are beneficial for more pertinent reactions in the immediate situation.

In this perspective, mental states are predictive states, which arise from a brain embodied in a living body, permeated with affect and embedded in an empowering socio-cultural niche. The result is the best possible and most accommodating interaction with the world via perceptions, actions, attention, emotions, homestatic regulation, cognition, learning, and language.

A predictive machine requires a high inter-dependence of processes, such as perception, action, and cognition, which are intrinsically related and share common codes. Besides the feedforward, or bottom-up, flow of information, there is significant top-down feedback and recurrent processing. Given the levels of ambiguity and noise always present in the environment and our neural system, prior biases or mental sets become critical for facilitating and optimizing current event analysis. This occurs whether it concerns recognizing objects, executing movements, or scaling emotional reactions. This dynamic information flow depends on previous experience and builds on memories of various kinds, but it does not include mnemonic encoding. Indeed, the more ambiguous the input, the greater the reliance on prior knowledge.

The predictive model of the brain has been successful in explaining a variety of mental phenomena, such as inattention and distraction, beliefs and desires, as well as neural data. Sometimes, though, the brain gets things wrong because of incomplete or inaccurate information, and this discrepancy can cause everything from mild cognitive dissonance to learning disorders to anxiety and depression. But our survival is proof positive that whatever strategies we learned are highly effective in navigating a world of uncertainty.

Here are eight lessons regarding the predictive brain that may be helpful in dealing with conflict:

  • Recognize your use of mental models. To deal with uncertainty in the world requires creating mental models in which we map our understanding and expectations about cause-and-effect relationships and then process and interpret information through these models or filters. Mental models become critical for facilitating and optimizing our responses to current events.
  • Understand your mental models. Recognize that complex mental processes determine which information you attend and, therefore, mediate, organize, and attribute meaning to your experience. Your background, memories, education, cultural values, role requirements, and organizational norms strongly influenced this dynamic process.
  • Withhold judgment of alternative interpretations until you have considered many of them. Expertise, and the confidence that attaches to it, is no protection from the common pitfalls endemic to the human thought process, particularly when it involves ambiguous information, multiple players, and fluid circumstances.
  • Challenge, refine, and challenge again all your mental models. Discourage conformity. Incoming data should reassess the premises of your models. Remain humble and nimble. Be self-conscious about your reasoning powers. Examine how you make judgments and reach conclusions. Encourage “outside of the box” thinking.
  • Value the unexpected. It reveals inaccuracies in your mental models. You cannot eliminate prediction pitfalls because they are an inherent part of the process. What you do is to train yourself on how to look for and recognize these obstacles, view them as opportunities, and develop procedures designed to offset them.
  • Emphasize factors that disprove hypotheses. Increased awareness of cognitive biases, such as the tendency to see information confirming an already-held judgment more vividly than one sees “disconfirming” data, does little by itself to help deal effectively with uncertainty. Look for ways to disprove what you believe.
  • Develop empathy and compassion. Put yourself in the shoes of others to see the options faced by others as they see those options. Understand the values and assumptions that others have and even their misperceptions and misunderstandings. Then, act.
  • Change external circumstances instead of trying to eliminate everyone’s biases. Mental models are resistant to change primarily because they reflect the temporo-spatial regularities and causal relationships found in your environment. Restructure the setting and it will affect your perceptions.

Our Evolving Sense of Awareness

Despite the mountains of information about mind and its relationship to brain, there remains a mystery at the core of our being. The holy grail of this mystery is awareness, the ability to hold something “in consciousness.” Neuroscientists and philosophers have called our first-person experience of the subjectivity that arises from this holding function the “hard problem.” This is because, unlike most other problems in science and life, this one has proven resistant to rationality and the scientific method. Recently, however, one promising approach has helped constrain, at least for me, the multiple ideas about awareness by placing its understanding within an evolutionary context.

In his “attention schema” theory, the neuroscientists Michael Graziano has proposed that awareness evolved in stages. The assumption behind this perspective is that each level in the progression provided fitness value and survival benefit to a species. Initially, according to Graziano, “awareness” involved bottom-up signal-to-noise mechanisms that selectively enhanced signals. The existence of some of the earliest neurotransmitter systems, namely the dopamine, norepinephrine and serotonin systems that perform such a function, is consistent with this idea.

The next step in the progression likely concerned the interaction between signal enhancing mechanisms and top-down biasing and switching mechanisms that developed for greater control of the processing associated with the enhanced signal. The circuit in the basal ganglia, involved in the integration and selection of voluntary behavior, is a good example of this. Here, the neurotransmitter dopamine operates on striatal neurons to perform a switching function, controlling the flow of information in the direct and indirect pathways of the circuit.

According to the principles of control theory, an even more effective way to control a complex variable is to have an internal model of that variable. This allows the system the ability to simulate its dynamics, monitor its state, and predict its function, at least a few seconds into the future. Thus, Graziano suggests that the next critical jump in the evolution of awareness was the development of an internal model of attention (a simulation) that allowed the brain to attribute to itself a “mind” aware of something. I would add that, at this level, evolution moved from nonconscious to conscious control and subjectivity. The awareness that “I am attending to this thing” was born from such bidirectional interactivity.

Adapting this internal model of attention to social attribution led, at some later stage, to ascribing awareness to other beings. Finally, because of language, culture, and other social developments, humans became extremely good at modeling others, perhaps too readily. Such an ability likely explains our readiness to anthropomorphize or attribute consciousness to characters in a story, puppets and dolls, thunder, oceans, empty spaces, ghosts and gods.

Justin Barrett calls this the Hyperactive Agency Detection Device, or HADD, and it appears to be a consequence of our hyper-social nature. The readiness to simulate and attribute a “mind” to animate and inanimate things may explain the sense some of us have of a rich spirit world surrounding us. Undoubtedly, this aspect of awareness provides side benefits, such as aesthetic experiences, including our sense of wonder about our mysterious world.

Flexibility In Thinking Is Crucial For Survival

Considering the billions of life forms present on planet Earth, there are several reasons why eusocial species, notably humans, have progressed to be a dominant force. Big brained, eminently social, collaborative in nature, able to communicate complex thoughts—these are just some explanations undergirding our achievement. From a cognitive science perspective, one significant outcome of these various factors, and a big reason for our success, is cognitive flexibility.  At its most essential, this refers to the ability to control the how, when, where, and why of thought. The human brain allows control, either automatically or deliberately, of what we think, how we think about it, when we think it, and why we are thinking of it. In an ever-changing world where circumstances vary dynamically from moment to moment, a thought-generating process that can adapt and respond equally fast is a decided advantage.

This type of mental flexibility incorporates the rapid analysis of circumstances, assessing of multiple channels of information, determining alternative solutions, eliminating those that do not work, recognizing errors, etc. More than anything, cognitive flexibility requires the “ability to resist the impulse to persevere and keep thinking in a previously active but no longer appropriate way.” Most times this requires the ability to assess the larger context in which such actions are pertinent.  Hundreds of years of neuroscience research have shown that our frontal lobes are the cortical regions critically necessary for this amazing flexibility.

What brought clarity to the role frontal lobes play in higher cognitive functions was the famous case of Phineas Gage. On September 13, 1848, the 25-year-old Gage was preparing a railroad bed using an iron tamping rod to pack explosive powder into a hole. He hit the powder to pack it in but the powder detonated, sending the long rod hurtling upward. The rod peneestrated Gage’s left cheek, tore through his brain, and exited his skull. Amazingly, Gage not only survived the horrific accident but could speak. He walked to a nearby cart, following the disaster so they could take him to a doctor. The injury destroyed an extensive part of Gage’s left frontal lobe. Or what we now consider the central executive region. In doing so, the injury changed Gage’s personality completely.

The chief functions performed by the frontal lobe include intellectual skills responsible for the planning, initiation, sequencing, monitoring, and overall cognitive control of complex goal-directed behavior. Friends of Gage did not recognize him following the accident, for he could no longer perform these skills. Professor of neuroscience Patricia Goldman-Rakic (1937-2003) advocated for the role of a special part of the frontal lobes called the prefrontal cortex (PFC) in the building blocks necessary for abstract understanding. Abstraction is the unique human ability to uncouple thinking from environmental stimuli – the basis for symbolic deliberation. She showed that impairments in a subdivision of the PFC, the dorsolateral part or DLPFC, contributes to thought disorders, such as those observed in schizophrenia. 

Goldman-Rakic’s work, along with others, further showed that the frontal lobes are an important site for inhibitory control. It appears that executive control operates, at one level, in a top-down manner, with the PFC having a leading, controlling role over many lower-level structures. This control over other brain regions is exercised through response inhibition which involves circuits that use chemicals such as gamma aminobutyric acid (GABA), an inhibitory neurotransmitter.

Another remarkable discovery by Goldman-Rakic and others was that the brain matures in an organized way, starting in the back and moving to the front. And these maturational changes do not stop with puberty. It means that the frontal lobes, home to key executive functions like planning, working memory, and impulse control, are among the last areas to mature in the brain. Full maturity of these circuits extends well into the late 20s or 30s.  Thus, for the first two to three decades of development, the human mind is in a state of reduced efficiency. During this time, cognitive control is susceptible to impulsiveness (or lack of inhibition) and reduced flexibility. The personality has a high likelihood of developing antisocial tendencies, delinquency, spoiled mind syndrome and other early criminal conduct. What all this suggests is the undeniable importance of flexibility in thinking, as orchestrated by frontal lobe circuits, and how critical it is for our survival.