The Evolution of Multidimensionality

Scientists root the concept of dimensions in the idea of describing space. However, the scientific sense of space as an “extended” void with dimensions did not arise until well into the 17th century; not until Galileo and Descartes made space the cornerstone of modern physics as something physical with a geometry. By the end of the 17th century, Isaac Newton had expanded the vision to encompass the entire universe, which now became a potentially infinite three-dimensional vacuum.

Interestingly, it was the work of artists several hundred years earlier that foresaw and likely undergirded this scientific breakthrough. In the 14th to 16th centuries, Giotto, Paolo Uccello and Piero della Francesca developed the techniques of what came to be known as perspective—a style originally termed “geometric figuring.” By exploring geometric principles, these painters gradually learned how to draw images of objects in three-dimensional space. By doing so, they reprogrammed European minds to see space in a Euclidean fashion.

Descartes’ contribution was to make images of mathematical relations and formalize the concept of a “dimension.” He did it in terms of a rectangular grid marked with an x and y axis, and a coordinate system. The Cartesian plane is a two-dimensional space because we need only two coordinates to identify any point within it. With this framework, Descartes linked geometric shapes and equations. Thus, we can describe a circle with a radius of 1 by the equation x2 + y2 =1. Later, this framework became the basis for the calculus developed by Isaac Newton and G W Leibniz.

How does this concept of space affect our lived experience? Imagine living in a Cartesian two-dimensional world in which you are only aware of length and width. You see two objects approach each other at tremendous speed in this flat-plane world. The inevitable outcome will be a crash, you think. When nothing happens and the two objects appear to pass through each other untouched and continue on the other side, you think—it’s a miracle!

The Cartesian plane makes it easy to imagine adding another axis (x, y, z), which now allows us the ability to describe the surface of a sphere (x2 + y2 + z2 = 1) and thus describe forms in three-dimensional space.

Imagine that you are now seeing the same two objects coming towards each other, but able to see them in three dimensions, not only length and width, but also depth. You see the two objects, which are planes, approach each other at tremendous speed. But because you can see that one is significantly above the other, you know they are safe. It is patently obvious and definitely not a miracle there is no crash. All because you can see into this third dimension.

In 1905, an unknown physicist named Albert Einstein published a paper describing the real world as a four-dimensional setting. In his “special theory of relativity,” Einstein added time to the three classical dimensions of space. Scientists mathematically accommodated this new idea easily since all one has to do is add a new “time” coordinate axis within the Cartesian framework.

Events in our world seem to occur in four dimensions (length, width, depth, and time) and we can see into all of them. Hence, when the same two objects, which we recognize as planes, approach each other at tremendous speed and at the same height, there would be a crash, except the two planes are flying in the same space but at different times. Again, it is obvious there is no crash because of our ability to perceive differences in time. This ability, to see in four dimensions, makes interactions between objects appear natural and obvious, whereas for those seeing in fewer dimensions, they would seem miraculous or paradoxical.

Einstein’s Theory of General Relativity says we live in four dimensions. String Theory, developed in the 1960s, in contrast, says it’s at least 10. In 1919, Theodor Kaluza discovered that adding a fifth dimension to Einstein’s equations could account for the interactions of both electromagnetism and gravity, the fundamental forces that govern how objects or particles interact. The problem was that, unlike the previous four, this fifth dimension did not relate directly to our sensory experience. It was just there in the mathematics. More recently, String Theory scientists have shown that an additional five dimensions account for weak and strong nuclear forces, the two additional fundamental forces of nature. Thus, with 10 dimensions, String Theory can account for ALL the fundamental forces and ALL their interactions. Unfortunately, we have no way of relating our lived experience to this mathematical multidimensional accountability.

We look out at our complex world and try to figure out why accidents happen? Why do young people die? What makes voters attracted to certain ideas? Why do we fall in love with this person and not that one? Why does only one person survive a plane crash? From our human perspective, these are difficult to answer and even incomprehensible questions because we don’t have full insight into the dimensions at work creating the dynamics of these interactions. If we did, it would be as obvious as the sun rising every morning.

Today we know physical interactions result from information exchanged by fundamental particles known as bosons, and mathematically accounted for in 4-D space by Einstein’s equations. Experiments with bosons, however, suggest there may be forms of matter and energy vital to the cosmos that are not yet known to science. There may, in fact, be additional fundamental forces of nature and distinct dimensions of information.

Indeed, different exchanges of information mediate human interactions. These involve intangible dimensions of unity, beauty, sociality, persistence, love, entanglement, etc. Information in these dimensions affects social interactions and relates more directly to our life. Imagine, for example, two individuals running at each other at full speed and crashing. Without contextual information, how do you understand the action? Is it two angry opponents on a battlefield trying to kill one another? Or, is it two players on opposite teams tackling one another in a game of American football? Knowing the intangible dimensions of the interaction and the information provided gives us an obvious answer.

Preliminary attempts by psychologist Sarah Hoppler and her colleagues (2022) give us hope that a combination of factors can describe social encounters. These researchers have identified six dimensions (actor, partner, relation, activities, context, and evaluation) with three levels of abstraction, based on how people describe their social interactions. They have shown their approach to depict and account for all conceivable sorts of situations in social interactions, irrespective of whether described abstractly or in great detail.

It might be useful, therefore, to ask whether there are limits to our comprehension of additional dimensions of information? Science tells us that physical dimensions may be so tiny and fleeting that we currently can’t detect them. However, in terms of intangible sources of the human experience, we may not be as helpless. We can sense and intentionally improve some of these dimensions (love, sociality, persistence); others we sense and learn to appreciate (beauty, unity); but a vast majority are, for now, still beyond our comprehension (entanglement).

We are still discovering, evolving, and quantifying the full capacity of our human experience. If we accept Pierre Teilhard de Chardin’s statement that “We are not human beings having a spiritual experience; we are spiritual beings having a human experience,” then the voyage of discovery of our multidimensional nature is unlimited and ought to be infinitely enjoyable.

What Is Genius?

Leonardo da Vinci — Albert Einstein

We define genius as original and exceptional insight while performing some art or endeavor that surpasses expectations, sets new standards for future works, establishes better methods of operation, or remains outside the capabilities of competitors. Prior to doing some background reading on the subject, I assumed that at such heights of intelligence, there is little room for fallibility, errors, flubs, mistakes, wrong guesses, etc. I discovered this was wrong, and it made me reconsider what genius really means. It’s not what we’ve been told by mythmakers.

Let’s consider two well-known and uncontested cases of genius, Leonardo da Vinci and Albert Einstein. For many of us, they represent the epitome of creativity in arts and science and a full-flowering of the human potential. Leonardo da Vinci is the embodiment of the Renaissance man. He was a painter, artist, engineer, architect, scientist, inventor, cartographer, anatomist, botanist, and writer. His active imagination conceptualized the tank, the helicopter, the flying machine, the parachute, and the self-powered vehicle. He was a man far ahead of his time and many of his visionary inventions became real only centuries later.

As an engineer, Leonardo sensed more than most how the design of machines informed by mathematical laws of physics is better than relying just on practice. He was the first to design separate interchangeable components deployed in a variety of complex devices. And no one drew machines with more attention to detail and reality. His insatiable curiosity about nature undergirded his efforts to devise flying machines. He sought not only to imitate flying birds, but to understand and apply the principles of flight to endow man with the ability to fly on his own. His genius lay in his mastery of engineering principles, design, and natural law.

Walter Isaacson, author of “Leonardo da Vinci,” describes how Leonardo integrated science with his artistic genius. “Leonardo spent many pages in his notebook dissecting the human face to figure out every muscle and nerve that touched the lips. On one of those pages you see a faint sketch at the top of the beginning of the smile of the Mona Lisa. Leonardo kept that painting from 1503, when he started it, to his deathbed in 1519, trying to get every aspect exactly right in layer after layer. During that period, he dissected the human eye on cadavers and could understand that the center of the retina sees detail, but the edges see shadows and shapes better. If you look directly at the Mona Lisa smile, the corners of the lips turn downward slightly, but shadows and light make it seem like it’s turning upwards. As you move your eyes across her face, the smile flickers on and off.”

While one can sense the inexhaustible work, energy, curiosity, and creative drive behind Leonardo’s genius, the stories insinuate his creativity unfolded naturally, perfectly, and inevitably. But, much of this reality is mixed with mythology since Leonardo created an endless succession of untested contraptions, unpublished studies, and unfinished artworks. There was no inevitability to his creations. There was a lot of fudge work and many, many steps before the final product. It would appear his genius rested on having an active and creative mind. What motivated such a mind? Undoubtedly, it depended on several personality attributes, but foremost among these was curiosity, his defining trait. Everything seemed to interest Leonardo. Maybe genius is simply an unrelenting curiosity that in its wake happens unexpectedly upon original, novel, and unique insights. The genius is that all that preparation and work allows one to recognize those marvelous coincidences.

Like da Vinci, Albert Einstein reached the pinnacle of admiration in the modern era as the iconic symbol of genius. His most famous equation is E = mc2. Einstein, however, doubted how important this was and dismissed notions it might one day be at the heart of an alternative energy source. In 1934, he declared that “there is not the slightest indication” that atomic energy will ever be possible. Today his equation is at the heart of over 400 nuclear power stations, which are about to become the world’s leading source of non-carbon-based energy.

Einstein thought his biggest mistake was refusing to believe his equations that predicted the expansion of the Universe. While some astronomers favored the notion that many objects in the sky were actually “island universes” located well outside the Milky Way, most astronomers of his time thought that the Milky Way represented the full extent of the Universe. Einstein sided with this erroneous view, and when he applied his General Relativity theory to the entire universe, he had to add a special type of fudge factor into his equations, a cosmological constant, to make it static and eternal. Unfortunately, later evidence showed that, in fact, the Universe was and is still expanding equally and uniformly in all directions on large cosmic scales. In the 1930s, Einstein referred to his introduction of the cosmological constant to keep the Universe static as his “greatest blunder.”

Today we know that Einstein actually missed out on predicting something even bigger: the existence of dark energy. In the mid-1990s, 40 years after Einstein’s death, astronomers showed his misplaced faith on his beautiful equations. Studies of exploding stars in distant galaxies revealed the Universe isn’t just expanding, it’s expanding at an ever-faster rate. The cause for this is a force stronger than gravity, but acting in the opposite direction—and with no obvious origin. This is the mysterious dark energy. Einstein’s theory can accommodate it, but at the price of reintroducing the same ugly fudge-factor Einstein loathed. Most theorists believe dark energy has its origins in the quantum laws of the sub-atomic world, which allow even apparently empty space to be seething with energy. Einstein’s antipathy for quantum theory made it unlikely he would have incorporated it into his most cherished work.

In fact, Einstein had to be dragged kicking and screaming to the idea that the Universe began in a Big Bang. After learning from the Belgian mathematician Georges Lemaître that General Relativity (GR) predicts the creation of the Universe, he dismissively replied: “Your calculations are correct, but your physics is abominable.” It’s easy to see why he believed this since the GR equations go haywire at the moment of the Big Bang, giving literally infinite results. What’s needed is something extra to bring the theory back under control. Theorists now believe they know what that something extra is—and, regrettably for Einstein, it’s the very thing he couldn’t accept: quantum theory. Recent calculations by theorists in the US and Europe have shown that combining GR with quantum theory results in a theory of ‘quantum gravity’ that gives insights not only into the Big Bang but also what came before it.

Although Einstein played a major role in developing quantum theory, he became increasingly suspicious of its fuzzy, ‘probabilistic’ view of particles, which seemed to prevent even their position and speed being known with complete precision. He summed up his view by saying, “God does not throw dice.” But today, most physicists believe Einstein was wrong. Throughout his career, Einstein had an almost religious belief in the fundamental unity of the Universe, and spent decades searching for the one true Theory of Everything (ToE), which would describe the Universe and everything in it. His failure did not deter others, and over half a century after his death, the quest for the ToE continues. The bad news is that there may be at least 10,500 ToEs, and no obvious way of deciding between them. There’s now a growing suspicion that the whole idea of just one true ToE may be a mistake—and that Einstein was naïve to spend his life looking for it.

These brief highlights of Albert Einstein’s miscues and false steps in his theorizing, amidst the many times he proved to be right, reinforce what my argument about genius is. It is not a natural, perfectly linear, and inevitable unfolding that occurs in the mind of one person. Rather, it reflects circuitous thinking, hard work, and an unrelenting drive that makes mistake after mistake. Yet, somewhere along this messy path, this creativity and uncompromising drive falls upon and discovers, perhaps unexpectedly, a heretofore unknown jewel of creation.

Unless special circumstances occur, such as injuries or neglect, most children embody the driving curiosity and energy of genius. My argument is that everyone is born with equal potential to realize such a trait. While it is unimpeded in an Einstein or da Vinci, it is blocked, obstructed, unencouraged, derailed in the vast majority of us. It is up to our society at large to identify these reasons and avoid this human misfortune. As Walter Bowman Russell, an American Renaissance man in his own right, put it in 1946, “Mediocrity is self-inflicted; genius is self-bestowed. The choice is yours.” 

A Content-Addicted Culture

We live in an exceedingly rich information culture. Indeed, the amount of information exceeds the capacity of our individual brains to process it by orders of magnitude. Not only is factual-based scientific information being collected at prodigious rates, but the amount of non-factual disinformation created by our culture matches its rate. Reality-based information and its imaginary counterpart are apparently not sufficient content providers, for we are now considering developing a metaverse—a universe of infinite made-up possibilities. While human imagination is a source of all these creative outbursts, problems occur when there are no boundaries to its unconstrained nature, no checks on counterfactual thinking, no testing to see whether ideas are true. When we think up stuff and assume it is real, it creates the singular psychological basis for human suffering. Humans have known this insight for thousands of years and yet we march on, like a devastating tsunami obliterating everything on its path or like lemmings on our way to our own destruction.

The driving energy behind this is a mind that lives in scarcity, that is constantly dissatisfied, always searching for more to fill a bottomless emptiness. It is this unfulfilled emptiness that makes us addicted to information, to the content of what our mind can experience. For, while we attend to this content, it prevents us from looking at the core of this emptiness. Such a possibility seems too frightening to consider and so we develop more and more content to distract us. The American social scientist Herbert Simon wrote: “The wealth of information means a dearth of something else—a scarcity of whatever information consumes. What information consumes is rather obvious: it consumes the attention of its recipients.”

Human attention comes in short supply, and when the information it consumes overwhelms it, there is nothing left to attend to our unfulfilled nature. The only antidote to this devastating tsunami of human experience is to become quieter and more intentional. A time of silence occurs when we stop furiously examining the content of our conscious mind. What we discover when we turn from noise to silence is that what we feared, the emptiness we felt, is not really the bogey man we envisioned, but a unique source of nourishment unlike anything ever experienced. We briefly touch what Christ called “the living water” or others refer to as the “I AM.” Silence is one of the few remaining ways we express a widespread, shared experience of sacredness.

The more we drink from this silent emptiness, this living water, the less dissatisfied we become, the less we are interested in the content of our consciousness. Instead, we become more interested in that which contains it all: consciousness itself. We lose our addiction to information, and at the very end of such a process, we identify with the silence itself, which, in fact, contains everything.

The Human Spiritual Experience

“We are not human beings having a spiritual experience; we are spiritual beings having a human experience.” – Pierre Teilhard de Chardin

What if mystics and spiritual masters are right about us being spiritual beings having a human experience? Is there anything in what we know about the brain and its development that may be more associated with such an idea than with the standard materialist explanation? I interpret a 2021 meta-analysis of brain connectivity by Edde and colleagues as evidence that we should not easily dismiss these types of alternative explanations.

A set of biological rules or algorithms govern brain connectivity changes during normal development, maturation, and aging and, forming and shaping the macroscopic architecture of the brain’s wiring network or connectome. The standard model, shown in the Figure below, predicts that as we develop prenatally and during the first few years, local neural connections form first, followed by more global connections. This initial growth occurs in parallel with synaptogenesis, where macroscopic connectome formation and transformation reflects an initial overgrowth and subsequent elimination of cortico-cortical fibers. For most of the middle period of development, we see a plateau in the process and a strengthening of these connections. As we reach old age, the prediction is that things reverse, with global connections losing strength and finally local connections being affected. This inversion of the developmental process during aging accords with the developmental models of neuroanatomy for which the latest matured regions are the first to deteriorate. The graph below illustrates this inverted U-function of brain connectivity during a lifetime.

Mostly, studies of the human connectome support this standard model. The quirk in Edde’s recent analysis is the unexpected observation that it is local networks that begin losing strength in aging, while global networks maintain or even increase. The implication is that we lose focused, specialized functions with aging, but maintain or strengthen global, integrative functions. If we are preparing for death, why would this be taking place?

For a potential explanation, let us turn to a simplified spiritual model of development that closely matches these new observations (see Figure below). Initially, an understanding of our true nature (the spiritual path) provides the insight that ego development, while natural, interferes with knowledge of who we really are. We gain insight that we form highly local and specialized functions early in order to protect our fragile sense of being. These processes strengthen during adolescence and early adulthood (ego development). Time and further insights (e.g., identifying exclusively with this ego) during middle age propel us to embark on a process described as ego-death or at least diminishment. We seek to undo the local and highly functional networks that arose during ego development. The outcome of this effort is a switch in perspective, which many term realization. This basically describes a switch from identification with ego-based processes to a sense of the larger unity of which we are part. Realization can occur at any time in development, but usually after some effort in reducing ego-based thinking. Realization means a major strengthening of the global networks associated with the unity experience. This is more in concert with the proposal that “in the fifth decade of life (that is, after a person turns 40), the brain starts to undergo a radical “rewiring” that results in diverse networks becoming more integrated and connected over the ensuing decades, with accompanying effects on cognition.”

Why do we have diminished memory function as we age? While clinical disorders such as dementia exacerbate this condition, the natural trend in aging is for a reduced capability. Why? Perhaps the materialist models are wrong and what we are seeing, as supported by Edde et al., is a reduction in localized functioning, while strengthening of the global unity functions. For me, this preparation to join this larger unity in death would seem like a better explanation for what is happening in aging. This is in contrast to the idea that “the networking changes likely result from the brain reorganizing itself to function as well as it can with dwindling resources and aging ‘hardware.’”

Human Interconnectedness: Billiard Balls or Gooey Chocolate?

Most of us are quite aware of how connected we are. This awareness has produced several ways to think about, describe, or view this connectivity. The ideas run the gamut from highly interconnected to loosely so. The extent to which the internet has made us aware of these relationships, with more people and more information from around the world, it has become front of mind. Since how we think about nature structures our reality of it, it seems necessary to reconsider how we might view our connections to others.

Many people view these interactions from the perspective that we are more like billiard balls that inevitably come in contact with other billiard balls as we run around the world doing our thing. It recalls the classic view of the structure of reality at the atomic level. That matter comprises extremely tiny particles called atoms originated about 2500 years ago by Democritus, a Greek philosopher. However, the idea was forgotten for approximately 2000 years. Then, at the beginning of the 19th century, the English chemist John Dalton brought back Democritus’ ancient idea of the atom. Dalton thought atoms were the smallest particles of matter and envisioned them as solid, hard spheres, like billiard balls. Currently, matter as energy, electron clouds, or probability waves have replaced the old billiard ball model of matter.

Yet, the classic notion of matter has affected our conceptualization of human interactions. We view the inevitable result of human billiard balls making contact and colliding as producing change, but namely to redirect or reorient our own trajectory. If the impact is great, it may cause emotions to engage and the interactions can rise to another level. Even at this level, we think that our internal environment remains unchanged, except perhaps for a brief flash of exasperation, anger, or resentment.

As the collisions become greater and more pronounced, we may acknowledge that the changes stay with us for a while, perhaps even a lifetime. The accepted modern paradigm to describe this dynamic is what I characterize as independent arising. Each of us is seen as an independent agent, a billiard ball, exerting control over what we experience. Thus, what we feel and how we respond depends on our own mind, allowing itself to experience and determine our actions. There is a level of autonomy, control, and agency we attribute to our behavior.

This model, however, is incomplete and does not explain all of human behavior. Therefore, it is time to recognize alternative explanations. One of these is that we may be more like balls of gooey chocolate when we encounter and collide with others. When we do, we leave a trace—sometimes messy, sometimes not. Therefore, it isn’t unusual to hear someone say, “I needed a shower after meeting that person.” The psychic residue of our interactions can affect our internal being, our psyche and spirit, and can be difficult to wash off. I characterize this as dependent arising.

A sophisticated description of this idea is the doctrine of dependent arising, which stands at the heart of Buddhist doctrine. It describes the principle of conditionality or the links that arise between experiences. My simple understanding of this doctrine is that these links and our experience of those links arise dependent on every other circumstance we encounter. They arise because we have a body, emotional reactivity, perceive incoming sensory information, conceptualize such information, and develop a conscious awareness of these experiences. Or, to put it in more modern terms, we are born with a body whose function is to create connections and links between experiences, add emotional valence, and reflect on them.

Martin Luther King Jr. captured the point I want to emphasize about our deep interconnectedness in what he expressed about injustice. He said, “We are caught in an inescapable network of mutuality, tied in a single garment of destiny. Whatever affects one directly affects all indirectly.” The concept of interbeing introduced by Thich Nhat Hanh also reflects our deep interconnectedness, where everything relies on everything else in order to manifest. Likewise, biologists describe how our human bodies are shared, rented, and occupied by countless other tiny organisms, without whom we couldn’t “move a muscle, drum a finger, or think a thought.” Indeed, our body is comprised of trillions of bacteria, viruses, and other such organisms. Without them, we wouldn’t be able to operate, think, feel, or speak. In fact, the analogy applies to the entire planet, which can be conceived as one giant breathing entity, with all its working parts connected in symbiosis.

We are not separate entities or exist independently, but are a continuation of one another, as Thich Nhat Hanh has argued. This garment of symbiotic mutuality that we represent gives a different perspective on what it means to be in a relationship with others. It thus calls for a rethinking of how much agency, control, and independence we actually have and to structure our spiritual path accordingly. By spiritual I mean a recognition of the validity and impact of our deep interconnectedness.

The Pale Blue Dot

By Carl Sagan

Consider again that dot. 
That's here. That's home. That's us. 
On it everyone you love, 
Everyone you know, 
Everyone you ever heard of, 
Every human being who ever was, 
Lived out their lives. 
The aggregate of our joy and suffering, 
Thousands of confident religions, 
Ideologies, and economic doctrines, 
Every hunter and forager, 
Every hero and coward, 
Every creator and destroyer of civilization, 
Every king and peasant, 
Every young couple in love, 
Every mother and father, 
Hopeful child, inventor and explorer, 
Every teacher of morals, 
Every corrupt politician, 
Every 'superstar', 
Every 'supreme leader', 
Every saint and sinner in the history of our species lived there – 
On a mote of dust suspended in a sunbeam.    

Sagan’s beautiful statement was occasioned by seeing the photograph of the earth in space as a pale blue dot. I read it recently and affected me as only beautiful poetry can. It also reminded me of what Harold Ramis, American actor, comedian, director, and writer, said about carrying two notes to remind you of who you are. The first note reads, “The universe was created for my delight.” The second note says, “I am a meaningless speck of dust in the vastness of the universe.”

Ramis’s point was that life occurs in the rhythmic oscillation between these opposite poles, of meaningfulness and meaninglessness. The rhythmic oscillation of this dance occurs outside and within your conscious awareness, but in either case, you are a participant.

Nisargadatta Maharaj, an Indian guru, offered a similar sentiment when he said, “Between looking inside and recognizing that I am nothing and seeing outside and recognizing that I am everything–my life turns.”

You, me, the earth and everyone else born in this speck of dust are both nothing and everything.  

Jumpstarting the Mind to “Wake Up”

This essay discusses jumpstarting the mind to wake up psychologically and spiritually. I extracted parts from my upcoming book, Transforming Anxiety Into Creativity.

Many scholars who study the mind describe individuals as having an original sense of wholeness, or self. As a child, this sense of self is active, adaptable, energetic, curious, and creative. It is unencumbered by problems, with an attitude of openness, eagerness, and lack of preconceptions. Along the way, we develop an ego, and living a joyful, stress-free, and fear-free life turns problematic. We experience worries, anxieties, uninhibited thoughts, fears, overwhelming feelings, and seeing no way out of difficult circumstances. When these negative experiences persist and affect our mood, thinking, and behavior, they disrupt the normal flow, joy, and unity of life and obscure its natural wonder. When the interruptions and disruptions become unmanageable, they are the basis for physical and mental disorders, autoimmune and emotional disturbances, heart problems, addictive behaviors, and suicidal ideation. Even worse, if negative ideations become a recurring issue, the inevitable consequence is psychopathology.

What is the source of this problem? The major reason seems to be that out of the original unity to which we are born, a separate ego crystallizes during development. Individuation or ego-self differentiation is a normal aspect of development, but the schism between what is real and what is not creates problems when not understood. Ego creates an illusion of separation, as it emerges from the normal push-pull dynamics of resisting and accepting what life presents. Many justified reasons, including our initial dependency and the potential harm negative experiences have on an immature mind, make our tendency to resist life more prevalent than our acceptance of it. Over time, resistance becomes stronger, reified, and real to the point we identify it as our true self and as our exclusive response to life.

Numerous solutions to the ego-self differentiation problem have been put forth from behavioral, psychological, and spiritual perspectives. The most enduring solutions are those that help us understand the source of the conflict and, from that point of view, its solution. In my book, Transforming Anxiety into Creativity, I recognize and understand the problem by combining a neuroscience background with a personal understanding of the mind based on knowledge and experience with Zen Buddhism. The solutions I present are easy to understand and available. Yet, they are difficult to put into action, as they call for a genuine change in perception and awareness.

What is the principal message? That awakening and self-realization are psychological and spiritual solutions to ego-self differentiation. From a spiritual viewpoint, awakening or “waking up” refers to understanding one’s true nature, the unity of life, and our role in this greater sense of connection. It means you wake up from the illusion, the dream-like experience in which you feel separated from life, to the recognition that you are and always have been integrated with it. This recognition of unity is a remembrance, a going back to what you once knew.

Far too few people recognize the source of their mental suffering and the ability they have to do something about it. As Henry David Thoreau put it, “The mass of men leads lives of quiet desperation.” Likewise, many descriptions of the spiritual awakening experience are too confusing to be helpful. These descriptions, however, boil down to the notion that one must first recognize “that belief in my thoughts is not in any way definitive of my true inner self.” This simple yet powerful insight starts the process of de-identification with the egoic mind.

When de-identification takes hold and you can honestly say, “I don’t believe that thought,” it is like the sudden giving way to the bottom of a pail of water. The filters through which the world is experienced are metaphorically cleared and cleansed. Your ego or “I” momentarily disappears in a rush of awareness, liberated from identification with resistance to life. The expansive consciousness, obscured by the limited ego, is suddenly liberated and appreciated. You become conscious of being conscious and encounter unlimited awareness. The experience brings a sense of freshness to the wonder of sense perception and of who you are.

This awakening experience is a process, one that can be gradual (including multiple small awakenings) or fast(er). Neither fast nor slow is better or worse, they only differ in the timing of the changes experienced. Our culture makes us skeptical of fast means since they bring to mind miracle-like processes, with not enough time to understand them. Thus, there is more discussion and focus on slow means. Like seeing a therapist, we recognize it may take a dozen or more years to resolve the issues. Slow solutions, like mindfulness meditation, spiritual study, ethics, and prayer, seem rational, are more likely to lead to a positive outcome, provide longer lasting solutions, require less effort, etc. In reality, our deceptive egoic mind creates these “explanations.” De-identification with thoughts means its own disappearance and so it wisely favors a slower demise. Yet, with one immediate change in perspective, which we are all capable of doing, we can jumpstart the mind to wake up.

Traditions like Yoga, Vedanta and Buddhism agree that the end goal of awakening, or enlightenment, is already here and now, that it is our true nature — or the true nature of reality. Thus, not that we have to achieve or become it, we simply need to remove the obstacles (the egoic mind) to realize its expression. Thus, knowing that you once held this treasure differs from never having had possession of it. So, above all else, the path to awakening requires the conviction that what you aspire to is real, since you once had it. While you may no longer identify with such a mind, you have not lost it, and it is possible to recover. This journey to waking up is a voyage of rediscovery.

The two broad approaches to removing the obstacle of the egoic mind are to either emphasize the need to transform and purify the mind (or even transcend it altogether). This is the gradual approach carried out through practices such as meditation, spiritual study, ethics, devotion, etc. Or, the fast(er) approach, which emphasizes the “already present” aspect of enlightenment. This focuses the teachings more around inquiring into your true nature and simply living in the present with non-attachment.

Living in the present with non-attachment provides an immediate doorway for a return to the extraordinary mind you once had, the one associated with a joyful and creative life. This is not a novel idea, but there are now science-based explanations for why such a switch works. Bringing attention to the immediacy of the moment changes the focus of attention from the mind’s ruminations of the past and future to the awareness of present circumstances and holding such thoughts in awareness. Nonattachment means not getting emotionally involved with the thoughts, but observing each mental dustup that arises without judgment. This is the most important action to implement. When done correctly, living in the present with non-attachment stops anxious, unmanageable thoughts in their tracks. The effect is immediate and, with practice, long-lasting.

That turning point, a longer-lasting experience of the present moment, marks the awakening experience and a recognition of the original unity you once had. It is a rebirth in which you find yourself childlike, but with greater appreciation. The opportunity for true living opens up—an ability to see things as they really are, without resisting them, and a genuine enjoyment of life.

If you have questions, direct them to: jpineda@ucsd.edu.  

A Neuroscientist’s Spiritual Journey: The Podcast

My passion at this point in life is to share my professional and spiritual insights with others. This podcast is one such attempt. Here is where you can find it:

Youtube: https://www.youtube.com/channel/UCmga5Z4JdHziQjtCdnVhYuw/videos

Apple Podcasts: https://podcasts.apple.com/us/podcast/the-middle-way-with-dr-matthew-goodman/id1566423470

Spotify: https://open.spotify.com/show/24QlEy5FOCTSQTWjsoOCRZ

In my creative endeavors, I try to explore how true seeing, hearing, feeling, and thinking can quiet the overactive mind and how this allows silence to become the fountain of creative thought. From such silence, I’ve experienced the emergence of a new perception and awareness. This awareness of unity is infused with intrinsic joy, love, and empathy towards others, and filled with a deep and uncompromising dedication to what is true and real.

Professionally, I am Professor Emeritus of Cognitive Science, Neuroscience, and Psychiatry at the University of California, San Diego. Having joined the Department of Cognitive Science at UCSD as a founding faculty member in 1989, I remained for the rest of my 28-year academic career. My hope is that my work impacted and inspired generations of undergraduate and graduate students to take on hard questions in the neurobiology of the human mind. 

I have authored many widely cited papers in animal and human cognitive and systems neuroscience and edited one academic book (Mirror Neuron Systems: The Role of Mirroring Processes in Social Cognition). This book is a collection of research on the functional significance of mirror neurons, which includes my work on autism. It is one of the most cited and downloaded books in the field.

For the last twenty years, I became interested in spiritual matters as a bridge to a fuller understanding of the mind. This led me to explore Zen Buddhism, train with a master teacher, and develop my creative side. This led to the publication of two books of poetry (Quieting of a Mind and Dawning of a New Mind) focusing on mind-brain relationships with an emphasis on spirituality, mysticism, environmentalism, and social activism. Most recently, I published the story of my journey and the bridging of science and spirituality (Piercing the Cloud: Encountering the Real Me).

Can Brain Evolution Teach Us Anything About Conflict?

The Russia-Ukraine war playing out on television these last two weeks has produced a helpless feeling of not being able to help. Yet, it has motivated me to explore alternative ways to be of use. This essay represents a small way of using what I know to consider future answers to such conflicts.

The 21st century is well on its way to becoming the era of translational biological information. This means we are applying the vast amount of knowledge gathered over the last century to change the world for the better. Our modern economy, law, politics, and the military reflect this process. These are several of the many institutions in society that owe a great deal to the growing understanding of the mind. Advertising, marketing, focus groups, negotiations, ethics, law, and intelligence work—all rely on awareness of how we think and decide. From genetics to personalized medicine, the study of the human mind sits at the edge of a truly transformational time. We know well the link between malnutrition and depression, while we learn more every day about the role depression plays in mild cognitive impairment. Other findings, such as the effects that microbiome bacteria in our stomachs have on cognition, are nothing short of extraordinary.

For the past century and a half, we have learned that changes in our brain result in modifications to the mind and to our personality. Tools to study and learn these brain-mind relationships, such as deep brain stimulation, have moved us from indirect to more direct mapping. We have discovered ways to know how people decide and think, and how it is we communicate with each other. A more recent development in neuroplasticity is that modifications to our personality can cause changes to our brains. Mindfulness meditation to manage stress rewires unhealthy circuits in the brain, such as the HPA axis response.

What such insights have produced is not only a better grasp of how information processing affects perception and cognition but how, through extensive training, we can extend our personal and sociocultural boundaries. Neurotechnology, one of the new sciences in this contemporary world, has developed methods for treating and repairing soldiers injured in battle. It has figured out how we move cursors across a screen through the power of thought and how to control an advanced prosthetic arm in the same way. Neurotechnology restores the sensation of touch to an individual with severe neuro-degenerative injury.

The consequences of this new science, however, are not always positive. In the name of national security and warfare preparation, neuropsychological training also eases individuals into controversial tasks, such as killing. Thoughts control the flying of drones. Pharmaceuticals help soldiers forget traumatic experiences or produce feelings of trust to encourage confession in an interrogation. Thus, the weaponization of biological information raises ethical concerns.

The dual use of scientific information for good and bad ought not to prevent us from extracting lessons on how to avert conflict. Given all the current clashes, it is an opportune time to ask whether there is anything in the biological treasure trove of knowledge that can help us deal with conflict or even how to avoid it.

Optimal prediction in decision-making is one innovative way to prevent conflict. Imagine being able to anticipate the plans of others, especially adversaries, and forestall, or prevent those efforts? Could we have stopped the Ukrainian war had we known that Russia would invade the country? Is it possible to stop any conflict if we know the problem before it happens? The rational answer would seem to be yes. Interestingly, the human brain evolved for “optimal” prediction in decision-making, turning Homo sapiens into one of the most successful species to survive a violent and uncertain world. It sounds reasonable, therefore, to ask whether there are lessons in this evolution that we can extract for more general use?

Recent developments in cognitive neuroscience, based on neurologically inspired theories of uncertainty, have led to proposals suggesting human brains are sophisticated prediction engines. This means the brain generates mental models of the surrounding environment to predict the most plausible explanation for what’s happening in each moment and updates the models in real time. According to Andy Clark, a cognitive scientist at the University of Edinburgh in Scotland, “You experience, in some sense, the world that you expect to experience.”

We assume the major function of “looking into the future” through prediction, preparation, anticipation, prospection or expectations in various cognitive domains is to organize our experience of the world as efficiently as possible. The brain-mind is optimally, not perfectly, designed to cope with both natural uncertainty (the fog surrounding complex, indeterminate human actions) and man-made uncertainty (the man-made fog fabricated by denials and deceptions). We do this by conserving energy while reducing uncertainty. This ability evolved to support human intelligence through continuously matching incoming sensory information with top-down predictions of the input. Analysis of the temporo-spatial regularities and causal relationships in the environment produce top-down predictions or expectations—something known as Bayesian inference.

The brain uses this knowledge of regularities and patterns to make a model or the “best guess” about what objects and events are most likely to be responsible for the signals it receives from the environment. This “best guess” goes through an iterative process of minimizing the mismatch (i.e., correcting the error) between expectancy and reality until it reaches an optimal solution. Mental models are forms of perception, recognition, inferences about the state of the world, attention, and learning, which are beneficial for more pertinent reactions in the immediate situation.

In this perspective, mental states are predictive states, which arise from a brain embodied in a living body, permeated with affect and embedded in an empowering socio-cultural niche. The result is the best possible and most accommodating interaction with the world via perceptions, actions, attention, emotions, homestatic regulation, cognition, learning, and language.

A predictive machine requires a high inter-dependence of processes, such as perception, action, and cognition, which are intrinsically related and share common codes. Besides the feedforward, or bottom-up, flow of information, there is significant top-down feedback and recurrent processing. Given the levels of ambiguity and noise always present in the environment and our neural system, prior biases or mental sets become critical for facilitating and optimizing current event analysis. This occurs whether it concerns recognizing objects, executing movements, or scaling emotional reactions. This dynamic information flow depends on previous experience and builds on memories of various kinds, but it does not include mnemonic encoding. Indeed, the more ambiguous the input, the greater the reliance on prior knowledge.

The predictive model of the brain has been successful in explaining a variety of mental phenomena, such as inattention and distraction, beliefs and desires, as well as neural data. Sometimes, though, the brain gets things wrong because of incomplete or inaccurate information, and this discrepancy can cause everything from mild cognitive dissonance to learning disorders to anxiety and depression. But our survival is proof positive that whatever strategies we learned are highly effective in navigating a world of uncertainty.

Here are eight lessons regarding the predictive brain that may be helpful in dealing with conflict:

  • Recognize your use of mental models. To deal with uncertainty in the world requires creating mental models in which we map our understanding and expectations about cause-and-effect relationships and then process and interpret information through these models or filters. Mental models become critical for facilitating and optimizing our responses to current events.
  • Understand your mental models. Recognize that complex mental processes determine which information you attend and, therefore, mediate, organize, and attribute meaning to your experience. Your background, memories, education, cultural values, role requirements, and organizational norms strongly influenced this dynamic process.
  • Withhold judgment of alternative interpretations until you have considered many of them. Expertise, and the confidence that attaches to it, is no protection from the common pitfalls endemic to the human thought process, particularly when it involves ambiguous information, multiple players, and fluid circumstances.
  • Challenge, refine, and challenge again all your mental models. Discourage conformity. Incoming data should reassess the premises of your models. Remain humble and nimble. Be self-conscious about your reasoning powers. Examine how you make judgments and reach conclusions. Encourage “outside of the box” thinking.
  • Value the unexpected. It reveals inaccuracies in your mental models. You cannot eliminate prediction pitfalls because they are an inherent part of the process. What you do is to train yourself on how to look for and recognize these obstacles, view them as opportunities, and develop procedures designed to offset them.
  • Emphasize factors that disprove hypotheses. Increased awareness of cognitive biases, such as the tendency to see information confirming an already-held judgment more vividly than one sees “disconfirming” data, does little by itself to help deal effectively with uncertainty. Look for ways to disprove what you believe.
  • Develop empathy and compassion. Put yourself in the shoes of others to see the options faced by others as they see those options. Understand the values and assumptions that others have and even their misperceptions and misunderstandings. Then, act.
  • Change external circumstances instead of trying to eliminate everyone’s biases. Mental models are resistant to change primarily because they reflect the temporo-spatial regularities and causal relationships found in your environment. Restructure the setting and it will affect your perceptions.