Addiction to Irrationality

Drunkard’s cloak — Source: Wikimedia.

[updated 5-13-2013]

The typical alcoholic (outside of treatment or AA) often says “I don’t have a drinking problem. I can stop whenever I want.”

In a way, we are all like this about our innate irrationality. We tend to think we can be rational whenever we want. We refuse to admit, to ourselves and others, that we have a problem with spontaneous, compulsive, and unconscious irrationality. This includes, but is not limited to, the matters discussed by Dan Ariely in Predictably Irrational: The Hidden Forces That Shape Our Decisions, in which he challenges assumptions about making decisions based on rational thought. Many of the problems listed below have clinical diagnoses as pathological states, but they are also prevalent (if not ubiquitous) in “normal”, “healthy” people at sub-clinical levels.

Our struggle with irrationality includes (but is not limited to):

In evolutionary terms, reason is only an emerging property of the brain. Irrationality is still more the rule than the exception.  It is innate in every one of us–even in the best and brightest of our scientists, philosophers, educators, and leaders. Although scientists and scholars take great pains to eliminate irrationality from their work products, it is insidious, and it often still intrudes in subtle ways. Even in our most rational-seeming people, irrationality often runs rampant in areas outside their core competence and in their private lives. Irrationality and bias often arise from a cognitive dissonance between individualism and cooperation (or selfishness and altruism).

It is always popular to minimize and/or look on the bight side of irrationality. This reminds me too much of the rationalization tricks of alcoholics. Even Dr. Ariely has joined this trend with his newest book, destined to be a smash-hit bestseller, The Upside of Irrationality: The Unexpected Benefits of Defying Logic at Work and at Home. This will be popular in may quarters, but especially among all the boneheads of the world for the ammunition it will give them against their more rational friends, coworkers, and family members.

It is undeniable that evolution has given the brain powerful heuristic tools for making snap judgments. These may serve us well (or not) when circumstances don’t permit more conscious, deliberate, and scientific methods of decision making. It is also undeniable that “facts” are often incomplete or presented in a biased way, that appearances (even “scientific” and “empirical” appearances) may deceive, and that sometimes our contra-factual  intuitions turn out  right. But no amount of benefit we may derive from irrational thinking  and behavior (which can often only be judged in hindsight) in any way changes, diminishes, nor even remotely compensates for the harm it does. Of course we wish to keep the cute, irrational baby– but that’s no excuse for not throwing out the  toxic bath water. The only rational thing is to do both.

Recognizing and saving the baby (the upside of irrationality) is all well and good. Nevertheless, the downsides of irrationality are accelerating humanity towards a cliff. If we all go over the cliff, what happens to the effing baby? As Richard Dawkins  points out in the The Selfish Gene (1989 p.8), things that give selective advantage can, if carried to an extreme, lead to annihilation of species.

It is no doubt precisely because irrationality seems so often to bear gifts, especially in the short term, that it is so seductive. It may also have to do with our inclination to be “cognitive misers“.

The problem with irrationality is that it is easy, it is pleasant, and it is reassuring; but it is also an unconscious compulsion or addiction, and we continue to pursue it and defend it way past the point of diminishing returns.

Why? Because irrational behaviors, emotions, and mental states are reinforced by the same neurochemicals that cause other forms of addiction. In An Open Letter to Researchers of Addiction, Brain Chemistry, and Social Psychology, the astrophysicist and author David Brin writes:

Consider studies of gambling. Researchers led by Dr. Hans Breiter of Massachusetts General Hospital examined with functional magnetic resonance imaging (fMRI) which brain regions activate when volunteers won games of chance — regions that overlapped with those responding to cocaine!

“Gambling produces a similar pattern of activity to cocaine in an addict,” according to Breiter.

Moving along the spectrum toward activity that we consider more “normal” — neuroscientists at Harvard have found a striking similarity between the brain-states of people trying to predict financial rewards (e.g., via the stock market) and the brains of cocaine and morphine users.

Along similar lines, researchers at Emory University monitored brain activity while asking staunch party members, from both left and right, to evaluate information that threatened their preferred candidate prior to the 2004 Presidential election. “We did not see any increased activation of the parts of the brain normally engaged during reasoning,” said Drew Westen, Emory’s director of clinical psychology. “Instead, a network of emotion circuits lit up… reaching biased conclusions by ignoring information that could not rationally be discounted. Significantly, activity spiked in circuits involved in reward, similar to what addicts experience when they get a fix,” Westen explained.

How far can this spectrum be extended? All the way into realms of behavior — and mental states — that we label as wholesome? Rich Wilcox of the University of Texas says: “Recovery process in addiction is based to a great extent on cognitively mediated changes in brain chemistry of the frontal/prefrontal cortex system. Furthermore… there is even a surprising amount of literature cited in PubMed suggesting that prayer also induces substantial changes in brain chemistry.”

Clearly this spectrum of “addiction” includes reinforcement of behaviors that are utterly beneficial and that have important value to us, e.g., love of our children. I get a jolt every time I smell my kids’ hair, for instance. The “Aw!” that many people give when then see a baby smile is accompanied by skin flushes and iris dilation, reflecting physiological pleasure. Similar jolts come to people (variously) from music, sex, exercise and the application of skill.

Although a lot of recent research has danced along the edges of this area, I find that the core topic appears to have been rather neglected. I’m talking about the way that countless millions of humans either habitually or volitionally pursue druglike reinforcement cycles — either for pleasure or through cycles of withdrawal and insatiability that mimic addiction — purely as a function of entering an addictive frame of mind.

For a majority, indeed, this process goes un-noticed because there is no pathology! Reiterating; it is simply “getting high on life.” Happy or at least content people who lead decent lives partake in these wholesome addictive cycles that have escaped much attention from researchers simply because these cycles operate at the highest levels of human functionality. (It is easy to verify that there is something true, underlying the phrase “addicted to love.”)

This wholesomeness should no longer mask or exclude such powerfully effective mental states from scientific scrutiny. For example, we might learn more about the role of oxytocin in preventing the down-regulating or tolerance effects that exacerbate drug addiction. Does this moderating effect provide the more wholesome, internally-generated “addictions” with their long-lasting power?

Even more attractive would be to shine light on patterns of volitional or habitual addictive mentation that are NOT helpful or functional or desirable.

Gambling has already been mentioned. Rage is obviously another of these harmful patterns, that clearly have a chemical-reinforcement component. Many angry people report deriving addictive pleasure from fury, and this is one reason why they return to the state, again and again. Thrill-seeking can also be like this, when it follows a pathology of down-regulating satiability. Ernst Fehr, Brian Knutson, and John Hibbing have written about the pleasure-reinforcement of revenge, that Hollywood films tap incessantly in plot lines that give audiences a vicarious thrill of Payback against villains-who-deserve-it.

The Most Common (but Unstudied) Form of Self-Addiction

So far, we are on ground that is supported by copious (if peripheral) research. If nothing else, at least there should be an effort to step back and notice the forest, for the trees, generalizing a view of this whole field as we’ve described so far. A general paradigm of self-reinforcement.

Only now, taking this into especially important new territory, please consider something more specific. A phenomenon that both illustrates the general point and demands attention on its own account.

I want to zoom down to a particular emotional and psychological pathology. The phenomenon known as self-righteous indignation.

We all know self-righteous people. (And, if we are honest, many of us will admit having wallowed in this state ourselves, either occasionally or in frequent rhythm.) It is a familiar and rather normal human condition, supported — even promulgated — by messages in mass media.

While there are many drawbacks, self-righteousness can also be heady, seductive, and even… well… addictive. Any truly honest person will admit that the state feels good. The pleasure of knowing, with subjective certainty, that you are right and your opponents are deeply, despicably wrong.

Sanctimony, or a sense of righteous outrage, can feel so intense and delicious that many people actively seek to return to it, again and again. Moreover, as Westin et.al. have found, this trait crosses all boundaries of ideology.

Indeed, one could look at our present-day political landscape and argue that a relentless addiction to indignation may be one of the chief drivers of obstinate dogmatism and an inability to negotiate pragmatic solutions to a myriad modern problems. It may be the ultimate propellant behind the current “culture war.”

If there is any underlying truth to such an assertion, then acquiring a deeper understanding of this one issue may help our civilization deal with countless others.

Actually, there are other problems besides the enormous political, social, and personal costs of irrationality. Another is what I would call the “atrocity cost”.  As Voltaire said, “Those who can make you believe absurdities, can make you commit atrocities.”

Well, as they say in all the Twelve-Step programs, the first step to recovery is “admitting that one cannot control one’s addiction or compulsion.”

Hello. I’m Poor Richard, and I’m an irrationalcoholic.

Are you an irrationalcoholic, too?

Poor Richard

by Dave Pollard (howtosavetheworld.ca)

Dr. Gabor Maté ~ Who We Are When We Are Not Addicted: The Possible Human

AHA!

AHA! = Average Humans Anonymous!

(A 12-step program for cognitive enhancement)

What is an “average” human?

Modern humans are known taxonomically as Homo sapiens (Latin: “wise man” or “knowing man”).

Mitochondrial DNA and fossil evidence indicates that anatomically modern humans originated in Africa about 200,000 years ago. (Wikipedia: Homo Sapiens)

Of course, 200,000 years ago we were not nearly as wise or knowing, not nearly as sapient, as we are (or think we are) today.

Behavioral modernity is a term used in anthropology, archeology and sociology to refer to a set of traits that distinguish present day humans and their recent ancestors from both living primates and other extinct hominid lineages. It is the point at which Homo sapiens began to demonstrate a reliance on symbolic thought and to express cultural creativity. These developments are often thought to be associated with the origin of language.[1]

There are two main theories regarding when modern human behavior emerged.[2] One theory holds that behavioral modernity occurred as a sudden event some 50 kya (50,000 years ago), possibly as a result of a major genetic mutation or as a result of a biological reorganization of the brain that led to the emergence of modern human natural languages.[3] Proponents of this theory refer to this event as the Great Leap Forward[4] or the Upper Paleolithic Revolution.

The second theory holds that there was never any single technological or cognitive revolution. Proponents of this view argue that modern human behavior is basically the result of the gradual accumulation of knowledge, skills and culture occurring over hundreds of thousands of years of human evolution.[5] Proponents of this view include Stephen Oppenheimer in his book Out of Eden, and John Skoyles and Dorion Sagan in their book Up from Dragons: The evolution of human intelligence. (Wikipedia: Behavioral Modernity)

Whenever behavioral modernity may have settled upon Homo sapiens, the beginnings of it are lost in prehistory, in past ages far before we have any clear and unambiguous physical or historical evidence.  The fields of evolutionary psychology and behavioral genetics promise to shed new light on the origins of modern human behavior but they are only in the very early stages of their own evolution as scientific genres.

Nevertheless, the important point is that the typical, average, or normal human has a brain that is an evolutionary work-in-progress.

We only invented agriculture about 10,000 years or so ago, which in brain-evolution time is like ten seconds ago. In that 10,000 years (or ten seconds) our brains have not had time to really get it right. Our agricultural methods are still causing too much long-term damage to the very resources we depend on to continue being productive in the future. Instead of improving the resource base over time, as brainless nature does, we are still destroying it faster than ever before. The situation with energy and manufacturing is just as bad. Our technology develops at a far greater pace than our brains, which we use to plan and manage the applications of the technology, hoping to maximize productivity and avoid drastic unintended consequences.

Our track record is not so good.

Interesting times

May you live in interesting times, often referred to as the Chinese curse, was the first of three curses of increasing severity, the other two being:

  • May you come to the attention of those in authority
  • May you find what you are looking for

(Wikipedia)

It is only in very recent, recorded history that humanity has come so close to achieving true greatness. Only recently have the consequences of human behavior become so great and so visible.  That makes the present day the most interesting time in all of human history.

In the past, the planetary environment was vast in proportion to all the cumulative impacts of human populations. Over a fairly recent period of time, however, humanity has turned a corner or crossed a tipping point where the environment is no longer large enough to fully absorb and erase all the effects that human activity creates. Those human effects are overtaking the planet’s homeostatic systems and causing ecological processes and environments to degrade or permanently fail. We can see this in species extinctions, failing hydrological systems, changing ocean currents and weather systems, and now even in planetary temperature regulation and rising sea levels.

The most interesting thing about these times is the extent to which the external world has become our mirror. Almost everything that’s wrong with our culture and our environment now is a result of human behavior and can be traced backwards to an evolutionary origin in the normal, anatomically and behaviorally modern, human brain.

Plato’s Allegory of the Cave

Animation of Plato’s Cave

Madness and normality

The problem with the modern human brain isn’t what we call clinical, DSM-level, mental illness–it is sub-clinical. The problem is normality–which includes standard, predictable cognitive faults, irregularities, and distortions that belong to many kinds of so-called “spectrum disorders” but fall below the accepted level of clinical severity or are just too complex to disentange.

It is the pandemic of typical, sub-clinical mental faults that causes poverty, crime,  global warming, oil spills, Iraq & Afghan wars, financial crisis, bad government, etc. Any behavior which produces negative utility is irrational.

The main reason our times have become so “interesting” is not disease, not resource scarcity, not over population. The root problem is our normal thinking and our typical behavior. We could cure all physical illness, all clinical mental illness, all poverty, war, etc. and we would still be hurtling just as fast (and probably even faster!) towards our own self-destruction! The problem is not what we have traditionally seen as illness or scarcity or other external threats. The problem is normality!

The root cause of our threatened survival is installed inside of every “healthy”, “normal” human being.

It’s in our DNA!

The standard brain: our normal cognitive faults, boo-boos, crutches, and placebos:

  • excessive bias towards simplicity and popularity of ideas and beliefs with little regard for accuracy
  • intolerance of ambiguity and cognitive dissonance
  • excessive sensitivity to emotional states and excessive positive bias (leading to addictions)
  • unconscious mental associations, cognitive biases, and behavior patterns
  • black-box (unconscious or pre-conscious) decision making with post hoc rationale
  • reverse-precedence cognitive hierarchy (highest, most recently evolved cognitive functions have lowest precedence)
  • fragmentation/compartmentalization (weak integration) of values, goals, personality, identity, and memory components
  • weak self-observation and attention management
  • automatic thoughts and behaviors (autopilot)
  • dishonesty
  • corruption
  • magical thinking (errors of causal association)
  • unconscious logical fallacies/errors
  • cultural biases (reinforcements for conformity, educational agenda biases, neuro-linguistic “dialects”, memes, etc.)
  • random and inconsistent neural programming (spaghetti code) from random experience/reinforcements
  • inappropriate psychological defense mechanisms (denial, self-delusion, wishful thinking, etc.)
  • linguistic deficiencies (formal thinking requires linguistic/grammatical/logical proficiency)

The 12 Steps of AHA!

By which we attempt to correct as many of the above cognitive boo-boos as possible:

We…

  1. Admit we are powerless over our thoughts, emotions, and moods; and over our sub-clinical  neurotic or impulsive behavior disorders and cognitive disorders—that our lives have become unmanageable, and if we don’t fix ourselves, our species will probably hit the wall in fifty years or less.

    “The subjective experience of powerlessness over one’s emotions can generate multiple kinds of behavior disorders, or it can be a cause of mental suffering with no consistent behavioral manifestation, such as affective disorders.” (Wikipedia: Emotions Anonymous)

    “The cognitive mental disorder perspective is the theory that psychological disorders originate from an interruption, whether short or long, in our basic cognitive functions, i.e. memory processing, perception, problem solving and language. In distinction (or in addition) to this perspective are the psychodynamic mental disorder perspective, behavioral mental disorder perspective, sociocultural mental disorder perspective, interpersonal mental disorder perspective and neurological/biological mental disorder perspective. One pioneer of cognitive disorder perspective is Albert Ellis. In 1962, Ellis proposed that humans develop irrational beliefs/goals about the world; and therefore, create disorders in cognitive abilities[1]. Another pioneer of the cognitive disorder perspective is Aaron Beck. In 1967, Beck designed what is known as the “cognitive model” for emotional disorders, mainly depression[2]. His model showed that a blending of negative cognitive functions about the self, the world, and possible selves lead to cognitive mental disorders.” (Wikipedia: Cognitive disorders).

    Nearly all forms of clinical mental illness, such as post-traumatic stress disorder, (PTSD), Attention-Deficit Hyperactivity Disorder (ADHD), Obsessive–compulsive disorder (OCD), and Dissociative identity disorder (multiple personality disorder), have sub-clinical counterparts in nearly all normal individuals.

  2. Came to believe that a higher, more stable, and more consistent level of cognitive integration and functionality could be achieved through work on cognitive modification.
  3. Made a searching and fearless cognitive inventory of ourselves. A cognitive inventory consists of a self-assessment and a coached/group assessment of our cognitive faults (see list of “The standard human cognitive faults, boo-boos, crutches and placebos” above) using various assessment tools, tests, surveys, monitored exercises, etc.

    The Deming System of Profound Knowledge

    “The prevailing [default] style of [cognition] must undergo transformation. A system cannot fully understand itself. The transformation requires a view from outside….”

    “The first step is transformation of the individual. This transformation is discontinuous. It comes from understanding of the system of profound knowledge. The individual, transformed, will perceive new meaning to his life, to events, to numbers, to interactions between people.” (More on this later…)

  4. Admitted to ourselves and to others in our group the exact nature of our cognitive faults.
  5. Were entirely ready to give up all these cognitive defects and shortcomings.
  6. Made a list of all persons we had affected as a consequence of our cognitive defects, and made direct amends to such people wherever possible, except when to do so would harm them or others.
  7. Continued to take personal cognitive inventories and when we discovered faults promptly admitted and modified them via the Deming “PDCA” Cycle for Continuous Improvement:
      Wikipedia: PDCA (plan-do-check-act) is an iterative four-step problem-solving process typically used in business process improvement. It is also known as the Deming cycle, Shewhart cycle, Deming wheel, or plan-do-study-act.

  8. PDCA was made popular by Dr. W. Edwards Deming, who is considered by many to be the father of modern quality control; however it was always referred to by him as the “Shewhart cycle”. Later in Deming’s career, he modified PDCA to “Plan, Do, Study, Act” (PDSA) so as to better describe his recommendations.

    The concept of PDCA is based on the scientific method, as developed from the work of Francis Bacon (Novum Organum, 1620). The scientific method can be written as “hypothesis” – “experiment” – “evaluation”; or plan, do, and check… According to Deming, during his lectures in Japan in the early 1950s, the Japanese participants revised the steps to the now traditional plan, do, check, act.

    Deming preferred plan, do, study, act (PDSA) because “study” has connotations in English closer to Shewhart’s intent than “check”.

    Wikipedia: William Edwards Deming “(October 14, 1900 – December 20, 1993) was an American statistician, professor, author, lecturer, and consultant. He is perhaps best known for his work in Japan. There, from 1950 onward he taught top management how to improve design (and thus service), product quality, testing and sales (the last through global markets), through various methods, including the application of statistical methods.

    Deming made a significant contribution to Japan’s later reputation for innovative high-quality products and its economic power. He is regarded as having had more impact upon Japanese manufacturing and business than any other individual not of Japanese heritage. Despite being considered something of a hero in Japan, he was only just beginning to win widespread recognition in the U.S. at the time of his death.” (Wikipedia)


    Though virtually unknown and unappreciated in the US, Deming is almost solely responsible for the transformation of Japanese industry from having, in my childhood, a reputation for manufacturing cheap junk goods to, by the mid-70’s, a reputation as the maker of the world’s highest quality and highest value automobiles, electronics , and many other consumer goods. Though his ideas of continuous improvement were originally widely rejected in the US until recently because they did not fit with autocratic US corporate culture, in the 80’s and 90’s US industry imported many Japanese manufacturing consultants  due to the reputation for quality and efficiency that Japan had gained, ironically, as a direct result of adopting Deming’s ideas.

    Demings ideas, rejected by US captains of industry for decades, swept through the entire Asian world and are largely responsible for the fact that Asian manufacturers are still kicking US industry’s ass today in markets as diverse as cars, cell phones, personal computers, and solar cells. Where would American workers be without such enlightened and visionary US corporate management? Perhaps still in the middle class instead of in unemployment lines or among the the ranks of the working poor.

    Deming’s PDCA continuous improvement cycle constitutes the next four steps (8 through 11) of AHA!

  9. PLAN

    Establish the objectives and cognitive processes necessary to deliver results in accordance with the expected output. By making the expected output the focus, it differs from other techniques in that the completeness and accuracy of the specification is also part of the improvement.

  10. DO
    Implement the new cognitive processes . Often on a small group scale if possible.
  11. CHECK
    Measure the new cognitive processes and compare the results against the expected results to ascertain any differences.
  12. ACT
    Analyze the differences to determine their cause. Each will be part of either one or more of the P-D-C-A steps. Determine where to apply cognitive changes that will include improvement. When a pass through these four steps does not result in the need to improve, refine the scope to which PDCA is applied until there is a plan that involves improvement.
  13. The final step of the AHA! twelve-step program: Having had a cognitive awakening and transformation as the result of these steps, we try to carry this message to others, and to practice these principles in all our affairs.

The principles and practices of cognitive modification (cognitive hygiene)

  1. Linguistic education and training. Although humans could think before they developed language, language has now become the brick, mortar, timber, metal and glass of conscious, rational thought and social interaction. Cognitive hygiene requires education and practice in formal language but not merely for the sake of constructing well-formed internal thoughts. Improving cognitive hygiene depends heavily on group dynamics and interaction, which require a common set of effective communication skills. Language is one of the most recently evolved abilities of the brain, and language training reinforces the dominance of the higher reasoning centers and networks over the earlier and more primitive functions of the brain. In addition to vocabulary, grammar, English usage and style, etc. (linguistic prescription), linguistic education involves the skills of well-crafted logic and argument (debate). The recognition of logical fallacies, a major cognitive pitfall, can be taught best in this context as well.
    Wikipedia: The origin of language:The main difficulty of the question [of language origins] stems from the fact that it concerns a development in deep prehistory which left no direct fossil traces and for which no comparable processes can be observed today.[2]The time range under discussion in this context extends from the phylogenetic separation of Homo and Pan some 5 million years ago to the emergence of full behavioral modernity some 50,000 years ago. The evolution of fully modern human language requires the development of the vocal tract used for speech production and the cognitive abilities required to produce linguistic utterances… It is mostly undisputed that pre-human australopithecines did not have communication systems significantly different from those found in great apes in general, but scholarly opinions vary as to the developments since the appearance of Homo some 2.5 million years ago. Some scholars assume the development of primitive language-like systems (proto-language) as early as Homo habilis, while others place the development of primitive symbolic communication only with Homo erectus (1.8 million years ago) or Homo heidelbergensis (0.6 million years ago) and the development of language proper with Homo sapiens sapiens less than 100,000 years ago.Wikipedia: History of concepts of the origin of language

    Thomas Hobbes, followed by John Locke and others, said that language is an extension of the “speech” that humans have within themselves as part of reason, one of the most primary characteristics of human nature. Hobbes in Leviathan while postulating as did Aristotle that language is a prerequisite for society, attributed it to innovation and learning after an initial impulse by God:[16]

    But the most noble and profitable invention of all others was that of speech … whereby men register their thoughts, recall them when they are past, and also declare them to one another for mutual utility and conversation; without which there had been amongst men neither commonwealth, nor society, nor contract, nor peace, no more than amongst lions, bears and wolves. The first author of speech was God himself, that instructed Adam how to name such creatures as He presented to his sight; for the Scripture goeth no further in this matter.”

    In Hobbes, man proceeds to learn on his own initiative all the words not taught by God: “figures, numbers, measures, colours ….” which are taught by “need, the mother of all inventions.” Hobbes, one of the first rationalists of the Age of Reason, identifies the ability of self-instruction as reason:[17]

    “For reason, in this sense, is nothing but reckoning … of the consequences of general names agreed upon for the marking and signifying of our thoughts; ….”

    Others have argued the opposite, that reason developed out of the need for more complex communication. Rousseau, despite writing[18] before the publication of Darwin‘s theory of evolution, said that there had once been humans with no language or reason who developed language first, rather than reason, the development of which he explicitly described as a mixed blessing, with many negative characteristics.

    Since the arrival of Darwin, the subject has been approached more often by scientists than philosophers. For example, neurologist Terrence Deacon in his Symbolic Species has argued that reason and language “coevolved“. Merlin Donald sees language as a later development building upon what he refers to as mimetic culture,[19] emphasizing that this coevolution depended upon the interactions of many individuals. He writes:

    A shared communicative culture, with sharing of mental representations to some degree, must have come first, before language, creating a social environment in which language would have been useful and adaptive.[20]

    The specific causes of the natural selection that led to language are, however, still the subject of much speculation, but a common theme going back to Aristotle is that many theories propose that the gains to be had from language and/or reason were probably mainly in the area of increasingly sophisticated social structures.

    In more recent times, a theory of mirror neurons has emerged in relation to language. Ramachandran[21] has gone so far as to argue that “mirror neurons will do for psychology what DNA did for biology: they will provide a unifying framework and help explain a host of mental abilities that have hitherto remained mysterious and inaccessible to experiments”. Mirror neurons are located in the human inferior frontal cortex and superior parietal lobe, and are unique in that they fire when one completes an action and also when one witnesses an actor performing the same action. Various studies have proposed a theory of mirror neurons related to language development.[22][23][24]

  2. Cognitive neuroscience education. General and specific material to support all the other cognitive modification goals and practices.
  3. Unconscious cognitive biases, implicit associations, and repetitive automatic thoughts. These items can be identified and quantified by computer tests and questionnaires. Once identified, they become the subjects of self-observation and cognitive de-conditioning/retraining via cognitive behavioral therapy (CBT), cognitive restructuring, and similar methods.
  4. Attention training. This includes self-observation of mental and physical states and behaviors, focused attention, dual attention, mindfulness mediation, memory practices, brainwave bio-feedback, Tai Chi (proprioceptive awareness) etc.
  5. Behavior modification. Behavioral self-awareness, applied behavior analysis, operant conditioning, etc.
  6. Radical honesty.
  7. Cognitive integration and meta-cognition (thinking about thinking). Methods and practices to improve the integration and coordination of existing neural networks and functional centers, and reverse the bottom-up evolutionary order of precedence so that the highest (most recent) cognitive areas and functions such as the rational neo-cortex take greater precedence over more primitive areas like the “limbic” (emotional) system. This area of work is still largely speculative and experimental and at this date would consist primarily of research.
  8. The Deming System of Profound Knowledge. This is a catch-all category for research into ways that Deming’s theories of quality control and continuous improvement can be applied to cognitive modification.

    “The prevailing [default] style of management [cognition] must undergo transformation. A system cannot understand itself. The transformation requires a view from outside….

    “The first step is transformation of the individual. This transformation is discontinuous. It comes from understanding of the system of profound knowledge. The individual, transformed, will perceive new meaning to his life, to events, to numbers, to interactions between people.

    “Once the individual understands the system of profound knowledge, he will apply its principles in every kind of relationship with other people. He will have a basis for judgment of his own decisions and for transformation of the organizations that he belongs to. The individual, once transformed, will:

    • Set an example;
    • Be a good listener, but will not compromise;
    • Continually teach other people; and
    • Help people to pull away from their current practices and beliefs and move into the new philosophy without a feeling of guilt about the past.”

    Deming advocated that all managers need to have what he called a System of Profound Knowledge, consisting of four parts:

    1. Appreciation of a system: understanding the overall processes involving suppliers, producers, and customers (or recipients) of goods and services (explained below);
    2. Knowledge of variation: the range and causes of variation in quality, and use of statistical sampling in measurements;
    3. Theory of knowledge: the concepts explaining knowledge and the limits of what can be known (see also: epistemology);
    4. Knowledge of psychology: concepts of human nature.

    Deming explained, “One need not be eminent in any part nor in all four parts in order to understand it and to apply it. The 14 points for management in industry, education, and government follow naturally as application of this outside knowledge, for transformation from the present style of Western management to one of optimization.”

    “The various segments of the system of profound knowledge proposed here cannot be separated. They interact with each other. Thus, knowledge of psychology is incomplete without knowledge of variation.

    “A manager of people needs to understand that all people are different. This is not ranking people. He needs to understand that the performance of anyone is governed largely by the system that he works in, the responsibility of management. A psychologist that possesses even a crude understanding of variation as will be learned in the experiment with the Red Beads (Ch. 7) could no longer participate in refinement of a plan for ranking people.”[21]

    The Appreciation of a system involves understanding how interactions (i.e., feedback) between the elements of a system can result in internal restrictions that force the system to behave as a single organism that automatically seeks a steady state. It is this steady state that determines the output of the system rather than the individual elements. Thus it is the structure of the organization rather than the employees, alone, which holds the key to improving the quality of output.

    The Knowledge of variation involves understanding that everything measured consists of both “normal” variation due to the flexibility of the system and of “special causes” that create defects. Quality involves recognizing the difference to eliminate “special causes” while controlling normal variation. Deming taught that making changes in response to “normal” variation would only make the system perform worse. Understanding variation includes the mathematical certainty that variation will normally occur within six standard deviations of the mean.

  9. Cognitive self-help Group: In addition to serving as one venue or vehicle for many of the preceding methods and practices, the group setting promotes non-verbal communication skills, listening, assertiveness, boundaries, and numerous other social and cognitive skills.
  10. Wikipedia: Mental health self-help groups: In most cases, the group becomes a miniature society that can function like a buffer between the members and the rest of the world.[19] The most essential processes are those that meet personal and social needs in an environment of safety and simplicity. Elegant theoretical formulations, systematic behavioral techniques, and complicated cognitive-restructuring methods are not necessary.[11]

    Despite the differences, researchers have identified many psychosocial processes occurring in self-help groups related to their effectiveness. This list includes, but is not limited too: acceptance, behavioral rehearsal, changing member’s perspectives of themselves, changing member’s perspectives of the world, catharsis, extinction, role modeling, learning new coping strategies, mutual affirmation, personal goal setting, instilling hope, justification, normalization, positive reinforcement, reducing social isolation, reducing stigma, self-disclosure, sharing (or “opening up”), and showing empathy.[5][6][8][11][19][20][21]

    Five theoretical frameworks have been used in attempts to explain the effectiveness of self-help groups.[5]

    1. Social support: Having a community of people to give physical and emotional comfort, people who love and care, is a moderating factor in the development of psychological and physical disease.
    2. Experiential knowledge: Members obtain specialized information and perspectives that other members have obtained through living with severe mental illness. Validation of their approaches to problems increase their confidence.
    3. Social learning theory: Members with experience become creditable role models.
    4. Social comparison theory: Individuals with similar mental illness are attracted to each other in order to establish a sense of normalcy for themselves. Comparing one another to each other is considered to provide other peers with an incentive to change for the better either through upward comparison (looking up to someone as a role model) or downward comparison (seeing an example of how debilitating mental illness can be).
    5. Helper theory: Those helping each other feel greater interpersonal competence from changing other’s lives for the better. The helpers feel they have gained as much as they have given to others. The helpers receive “personalized learning” from working with helpees. The helpers’ self-esteem improves with the social approval received from those they have helped, putting them an a more advantageous position to help others.

    A framework derived from common themes in empirical data describes recovery as a contextual nonlinear process, a trend of general improvement with unavoidable paroxysms while negotiating environmental, socioeconomic and internal forces, motivated by a drive to move forward in one’s life. The framework identified several negotiation strategies, some designed to accommodate illnesses and other’s designed to change thinking and behavior. The former category includes strategies such as acceptance and balancing activities. The latter includes positive thinking, increasing one’s own personal agency/control and activism within the mental health system.[22]

  11. Community. This could range from a community of affiliated cognitive self-help groups to one or more complex, self-reliant, and sustainable communities or “micro-cultures” serving a broad variety of social, educational, and economic functions with cognitive modification at the core of each one. Such a micro-culture could provide a full spectrum of venues, each having appropriate cognitive hygiene processes and objectives at its core in addition to its other activity:
    • cognitive self-help groups
    • skilled trades and professional work groups
    • green agriculture, cottage industries, and commercial enterprises
    • medical, professional, and scientific facilities
    • formal educational venues
    • and many others

    Such a complex, cognitively optimized community might offer the most effective possible matrix for rapid human cognitive development.

Poor Richard
7/29/2010

Externalizing Reality

In economic theory, an externality is any cost or benefit not accounted for in a calculation of profit or loss. Classic examples are the cost of pollution not included in the price of a manufactured product, the death of coal miners not included in the price of electricity, and the cost of mass murder or the little matter of global warming not included in the price of oil and gasoline.

Economic externalities are only a small subset of a more general category I call cognitive externalities–anything that is filtered out of our mental picture of the world around us.

We all externalize parts of reality, not because they are unknowable, but because they are unpleasant or inconvenient. That is the principal basis of all our corruption, all our dis-enlightenment. We all do it. Its in our DNA. But the costs or consequences of externalities in economic models or in any other domain of reality, are disproportionately borne by the poor and powerless. One of the worst examples of externalized reality is this: despite some remnants of local color from country to country, the new world order is a global East India Company with helicopter gunships. A Martian anthropologist studying the last five thousand years or so of human history would have to conclude that the primary industry of our species is conducting mass murder for profit and that the masses, even in the dominant cultures, have all devolved into cargo cults.

If cargo cults are mentioned in anyone’s personal library of mental narratives they probably take the form of a story about the peculiar behavior of small numbers of black natives somewhere on the coast of Africa in some prior century. Am I the only person with a story in her head about how that same behavior shows through in all of us under the euphemistic label of “consumerism”?

People live by stories. Each person’s head holds a library of short and long narratives and we pull one off the shelf that fits something about any particular situation or circumstance we meet from moment to moment. Too often these stories are on the level of children’s picture books, suggesting simple but wrong solutions to complex problems or situations. Most of us have stories about history that are wrong, stories about our families that are wrong, stories about nature that are wrong, and stories about ourselves that are wrong. And anything that doesn’t exist in the current active mental story, right or wrong, is externalized from a person’s reality in that moment.

Sometimes, reality is externalized on purpose. The principle weapon of special interests today is information asymmetry, a simple idea (better known to most of us as fraud, deception, marketing, public relations, spin, infotainment, etc.) that won a Nobel Prize for economics. This has resulted in a vast and thriving industry of disinformation and information pollution that corrupts and perverts every institution of society. But by far the most destructive lies are the ones we tell ourselves.

Our addiction to self delusion is encouraged and enabled by a liar’s code. If you don’t unmask me I won’t defrock you. Popes, presidents, senators, CEO’s, teachers, and parents set the example for one and all.

Of course there is such a thing as an ethical (justified) lie, a lesser evil than some dire alternative, but self deception dissolves sanity itself. Identity itself becomes externalized. Self awareness fails and then, as Yeats said, “Things fall apart; the center cannot hold; Mere anarchy is loosed upon the world.” This is the truly unpardonable sin. But it won’t be avoided by force of will, strength of character, or high moral ideals. Our cognitive deformity, self-delusion, settled upon us by evolution, will be undone not by willpower, for which humanity is not noted, but mostly by wit, art and innovation–things we are good at.

The opposite of the unpardonable sin of self deception is liberation from self-imposed delusion–especially delusions about ourselves. The ability to tolerate cognitive dissonance and look clearly at uncomfortable facts is the essence of authentic enlightenment. It was inscribed on the entrance of the ancient Greek Temple of the Oracle at Delphi: “Know Thyself.”

Externalizing inconvenient reality (sometimes called denial, self deception, willful ignorance, or preserving cognitive consonance) is a coping mechanism. I would never suggest that we discard a coping mechanism without replacing the truly protective parts of it with something new. In fact with many, many new things.

The Greeks knew what they didn’t know (self-knowledge) but their philosophical methods were empirically weak. Today we know how to come by that knowledge–by the scientific method. We must discover and invent new cognitive prophylactics and prosthetics not as Sir Thomas Moore invented Utopia or as Reagan-era bean counters invented “Trickle-Down Economics”, but as Eli Whitney invented the cotton gin: with all the real working parts. We need a science and technology of cognitive hygiene and end-to-end information quality control. Despite living in an “age of science,” we still mostly resort to authority and reputation to judge the quality of information. I guess there are many reasons that “fact checking” remains in the dark ages. Information Quality Management is fine for database administrators, but we human beings reserve the right to our own facts, just as we reserve the right to mate with the worst possible partner. Still, without surrendering such rights, it might be nice if the scientific/academic community devoted more effort to producing a science and technology of information quality assurance that we could consult or ignore at our own risk.

In addition to empirical knowledge, like that which we might gain from brain signals, functional MRI pictures, or implicit association tests, enlightenment grows from coaching and practice with the object of re-engineering faulty parts of the operating system of the brain. Unlike genetic engineering, it requires exercise and training much as any physical, athletic ability.

I’m not drumming up a utopia built on some cult of cognitive science. But we MUST discover alternative practical means to protect ourselves from that suffering which we seek to evade by externalizing reality. As we do, we may find that workable solutions to nearly every other problem and crisis are already on the table.

Poor Richard

“The Beginning of Wisdom 3.0”

“The Enlightenment 2.0″

“The Inner Hunchback”

“Is Spiritual the New Supernatural?”