Perrenial Wisdom vs Cultural Graffiti

Many of the earliest known teachings about wisdom and self-knowledge include methods of meditation and breathing. We find these teachings in every culture around the world, past and present. We can trace the teachings back to every dead culture that we have evidence for. It is possible that these teachings originate before writing, and perhaps before language itself. The wisdom teachings we know today as meditation and breathing methods may go back to our earliest cognitive experiences of  self-awareness, even before the dawn of Homo sapiens.

Although such ancient teachings may have had independent origins in many different times and places, I suspect that all these teachings about meditation and breathing may have a common source, and that source might actually be our own DNA.

There are numerous instinctive patterns of breathing and mental states that are suitable to different types of activity.

Although there is probably some cognitive “wisdom” correlated with all such possible states, here I only want to focus on the state of being at rest or repose or prior to sleep; and in particular the state of intentional self-calming and relaxation on a mental and physical level concurrently. This could be characterized as a process of progressive relaxation and breathing in a characteristic way. Although the optimal patterns, processes, and progressions are only vaguely defined, they include

  • Synchronization of circulation, muscle action, organ functions, neural networks, etc.
  • Process cycles, oscillators, timing and re-timing cascades
  • Self-massage of internal organs and muscle sets. Optimizing circulation.
  • Balance and equilibrium approached progressively, step by step, through repeated cycles of stretching (inhalation) and relaxing (exhalation).

Muscles are like taffy. They tend to stiffen into any long-held position. Stretch-relax cycles plasticize the tissue, allow stress and balance redistribution, equilibrium.

Three stages:

  1. Beginning: The first step is mental, attention turns from thought to quiet observation of body, muscles, breath, etc.
  2. Middle: as tensions are located and relaxed, the breathing pattern gradually gets deeper, larger, and more synchronized. The deep breathing creates passive mobilization of joints, alternate muscle sets, and organs. All gradually assume their ideal positions and orientation.
  3. Ending: When all the tissues and organs have experienced a sufficient period of rhythmic flexing and have assumed their ideal configuration, the breathing gets shallower until motion becomes minimized at a level just adequate to provide sufficient blood/air material exchange in the lungs to support the resting, or idling, body and brain.

This type of meditation and breathing is ideal for introducing sleep but can also be practiced as a “tune up” in many other situations.

These three stages of progression probably have many subdivisions that could be teased out by subtle physiological measurements.

At each point along the progression of this process, with each breath, there may be an optimal rate of inhalation and exhalation, an optimal volume of expansion and contraction, and an optimal pause in transition. These parameters can not be prescribed by theory or taught in any exact, static, one-size-fits-all paradigm. It can only be anticipated that such optimums exist and that they can and must be directly sensed by each individual in each moment.

Each species presumably has it own distinctive set of patterns for this process of retuning or toning the brain and body.

These instinctive patterns have not been invented by us. They have been invented by evolution and are only crudely understood by the mind and by the traditions that attempt to “teach” them.

Our cognitive attempts to recognize, ponder, and practice this process consciously could possibly be the very first example of organic biology percolating up into mindfulness and ultimately taking shape as individual and collective human wisdom.

Cognitive models of these phenomena vary. Many are crude and take off in tangents influenced by many possible agendas. Most end up in “the weeds”. But as long as the core elements of calming thought, turning attention to finding and releasing tensions, and expanding breathing cycles remain, each individual has an opportunity to follow a gradient in these perceptual “tastes” towards the “zone” of equilibration and instinctive “master rhythm”. This process may involve not only joints, tissues and organs but perhaps even cellular and sub-cellular processes. It may also involve rerouting neural network/module inputs and outputs in novel ways that increase functional neural integration and/or re-order the functional dominance hierarchy of one area or network over another. One example would be new or increased mapping of the motor, parasympathetic, or proprioceptive neural networks onto the neocortex. The neocortex is the newest and outermost layer of the brain that seems to be all about integrating, organizing, and coordinating older brain areas in new ways. Practices such as yoga and tai chi may also help to increase brain integration in a similar way.

What makes this authentic wisdom is that it is self-knowledge gained through a combination of self-observation and rational thought. It is a combination of instinctive biological activity with conscious cognitive functions in which each enhances the other. The instinctive pattern of breathing becomes associated with a passive but observant mental state and the conscious mind becomes able to initiate and facilitate the instinctive activity. Such combinations promote new and higher cognitive and biological integrations. I think this is the reality base and the functional nucleus of most forms of yoga, meditation, and other “spiritual” practices. Unfortunately, this core reality often gets obfuscated or entirely lost in inappropriate language, ideas, and extraneous associations. Then the potential for wisdom is perverted into its own opposite, magical thinking and folly.

I guess this is my non-mystical version of the “perennial philosophy“.  People in all cultures and all ages find a few common truths hidden in plain sight and proceed to embellish them with all sorts of extraneous associations drawn from the culture, natural history, and language of the respective place and time. This process further buries and obscures what was always hidden in plain sight under more and more layers of cultural graffiti.

The human brain is brilliant at connecting dots. List three facts on a blackboard and anybody can instantly make up a story about them. In fact, we can’t help it. We have to make up a story. Our brains are wired that way. The problem is that we go around spontaneously and compulsively connecting all kinds of dots that really shouldn’t be connected. Pretty soon we are all lost in the tall weeds of our own bullshit.

“Just the facts, Ma’am.”

Did Dragnet’s Sgt. Joe Friday really say that–or not?

Poor Richard

Related PRA 2010 posts:

Know Then Thyself

by Alexander Pope

Know then thyself, presume not God to scan
The proper study of mankind is man.
Placed on this isthmus of a middle state,
A being darkly wise, and rudely great:
With too much knowledge for the sceptic side,
With too much weakness for the stoic’s pride,
He hangs between; in doubt to act, or rest;
In doubt to deem himself a God, or beast;
In doubt his mind and body to prefer;
Born but to die, and reas’ning but to err;
Alike in ignorance, his reason such,
Whether he thinks to little, or too much;
Chaos of thought and passion, all confus’d;
Still by himself, abus’d or disabus’d;
Created half to rise and half to fall;
Great lord of all things, yet a prey to all,
Sole judge of truth, in endless error hurl’d;
The glory, jest and riddle of the world.

Advertisements

AHA!

AHA! = Average Humans Anonymous!

(A 12-step program for cognitive enhancement)

What is an “average” human?

Modern humans are known taxonomically as Homo sapiens (Latin: “wise man” or “knowing man”).

Mitochondrial DNA and fossil evidence indicates that anatomically modern humans originated in Africa about 200,000 years ago. (Wikipedia: Homo Sapiens)

Of course, 200,000 years ago we were not nearly as wise or knowing, not nearly as sapient, as we are (or think we are) today.

Behavioral modernity is a term used in anthropology, archeology and sociology to refer to a set of traits that distinguish present day humans and their recent ancestors from both living primates and other extinct hominid lineages. It is the point at which Homo sapiens began to demonstrate a reliance on symbolic thought and to express cultural creativity. These developments are often thought to be associated with the origin of language.[1]

There are two main theories regarding when modern human behavior emerged.[2] One theory holds that behavioral modernity occurred as a sudden event some 50 kya (50,000 years ago), possibly as a result of a major genetic mutation or as a result of a biological reorganization of the brain that led to the emergence of modern human natural languages.[3] Proponents of this theory refer to this event as the Great Leap Forward[4] or the Upper Paleolithic Revolution.

The second theory holds that there was never any single technological or cognitive revolution. Proponents of this view argue that modern human behavior is basically the result of the gradual accumulation of knowledge, skills and culture occurring over hundreds of thousands of years of human evolution.[5] Proponents of this view include Stephen Oppenheimer in his book Out of Eden, and John Skoyles and Dorion Sagan in their book Up from Dragons: The evolution of human intelligence. (Wikipedia: Behavioral Modernity)

Whenever behavioral modernity may have settled upon Homo sapiens, the beginnings of it are lost in prehistory, in past ages far before we have any clear and unambiguous physical or historical evidence.  The fields of evolutionary psychology and behavioral genetics promise to shed new light on the origins of modern human behavior but they are only in the very early stages of their own evolution as scientific genres.

Nevertheless, the important point is that the typical, average, or normal human has a brain that is an evolutionary work-in-progress.

We only invented agriculture about 10,000 years or so ago, which in brain-evolution time is like ten seconds ago. In that 10,000 years (or ten seconds) our brains have not had time to really get it right. Our agricultural methods are still causing too much long-term damage to the very resources we depend on to continue being productive in the future. Instead of improving the resource base over time, as brainless nature does, we are still destroying it faster than ever before. The situation with energy and manufacturing is just as bad. Our technology develops at a far greater pace than our brains, which we use to plan and manage the applications of the technology, hoping to maximize productivity and avoid drastic unintended consequences.

Our track record is not so good.

Interesting times

May you live in interesting times, often referred to as the Chinese curse, was the first of three curses of increasing severity, the other two being:

  • May you come to the attention of those in authority
  • May you find what you are looking for

(Wikipedia)

It is only in very recent, recorded history that humanity has come so close to achieving true greatness. Only recently have the consequences of human behavior become so great and so visible.  That makes the present day the most interesting time in all of human history.

In the past, the planetary environment was vast in proportion to all the cumulative impacts of human populations. Over a fairly recent period of time, however, humanity has turned a corner or crossed a tipping point where the environment is no longer large enough to fully absorb and erase all the effects that human activity creates. Those human effects are overtaking the planet’s homeostatic systems and causing ecological processes and environments to degrade or permanently fail. We can see this in species extinctions, failing hydrological systems, changing ocean currents and weather systems, and now even in planetary temperature regulation and rising sea levels.

The most interesting thing about these times is the extent to which the external world has become our mirror. Almost everything that’s wrong with our culture and our environment now is a result of human behavior and can be traced backwards to an evolutionary origin in the normal, anatomically and behaviorally modern, human brain.

Plato’s Allegory of the Cave

Animation of Plato’s Cave

Madness and normality

The problem with the modern human brain isn’t what we call clinical, DSM-level, mental illness–it is sub-clinical. The problem is normality–which includes standard, predictable cognitive faults, irregularities, and distortions that belong to many kinds of so-called “spectrum disorders” but fall below the accepted level of clinical severity or are just too complex to disentange.

It is the pandemic of typical, sub-clinical mental faults that causes poverty, crime,  global warming, oil spills, Iraq & Afghan wars, financial crisis, bad government, etc. Any behavior which produces negative utility is irrational.

The main reason our times have become so “interesting” is not disease, not resource scarcity, not over population. The root problem is our normal thinking and our typical behavior. We could cure all physical illness, all clinical mental illness, all poverty, war, etc. and we would still be hurtling just as fast (and probably even faster!) towards our own self-destruction! The problem is not what we have traditionally seen as illness or scarcity or other external threats. The problem is normality!

The root cause of our threatened survival is installed inside of every “healthy”, “normal” human being.

It’s in our DNA!

The standard brain: our normal cognitive faults, boo-boos, crutches, and placebos:

  • excessive bias towards simplicity and popularity of ideas and beliefs with little regard for accuracy
  • intolerance of ambiguity and cognitive dissonance
  • excessive sensitivity to emotional states and excessive positive bias (leading to addictions)
  • unconscious mental associations, cognitive biases, and behavior patterns
  • black-box (unconscious or pre-conscious) decision making with post hoc rationale
  • reverse-precedence cognitive hierarchy (highest, most recently evolved cognitive functions have lowest precedence)
  • fragmentation/compartmentalization (weak integration) of values, goals, personality, identity, and memory components
  • weak self-observation and attention management
  • automatic thoughts and behaviors (autopilot)
  • dishonesty
  • corruption
  • magical thinking (errors of causal association)
  • unconscious logical fallacies/errors
  • cultural biases (reinforcements for conformity, educational agenda biases, neuro-linguistic “dialects”, memes, etc.)
  • random and inconsistent neural programming (spaghetti code) from random experience/reinforcements
  • inappropriate psychological defense mechanisms (denial, self-delusion, wishful thinking, etc.)
  • linguistic deficiencies (formal thinking requires linguistic/grammatical/logical proficiency)

The 12 Steps of AHA!

By which we attempt to correct as many of the above cognitive boo-boos as possible:

We…

  1. Admit we are powerless over our thoughts, emotions, and moods; and over our sub-clinical  neurotic or impulsive behavior disorders and cognitive disorders—that our lives have become unmanageable, and if we don’t fix ourselves, our species will probably hit the wall in fifty years or less.

    “The subjective experience of powerlessness over one’s emotions can generate multiple kinds of behavior disorders, or it can be a cause of mental suffering with no consistent behavioral manifestation, such as affective disorders.” (Wikipedia: Emotions Anonymous)

    “The cognitive mental disorder perspective is the theory that psychological disorders originate from an interruption, whether short or long, in our basic cognitive functions, i.e. memory processing, perception, problem solving and language. In distinction (or in addition) to this perspective are the psychodynamic mental disorder perspective, behavioral mental disorder perspective, sociocultural mental disorder perspective, interpersonal mental disorder perspective and neurological/biological mental disorder perspective. One pioneer of cognitive disorder perspective is Albert Ellis. In 1962, Ellis proposed that humans develop irrational beliefs/goals about the world; and therefore, create disorders in cognitive abilities[1]. Another pioneer of the cognitive disorder perspective is Aaron Beck. In 1967, Beck designed what is known as the “cognitive model” for emotional disorders, mainly depression[2]. His model showed that a blending of negative cognitive functions about the self, the world, and possible selves lead to cognitive mental disorders.” (Wikipedia: Cognitive disorders).

    Nearly all forms of clinical mental illness, such as post-traumatic stress disorder, (PTSD), Attention-Deficit Hyperactivity Disorder (ADHD), Obsessive–compulsive disorder (OCD), and Dissociative identity disorder (multiple personality disorder), have sub-clinical counterparts in nearly all normal individuals.

  2. Came to believe that a higher, more stable, and more consistent level of cognitive integration and functionality could be achieved through work on cognitive modification.
  3. Made a searching and fearless cognitive inventory of ourselves. A cognitive inventory consists of a self-assessment and a coached/group assessment of our cognitive faults (see list of “The standard human cognitive faults, boo-boos, crutches and placebos” above) using various assessment tools, tests, surveys, monitored exercises, etc.

    The Deming System of Profound Knowledge

    “The prevailing [default] style of [cognition] must undergo transformation. A system cannot fully understand itself. The transformation requires a view from outside….”

    “The first step is transformation of the individual. This transformation is discontinuous. It comes from understanding of the system of profound knowledge. The individual, transformed, will perceive new meaning to his life, to events, to numbers, to interactions between people.” (More on this later…)

  4. Admitted to ourselves and to others in our group the exact nature of our cognitive faults.
  5. Were entirely ready to give up all these cognitive defects and shortcomings.
  6. Made a list of all persons we had affected as a consequence of our cognitive defects, and made direct amends to such people wherever possible, except when to do so would harm them or others.
  7. Continued to take personal cognitive inventories and when we discovered faults promptly admitted and modified them via the Deming “PDCA” Cycle for Continuous Improvement:
      Wikipedia: PDCA (plan-do-check-act) is an iterative four-step problem-solving process typically used in business process improvement. It is also known as the Deming cycle, Shewhart cycle, Deming wheel, or plan-do-study-act.

  8. PDCA was made popular by Dr. W. Edwards Deming, who is considered by many to be the father of modern quality control; however it was always referred to by him as the “Shewhart cycle”. Later in Deming’s career, he modified PDCA to “Plan, Do, Study, Act” (PDSA) so as to better describe his recommendations.

    The concept of PDCA is based on the scientific method, as developed from the work of Francis Bacon (Novum Organum, 1620). The scientific method can be written as “hypothesis” – “experiment” – “evaluation”; or plan, do, and check… According to Deming, during his lectures in Japan in the early 1950s, the Japanese participants revised the steps to the now traditional plan, do, check, act.

    Deming preferred plan, do, study, act (PDSA) because “study” has connotations in English closer to Shewhart’s intent than “check”.

    Wikipedia: William Edwards Deming “(October 14, 1900 – December 20, 1993) was an American statistician, professor, author, lecturer, and consultant. He is perhaps best known for his work in Japan. There, from 1950 onward he taught top management how to improve design (and thus service), product quality, testing and sales (the last through global markets), through various methods, including the application of statistical methods.

    Deming made a significant contribution to Japan’s later reputation for innovative high-quality products and its economic power. He is regarded as having had more impact upon Japanese manufacturing and business than any other individual not of Japanese heritage. Despite being considered something of a hero in Japan, he was only just beginning to win widespread recognition in the U.S. at the time of his death.” (Wikipedia)


    Though virtually unknown and unappreciated in the US, Deming is almost solely responsible for the transformation of Japanese industry from having, in my childhood, a reputation for manufacturing cheap junk goods to, by the mid-70’s, a reputation as the maker of the world’s highest quality and highest value automobiles, electronics , and many other consumer goods. Though his ideas of continuous improvement were originally widely rejected in the US until recently because they did not fit with autocratic US corporate culture, in the 80’s and 90’s US industry imported many Japanese manufacturing consultants  due to the reputation for quality and efficiency that Japan had gained, ironically, as a direct result of adopting Deming’s ideas.

    Demings ideas, rejected by US captains of industry for decades, swept through the entire Asian world and are largely responsible for the fact that Asian manufacturers are still kicking US industry’s ass today in markets as diverse as cars, cell phones, personal computers, and solar cells. Where would American workers be without such enlightened and visionary US corporate management? Perhaps still in the middle class instead of in unemployment lines or among the the ranks of the working poor.

    Deming’s PDCA continuous improvement cycle constitutes the next four steps (8 through 11) of AHA!

  9. PLAN

    Establish the objectives and cognitive processes necessary to deliver results in accordance with the expected output. By making the expected output the focus, it differs from other techniques in that the completeness and accuracy of the specification is also part of the improvement.

  10. DO
    Implement the new cognitive processes . Often on a small group scale if possible.
  11. CHECK
    Measure the new cognitive processes and compare the results against the expected results to ascertain any differences.
  12. ACT
    Analyze the differences to determine their cause. Each will be part of either one or more of the P-D-C-A steps. Determine where to apply cognitive changes that will include improvement. When a pass through these four steps does not result in the need to improve, refine the scope to which PDCA is applied until there is a plan that involves improvement.
  13. The final step of the AHA! twelve-step program: Having had a cognitive awakening and transformation as the result of these steps, we try to carry this message to others, and to practice these principles in all our affairs.

The principles and practices of cognitive modification (cognitive hygiene)

  1. Linguistic education and training. Although humans could think before they developed language, language has now become the brick, mortar, timber, metal and glass of conscious, rational thought and social interaction. Cognitive hygiene requires education and practice in formal language but not merely for the sake of constructing well-formed internal thoughts. Improving cognitive hygiene depends heavily on group dynamics and interaction, which require a common set of effective communication skills. Language is one of the most recently evolved abilities of the brain, and language training reinforces the dominance of the higher reasoning centers and networks over the earlier and more primitive functions of the brain. In addition to vocabulary, grammar, English usage and style, etc. (linguistic prescription), linguistic education involves the skills of well-crafted logic and argument (debate). The recognition of logical fallacies, a major cognitive pitfall, can be taught best in this context as well.
    Wikipedia: The origin of language:The main difficulty of the question [of language origins] stems from the fact that it concerns a development in deep prehistory which left no direct fossil traces and for which no comparable processes can be observed today.[2]The time range under discussion in this context extends from the phylogenetic separation of Homo and Pan some 5 million years ago to the emergence of full behavioral modernity some 50,000 years ago. The evolution of fully modern human language requires the development of the vocal tract used for speech production and the cognitive abilities required to produce linguistic utterances… It is mostly undisputed that pre-human australopithecines did not have communication systems significantly different from those found in great apes in general, but scholarly opinions vary as to the developments since the appearance of Homo some 2.5 million years ago. Some scholars assume the development of primitive language-like systems (proto-language) as early as Homo habilis, while others place the development of primitive symbolic communication only with Homo erectus (1.8 million years ago) or Homo heidelbergensis (0.6 million years ago) and the development of language proper with Homo sapiens sapiens less than 100,000 years ago.Wikipedia: History of concepts of the origin of language

    Thomas Hobbes, followed by John Locke and others, said that language is an extension of the “speech” that humans have within themselves as part of reason, one of the most primary characteristics of human nature. Hobbes in Leviathan while postulating as did Aristotle that language is a prerequisite for society, attributed it to innovation and learning after an initial impulse by God:[16]

    But the most noble and profitable invention of all others was that of speech … whereby men register their thoughts, recall them when they are past, and also declare them to one another for mutual utility and conversation; without which there had been amongst men neither commonwealth, nor society, nor contract, nor peace, no more than amongst lions, bears and wolves. The first author of speech was God himself, that instructed Adam how to name such creatures as He presented to his sight; for the Scripture goeth no further in this matter.”

    In Hobbes, man proceeds to learn on his own initiative all the words not taught by God: “figures, numbers, measures, colours ….” which are taught by “need, the mother of all inventions.” Hobbes, one of the first rationalists of the Age of Reason, identifies the ability of self-instruction as reason:[17]

    “For reason, in this sense, is nothing but reckoning … of the consequences of general names agreed upon for the marking and signifying of our thoughts; ….”

    Others have argued the opposite, that reason developed out of the need for more complex communication. Rousseau, despite writing[18] before the publication of Darwin‘s theory of evolution, said that there had once been humans with no language or reason who developed language first, rather than reason, the development of which he explicitly described as a mixed blessing, with many negative characteristics.

    Since the arrival of Darwin, the subject has been approached more often by scientists than philosophers. For example, neurologist Terrence Deacon in his Symbolic Species has argued that reason and language “coevolved“. Merlin Donald sees language as a later development building upon what he refers to as mimetic culture,[19] emphasizing that this coevolution depended upon the interactions of many individuals. He writes:

    A shared communicative culture, with sharing of mental representations to some degree, must have come first, before language, creating a social environment in which language would have been useful and adaptive.[20]

    The specific causes of the natural selection that led to language are, however, still the subject of much speculation, but a common theme going back to Aristotle is that many theories propose that the gains to be had from language and/or reason were probably mainly in the area of increasingly sophisticated social structures.

    In more recent times, a theory of mirror neurons has emerged in relation to language. Ramachandran[21] has gone so far as to argue that “mirror neurons will do for psychology what DNA did for biology: they will provide a unifying framework and help explain a host of mental abilities that have hitherto remained mysterious and inaccessible to experiments”. Mirror neurons are located in the human inferior frontal cortex and superior parietal lobe, and are unique in that they fire when one completes an action and also when one witnesses an actor performing the same action. Various studies have proposed a theory of mirror neurons related to language development.[22][23][24]

  2. Cognitive neuroscience education. General and specific material to support all the other cognitive modification goals and practices.
  3. Unconscious cognitive biases, implicit associations, and repetitive automatic thoughts. These items can be identified and quantified by computer tests and questionnaires. Once identified, they become the subjects of self-observation and cognitive de-conditioning/retraining via cognitive behavioral therapy (CBT), cognitive restructuring, and similar methods.
  4. Attention training. This includes self-observation of mental and physical states and behaviors, focused attention, dual attention, mindfulness mediation, memory practices, brainwave bio-feedback, Tai Chi (proprioceptive awareness) etc.
  5. Behavior modification. Behavioral self-awareness, applied behavior analysis, operant conditioning, etc.
  6. Radical honesty.
  7. Cognitive integration and meta-cognition (thinking about thinking). Methods and practices to improve the integration and coordination of existing neural networks and functional centers, and reverse the bottom-up evolutionary order of precedence so that the highest (most recent) cognitive areas and functions such as the rational neo-cortex take greater precedence over more primitive areas like the “limbic” (emotional) system. This area of work is still largely speculative and experimental and at this date would consist primarily of research.
  8. The Deming System of Profound Knowledge. This is a catch-all category for research into ways that Deming’s theories of quality control and continuous improvement can be applied to cognitive modification.

    “The prevailing [default] style of management [cognition] must undergo transformation. A system cannot understand itself. The transformation requires a view from outside….

    “The first step is transformation of the individual. This transformation is discontinuous. It comes from understanding of the system of profound knowledge. The individual, transformed, will perceive new meaning to his life, to events, to numbers, to interactions between people.

    “Once the individual understands the system of profound knowledge, he will apply its principles in every kind of relationship with other people. He will have a basis for judgment of his own decisions and for transformation of the organizations that he belongs to. The individual, once transformed, will:

    • Set an example;
    • Be a good listener, but will not compromise;
    • Continually teach other people; and
    • Help people to pull away from their current practices and beliefs and move into the new philosophy without a feeling of guilt about the past.”

    Deming advocated that all managers need to have what he called a System of Profound Knowledge, consisting of four parts:

    1. Appreciation of a system: understanding the overall processes involving suppliers, producers, and customers (or recipients) of goods and services (explained below);
    2. Knowledge of variation: the range and causes of variation in quality, and use of statistical sampling in measurements;
    3. Theory of knowledge: the concepts explaining knowledge and the limits of what can be known (see also: epistemology);
    4. Knowledge of psychology: concepts of human nature.

    Deming explained, “One need not be eminent in any part nor in all four parts in order to understand it and to apply it. The 14 points for management in industry, education, and government follow naturally as application of this outside knowledge, for transformation from the present style of Western management to one of optimization.”

    “The various segments of the system of profound knowledge proposed here cannot be separated. They interact with each other. Thus, knowledge of psychology is incomplete without knowledge of variation.

    “A manager of people needs to understand that all people are different. This is not ranking people. He needs to understand that the performance of anyone is governed largely by the system that he works in, the responsibility of management. A psychologist that possesses even a crude understanding of variation as will be learned in the experiment with the Red Beads (Ch. 7) could no longer participate in refinement of a plan for ranking people.”[21]

    The Appreciation of a system involves understanding how interactions (i.e., feedback) between the elements of a system can result in internal restrictions that force the system to behave as a single organism that automatically seeks a steady state. It is this steady state that determines the output of the system rather than the individual elements. Thus it is the structure of the organization rather than the employees, alone, which holds the key to improving the quality of output.

    The Knowledge of variation involves understanding that everything measured consists of both “normal” variation due to the flexibility of the system and of “special causes” that create defects. Quality involves recognizing the difference to eliminate “special causes” while controlling normal variation. Deming taught that making changes in response to “normal” variation would only make the system perform worse. Understanding variation includes the mathematical certainty that variation will normally occur within six standard deviations of the mean.

  9. Cognitive self-help Group: In addition to serving as one venue or vehicle for many of the preceding methods and practices, the group setting promotes non-verbal communication skills, listening, assertiveness, boundaries, and numerous other social and cognitive skills.
  10. Wikipedia: Mental health self-help groups: In most cases, the group becomes a miniature society that can function like a buffer between the members and the rest of the world.[19] The most essential processes are those that meet personal and social needs in an environment of safety and simplicity. Elegant theoretical formulations, systematic behavioral techniques, and complicated cognitive-restructuring methods are not necessary.[11]

    Despite the differences, researchers have identified many psychosocial processes occurring in self-help groups related to their effectiveness. This list includes, but is not limited too: acceptance, behavioral rehearsal, changing member’s perspectives of themselves, changing member’s perspectives of the world, catharsis, extinction, role modeling, learning new coping strategies, mutual affirmation, personal goal setting, instilling hope, justification, normalization, positive reinforcement, reducing social isolation, reducing stigma, self-disclosure, sharing (or “opening up”), and showing empathy.[5][6][8][11][19][20][21]

    Five theoretical frameworks have been used in attempts to explain the effectiveness of self-help groups.[5]

    1. Social support: Having a community of people to give physical and emotional comfort, people who love and care, is a moderating factor in the development of psychological and physical disease.
    2. Experiential knowledge: Members obtain specialized information and perspectives that other members have obtained through living with severe mental illness. Validation of their approaches to problems increase their confidence.
    3. Social learning theory: Members with experience become creditable role models.
    4. Social comparison theory: Individuals with similar mental illness are attracted to each other in order to establish a sense of normalcy for themselves. Comparing one another to each other is considered to provide other peers with an incentive to change for the better either through upward comparison (looking up to someone as a role model) or downward comparison (seeing an example of how debilitating mental illness can be).
    5. Helper theory: Those helping each other feel greater interpersonal competence from changing other’s lives for the better. The helpers feel they have gained as much as they have given to others. The helpers receive “personalized learning” from working with helpees. The helpers’ self-esteem improves with the social approval received from those they have helped, putting them an a more advantageous position to help others.

    A framework derived from common themes in empirical data describes recovery as a contextual nonlinear process, a trend of general improvement with unavoidable paroxysms while negotiating environmental, socioeconomic and internal forces, motivated by a drive to move forward in one’s life. The framework identified several negotiation strategies, some designed to accommodate illnesses and other’s designed to change thinking and behavior. The former category includes strategies such as acceptance and balancing activities. The latter includes positive thinking, increasing one’s own personal agency/control and activism within the mental health system.[22]

  11. Community. This could range from a community of affiliated cognitive self-help groups to one or more complex, self-reliant, and sustainable communities or “micro-cultures” serving a broad variety of social, educational, and economic functions with cognitive modification at the core of each one. Such a micro-culture could provide a full spectrum of venues, each having appropriate cognitive hygiene processes and objectives at its core in addition to its other activity:
    • cognitive self-help groups
    • skilled trades and professional work groups
    • green agriculture, cottage industries, and commercial enterprises
    • medical, professional, and scientific facilities
    • formal educational venues
    • and many others

    Such a complex, cognitively optimized community might offer the most effective possible matrix for rapid human cognitive development.

Poor Richard
7/29/2010

Externalizing Reality

In economic theory, an externality is any cost or benefit not accounted for in a calculation of profit or loss. Classic examples are the cost of pollution not included in the price of a manufactured product, the death of coal miners not included in the price of electricity, and the cost of mass murder or the little matter of global warming not included in the price of oil and gasoline.

Economic externalities are only a small subset of a more general category I call cognitive externalities–anything that is filtered out of our mental picture of the world around us.

We all externalize parts of reality, not because they are unknowable, but because they are unpleasant or inconvenient. That is the principal basis of all our corruption, all our dis-enlightenment. We all do it. Its in our DNA. But the costs or consequences of externalities in economic models or in any other domain of reality, are disproportionately borne by the poor and powerless. One of the worst examples of externalized reality is this: despite some remnants of local color from country to country, the new world order is a global East India Company with helicopter gunships. A Martian anthropologist studying the last five thousand years or so of human history would have to conclude that the primary industry of our species is conducting mass murder for profit and that the masses, even in the dominant cultures, have all devolved into cargo cults.

If cargo cults are mentioned in anyone’s personal library of mental narratives they probably take the form of a story about the peculiar behavior of small numbers of black natives somewhere on the coast of Africa in some prior century. Am I the only person with a story in her head about how that same behavior shows through in all of us under the euphemistic label of “consumerism”?

People live by stories. Each person’s head holds a library of short and long narratives and we pull one off the shelf that fits something about any particular situation or circumstance we meet from moment to moment. Too often these stories are on the level of children’s picture books, suggesting simple but wrong solutions to complex problems or situations. Most of us have stories about history that are wrong, stories about our families that are wrong, stories about nature that are wrong, and stories about ourselves that are wrong. And anything that doesn’t exist in the current active mental story, right or wrong, is externalized from a person’s reality in that moment.

Sometimes, reality is externalized on purpose. The principle weapon of special interests today is information asymmetry, a simple idea (better known to most of us as fraud, deception, marketing, public relations, spin, infotainment, etc.) that won a Nobel Prize for economics. This has resulted in a vast and thriving industry of disinformation and information pollution that corrupts and perverts every institution of society. But by far the most destructive lies are the ones we tell ourselves.

Our addiction to self delusion is encouraged and enabled by a liar’s code. If you don’t unmask me I won’t defrock you. Popes, presidents, senators, CEO’s, teachers, and parents set the example for one and all.

Of course there is such a thing as an ethical (justified) lie, a lesser evil than some dire alternative, but self deception dissolves sanity itself. Identity itself becomes externalized. Self awareness fails and then, as Yeats said, “Things fall apart; the center cannot hold; Mere anarchy is loosed upon the world.” This is the truly unpardonable sin. But it won’t be avoided by force of will, strength of character, or high moral ideals. Our cognitive deformity, self-delusion, settled upon us by evolution, will be undone not by willpower, for which humanity is not noted, but mostly by wit, art and innovation–things we are good at.

The opposite of the unpardonable sin of self deception is liberation from self-imposed delusion–especially delusions about ourselves. The ability to tolerate cognitive dissonance and look clearly at uncomfortable facts is the essence of authentic enlightenment. It was inscribed on the entrance of the ancient Greek Temple of the Oracle at Delphi: “Know Thyself.”

Externalizing inconvenient reality (sometimes called denial, self deception, willful ignorance, or preserving cognitive consonance) is a coping mechanism. I would never suggest that we discard a coping mechanism without replacing the truly protective parts of it with something new. In fact with many, many new things.

The Greeks knew what they didn’t know (self-knowledge) but their philosophical methods were empirically weak. Today we know how to come by that knowledge–by the scientific method. We must discover and invent new cognitive prophylactics and prosthetics not as Sir Thomas Moore invented Utopia or as Reagan-era bean counters invented “Trickle-Down Economics”, but as Eli Whitney invented the cotton gin: with all the real working parts. We need a science and technology of cognitive hygiene and end-to-end information quality control. Despite living in an “age of science,” we still mostly resort to authority and reputation to judge the quality of information. I guess there are many reasons that “fact checking” remains in the dark ages. Information Quality Management is fine for database administrators, but we human beings reserve the right to our own facts, just as we reserve the right to mate with the worst possible partner. Still, without surrendering such rights, it might be nice if the scientific/academic community devoted more effort to producing a science and technology of information quality assurance that we could consult or ignore at our own risk.

In addition to empirical knowledge, like that which we might gain from brain signals, functional MRI pictures, or implicit association tests, enlightenment grows from coaching and practice with the object of re-engineering faulty parts of the operating system of the brain. Unlike genetic engineering, it requires exercise and training much as any physical, athletic ability.

I’m not drumming up a utopia built on some cult of cognitive science. But we MUST discover alternative practical means to protect ourselves from that suffering which we seek to evade by externalizing reality. As we do, we may find that workable solutions to nearly every other problem and crisis are already on the table.

Poor Richard

“The Beginning of Wisdom 3.0”

“The Enlightenment 2.0″

“The Inner Hunchback”

“Is Spiritual the New Supernatural?”

The beginning of wisdom 3.0

The Temple of Apollo at Delphi, Greece

Temple of Apollo at Delphi, Greece (Image via Wikipedia)

According to the Bible’s Psalms and Proverbs, the fear of the Lord is the beginning of wisdom. Solomon expresses a similar sentiment in the book of Ecclesiastes.

But long before the Bible was written, the greatest men and women in ancient times (times in which travel could be difficult and dangerous) journeyed from all over the world to the Temple of Apollo at Delphi seeking answers to their most burning questions.

Over the door of the temple they found the inscription “Know thyself”.

“Know Thyself”

The phrase “Know thyself” has been attributed to Pythagoras, Socrates, and a number of other Greek sages, but it is thought to have originated in pre-history, perhaps from the time of the Mother Goddess and the Gaia religion. It has been found in many other places, including ancient Icelandic runes.

In fact, it has been suggested that this phrase sums up the whole of ancient philosophy.

What does it mean?

The implication of the inscription’s exact placement above the entrance to the Temple at Delphi is perhaps that self-knowledge is a pre-condition to all further knowledge. In other words, “Know thyself before thou entereth in here and bother the Oracle.”  Without meeting that prerequisite, further inquiry may be pointless. You just might be wasting the Oracle’s precious time and your own.

The seat of consciousness? (Click image for full size)

But what is self-knowledge and how is it obtained? What is the self? Is it the body, the mind, the soul, or is it all of these? At least in the case of the ancient philosophers it was probably a combination of all three. The distinctions were not as clearly drawn then as they can be today. However, in the context of our modern perspective, it may be safe to say that the ancients were not really talking about knowing human anatomy. It is more likely that they were thinking about consciousness.  People still differ about the “seat of consciousness”, whether it be the soul, the brain, the universe, or any number of other things.

According to modern opinion, human beings (homo sapiens, from the latin “wise man” or “knowing man”) are thought to be self-aware by nature. Is this natural self-awareness the same as self-knowledge? Surely the whole of ancient philosophy would not be dedicated to exhorting the need for something that all human beings already possessed!

Socrates said the unexamined life is not worth living, but how many of us really knows how to examine our lives? I think we assume we’d know how to do that if we chose to, but do we? The brain evolved in some very idiosyncratic ways, and self-examination was apparently not high on the to do list for natural selection.

Nevertheless, we modern humans, especially the best and brightest of us, tend to assume that our own personal self-knowledge is something we come by automatically in the course of all our experience and all our spontaneous and natural thoughts and feelings about ourselves. 

The small percentage of us who have studied psychology or participated in some kind of psychological counseling or therapy(such as Freudian analysis, aroma therapy, or the currently popular cognitive behavioral therapy) tend to assume that we are especially knowledgeable about ourselves. We may even have become self-conscious.

“Self Observation”

On the other hand, the really sophisticated philosophers among us (we know who we are) may believe that self-examination, self-observation, introspection, mindfulness, meditation, and other forms of psychological mindedness prescribe specific kinds of education, work, or practice that one must pursue in order to acquire greater insight into and mastery of the workings of ones own mind.

Those willing to explore the outer limits of knowledge may also believe that a guru, an extraterrestrial intelligence, or an altered state of consciousness has conferred special self-knowledge upon them mystically.

Some may have come by their heightened self-insight chemically.

Though having belonged at some time or another to all of the above groups,  I have recently settled down to the more pedantic pursuit of following the research on cognitive neuroscience.

“fMRI Brain”

Be all that as it may, however, what all of the above paths to self-knowledge tend to have in common is the problem of motivation and discipline, or the lack of it. So I was delighted with myself when I hit upon the following idea: what if the video game industry could be induced to produce exciting, psychologically addictive video games based on some or all of the above methods for increasing self-knowledge?

Lo and behold a few days later I accidentally found this:

The “Know Thyself” game

A Lost Soul. An Unruly Subconscious. A Second Chance. A Role-Playing Game.

What if you were suddenly without any memories, held in a dream prison by your own subconscious, and the only hint you have of who you might be is a single statement repeated over and over in your head?

Know Thyself is a game for three to five players for an evening’s entertainment. One person plays an amnesiac in a fever dream hell and the others play that person’s subconscious & people from their past. The game features bizarre, unreal play due to a special deck of playing cards.

This is not actually a video game, and there are no photo-realistic, kick-ass action avatars, but it seems like a small step in that direction. For more information (but not much) see Tomorrow the World Games.

Could this at last be the true philosopher’s stone, the long sought-after secret to transforming unemployed couch potatoes into enlightened beings, the key to awakening the dormant wisdom we need to save the world?

First there was the beginning of innate, natural wisdom in human pre-history, the first dawning of wisdom in the world (beginning of wisdom 1.0). Then there was the beginning of conscious, formal wisdom in individual cognitive development and human culture (beginning of wisdom 2.0). Now begins the promulgation of that most radical and fundamental form of wisdom, self-knowledge, by the new and improved process of electronic video game addiction (beginning of wisdom 3.0).

Video games that promote self-examination and good mental hygiene? Gee whizz, Batman! That could be the beginning of a whole new age of wisdom and enlightenment for humanity.

The current “Known Universe” of video games is relatively flat…

Go now and carry this eureka-quality epiphany to the four corners of the video game world!

Poor Richard

ADDENDUM 9/15/2010

In reply to a post called

______________________________

Mind change – a moral choice?

______________________________

at the Open Parachute blog , I posted the following comment:

In “The Beginning of Wisdom 3.0” I argue that brain changes or cognitive influences caused by video gaming could, if the games were appropriately designed, be very constructive. In fact, I suggest video games as a delivery system for a whole spectrum of positive cognitive re-engineering efforts addressing such issues as “predictable irrationality”, “cognitive self-defense”, cognitive self-assessment, cognitive therapy, etc.

“A spoonful of sugar helps the medicine go down.”

As we all know, video games can be extremely compelling (if not addictive), and users can obsess over them for hours and days at a time. If a game meets enough of the criteria needed to make it compelling to a target audience, users can be expected to gladly consume any educational content embedded int the game. This is well-established and has already been extensively exploited in a broad range of educational software and interactive video products.

I advocate robust research and development efforts aimed at producing state-of-the-art video games designed to teach actual cognitive skills and abilities, with or without explicit, factual educational content.

At the simplest level, games might be designed to train users in critical reasoning skills such as the use of sound logic and argument or the recognition of logical fallacies.

On a deeper level, games might be designed to reveal a user’s implicit associations and unconscious cognitive biases and even to assist the user in altering such biases.

On a deeper level yet, information gleaned from cognitive neuroscience might be applied to correct pathologies, compensate for deficits, or improve a wide variety of targeted cognitive or neural processes.

The psychological and neural consequences of using video games may very well be undesirable or even harmful if some or all of the impacts are arbitrary, unintended, and unexamined. On the other hand, if the impacts are intentional and constructive, video games might help us fix a whole panoply of thorny problems. They could become a virtual panacea for any and all correctable neuro-cognitive disorders of thinking, reasoning, and behavior.

Video Game themes that could be adapted for cognitive skills/hygiene

These projects have a potential to be made into video games or other spin-offs that could be designed not simply as entertainment products but also as educational tools–both pedagogical and dialectical–perhaps the first of their kind.

An Economical Bestiary (PRA 2010)

PRA 2010′s “Economical Bestiary” is a work of  hypertext literature — a blog-based book– about economic myths and facts. The work analyzes economic myths and political misconceptions and  in many cases relates the misconceptions to irrational cognitive biases. A video game based on the Economical Bestiary could be designed to teach critical reasoning skills, propaganda self-defense, logical fallacy detection, discovering and altering implicit associations, etc.

One object of the game would be to take-over the status quo government/economy, based mainly on accumulating economic and political points–but some violence is inevitable… and good for suspense.

I suppose it would have to go all “Global”, with economical and political beasties from multiple nations slugging it out.

The ultimate ULTIMATE objective might be an egalitarian, steady-state civilization that would solve global warming, etc. At the very least, the players would have to prevent and/or survive any number of possible catastrophes, regardless of who was in power.

If it were done right it might be a fun game for business- and politically-minded people of just about any age, and it might get some people to think harder and smarter about how to save the world at the same time.

The game could continue to evolve, becoming more realistic, until it actually started spilling out into reality with people creating real alliances and institutions.

The Inner Hunchback (PRA 2010)

Synopsis: In Victor Hugo’s novel, The Hunchback of Notradame, each character has its own individual point of view, drawn from trusted sources such as religion, academic or political authority, kinship, popular culture, traditions, etc.  Hugo  leads the reader through each character’s reality, giving us privileged vantage points from which to glimpse the insights, errors, and cognitive biases of each and providing us an opportunity to assemble a “bigger picture” of our own.

Animal Farm 2.0 (A nail-biting modern sequel to George Orwell’s original novel) (PRA 2010)

Synopsis: Over a course of  years, an average family farm is gradually transformed into a corporate animal death camp, complete with an ersatz animist-fundamentalist theocracy that secretly serves the human corporate overlords. There will also be sinister, mad scientists doing gene-splicing experiments on plants, animals and humans alike….Too scary for young readers? Don’t worry–it all comes out right in the end!

The Illustrated Treasury of Cognitive-Bias Fairy Tales and Folk Stories (This project will be posted shortly on PRA 2010)

Poor Richard

Related Information:

Virtual Reality Won’t Just Amuse—It Will Heal Millions (wired.com)

The Quantified Self: Self Knowledge Through Numbers–a catalog-in-progress of all the self-tracking tools out there

Dozens of tools are listed in  14 categories. Some tools gather and analyze data collected by mobile devices and sensors. A sampling:

Mood

ButterBeeHappy
CureTogether Anxiety, Depression, Mood Tracking
Facing Us
Gotta Feeling
Gratitude & Happiness
GratitudeLog
Happiness for iPhone
Happy Factor

Productivity

1DayLater
BaseCamp
Blueprint HQ
BubbleTimer
EtherPad

Stanford  Encyclopedia of Philosophy Stanford Encyclopedia of Philosophy

Introspection , as the term is used in contemporary philosophy of mind, is a means of learning about one’s own currently ongoing, or perhaps very recently past, mental states or processes.

Self-Knowledge In philosophy, ‘self-knowledge’ commonly refers to knowledge of one’s particular mental states, including one’s beliefs, desires, and sensations.

Introspective People Have Larger Prefrontal Cortex

Lumosity “Brain Games –Scientifically designed brain fitness program. Lumosity is designed by some of the leading experts in neuroscience and cognitive psychology from Stanford and UCSF.”

NASA-funded game aims to make science more appealing

Last week a curious, free release popped up on Steam: Moonbase Alpha, a NASA-funded game where up to six players can team up in order to save a near-future Lunar base crippled by a meteor strike. The game is just the first release from NASA’s Learning Technologies program, which aims to help raise interest in the space program through gaming.

“The US is facing a crisis in technical fields,” explained Laughlin. “There are not enough students studying science, technology, engineering and mathematics to fill our national needs in those areas. NASA literally cannot function without STEM graduates. The big goals for NASA Education are to get more students into STEM fields of study and graduating into STEM careers. It’s also the president’s goal with the Educate to Innovate initiative. Moonbase Alpha was developed in support of those goals.”

Gamers beat algorithms at finding protein structures (ArsTechnica.com)

Today’s issue of Nature contains a paper with a rather unusual author list. Read past the standard collection of academics, and the final author credited is… an online gaming community.

Scientists have turned to games for a variety of reasons, having studied virtual epidemics and tracked online communities and behavior, or simply used games to drum up excitement for the science. But this may be the first time that the gamers played an active role in producing the results, having solved problems in protein structure through the Foldit game.

Starting with algorithms, ending with brains

Foldit uses some of the same conventions typical of other computer games, like a few simple structural problems to give new users a smooth learning curve. It also borrows from other online gaming communities; there are leaderboards, team and individual challenges, user forums, and so on.

Though very few of those who played Foldit had any significant background in biochemistry, the gamers tended to beat Rosetta when it came to solving structures. In a series of ten challenges, they outperformed the algorithms on five and drew even on another three.

By tracing the actions of the best players, the authors were able to figure out how the humans’ excellent pattern recognition abilities gave them an edge over the computer. For example, people were very good about detecting a hydrophobic amino acid when it stuck out from the protein’s surface, instead of being buried internally, and they were willing to rearrange the structure’s internals in order to tuck the offending amino acid back inside. Those sorts of extensive rearrangements were beyond Rosetta’s abilities, since the energy changes involved in the transitions are so large.

The authors also note that different players tended to have different strengths. Some were better at making the big adjustments needed to get near an energy minimum, while others enjoyed the fine-scale tweaking needed to fully optimize the structure. That’s where Foldit’s ability to enable team competitions, where different team members could handle the parts of the task most suited to their interests and abilities, really paid off.

The Nature article makes it clear that researchers in other fields, including astronomy, are starting to try similar approaches to getting the public to contribute something other than spare processor time to scientific research. As long as the human brain continues to outperform computers on some tasks, researchers who can harness these differences should get a big jump in performance.

Science gleans 60TB of behavior data from Everquest 2 logs (ArsTechnica.com)

Researchers ranging from psychologists to epidemiologists have wondered for some time whether online, multiplayer games might provide some ways to test concepts that are otherwise difficult to track in the real world.

Jaideep Srivastava is a computer scientist doing work on machine learning and data mining—in the past, he has studied shopping cart abandonment at Amazon.com, a virtual event without a real-world parallel. He spent a little time talking about the challenges of working with the Everquest II dataset, which on its own doesn’t lend itself to processing by common algorithms. For some studies, he has imported the data into a specialized database, one with a large and complex structure. Regardless of format, many one-pass, exhaustive algorithms simply choke on a dataset this large, which is forcing his group to use some incremental analysis methods or to work with subsets of the data.

Srivastava then gave a short tour of the sorts of items the team is trying to extract from the raw logs. He apparently has graduate students working on non-traditional figures like the “monster composite difficulty index” and an “experience rate measure.”

Noshir Contractor described how the data was allowing him to explore social network dynamics within the game. He described a variety of factors that are thought to influence the growth and extent of social networks, such as collective action, social exchange, the search for similar people, physical proximity, friend-of-a-friend (FoaF) interactions, and so on. Because these are well-developed concepts, statistical tools exist that can extract their signature from the raw data by looking at interactions like instant messaging, partnerships, and trade.

Williams pointed out one case where having access to the server logs allowed the researchers to identify some serious skewing in the responses to the demographic surveys. Older women turned out to be some of the most committed players but significantly under-reported the amount of time they spent in the game by three hours per week (men under-reported as well, but only by one hour). The example highlights the risk of using self-reporting for behavioral studies and the potential of the virtual world data.

Blizzard [World of Warcraft] negotiating with researchers for virtual epidemic study (ArsTechnica.com)

A strange phenomenon struck the virtual inhabitants of World of Warcraft. A disease designed to be limited to areas accessed by high-level characters managed to make it back to the cities of that virtual world, where it devastated their populations. At the time, Ars’ Jeremy Reimer noted, “It would be even more interesting if epidemiologists in the real world found that this event was worthy of studying as a kind of controlled experiment in disease propagation.” The epidemiologists have noticed, and there may be more of these events on the way for WoW players. There were a number of features in the virtual outbreak that actually mimicked the spread of and response to real-world epidemics.

Modeling Infectious Diseases Dissemination Through Online Role-Playing Games, Balicer, Ran D. (Epidemiology: March 2007)

As mathematical modeling of infectious diseases becomes increasingly important for developing public health policies, a novel platform for such studies might be considered. Millions of people worldwide play interactive online role-playing games, forming complex and rich networks among their virtual characters. An unexpected outbreak of an infective communicable disease (unplanned by the game creators) recently occurred in this virtual world. This outbreak holds surprising similarities to real-world epidemics. It is possible that these virtual environments could serve as a platform for studying the dissemination of infectious diseases, and as a testing ground for novel interventions to control emerging communicable diseases.

Neurobiology of Meditation

How Meditation Reshapes Your Brain Max Miller on October 6, 2010 (BigThink.com)

—”Mental Training Enhances Attentional Stability: Neural and Behavioral Evidence,” (2009) by Antoine Lutz in The Journal of Neuroscience [PDF]

—”Short-term meditation training improves attention and self-regulation,” (2007)  by Michael Posner in the journal PNAS

Know Then Thyself

by Alexander Pope

Know then thyself, presume not God to scan
The proper study of mankind is man.
Placed on this isthmus of a middle state,
A being darkly wise, and rudely great:
With too much knowledge for the sceptic side,
With too much weakness for the stoic’s pride,
He hangs between; in doubt to act, or rest;
In doubt to deem himself a God, or beast;
In doubt his mind and body to prefer;
Born but to die, and reas’ning but to err;
Alike in ignorance, his reason such,
Whether he thinks too little, or too much;
Chaos of thought and passion, all confus’d;
Still by himself, abus’d or disabus’d;
Created half to rise and half to fall;
Great lord of all things, yet a prey to all,
Sole judge of truth, in endless error hurl’d;
The glory, jest and riddle of the world.

%d bloggers like this: