Saturday, August 29, 2009

The ancestor’s tale

A voice of reason in irrational times, Richard Dawkins is both theorist and explainer of one of the greatest discoveries of the human mind

Thomas Henry Huxley, the great contemporary populariser of Charles Darwin’s ideas, declared it his aim to “smite all humbugs, however big; to give a nobler tone to science; to set an example of abstinence from petty personal controversies, and of toleration for everything but lying”.

That is a fair summary of Richard Dawkins’s achievement in four decades of public advocacy of science and its methods. Professor Dawkins does not altogether avoid the use of invective in the controversies that he joins. But he would reasonably point out that the promotion of critical thinking and an appreciation of scientific discovery are far from merely personal obsessions. In his latest book, The Greatest Show on Earth, from which The Times is publishing extracts next week, he lucidly expounds evolution and its mechanism of natural selection.

The book’s most obvious accomplishment is to rout “scientific” creationism and its equivalent (and equally absurdly named) doctrine of Intelligent Design. But while that is an exhausting and necessary task, it is only indirectly Professor Dawkins’s message. He is rightly impatient that scientists need devote any time to dogmas that are not testable and yield no predictions, and that are thus exactly unlike science.

In his book, Professor Dawkins compares the relation of Intelligent Design to science with that of Holocaust denial to history. The analogy is deliberately inflammatory and entirely correct. The objection to Holocaust denial is not, as many believe, that it is offensive and xenophobic (though it is, of course, both). It is, rather, that Holocaust denial is false. It is impossible to argue consistently that the Holocaust never happened except by ignoring or faking the historical evidence. Creationism and Intelligent Design are like that. They are not even wrong; they are just bad ideas.

Pseudoscientific beliefs can never be refuted, because their proponents do not recognise the concept of evidence. Evolutionary theory, by contrast, accurately predicted that there must be a mechanism by which traits would pass from one generation to the next. The discovery, a century after Darwin, of the structure of DNA confirmed the explanatory power of his theories. How all species descend from a common ancestor is an extraordinary narrative. Through the power of reason, observation and experiment, evolutionary biology is able to grasp it. And Professor Dawkins has made it his particular skill to tell it.

He has advanced explanations of patterns of kinship and hence of altruism among species, by positing the essential unit of evolution as the gene. His book The Blind Watchmaker demonstrates how natural selection, unplanned but not random, explains complexity without the need to invoke design or purpose. And there is his dismissal of all religious explanations.

Religious faith can be entirely compatible with science and reason. Professor Dawkins’s belief that “moderate religion makes the world safe for extremists” is mistaken and tactically disastrous. There is an intense common interest among all those who believe in science, liberal rights and sexual equality that moderate religion, which recognises the value of free inquiry and political pluralism, should prevail over theocratic extremists.

But the frequent criticism of Professor Dawkins that he is a scientific fundamentalist is wrong. (And the claim of one writer that by his militancy Professor Dawkins has become the “top pin-up” of the Intelligent Design lobby is patently absurd.) Evolution through natural selection is among the greatest discoveries of civilisation. As Theodosius Dobzhansky, the geneticist, wrote, nothing in biology makes sense except in the light of evolution. And much in other fields is illuminated by it. Professor Dawkins combines the role of theorist, synthesist and explainer in that cause as no one else does.

Thursday, August 20, 2009

MIRROR NEURONS and imitation learning as the driving force behind "the great leap forward" in human evolution

[V.S. RAMACHANDRAN:] The discovery of mirror neurons in the frontal lobes of monkeys, and their potential relevance to human brain evolution — which I speculate on in this essay — is the single most important "unreported" (or at least, unpublicized) story of the decade. I predict that mirror neurons will do for psychology what DNA did for biology: they will provide a unifying framework and help explain a host of mental abilities that have hitherto remained mysterious and inaccessible to experiments.

There are many puzzling questions about the evolution of the human mind and brain:

1) The hominid brain reached almost its present size — and perhaps even its present intellectual capacity about 250,000 years ago . Yet many of the attributes we regard as uniquely human appeared only much later. Why? What was the brain doing during the long "incubation "period? Why did it have all this latent potential for tool use, fire, art music and perhaps even language- that blossomed only considerably later? How did these latent abilities emerge, given that natural selection can only select expressed abilities, not latent ones? I shall call this "Wallace's problem", after the Victorian naturalist Alfred Russell Wallace who first proposed it.

2) Crude "Oldawan" tools — made by just a few blows to a core stone to create an irregular edge — emerged 2.4 million ago and were probably made by Homo Habilis whose brain size was half way (700cc) between modern humans (1300) and chimps (400). After another million years of evolutionary stasis aesthetically pleasing "symmetrical" tools began to appear associated with a standardization of production technique and artifact form. These required switching from a hard hammer to a soft (wooden?) hammer while the tool was being made, in order to ensure a smooth rather than jagged, irregular edge. And lastly, the invention of stereotyped "assembly line" tools (sophisticated symmetrical bifacial tools) that were hafted to a handle, took place only 200,000 years ago. Why was the evolution of the human mind "punctuated" by these relatively sudden upheavals of technological change?

3) Why the sudden explosion (often called the "great leap" ) in technological sophistication, widespread cave art, clothes, stereotyped dwellings, etc. around 40 thousand years ago, even though the brain had achieved its present "modern" size almost a million years earlier?

4) Did language appear completely out of the blue as suggested by Chomsky? Or did it evolve from a more primitive gestural language that was already in place?

5) Humans are often called the "Machiavellian Primate" referring to our ability to "read minds" in order to predict other peoples' behavior and outsmart them. Why are apes and humans so good at reading other individuals' intentions? Do higher primates have a specialized brain center or module for generating a "theory of other minds" as proposed by Nick Humphrey and Simon Baron-Cohen? If so, where is this circuit and how and when did it evolve?

The solution to many of these riddles comes from an unlikely source.. the study of single neurons in the brains of monkeys. I suggest that the questions become less puzzling when you consider Giaccamo Rizzollati's recent discovery of "mirror neurons' in the ventral premotor area of monkeys. This cluster of neurons, I argue, holds the key to understanding many enigmatic aspects of human evolution. Rizzollati and Arbib have already pointed out the relevance of their discovery to language evolution . But I believe the significance of their findings for understanding other equally important aspects of human evolution has been largely overlooked. This, in my view, is the most important unreported "story" in the last decade.

THE EMERGENCE OF LANGUAGE

Unlike many other human traits such as humor, art, dancing or music the survival value of language is obvious — it helps us communicate our thoughts and intentions. But the question of how such an extraordinary ability might have actually evolved has puzzled biologists, psychologists and philosophers at least since the time of Charles Darwin. The problem is that the human vocal apparatus is vastly more sophisticated than that of any ape but without the correspondingly sophisticated language areas in the brain the vocal equipment alone would be useless. So how did these two mechanisms with so many sophisticated interlocking parts evolve in tandem? Following Darwin's lead I suggest that our vocal equipment and our remarkable ability to modulate voice evolved mainly for producing emotional calls and musical sounds during courtship ("croonin a toon."). Once that evolved then the brain — especially the left hemisphere — could evolve language.

But a bigger puzzle remains. Is language mediated by a sophisticated and highly specialized "language organ" that is unique to humans and emerged completely out of the blue as suggested by Chomsky? Or was there a more primitive gestural communication system already in place that provided a scaffolding for the emergence of vocal language?

Rizzolatti's discovery can help us solve this age-old puzzle. He recorded from the ventral premotor area of the frontal lobes of monkeys and found that certain cells will fire when a monkey performs a single, highly specific action with its hand: pulling, pushing, tugging, grasping, picking up and putting a peanut in the mouth etc. different neurons fire in response to different actions. One might be tempted to think that these are motor "command" neurons, making muscles do certain things; however, the astonishing truth is that any given mirror neuron will also fire when the monkey in question observes another monkey (or even the experimenter) performing the same action, e.g. tasting a peanut! With knowledge of these neurons, you have the basis for understanding a host of very enigmatic aspects of the human mind: "mind reading" empathy, imitation learning, and even the evolution of language. Anytime you watch someone else doing something (or even starting to do something), the corresponding mirror neuron might fire in your brain, thereby allowing you to "read" and understand another's intentions, and thus to develop a sophisticated "theory of other minds." (I suggest, also, that a loss of these mirror neurons may explain autism — a cruel disease that afflicts children. Without these neurons the child can no longer understand or empathize with other people emotionally and therefore completely withdraws from the world socially.)

Mirror neurons can also enable you to imitate the movements of others thereby setting the stage for the complex Lamarckian or cultural inheritance that characterizes our species and liberates us from the constraints of a purely gene based evolution. Moreover, as Rizzolati has noted, these neurons may also enable you to mime — and possibly understand — the lip and tongue movements of others which, in turn, could provide the opportunity for language to evolve. (This is why, when you stick your tongue out at a new born baby it will reciprocate! How ironic and poignant that this little gesture encapsulates a half a million years of primate brain evolution.) Once you have these two abilities in place the ability to read someone's intentions and the ability to mime their vocalizations then you have set in motion the evolution of language. You need no longer speak of a unique language organ and the problem doesn't seem quite so mysterious any more.

(Another important piece of the puzzle is Rizzolatti's observation that the ventral premotor area may be a homologue of the "Broca's area" — a brain center associated with the expressive and syntactic aspects of language in humans).

These arguments do not in any way negate the idea that there are specialized brain areas for language in humans. We are dealing, here, with the question of how such areas may have evolved, not whether they exist or not.

Mirror neurons were discovered in monkeys but how do we know they exist in the human brain? To find out we studied patients with a strange disorder called anosognosia. Most patients with a right hemisphere stroke have complete paralysis of the left side of their body and will complain about it, as expected. But about 5% of them will vehemently deny their paralysis even though they are mentally otherwise lucid and intelligent. This is the so called "denial" syndrome or anosognosia. To our amazement, we found that some of these patients not only denied their own paralysis, but also denied the paralysis of another patient whose inability to move his arm was clearly visible to them and to others. Denying ones one paralysis is odd enough but why would a patient deny another patient's paralysis? We suggest that this bizarre observation is best understood in terms of damage to Rizzolatti's mirror neurons. It's as if anytime you want to make a judgement about someone else's movements you have to run a VR (virtual reality) simulation of the corresponding movements in your own brain and without mirror neurons you cannot do this .

The second piece of evidence comes from studying brain waves (EEG) in humans. When people move their hands a brain wave called the MU wave gets blocked and disappears completely. Eric Altschuller, Jamie Pineda, and I suggested at the Society for Neurosciences in 1998 that this suppression was caused by Rizzolati's mirror neuron system. Consistent with this theory we found that such a suppression also occurs when a person watches someone else moving his hand but not if he watches a similar movement by an inanimate object. (We predict that children with autism should show suppression if they move their own hands but not if they watch some one else. Our lab now has preliminary hints from one highly functioning autistic child that this might be true (Social Neuroscience Abstracts 2000).

THE BIG BANG OF HUMAN EVOLUTION

The hominid brain grew at an accelerating pace until it reached its present size of 1500cc about 200,000 years ago. Yet uniquely human abilities such the invention of highly sophisticated "standardized" multi- part tools, tailored clothes, art, religious belief and perhaps even language are thought to have emerged quite rapidly around 40,000 years ago — a sudden explosion of human mental abilities and culture that is sometimes called the "big bang." If the brain reached its full human potential — or at least size — 200,000 years ago why did it remain idle for 150,000 years? Most scholars are convinced that the big bang occurred because of some unknown genetic change in brain structure. For instance, the archeologist Steve Mithen has just written a book in which he claims that before the big bang there were three different brain modules in the human brain that were specialized for "social or machiavellian intelligence", for "mechanical intelligence" or tool use, and for "natural history" (a propensity to classify). These three modules remained isolated from each other but around 50,000 years ago some genetic change in the brain suddenly allowed them to communicate with each other, resulting in the enormous flexibility and versatility of human consciousness.

I disagree with Mithen ingenious suggestion and offer a very different solution to the problem. (This is not incompatible with Mithen's view but its a different idea). I suggest that the so-called big bang occurred because certain critical environmental triggers acted on a brain that had already become big for some other reason and was therefore "pre-adapted" for those cultural innovations that make us uniquely human. (One of the key pre adaptations being mirror neurons.) Inventions like tool use, art, math and even aspects of language may have been invented "accidentally" in one place and then spread very quickly given the human brain's amazing capacity for imitation learning and mind reading using mirror neurons. Perhaps ANY major "innovation" happens because of a fortuitous coincidence of environmental circumstances — usually at a single place and time. But given our species' remarkable propensity for miming, such an invention would tend to spread very quickly through the population — once it emerged.

Mirror neurons obviously cannot be the only answer to all these riddles of evolution. After all rhesus monkeys and apes have them, yet they lack the cultural sophistication of humans (although it has recently been shown that chimps at least DO have the rudiments of culture, even in the wild). I would argue, though, that mirror neurons are Necessary but not sufficient: their emergence and further development in hominids was a decisive step. The reason is that once you have a certain minimum amount of "imitation learning" and "culture" in place, this culture can, in turn, exert the selection pressure for developing those additional mental traits that make us human . And once this starts happening you have set in motion the auto-catalytic process that culminated in modern human consciousness.

A second problem with my suggestion is that it doesn't explain why the many human innovations that constitute the big bang occurred during a relatively short period. If its simply a matter of chance discoveries spreading rapidly,why would all of them have occurred at the same time? There are three answers to this objection. First,the evidence that it all took place at the same time is tenuous. The invention of music, shelters,hafted tools, tailored clothing, writing, speech, etc. may have been spread out between 100K and 5k and the so-called great leap may be a sampling artifact of archeological excavation. Second, any given innovation (e.g. speech or writing or tools) may have served as a catalyst for the others and may have therefore accelerated the pace of culture as a whole. And third, there may indeed have been a genetic change,b ut it may not have been an increase in the ability to innovate ( nor a breakdown of barriers between modules as suggested by Mithen) but an increase in the sophistication of the mirror neuron system and therefore in "learnability." The resulting increase in ability to imitate and learn (and teach) would then explain the explosion of cultural change that we call the "great leap forward" or the "big bang" in human evolution. This argument implies that the whole "nature-nurture debate" is largely meaningless as far as human are concerned. Without the genetically specified learnability that characterizes the human brain Homo sapiens wouldn't deserve the title "sapiens" (wise) but without being immersed in a culture that can take advantage of this learnability, the title would be equally inappropriate. In this sense human culture and human brain have co-evolved into obligatory mutual parasites — without either the result would not be a human being. (No more than you can have a cell without its parasitic mitochondria).

THE SECOND BIG BANG

My suggestion that these neurons provided the initial impetus for "runaway" brain/ culture co-evolution in humans, isn't quite as bizarre as it sounds. Imagine a martian anthropologist was studying human evolution a million years from now. He would be puzzled (like Wallace was) by the relatively sudden emergence of certain mental traits like sophisticated tool use, use of fire, art and "culture" and would try to correlate them (as many anthropologists now do) with purported changes in brain size and anatomy caused by mutations. But unlike them he would also be puzzled by the enormous upheavals and changes that occurred after (say) 19th century — what we call the scientific/industrial revolution. This revolution is, in many ways, much more dramatic (e.g. the sudden emergence of nuclear power, automobiles, air travel, and space travel) than the "great leap forward" that happened 40,000 years ago!!

He might be tempted to argue that there must have been a genetic change and corresponding change in brain anatomy and behavior to account for this second leap forward. (Just as many anthropologists today seek a genetic explanation for the first one.) Yet we know that present one occurred exclusively because of fortuitous environmental circumstances, because Galileo invented the "experimental method," that, together with royal patronage and the invention of the printing press, kicked off the scientific revolution. His experiments and the earlier invention of a sophisticated new language called mathematics in India in the first millennium AD (based on place value notation, zero and the decimal system), set the stage for Newtonian mechanics and the calculus and "the rest is history" as we say.

Now the thing to bear in mind is that none of this need have happened. It certainly did not happen because of a genetic change in the human brains during the renaissance. It happened at least partly because of imitation learning and rapid "cultural" transmission of knowledge. (Indeed one could almost argue that there was a greater behavioral/cognitive difference between pre-18th century and post 20th century humans than between Homo Erectus and archaic Homo Sapiens. Unless he knew better our Martian ethologist may conclude that there was a bigger genetic difference between the first two groups than the latter two species!)

Based on this analogy I suggest, further, that even the first great leap forward was made possible largely by imitation and emulation. Wallace's question was perfectly sensible; it is very puzzling how a set of extraordinary abilities seemed to emerge "out of the blue". But his solution was wrong...the apparently sudden emergence of things like art or sophisticated tools was not because of God or "divine intervention". I would argue instead that just as a single invention (or two) by Galileo and Gutenberg quickly spread and transformed the surface of the globe (although there was no preceding genetic change), inventions like fire, tailored clothes, "symmetrical tools", and art, etc. may have fortuitously emerged in a single place and then spread very quickly. Such inventions may have been made by earlier hominids too (even chimps and orangs are remarkably inventive...who knows how inventive Homo Erectus or Neandertals were) but early hominids simply may not have had an advanced enough mirror neuron system to allow a rapid transmission and dissemination of ideas. So the ideas quickly drop out of the "meme pool". This system of cells, once it became sophisticated enough to be harnessed for "training" in tool use and for reading other hominids minds, may have played the same pivotal role in the emergence of human consciousness (and replacement of Neandertals by Homo Sapiens) as the asteroid impact did in the triumph of mammals over reptiles.

So it makes no more sense to ask "Why did sophisticated tool use and art emerge only 40,000 years ago even though the brain had all the required latent ability 100,000 years earlier?" — than to ask "Why did space travel occur only a few decades ago, even though our brains were preadapted for space travel at least as far back Cro Magnons?". The question ignores the important role of contingency or plain old luck in human evolutionary history.

Thus I regard Rizzolati's discovery — and my purely speculative conjectures on their key role in our evolution — as the most important unreported story of the last decade.

Are You Thinking What I'm Thinking?

When children turn four, they start to wonder what other people are thinking. For instance, if you show a four-year-old a packet of gum and ask what's inside, she'll say, "Gum." You open the packet and show her that inside there's a pencil instead of gum. If you ask her what her mother, who's waiting outside, will think is in the packet once it's been reclosed, she'll say, "Gum," because she knows her mother hasn't seen the pencil. But children under the age of four will generally say their mother will think there's a pencil inside — because children this young cannot yet escape the pull of the real world. They think everyone knows what they know, because they cannot model someone else's mind and in this case realize that someone must see something in order to know it. This ability to think about what others are thinking about is called having a theory of mind.

Wednesday, August 5, 2009

Intelligent design

Intelligent design is the assertion that "certain features of the universe and of living things are best explained by an intelligent cause, not an undirected process such as natural selection."[1][2] It is a modern form of the traditional teleological argument for the existence of God, but one which avoids specifying the nature or identity of the designer. The idea was developed by a group of American creationists who reformulated their argument in the creation-evolution controversy to circumvent court rulings that prohibit the teaching of creationism as science.[4][5][6] Intelligent design's leading proponents – all of whom are associated with the Discovery Institute, a politically conservative think tank – believe the designer to be the God of Christianity.

Advocates of intelligent design argue that it is a scientific theory, and seek to fundamentally redefine science to accept supernatural explanations. The unequivocal consensus in the scientific community is that intelligent design is not science. The U.S. National Academy of Sciences has stated that "creationism, intelligent design, and other claims of supernatural intervention in the origin of life or of species are not science because they are not testable by the methods of science." The U.S. National Science Teachers Association and the American Association for the Advancement of Science have termed it pseudoscience. Others in the scientific community have concurred, and some have called it junk science.

The Galileo Gambit

"They laughed at Copernicus. They laughed at the Wright Brothers. Yes, well, they also laughed at the Marx Brothers. Being laughed at does not mean you are right."

Michael Shermer

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

In what I call the Galileo Gambit (I have not found any other name for it) the writer compares herself (or whoever she is supporting) to some famous person, noting some alleged similarity (an argument by analogy). Examples:

"They persecuted Galileo, and now they are persecuting me. One day my theory will be accepted."

"Don't lecture me about my report card. Eisenstein was a poor student and look how he turned out."

"So what if none of my paintings have sold? Van Gough never sold any of his paintings while he was alive. Now they sell for millions of dollars."

A popular variation is to note that in the past scientists were often wrong, so they may be wrong now. Example:

"Before Columbus sailed scientists thought that the earth was flat. They were wrong then, and they are wrong about my free energy machine now. Just wait and see."

There are several problems with the Galileo Gambit. Some of the examples are wrong. Galileo was persicuted by the church, not by other scientists, and he was truly persecuted. He was placed under house arrest and threatened with torture. And scientists realized that the earth was round centuries before Columbus. The main problem is that for every Galileo there are hundreds of pseudo-scientists whose theories have long been forgotten.

Here are two more examples of the Galileo Gambit, both from Steven Milloy.

Mars and the Eco-Inquisition

Science Made Easy: The Madeleine Jacobs Principle

Tuesday, August 4, 2009

Science Made Easy: The Madeleine Jacobs Principle

When scientists hold opposing views (as in the global warming debate), who and what do you believe?

Until today, I would have said that, to resolve such a dilemma, you need to do the relevant background research, listen to the scientists' arguments and then formulate your own opinion.

But that was until today. From now on, I'm adhering to the Madeleine Jacobs Principle.

Who's Madeleine Jacobs and what's her principle?

Madeleine is the editor of the American Chemical Society's Chemical and Engineering News. In the June 30, 1997 issue Ms. Jacobs editorialized on global warming. Here's Madeleine's (precious) thought process on who's right in the debate over the science of global warming:

After noting that:

[President Clinton's failure to speak out forcefully on global warming] gives ammunition to an extremely small but vocal and influential minority of global warming skeptics. These individual have successfully created a general perception that scientists are sharply divided over whether global warming is taking place at all. They continue to work vigorously to discredit the scientific findings of the UN Intergovernmental Panel on Climate Change (IPCC), which represents the consensus of 2,500 scientists, [Emphasis added]

Madeleine stated:

President Clinton still has some time between now and December to take a leadership role on global warming. No one disputes that the impact of human activities on climate is complex or that research on the mechanisms of global warming must continue. But it is important to remember that the two sides in a debate aren't necessarily equal, and that in the debate over global warming, the skeptics are a very small minority.

So according to Madeleine, the global warming advocates are right because... there are more of them? It's called science by consensus. And that's the Madeleine Jacobs Principle.

But I wonder if Madeleine realizes that 500 years ago, consensus science was the Earth was flat. And during the 17th century, consensus science was that the Sun revolved around the Earth. In short, the consensus was wrong.

You see (Madeleine, are you paying attention?), science is about observations and facts, not consensus.