The mysterious dancing mania and mass psychogenic illness

Try to imagine yourself walking along the streets of a city (maybe the one you live in, or one you’ve visited, or one you simply make up in your head—as long as you can picture it clearly it doesn’t matter much). Think of the shops and businesses you might pass as you stroll down the sidewalk, the smells of food emanating from nearby restaurants, and the noises you’d hear—intermittent car horns, snippets of conversation, the discordant sounds of construction equipment. Now, imagine you approach a street corner, and as you do you begin to hear some rhythmic music playing from just out of view—on what sounds like bagpipes (to really set the mood, click play on the video below for some appropriate background music). As you turn the corner, curious to find the source of the music, you see a large city park. It charmingly interrupts the asphalt and concrete of the city with expansive green grasses, dense leafy trees, and a bubbling decorative fountain. But despite its beauty, the park is also the backdrop to one of the strangest spectacles you’ve ever witnessed.

The park is filled with people—perhaps a hundred, maybe more. Many of them are naked. Others are wearing clothes that are dirty, ripped, and often hanging loosely from their undernourished bodies. A large group of them have formed a circle by holding hands, and many others are contained within the circle. Someone you can’t see is playing the aforementioned upbeat (almost eerily so, now that you can see the whole picture) tune on the bagpipes, and nearly everyone is dancing—but not in a choreographed manner you might see from a flash mob today. Instead, this dancing is convulsive and jerky, and almost out of control—like there is a maniacal puppet master manipulating their movements from above.

As you cautiously take a few steps closer to this bizarre scene, you see that many of the dancers are staring blankly up at the sky, as if in a trance. Occasionally, they yell—shriek might be the more appropriate word—unintelligibly into the air. Some of these shrieks become agonized screams, and you can clearly make out the word “help!” shouted at least once or twice. You notice that, in the middle of the circle, several couples are on the ground having sex with one another. The whole thing looks like a drug-fueled ritual/orgy, but it’s taking place right out in the open, for everyone to see.

One of the dancers suddenly falls to the ground and starts convulsing. He’s clearly having some sort of seizure, his body thrashing about wildly and uncontrollably—but everyone just ignores him. After what must be about 30 seconds, he recovers, slowly gets up, and begins dancing again.

Think of the shock and horror you would feel when you encountered this scene. Now consider that if you lived in certain parts of Europe between the fourteenth and seventeenth centuries, this spectacle may not even have been cause for alarm. These types of dancing displays were not unheard of, and it’s very possible you would have seen one before.

In those days, the people who participated in the dancing rituals were thought to be afflicted by some malady (often assumed to be demonic possession) that led to compulsive dancing. The ailment was deemed contagious, and it was believed onlookers could be overcome and compelled to join the dancing at any moment. The condition was often called the dancing mania or St. Vitus’ dance, the latter name coming into use because the afflicted would often dance near the churches or shrines of St. Vitus, the patron saint of dancers. Priests from these churches frequently tried to intercede, frantically attempting to exorcise the demons from those who were affected before they were able to pass the sickness on to members of the clergy.

A depiction of dancing mania by Pieter Brueghel the Younger.

One such event occurred in 1374 and spread across a large area of Europe that included western Germany, Belgium, the Netherlands, Luxembourg, and northeastern France. Dozens of independent chroniclers of the events agree that thousands of people were affected, and the dancing went on for weeks. Another incident in Strasbourg in 1518 involved around 400 people, a number of whom were reported to have died while dancing in oppressively high summer temperatures. There were many other smaller occurrences of dancing mania, and sporadic reports of it persisted up until the mid-1600s.

While it’s possible some of the details of these events have been embellished, the number of independent verifications of them suggest they did occur in some form. So what could have caused this strange behavior? To this day, scientists are stumped. Some have suggested the culprit might have been widespread ergot poisoning. Ergot is a fungus that grows on rye; it has strong psychoactive effects when it’s ingested, and it can cause hallucinations, tremors, and convulsions (a constituent of ergot, lysergic acid, can be used to synthesize LSD). Is it possible, then, that widespread consumption of tainted rye could have led to these “epidemics?”

It doesn’t seem very likely. Ergot poisoning is characterized by spasms and convulsions, but also by symptoms like nausea and diarrhea, making it improbable sufferers could have danced for days on end. Additionally, ergot poisoning often involves the appearance of gangrene (i.e. tissue dying due to a lack of blood flow—it causes gruesome blackened skin that’s difficult to overlook) on the toes and fingers, but reports of dancing manias don’t include such descriptions. Finally, outbreaks of dancing mania also sometimes occurred in regions where rye wasn’t a common crop.

Of course it’s possible there was some other environmental exposure we haven’t identified that had a widespread influence on behavior, but such things are difficult to ascertain so long after-the-fact. And due to the lack of viable alternative explanations, many scientists have begun to believe the dancing mania was a manifestation of something called mass psychogenic illness, or MPI.

MPI involves the appearance of symptoms that spread throughout a population, but don’t have a clear physical origin. In other words, in MPI the brain is causing the patient to think they are afflicted by some ailment—even though the brain itself is the creator and orchestrator of the illness. This doesn’t mean that the symptoms aren’t real; there can be legitimate physical manifestations of MPI. But there’s no evidence the symptoms are produced by something (like a poison or a germ) other than the nervous system.

MPI is surprisingly common throughout history. Before dancing mania, there was a condition known as tarantism that occurred during the Middle Ages in Southern Italy. Victims of tarantism suffered from a number of symptoms ranging from headache to difficulty breathing, which, according to the victims, began immediately after the bite of a tarantula. (In those days, tarantula referred to a wolf spider, not the spiders we typically think of as tarantulas. Regardless, whether a spider bite was really involved was usually difficult to verify; it’s suspected that in many cases, the spider—like the resultant condition—was a phantom of the mind.) Once the malady took hold, however, the victims didn’t seek out antidotes to spider venom. Instead, they immediately began to take part in the only recognized cure: dancing. Patients would dance on and off for hours, days, or even weeks to upbeat melodies now known as tarantellas (this is what you heard in the video clip above).

Since these dancing disorders of the Middle Ages and early modern times, there have been hundreds of other potential instances of MPI as well. But, you might be thinking, perhaps MPI occurred in the distant past because people were more superstitious and easily-duped than they are today. Surely, we must have advanced past this era of gullibility, right?

Wrong. There is a long list of examples of possible MPI in modern times. For instance, in 2011, twenty classmates at a high school outside Buffalo, NY suddenly began to experience tics, verbal outbursts, and other symptoms that resembled those of Tourette syndrome. Despite investigations by doctors and state health department officials, no environmental cause of the condition was identified, and most doctors eventually agreed that the students’ conditions were brought on by psychological factors. Some doctors even suggested that social and mainstream media contributed to the “spread” of the affliction. Those who were more inclined to post frequently about their ailment on sites like Facebook and those that gave frequent interviews to the press were thought to have the most aggravated conditions. The students who avoided these practices tended to improve more quickly.

Havana syndrome is potentially an even more recent example. Havana syndrome began in late 2016 in Cuba, when American and Canadian diplomatic personnel started reporting a number of symptoms—like headaches, nausea, dizziness, memory problems, hearing loss, and even “mild brain trauma”— which typically appeared after hearing a prolonged harsh, high-pitched noise. Strangely, other people nearby usually didn’t report hearing anything. By 2018, up to 40 cases of Havana syndrome had been documented among American and Canadian diplomatic personnel in Cuba. And in early 2018, similar claims began to be made by U.S. diplomats in China.

At first, many thought this was a case of international espionage at its finest—perhaps Moscow testing a secret acoustical weapon. But evidence to support that theory is lacking, and a number of scientists have now decided it’s more likely the diplomats were experiencing MPI. (Some have even suggested the high-pitched noise the diplomats heard was actually the sound of a particularly noisy type of cricket.)

There are many more examples of MPI in both modern times and the distant past. So, what is actually going on here? Well, first it’s important to point out that it’s almost impossible to completely eliminate other potential causes in these cases. There’s always the chance the unexplained symptoms linked to occurrences of putative MPI could be better explained by a toxin in the environment, a pathogen, or something else altogether that we just haven’t been able to identify. Perhaps, for example, Havana syndrome really was caused by some new weapon being surreptitiously tested by the Russians. We don’t know for sure.

But it’s also likely that at least some of these cases of potential MPI are due mainly to psychological factors. And if so, we’re at a loss to explain how, exactly, that might occur.

Some have suggested that extreme stress, pushing the brain to its cognitive breaking-point, might be a risk factor. Dancing mania, for instance, often affected areas that had recently been ravaged by harsh societal blights like food shortages, devastating diseases, etc. Others have argued that MPI preys primarily on the most suggestible people in the population. According to this hypothesis, there are some who are simply more inclined to believe a mysterious illness is taking hold of them, especially after they’ve heard about or seen someone else affected by that “illness.” (These might also be the same people who are most likely to be susceptible to the influence of something like hypnosis.) And still others are unconvinced that MPI is a viable diagnosis in many cases, since it implies a certainty we can’t possess (that there is no other cause of the condition) and assumes we have the ability to explain behavior that might have been prompted by any number of factors ranging from actual physical illness to cultural elements we may not completely understand.

Thus, at this point, MPI is controversial. We can’t explain why it might happen, and we also can’t say for sure how often it really does. But, there are many scientists who believe this type of mass hysteria is a legitimate phenomenon that has the potential to affect anyone, given the right circumstances. That’s a sobering thought, although it’s still unclear if it’s grounded in reality or if it, like the condition in question, is merely an example of the inherent fallibility of the brain.

References (in addition to linked text above):

Bartholomew RE. Tarantism, dancing mania and demonopathy: the anthro-political aspects of 'mass psychogenic illness'. Psychol Med. 1994 May;24(2):281-306.

Waller J. A forgotten plague: making sense of dancing mania. Lancet. 2009 Feb 21;373(9664):624-5.

Optograms: images from the eyes of the dead

On a cloudy fall morning in 1880, Willy Kuhne, a distinguished professor of physiology at the University of Heidelberg, waited impatiently for 31-year-old Erhard Reif to die. Reif had been found guilty of the reprehensible act of drowning his own children in the Rhine, and condemned to die by guillotine. Kuhne’s eagerness for Reif’s death, however, had nothing to do with his desire to see justice served. Instead, his impatience was mostly selfish—he had been promised the dead man’s eyes, and he planned to use them to quell a bit of scientific curiosity that had been needling him for years.

For the several years prior, Kuhne had been obsessed with eyes, and especially with the mechanism underlying the eye’s ability to create an image of the outside world. As part of this obsession, Kuhne wanted to determine once and for all the veracity of a popular belief that the human eye stores away an image of the last scene it observed before death—and that this image could then be retrieved from the retina of the deceased. Kuhne had given these images a name: optograms. He had seen evidence of them in frogs and rabbits, but had yet to verify their existence in people.

Optograms had become something of an urban legend by the time Kuhne started experimenting with them. Like most urban legends, it’s difficult to determine where this one began, but one of the earliest accounts of it can be found in an anonymous article published in London in 1857. The article claimed that an oculist in Chicago had successfully retrieved an image from the eye of a murdered man. According to the story, although the image had deteriorated in the process of separating the eye from the brain, one could still make out in it the figure of a man wearing a light coat. The reader was left to wonder whether or not the man depicted was, in fact, the murderer—and whether further refinements to the procedure could lead to a foolproof method of identifying killers by examining the eyes of their victims.

Optograms remained an intrigue in the latter half of the 19th century, but they became especially interesting to Kuhne when physiologist Franz Boll discovered a biochemical mechanism that made them plausible. Boll identified a pigmented molecule (later named rhodopsin by Kuhne) in the rod cells of the retina that was transformed from a reddish-purple color to pale and colorless upon exposure to light. At the time, much of the biology underlying visual perception was still a mystery, but we now know that the absorption of light by rhodopsin is the first step in the visual process in rod cells. It also results in something known as “bleaching,” where a change in the configuration of rhodopsin causes it to stop absorbing light until more of the original rhodopsin molecule can be produced.

In studying this effect, Boll found that the bleaching of rhodopsin could produce crude images of the environment on the retina itself. He demonstrated as much with a frog. He put the animal into a dark room, cracked the windows’ shutters just enough to allow a sliver of light in, and let the frog’s eyes focus on this thin stream of light for about ten minutes. Afterwards, Boll found an analogous streak of bleached rhodopsin running along the frog’s retina.

An optogram Kuhne retrieved from the retina of a rabbit, showing light entering the room through a seven-paned window.

Kuhne was intrigued by Boll’s research, and soon after reading about it he started his own studies on the retina. He too was able to observe optograms in the eyes of frogs, and he saw an even more detailed optogram in the eye of a rabbit. It preserved an image of light coming into the room from a seven-paned window (see picture to the right).

Kuhne worked diligently to refine his technique for obtaining optograms, but eventually decided that—despite the folklore—the procedure didn’t have any forensic potential (or even much practical use) at all. He found that the preservation of an optogram required intensive work and a great deal of luck. First, the eye had to be fixated on something and prevented from looking away from it (even after death), or else the original image would rapidly be intermingled with others and become indecipherable. Then, after death the eye had to be quickly removed from the skull and the retina chemically treated with hardening and fixing agents. This all had to be done in a race against the clock, for if the rhodopsin was able to regenerate (which could even happen soon after death) then the image would be erased and the whole effort for naught. Even if everything went exactly as planned and an optogram was successfully retrieved, it’s unclear if the level of detail within it could be enhanced enough to make the resultant image anything more than a coarse outline—and only a very rough approximation of the outside world.

Regardless, Kuhne couldn’t overlook the opportunity to examine Reif’s eyes. After all, he never did have the opportunity to see if optograms might persist in a human eye after death and—who knew—perhaps optograms in the human eye would be qualitatively different from those made in the eyes of frogs and rabbits. Maybe human optograms would be more accessible and finely detailed than he expected. Perhaps they might even be scientifically valuable.

Reif was beheaded in the town of Bruschal, a few towns over from Kuhne’s laboratory. After Reif’s death, Kuhne quickly took the decapitated head into a dimly-lit room and extracted the left eye. He prepared it using the process he had refined himself, and within 10 minutes he was looking at what he had set out to see: a human optogram.

Kuhne’s drawing of The image he saw when he examined Erhard Reif’s retina.

So was this the revolutionary discovery that would change ophthalmic and forensic science forever? Clearly not, or murder investigations would look much different today. Kuhne made a simple sketch of what he saw on Reif’s retina (reprinted to the right in the middle of the text from one of Kuhne’s papers). As you can see, it’s a bit underwhelming—certainly not the type of image that would solve any murder mysteries. It confirmed that the level of detail in a human optogram didn’t really make it worth the trouble of retrieval. Kuhne didn’t provide any explanation as to what the image might be. Of course any attempt to characterize it would amount to pure speculation, and perhaps the esteemed Heidelberg physiologist was not comfortable adding this sort of conjecture to a scientific paper.

This experience was enough to deter Kuhne from continuing to pursue the recovery of human optograms, and it seems like it would be a logical end to the fascination with optograms in general. The idea of using them to solve murders, however, reappeared periodically well into the 1900s. In the 1920s, for instance, an editorial in the New York Times critiqued a medical examiner who had neglected to take photographs of a high-profile murder victim’s eyes, suggesting that an important opportunity to retrieve an image of the murderer had been lost.

But as the 20th century wore on and our understanding of the biochemistry of visual perception became clearer, interest in optograms finally dwindled. Those who studied the eye were not convinced of their utility, and that opinion eventually persuaded the public of the same. It’s intriguing to think, though, how different our world would have been if optograms really had lived up to the hype. It certainly would have simplified some episodes of CSI.


Lanska DJ. Optograms and criminology: science, news reporting, and fanciful novels. Prog Brain Res. 2013;205:55-84. doi: 10.1016/B978-0-444-63273-9.00004-6.

Know your brain: Posterior parietal cortex

Where is the posterior parietal cortex?

posterior parietal cortex in blue.

The posterior parietal cortex comprises the region of the parietal cortex that is posterior to the primary somatosensory cortex and its adjacent sulcus, the postcentral sulcus. The posterior parietal cortex itself is divided into an upper and lower portion: the superior parietal lobule and inferior parietal lobule, respectively. These two lobules are separated from one another by a sulcus called the intraparietal sulcus.

What is the posterior parietal cortex and what does it do?

The posterior parietal cortex receives input from a collection of sensory areas as well as a variety of other regions of the brain, and is thought to integrate that input to facilitate the execution of functions that require diverse information. It has been associated with a number of these functions, which are sometimes called "higher-order" functions; it is probably best known, however, for its role in attention.

Through attempts to find the brain regions that facilitate attention, researchers have identified two attention-related networks that involve the posterior parietal cortex; these are termed the dorsal and ventral fronto-parietal systems. The dorsal system is found in both cerebral hemispheres and includes areas of the superior parietal lobule and intraparietal sulcus as well as a region of the frontal cortex that is involved in eye movements and visual perception known as the frontal eye field. The dorsal system is thought to be involved with what is known as "endogenous attention," which involves attention that is directed based on individual goals or desires. For example, if you are attempting to focus your attention to read this article, you are utilizing endogenous attention. 

The ventral system is found primarily in the right cerebral hemisphere and includes the area where the temporal and parietal lobes meet (the temporo-parietal junction), the intraparietal sulcus, and areas of the frontal cortex. The ventral system seems to be involved more in what is termed "exogenous attention," or attention that is directed towards external stimuli that are not being attended to by endogenous attentional processes. For example, if you were reading this article in a library and someone a few tables over shouted, breaking the complete silence of the room, you would suddenly and reflexively direct your attention to the person who shouted. This type of attention is not associated with your own goals or desires, and falls under the rubric of exogenous attention.

an example of what a clock drawn by a patient with hemispatial neglect might look like.

The importance of the posterior parietal cortex to attention is perhaps best exemplified by a condition that can occur after damage to the posterior parietal cortex known as hemispatial or contralateral neglect. Hemispatial neglect is most frequently associated with damage to the posterior parietal cortex in the right cerebral hemisphere (due to stroke, head trauma, etc.), after which the patient ceases to devote attention to the left side of their body and visual field. These patients can act as if they don't perceive anything in a certain part of their visual field; if asked to draw a picture, they will often not include a significant portion (up to half) of the item drawn, they may eat only about half of the food off of a plate, and shave or put makeup on only half of their face. Some patients may even deny that part of their body on the neglected side is theirs in an attempt to reject the idea that they are suffering from a neurological condition.

The posterior parietal cortex is also believed to be involved in some aspects of motor function, such as planning movements and integrating visual information with movement to facilitate actions like reaching and grasping. Additionally, regions of the posterior parietal cortex are thought to contain neurons called mirror neurons, which are activated not only when a particular action is performed but also when someone else is observed performing the same action. The true function of mirror neurons is yet to be determined. Some hypothesize that they are important to allowing us to learn by imitation, or even for understanding the actions of others; but there are also many who are critical of these hypotheses, arguing they are too speculative and lacking evidential support.

Additionally, the posterior parietal cortex is thought to be involved in language as well as the ability to understand numbers and arithmetic. Thus, its functions span a large spectrum ranging from attention to movement to number processing. Research is still being done to better understand the role of the posterior parietal cortex in these actions and others. What is known about the posterior parietal cortex already, however, makes it one of the more intriguing areas in the brain.

Reference (in addition to linked text above):

Caspers S, Amunts K, Zilles K. Posterior Parietal Cortex: Multimodal Association Cortex. In: Mai JK and Paxinos G, eds. The Human Nervous System. 3rd ed. New York: Elsevier; 2012.

Capgras delusion

Think for a moment about the people in your life whom you are closest to and most familiar with---those whom you see, talk to, and maybe share intimate moments with on a regular basis. Perhaps this would be your spouse, partner, parents, siblings, or friends. Now, try to imagine waking up tomorrow and, upon seeing one of these people, being overcome with an unshakable feeling that it is not really them you are seeing. Even though you know it sounds crazy, you can't stop yourself from thinking that this person you have known for so long has been surreptitiously replaced with an impostor---someone else who looks just like them but is a different person altogether. You know this is irrational and even absurd, but it feels so true to you that you have to believe it's what really is going on.

The sense that people we are familiar with have been replaced with look-alike impostors is the defining symptom of a rare condition known as Capgras delusion. First described in 1923 by psychiatrist Joseph Capgras and his assistant Jean Reboul-Lachaux, Capgras delusion is one of a group of disorders known as delusional misidentification syndromes that involve persistent problems in accurately identifying oneself or others. The original description of Capgras delusion involved a 53-year-old woman who had experienced the death of four of five of her children, leaving her with only a daughter. Several years after the death of her children she began to believe that her daughter and husband had been replaced by identical look-alikes. She eventually felt this was true for everyone she was close to, and she devised elaborate explanations for the duplicity that involved the existence of multiple look-alikes for each person. She believed, for example, that each day she would sometimes see (and communicate with) several different impostors who looked just like her daughter---without ever actually speaking to her "real" daughter.

Patients with Capgras delusion often don't display other major cognitive deficits and can usually appreciate how ludicrous their beliefs may seem to others. They may be able to, for instance, admit that it would be hard for them to believe if someone else described a similar experience with look-alike impersonators. For example, this interaction (from a 1979 paper on the subject) occurred between an experimenter and a Capgras patient who---after a head injury---believed his wife and five children had been replaced with look-alikes:

E. [Experimenter] Isn't that [two families] unusual?                                                                                   S. [Patient] It was unbelievable!                                                                                                                     E. How do you account for it?                                                                                                                       S. I don't know. I try to understand it myself, and it was virtually impossible.                                            E. What if I told you I don't believe it?                                                                                                          S. That's perfectly understandable. In fact, when I tell the story, I feel that I'm concocting a story...It's not quite right. Something is wrong.                                                                                                             E. If someone told you the story, what would you think?                                                                              S. I would find it extremely hard to believe...

Despite a Capgras patient recognizing the irrationality involved, the delusion continues. Even time spent with the "impostor" doesn't dissuade the patient; in fact it only tends to strengthen the conviction that the "look-alike" is not who he or she claims to be.

Explaining the Capgras delusion

Although it is believed to stem from some neurological dysfunction, the Capgras delusion is not fully understood; several hypotheses have been proposed over the years to explain the phenomenon. Most recent hypotheses involve a deficit in the neurobiological mechanisms responsible for the recognition of familiar faces. To understand how this may lead to the development of the Capgras delusion, it can be useful to make a comparison to a disorder called prosopagnosia.

In prosopagnosia, patients have an impaired ability to recognize faces despite otherwise normal visual processing. This impairment often involves a general "face-blindness" that leads to a failure to recognize even the most familiar faces. Even though prosopagnosics are unable to overtly identify faces, however, past experiments have suggested they may experience a type of unconscious recognition when they see a familiar face. One way this has been tested has been to measure the skin conductance response (SCR) of prosopagnosic patients as they look at pictures of recognizable faces. SCR, which can detect slight changes in perspiration levels, is often used an indication of autonomic nervous system arousal and thus considered by some to be representative of a type of emotional response. An increased SCR has been observed in prosopagnosics when they look at images of people they are familiar with---even when they aren't able to identify the faces; this SCR has been interpreted as a physiological expression of unconscious recognition.

Capgras delusion is sometimes described as the "mirror-image" of prosopagnosia because Capgras patients recognize the faces of those closest to them, but their SCR is not increased upon seeing those familiar faces. Thus, it has been hypothesized that their conscious recognition is intact but their unconscious emotional response---that visceral familiarity we are used to sensing when we see those we are close to---is lacking. So, when Capgras patients are in the presence of someone they know they should have an emotional connection with, they are understandably disturbed when they don't feel any familiarity with the person. Instead they experience the same degree of autonomic arousal they would when seeing a stranger on the street.

The neurobiology underlying these unusual disruptions of familiarity is not very clear, and explanations of the mechanism responsible remain somewhat speculative. Because Capgras patients are able to recognize faces but do not display a typical emotional response to familiar faces, it has been hypothesized that there is some interruption in the pathways that connect facial recognition areas in the temporal lobe with areas of the limbic system---like the amygdala---that are involved with generating emotional responses. Although facial recognition is still functional, without the ability to activate the limbic system during facial recognition, the patient experiences a lack of emotion and familiarity.

It is thought that this dearth of familiarity is just one component of the Capgras delusion, however. Another aspect involves the pathological logic that leads to the belief that the suddenly unfamiliar person is actually an impostor. Why Capgras patients come to this specific conclusion instead of deciding that they are experiencing an abnormal neurobiological event is not very clear. It may involve an attempt to deal with the cognitive dissonance Capgras patients experience when they have a complete absence of feeling for someone they know they "should" have some emotional link with. In other words, a man would be perturbed to find he feels devoid of any familiarity towards his wife of 30 years; deciding that she must be an impostor allows him to explain his lack of emotion and perhaps reduce some of the mental strain caused by the alarming situation. The development of such an extreme and persistent delusion, however, also likely involves some neurological disruption of executive functions. For example, damage to the frontal areas of the brain, which are thought to be important in the management of rational thought, is often seen in Capgras patients and may contribute to the delusions that characterize the disorder.

There is a paucity of hard evidence to support the current hypotheses about the neurobiological bases of Capgras delusion, however. Likely due to the rarity of the disorder, many studies of Capgras patients (including the relatively few neuroimaging studies that have been published) have been case studies of just one patient. This approach, although informative, does not provide us with the type of evidence that can be used to make strong conclusions about the underlying neurobiology of the Capgras delusion. It is not surprising that research in this area has progressed relatively slowly, for Capgras delusion is far from a public health crisis; thus, answers are not pursued with the same fervor as they are in a much more prevalent disorder like Alzheimer's disease. To the neuroscientist, however, the Capgras delusion represents a fascinating opportunity to explore functions of the brain that we normally take for granted. The recognition of a spouse, for example, as someone who has been part of your life for years seems so natural and ingrained that it is difficult to believe it is dependent upon the proper functioning of neurobiological mechanisms in the same way that sight or movement might be. Capgras delusion, however, demonstrates that even our most fundamental beliefs can crumble with the dysfunction of certain brain regions.

Young, G. (2008). Capgras delusion: An interactionist model Consciousness and Cognition, 17 (3), 863-876 DOI: 10.1016/j.concog.2008.01.006

Savant syndrome

Savant syndrome is one of the true mysteries of neuroscience. Many people were first exposed to this curious phenomenon when they watched the movie Rain Man. In it, Dustin Hoffman plays a character named Raymond Babbit, who is loosely based on Kim Peek. Peek was (Peek died in 2009) a savant who had a stunningly prodigious memory and the ability to read a book in an hour, retaining virtually all of the information he took in during that short time. Peek, like most other savants, also suffered from a neurological disorder that caused him to have difficulty with more mundane aspects of daily life like getting dressed or brushing his teeth. Intellectually, he had trouble understanding abstract concepts and engaging in normal reasoning. Yet, by some accounts he read and remembered the information from over 12,000 books.

Like Peek, most savants are exceptional in one or a few closely related areas, yet they suffer from a cognitive disorder that severely disrupts functioning in other areas. In most cases, a savant’s skill will involve art, music, mathematics, mechanics, spatial estimation, or calendar calculating, e.g. making rapid calculations regarding the day of the week given a particular date. The mystery, however, is what creates the exceptionality. The area of expertise is usually not something that the individual has had special training in, and thus it often seems like an innate ability has somehow been brought from the depths of the mind up to the surface. Strangely enough, this emergence usually coincides with some sort of developmental disorder or with another insult to the integrity of the brain.

In fact, there are multiple instances of a savant-like skill emerging from a person with no sort of developmental disability after he or she had some sort of traumatic brain event. This is known as acquired savant syndrome. For example, in 2006 a 39-year old man named Derek Amato experienced head trauma after hitting his head on the floor of a swimming pool. Despite having no training in piano playing (and little musical training in general), Amato discovered a few weeks after the accident that he was suddenly able to play the piano as if he had been taking lessons for years. Now, Amato is a professional piano player who has released multiple albums.

Savantism, and especially acquired savant syndrome, begs the question: do we all have these abilities locked up deep within our brain, but just don’t know how to free them? Some researchers, like Allan Snyder, think so. Snyder suggests that savants are able to access information in a raw form, before a “normal” brain would begin to categorize it, apply labels, and incorporate it into a larger picture. Although it is helpful for our brains to create this holistic view of something, it may also cause us to ignore details our brain decides are unimportant and/or distracting. Savants, then, might possess an attention to detail that many of us are incapable of displaying, and perhaps it is this attention to detail that allows for things like the creation of meticulous sculptures, memorization of large pieces of information, and the ability to do lengthy mathematical calculations in one’s head.

Snyder tried to test this hypothesis using transcranial magnetic stimulation (TMS). TMS involves the use of magnetic fields to transiently disrupt electrical activity in the brain. Although this sounds somewhat perilous, the side effects are minimal, and when the TMS stimulation stops, brain activity returns to normal relatively quickly. You can see a video of this procedure at the bottom of this post. TMS gives scientists a valuable tool, as they can disturb function in an awake patient, then see how that perturbation affects behavior. In the past, this was something that could only be done with surgery.

Snyder et al. asked participants to look at an image consisting of anywhere from 50 to 150 dots (see image to the right) for only 1.5 seconds, and then to estimate the number of dots that were there. Obviously, 1.5 seconds is too fast for someone to count the dots, but the rapid estimation (sometimes with precision) of large numbers of objects is something that has been documented in savants. Snyder et al. had the participants make estimates after applying TMS to their left anterior temporal cortex (a region hypothesized to be involved in savant syndrome) as well as after applying sham TMS (i.e. the machine was used in such a way that the participant might think TMS was being used, but there was no stimulation applied) as a control. The researchers found that, in 10 out of 12 participants, the ability to estimate the number of dots improved after TMS. Snyder et al. suggested that, by inhibiting activity in the anterior temporal cortex, they inhibited brain activity involved in holistic processing, allowing for a focus on detail that improved numerical estimation.

While the idea that savant-like skills are latent within each of us is intriguing, the explanation provided by Snyder et al. for why we can’t normally access them still will require much more work to be convincing. The skills displayed by some savants seem to be almost superhuman, and thus it’s difficult to understand how holistic processing dampens them to such a degree that they nearly disappear in most of us. Snyder’s hypothesis also doesn’t allow us to understand how someone with no training in a specific skill like piano playing can suddenly acquire that skill. Although the ability to pay attention to details may be important for the development of musical talent, such a talent at least has the appearance of requiring some deeper training as well. How someone can gain proficiency in an instrument without that training, regardless of how their brain processes details, is a puzzle.

Savant syndrome remains a very intriguing area for future research. For, it hints at some deep power hidden within the human brain, and also suggests that there may be a key to unlock that power in all of us. If that key is found, it would revolutionize our conceptualization of human potential.

Snyder A, Bahramali H, Hawker T, & Mitchell DJ (2006). Savant-like numerosity skills revealed in normal people by magnetic pulses. Perception, 35 (6), 837-45 PMID:16836048