Thursday, January 31, 2008

I Have the Strangest Feeling I’ve Written this Post Before

We’ve all experienced it, some of us many times in many different places: déjà vu, that nebulous feeling you’ve been somewhere before although you can’t pinpoint exactly when or under what circumstances. A number of explanations have been offered over the years for why déjà vu occurs. They range from the mystical (remnants of memories from a past life) to the scientific. Even within these disparate categories the explanations are numerous. Some scientists consider it simply a case of erroneous memory. Perhaps, they suggest, some features of the environment are similar enough to a past environment to create a sense of familiarity, even though it is inaccurate. Others postulate déjà vu results from a lack of calibration between short and long-term memory. This might occur when the details of a scene are sent to long-term memory storage before they have been consciously processed. Thus, there is a lag of a few seconds before the short-term memory catches up, when a distant memory of the present environment seems to exist even though it was formed just seconds prior. Most of the scientific explanations, although they vary in details, share the implication that déjà vu involves some sort of miscalculation or misrepresentation by the brain.

Current research involving deep brain stimulation (DBS) may eventually provide some real insight into déjà vu, although déjà vu had nothing to do with the original goals of the research. DBS is a relatively new technique where a device that emits electrical pulses is surgically implanted in the brain. Although the reasons for its effectiveness are not fully understood, it has been shown to be beneficial in the treatment of Parkinson’s disease, tremors, and chronic pain. Andres Lozano, a Professor of Neurosurgery at the Toronto Western Hospital in Toronto, Ontario, and his team of researchers were attempting to use DBS to treat a 50-year old male patient with chronic obesity when they induced an unexpected result. While electrically stimulating areas of the hypothalamus in the hopes of identifying a site with appetite-suppressant qualities they discovered an area near the fornix that elicited feelings of déjà vu. With further stimulation the patient was able to recall vivid memories of being in a park with his friends when he was around twenty years old. The patient returned for further tests and it was found stimulation of the same area increased performance on memory tasks.

The hypothalamus is primarily involved in the regulation of metabolic and other autonomic processes through its connection to the pituitary gland. The fornix, however, has a more prominent role in memory. It is made up of a bundle of axons (nerve fibers) that connect to the hippocampus, an area of the brain thought to be extremely important in memory formation and recall.

Probably the most interesting aspect of these findings has little to do with déjà vu and more to do with DBS. DBS has only begun to be implemented within the past ten years or so, and it continues to surprise scientists and doctors with its multifarious uses. In addition to its efficacy in alleviating the chronic conditions listed above, it has also been found to have some success (just within the past year) in rousing patients from a comatose state. Since we still don’t know why this procedure is effective, imagine its potential benefit when we figure out what the mechanism of action is.

Wednesday, January 30, 2008

Of Mice and Men and Empathy and Schadenfreude

Scientific American: Mind has an article in their most recent issue about our increasing recognition of empathy in non-human animals. It summarizes the history of the attribution of moralistic emotions to non-humans, with the implication that now more than ever scientists are recognizing homologues of empathy in animals like mice and primates. Primatologist and psychologist Frans B.M. de Waal plays an important role in that history, as he has long argued other animals are capable of morally driven actions. He suggests the capacity for such actions played an important role in the evolution of humans, as they promote concern for the lives of others in one’s social group and also help resolve conflicts within that group.

The Scientific American article discusses several studies done within the last couple of years that seem to support the empathetic animal view. For example, a group of researchers at McGill University took pairs of mice separated by Plexiglas and injected either one or both of them with an acetic acid that causes a stomachache. The stomachache results in a restless discomfort, commonly referred to as “writhing”. They found that a mouse injected with the acid demonstrated more writhing when its partner was also showing discomfort. Even more importantly, this effect only occurred between mice who were cage mates before the experiment, suggesting an empathetic concern between mice who were closely connected. Interestingly, it also only happened with male and female or female and female mice. When a male mouse saw another male mouse in discomfort, the observing mouse’s pain tolerance actually rose and he exhibited less distress. Since male mice have a high level of competition with other males (for mates), it could be they were suppressing their pain in order to demonstrate strength. It also may indicate they felt less empathy for their rival.

A group of researchers at the University of Zurich studied this same empathetic effect in humans. They used neuroimaging to observe participants’ brain activity as they watched another person in pain. Prior to this the participants had played a game with the subject experiencing the pain. In some situations the person in pain had worked cooperatively with the participant, in others they had treated the participant unfairly in the game. The study found that viewing the subject in pain who had been cooperative activated areas in the brain related to pain (an empathetic reaction). In females this area was also activated even when viewing the subject who had treated them unfairly. When men watched the person who had treated them unfairly experience pain, however, the pleasure centers in their brain were activated.

The similarities between the human study and the mouse study could be interpreted to suggest rivalry and competition among males is a powerful, and possibly universal, emotion. Evolutionarily men have had to face much more vicious competition among themselves in order to attain a mate. This is one suggested reason why males of a species often grow to be larger than females, as the larger male has had an evolutionary advantage in being able to physically defeat competitors for available mates. Thus, it makes sense that males have less empathy for a known competitor. Evolutionarily those who were too trusting and friendly (less inclined to compete/fight) probably would have been the first to be pushed to the side when a contest for an available mate began.

Sunday, January 27, 2008

Equal Time for ESP Enthusiasts

Last week I put up a post about a neuroimaging experiment that studied brain activity associated with extrasensory perception (ESP). I must admit I am biased on this topic, and tend to be pretty dismissive toward belief in the paranormal. I’m sure this had something to do with me deciding to post on that particular study, and I’ll bet a hint of gloating could be detected in my review of an experiment that discredited the existence of ESP. So when I came across this study today I felt obligated to discuss it, in an attempt at providing equal time.

Researchers working jointly at Imperial College London (ICL) and the National Institute of Health (NIH) in the U.S. have been studying how homologous recombination occurs between two strands of DNA. Homologous recombination happens when two strands of DNA that are complementary come together, break apart, trade genetic information, then close back up again. It is commonly seen in meiosis (cell division to form sex cells), where it is often referred to as “crossing over”. Homologous recombination contributes to genetic diversity, and thus is integral to evolution. It is also used by the body to repair damaged sections of DNA.

For two strands of DNA to come together and recombine, one strand must first be able to identify another that has complementary base pairs (e.g. A-T, C-G). In the past, scientists thought this process was facilitated by proteins or other organic molecules. But the groups at ICL and NIH found that long strands of DNA appear to be able to recognize one another without any direct contact or assistance from other molecules. The researchers suggest the recognition is due to patterns of electrical charges between the two molecules, but more work must be done to fully understand that interaction. The findings could be important, as understanding this mechanism may shed light on how to stop errors in recombination that lead to diseases like cancer or Alzheimer’s.

But the reason I juxtapose this study with the neuroimaging post from last week is that these DNA strands seem to behave in a “telepathic” manner. Their method of communicating without contact probably would have been disregarded ten years ago as improbable or impossible. And it seems subtle paradigm shifting (note the word subtle as I’m not promoting a Kuhnian philosophy here) like this happens all the more frequently in science as our technology becomes more impressive. Something we disregard we suddenly can explain, and then it becomes not only accepted as true but as if it couldn’t ever have been any other way. So…maybe I shouldn’t be quite as dismissive about the paranormal. Perhaps it’s not very scientific.

Saturday, January 26, 2008

Warning to Homophobes: Don't Drink With Fruit Flies

It’s common knowledge that drinking alcohol can lower our inhibitions, causing some of us to occasionally do things we regret the next sobering (in more ways than one) day. One common cause of alcohol-induced remorse is the weakening of sexual restraint. It can lead to a sexual liaison with someone you normally wouldn’t consider sharing your bed with, whether it be a co-worker, friend, or someone you’re just plain not attracted to (hence the scientific term “beer goggles”). While all of this is common knowledge, scientists don’t really understand why it happens. So a group of researchers at Penn State is attempting to make sense of it by studying Drosophila melanogaster, more commonly known as the fruit fly.

It may seem strange to study a fruit fly to gain a better understanding of human beings. Fruit flies, however, have been an integral part of science since the beginning of the twentieth century. At that time Thomas Hunt Morgan was trying to comprehend Gregor Mendel’s pea plant experiments. He wanted to figure out what molecular mechanisms could be responsible for inheritance. One reason he chose to study fruit flies is they produce a new generation about every two weeks. While Mendel had to wait a year for traits to be passed down from his pea plants, Morgan could go through over twenty generations in that time. Morgan had set out to prove Mendel wrong, but ended up winning the Nobel Prize in 1933 for demonstrating that inherited information was passed down on chromosomes (confirming Mendel’s hypothesis of inheritance).

Drosophilas have been widely used in science ever since. Their quick rate of reproduction allows researchers to make genetic manipulations and study the effects shortly after. In addition, the genome of the fruit fly has been sequenced, and surprisingly they share up to 77% of disease-causing genes with humans. These include genes for Parkinson’s, Huntington’s, and Alzheimer’s disease. Their similarity to humans makes them popular for genetics research, but our good understanding of the Drosophila genome has also made them common subjects for studying behavior. Fruit flies have been used to study (among other things) memory performance, longevity, sexual orientation, and alcohol and drug abuse.

The team at Penn State gave fruit flies a daily dosage of ethanol (the intoxicating agent in alcohol) and observed the results. They saw that ethanol increased male fruit fly courtship of females, as would be expected. But to their surprise, they found it also resulted in increased instances of inter-male courtship. Inter-male sexual relations rarely happen in Drosophila without the influence of ethanol, but with ethanol the tendency for them to occur rose steadily after the first few encounters. The researchers studied the molecular mechanisms behind this behavior and found a couple of factors, one being dopamine transmission, that were necessary for the decreased sexual inhibition. The study wasn’t undertaken to examine homosexuality in fruit flies (although researchers in the past have identified a gene, called fruitless, that can be manipulated to cause homosexual behavior), but to find a physiological basis for sexual disinhibition. The results are something future researchers can build upon to understand not only why we sometimes wake up next to a face we are surprised (or mortified) to see, but--more importantly--why more unprotected sex and sexual assault occurs when alcohol is involved.

Thursday, January 24, 2008

In the Eye of the Beholder

Imagine you are at work one morning, sitting at your desk (or wherever you may sit at work), and someone begins walking toward you. You look up and their face is a blur, a completely featureless void, that gives you no indication who they might be. You examine their gait, their clothing, and their body shape. This tells you they are a man, but everything else is so nondescript you don’t know if he’s your boss, a visitor, or a co-worker from down the hall. You anxiously look down, hoping the person won’t notice your confusion, as it would seem quite strange since you’ve worked there for years. Only when you hear his voice do you realize he is a friend, simply interested in what you are doing for lunch.

Welcome to the world of a prosopagnosiac. People who suffer from prosopagnosia have an inability to recognize faces. The cause of the disorder, also known as face blindness, is most often some sort of brain damage, such as a tumor or lesion, but there are also congenital cases. A 2006 survey suggested that up to 2% of the population may suffer from prosopagnosia. The severity of the disorder can vary from a subtle blurring of features to the complete inability to recognize faces (even one’s own face) as described in the paragraph above. Prosopagnosiacs often learn to cope with their affliction by focusing on other features of a person that make them recognizable, such as body size, voice, or style of dress. Thus, the account above is slightly dramatized, as an inured prosopagnosiac probably would have been able to recognize a friend based on some of these other qualities before hearing his voice.

Prosopagnosia is a fascinating disorder for many reasons, but perhaps what makes it most amazing to neuroscientists is its specificity. For a long time scientists didn’t know if there were areas of the brain—or individual neurons—so specialized that, when damaged, they could impair only one distinct skill. As prosopagnosia is an example of such a case, it has contributed to our modern understanding of the brain.

Prosopagnosia appears to result from damage to an area in the temporal and occipital lobes called the fusiform gyrus. Neuroimaging studies have identified a specific region of the fusiform gyrus that is activated when a subject views a person’s face. It resultantly came to be named the fusiform face area (FFA). There has been some dispute over whether the FFA is activated only in facial recognition, but its role in seeing faces is well documented. A recent study conducted with prosopagnosiacs demonstrated that the FFA is also used in recognizing beauty in a face. While the brain is amazing for its complex interactions between its various parts, it is also interesting (and sometimes frightening) to realize just how functionally specific some of those parts are.

Wednesday, January 23, 2008

Thinking Thin Not So Easy

Our overweight population is arguable the most dangerous health crisis the United States is facing right now, and much of the rest of the developed world is heading down the same path. About 65% of the U.S. population is overweight, and over 30% are obese. Public awareness of this is rising slowly, resulting in half-hearted attempts by fast-food restaurants to add healthy items to their menus and in the proliferation of a diet industry that in many cases probably does as much harm as good. Needless to say, the trend seems to be continuing in the wrong direction. As we grow fatter as a nation, we also find diabetes, heart disease, and some types of cancer rising at alarming rates.

Many of the proposed solutions to this dilemma focus on public awareness and corporate responsibility, both of which are good things. Many scientists, however, are interested in finding the roots of the problem. There is a reason why human beings are inclined to eat fatty foods, and why the digestion of excess amounts of such foods results in the deposit of adipose tissue throughout the body. Think about this from an evolutionary standpoint. In an environment like that which our hunting and gathering ancestors lived, there were periods of food availability followed by days (or longer) where food was scarce. In this ancient world, the ability to store fat as adipose tissue would become adaptive, and the desire for fatty foods would have been beneficial as those types of food would result in stored energy that could sustain one over periods of scarcity. Today’s environment differs, however, in that food is available all the time, and those foods that are often the fattiest are those that require the least effort and money to obtain. Perhaps those behavioral remnants of our evolutionary past combine with the modern ubiquitousness of food to create the obesity epidemic we are witness to today.

But this obviously isn’t the whole story, for it doesn’t explain the difference between the 35% of the population who isn’t overweight and the 65% who are. Scientists hope that finding the reason for this disparity may lead to better methods to curb obesity and avoid the national health crisis we seem to be headed toward. There has been a great deal of research that supports a strong genetic influence in obesity. The number of genes involved, their interdependence, and the molecular mechanism of their influence, however, are yet to be determined.

Naturally some of research in this area is focused on the neural mechanisms that contribute to overeating. As eating is a rewarding process, much attention has been paid to dopamine abnormalities leading to obesity (for more discussion of dopamine and rewarding processes see last week’s post “Drugs, Love, & War: All the Same to the Brain?”). A recent discovery by William Bendena and Ian Chin-Sang of Queen’s University, however, has shown perhaps the most direct connection between neurotransmitter activity and overeating to date. Experimenting with worms that have distinct neurotransmitter similarities to humans, Bendana and Chin-Sang found a nervous system receptor that, when damaged, caused no change in the worms—until they were placed on food. Then they suddenly become lethargic, would not move away from the food, and gained fat at a much quicker rate than controls. When they added extra copies of the receptor to other worms, they became much more active, traveling great distances from their food supply. Of course much work must be done to apply these findings to humans, but it does suggest that perhaps there is a neurobiological mechanism that leads directly to lethargy and overeating. If so, it may be amenable to correction through pharmacological methods, which might be more successful than simply adding more salads to a fast food menu.

Tuesday, January 22, 2008

Stem Cells and the Brain

Stem cells are probably one of the least understood (by the public), yet most fascinating, biological entities we have identified. Who of us hasn’t marveled at the ability of a newt to grow back its limbs after they are cut off, or of a starfish to be cut in half and regenerate to form two new starfish? Both organisms are able to do these seemingly miraculous things because of stem cells. So you can understand why some scientists are consumed with understanding and utilizing stem cells, in the hopes of slowing disease and even aging. Stem cells are special because they are versatile cells that can be prodded to develop into any type of adult cell, be it muscle, liver, nerve, etc. This makes them valuable not only for possible cell replacement therapies (for degenerative diseases like Parkinson’s), but also for the study of cell growth to learn more about the etiology of disease. If you are unfamiliar with stem cells and have a few hours to learn about them, there is a fantastic series of lectures available for free on the Howard Hughes Medical Institute’s interactive site, http://www.hhmi.org/biointeractive/lectures/.

Manipulating stem cells, however, is not easy and involves several dilemmas. A major one is: once you have one of these versatile cells, how do you get it to become what you want it to be? This is an area of continuing research, and in most cases involves finding a gene or set of genes responsible for directing the stem cell’s growth. This, once again, is not an easy task, as there are somewhere around 30,000 genes in a human cell (estimates vary).

Occasionally, however, there are successes. Dr. Edwin Monuki and colleagues at the University of California, Irvine, have identified a gene called Lhx2 that directs stem cells in early development to form the cerebral cortex. The cortex is responsible for higher-order functions in humans, such as reasoning, language, and vision. Degradation of or damage to the cortex can be very debilitating, as is seen in cases of Alzheimer’s disease or stroke. Thus, the discovery of a mechanism to turn stem cells into cortical cells has great potential to slow neurodegenerative disease or help patients recover from cerebrovascular accidents. Despite the political controversies, stem cells are one of the most promising tools we have for fighting disease and aging, although much more must be learned about them before they can fulfill our expectations. Discoveries like Dr. Monuki’s are edifying steps toward that goal.

Friday, January 18, 2008

The Eyes Are the Windows to the...Internet?

This may seem a little off the topic of neuroscience, but ultimately neuroscience is needed to explain perception, and anything related to vision is related to the brain (plus it was just too cool for me to ignore). Remember this scene?

It is from Terminator 2: Judgment Day, and is a snapshot of the Terminator robot’s (Arnold Schwarzenegger's) point of view. His/its visual display was complete with a targeting mechanism as well as an extensive database that was able to identify people and objects without having to make any movement or effort. Well, as is sometimes the case with science fiction, the movie may be a harbinger of technology to come, possibly within this generation. Engineers at the University of Washington are attempting to develop a contact lens with an imbedded electronic circuit that would superimpose a virtual display over the wearer’s normal visual representation. They have gotten as far as creating a flexible, safe lens with a circuit and light-emitting diodes incorporated into it, although the diodes do not yet light up.

The ability to put a circuit on a contact lens, however, is incredible in and of itself. Not only does it necessitate the creation of nanoid circuits, but it also involves embedding them in biologically safe materials that are flexible enough to fit on the eye. The circuits for the contact lenses created at UW are about one thousandth the width of a human hair. After being constructed they are spread on a sheet of flexible plastic. The pieces of the circuit can only fit together in one shape, and natural capillary action (the same force that draws water up plant tubules) brings the pieces together. This is an amazing manufacturing technique in nanotechnology known as self-assembly.

The display Arnold Schwarzenegger had in Terminator 2 is still quite a ways off. But according to a UW assistant professor of engineering, Babak Parviz, a basic display with a few pixels could be functional “fairly quickly”. Eventually, the team at UW hopes to create a contact lens that could easily be popped in or out, with wireless technology that would allow one to surf the internet without a screen in front of them. It sounds very cool, but also I imagine it would be a little eerie to walk into Starbucks and see a bunch of people staring off into space as they sip their mochas…

Thursday, January 17, 2008

Using Neuroscience to Debunk the Paranormal

Extrasensory perception, or ESP, is one of the most widely accepted paranormal phenomena, with almost half of adults in the United States affirming its existence. Under the rubric of ESP fall mental processes that are considered outside the normal range of thought, such as predicting the future, reading other people’s minds (telepathy), and knowing of distant events as they occur (clairvoyance). Detractors claim no reliable evidence of ESP has ever been presented, while supporters assert they have experienced these extraordinary thoughts, such as knowing someone was going to call right before the phone rang.

These areas are difficult for scientists because, although no scientific evidence exists to support such claims, they are also very difficult to disprove. It is clearly possible someone believes they had an eerie feeling at just the same moment a friend was involved in a car crash. A scientist might argue, however, that the person remembers a chill he had which would normally have been ignored, but now is remembered in conjunction with a disturbing event. The association is made after the fact, but remembered as if it were made beforehand. The same applies to the case of thinking about a friend, then hearing the phone ring and being surprised to hear her voice on the other end of the line. You may have thought about her a hundred times in the week before this call, but it is only considered a memorable event when the phone coincidentally rings during one of those times. Those who believe in ESP might ascribe this to telepathy, but a scientist might suggest this is due to coincidence and the human tendency to remember things through association. Still, it is hard to prove it’s not because of ESP.

A group of psychologists at Harvard University are using neuroimaging to try get to the bottom of this issue. Samuel Moulton and Stephen Kosslyn used fMRI to study participants as they viewed ESP and non-ESP stimuli. The non-ESP stimuli were pictures simply presented visually. The ESP stimuli were presented visually as well in three other ways. First, to measure telepathy, they were shown to the participants’ identical twin, relative, romantic partner, or friend, who was seated in another room. Then, to measure clairvoyance, they were displayed on a computer screen located out of sight of the participant. To measure precognition, the pictures were shown later (in the future).

They found no difference in the way the brain reacted to the ESP and the non-ESP stimuli, although there was a difference in the emotional importance the participants’ ascribed to the ESP stimuli. This finding supports the concept mentioned above, where we may assign significance to an event, then later correlate that event with a paranormal explanation. The researchers are the first to point out that this doesn’t prove ESP is not real. Once again, it’s hard to prove something like ESP doesn’t exist, as proponents can simply claim it is not measurable in this way (neuroimaging). But Moulton feels this is the best evidence against ESP thus far.

Wednesday, January 16, 2008

Nature vs. Nurture in Depression

It seems the nature vs. nurture debate has cooled from a fiery argument to some mild bickering over the details (although make no mistake, they are important details). Most scientists today will accept the statement that neither nature nor nurture can be considered solely responsible for one’s behavior, rather it is some combination of both. Just how much each factor contributes to that end product, however, is the detail that continues to be debated. Personally, I have always appreciated the analogy of a virtuoso musician playing an instrument. When you hear the music she creates, you cannot say that either the musician or the instrument is solely responsible for it. Take away either one and you are left with silence. In a similar (but much more complex) way, our genes and environment are both responsible for who we are.

A study conducted by Gerald Haeffel and colleagues at the University of Notre Dame attempted to investigate depression while taking both genes and environment into consideration. It has long been thought that dopamine may play a role in depression. In fact, a specific gene has been identified that encodes for a dopamine transporter (removes excess dopamine from the space between communicating neurons, or synaptic cleft), and a certain variety (or allele) of this gene has been correlated with depression. Haeffel studied 177 male adolescents from a Russian juvenile detention facility. They were given a depression assessment, a questionnaire designed to determine their mothers’ parenting style, and tested for the specific dopamine transporter gene previously implicated in depression. The results showed that neither cruel mothering patterns, nor the dopamine transporter gene alone predicted depression. A combination of both, however, resulted in a higher risk for depression and suicidal tendencies.

This study is groundbreaking not only because it is the first to support the theory of a dopamine transporter gene in depression, but also because it represents a modern understanding of the interaction of nature and nurture. As scientists like Haeffel begin to more frequently use a combination of genetic and environmental experimental designs, we will inevitably gain a much deeper, and more accurate, understanding of human behavior.

Tuesday, January 15, 2008

Drugs, Love, & War: All the Same to the Brain?

In many ways, of course, the brain handles drugs, love, and violence drastically differently. Researchers have been aware for some time, however, that love and drugs also have many similarities in how they are processed by the brain. A neurotransmitter called dopamine has been found to be necessary for participation in drug-seeking or love-seeking behavior. In fact, it has been implicated in nearly every experience we consider rewarding, such as love, drugs, eating, and sex. This has caused one of the primary dopaminergic systems of the brain, the mesolimbic dopamine pathway (along with other accompanying structures) to be referred to as the reward system of the brain. Originally it was thought that this system must be responsible for the euphoric effects one feels when using drugs or while experiencing romantic love. Later experiments showed, however, that it is more likely dopamine is necessary for reinforcement--for helping the brain to remember what experiences were rewarding, and what clues in the environment to look for in order to facilitate the reoccurrence of those experiences. For example, an ex-smoker might get a whiff of cigarette smoke from a passerby and feel a craving to have a cigarette. This associative memory experience is probably due to dopamine.

Now for the first time researchers have drawn a direct connection between dopamine and aggression. Many of us have felt the surge of energy that accompanies watching a boxing match, or a fight during a hockey game, and some may even have been embarrassed when overcome with such a "primal" emotion. Maria Couppis and other researchers at Vanderbilt University conducted an experiment with mice to further explore this violence-associated euphoria. In the experiment a male and female mouse were kept together in a cage and several other mice were kept in a separate cage. The female mouse was removed and replaced with one of the intruder mice, provoking an aggressive response by the "home" male mouse. The female mouse was returned and the intruder removed. The home mouse was then trained to push a target with its nose in order to have the intruder mouse put back into the cage, when it would again behave aggressively toward it. The fact that the mouse continually pushed the target indicates the opportunity to engage in the aggressive defense may have been rewarding to the mouse. Then the mouse was treated with a dopamine antagonist, which blocks the activity of dopamine. This significantly reduced the mouse's target-pushing behavior. The experiment was repeated with a number of different mice, and with changes in environment, with similar results.

This experiment is the first to demonstrate a distinct similarity between violence and other reward seeking behavior. Why would our brains put violence in the same category as sex? The consensus opinion on rewarding behavior of any kind is there must have been an evolutionary advantage in pursuing that type of behavior in order for it to become part of our reward system. The evolutionary advantage of eating, for example, is obvious (it's necessary for survival), so it makes sense we should have evolved so eating is enjoyable for us. The same is true for sex. Without it we can't achieve our evolutionary goal of procreation, so it should be something we want to pursue. Aggression had its own evolutionary advantage. It was necessary to our ancestors in order to protect offspring, mates, territory, and food. So perhaps the fascination many of us have with violence comes from a brain system that evolved in a time when aggression was a necessary part of survival. It is important to mention, however, that just because violence may be a natural part of our evolutionary past does not mean it still holds a place in today's environment or behavior.

Friday, January 11, 2008

The Neuroimaging Revolution

One of the most exciting scientific advances of the past fifty years has been the development of complex neuroimaging techniques. Since computerized axial tomography (CAT or CT) was introduced in the 1970s we have seen the development of PET scanning, magnetic resonance imaging (MRI), and functional MRIs (fMRI), each more effective than the one before, and each allowing for a drastically improved understanding of the brain and behavior. The dominant method of imaging through the past twenty years of brain research has been the fMRI. MRIs create an image of the brain out of radio waves emitted by hydrogen atoms in the body when they are manipulated by a magnetic field. fMRIs go a step further, allowing for a measurement of actual brain activity. When brain areas are active, they have an increased need for oxygen, and thus there is an increased amount of oxygenated blood moved to that area. fMRIs take advantage of the fact that oxygenated and deoxygenated blood have different magnetic resonance signals, and create an image of brain activity based on blood oxygenation levels. Here are pictures of an MRI (top left) and fMRI (bottom right):












fMRIs are effective and now integral in brain science, but don't think for a second that the desire to create precise brain imaging techniques ends there. There are a number of other techniques still being perfected that you may not hear about until they become more prevalent. One that is already starting to now give us a more complete picture of the brain is diffusion tensor imaging (DTI). While MRI shows the major structures of the brain, it has not been able to recreate the connections between those structures, such as the white matter tracts that connect the two hemispheres of our brain. DTI uses measurement of water diffusion along nerve fibers to image this subarchitecture. Compare this colorful DTI picture to the MRI and fMRI pictures above.


DTI has already begun to be utilized in research. Randy Buckner and colleagues used DTI to measure white matter integrity in older patients. They studied it along with fMRI, and found that, in older individuals who had experienced a loss of cognitive ability, the integrity of white matter connections was decreased along with that of functional connections. It is hoped that the use of DTI will provide insight into the cognitive loss associated with aging, as well as into dementias like Alzheimer's Disease.

Spiders, Snakes, and Evolved Fears

Did you ever take a moment to think about some of the common phobias or fears among people and why they are so common? Scientists have wondered about this for some time, and a great deal of research has focused on two widely held fears, those of spiders and snakes. Why do so many of us fear spiders and snakes when 1) a disproportionate number of them are not dangerous to us, and 2) most of us have never had a dangerous encounter with one?

A knee-jerk answer to those questions might be that we have a culturally-induced fear of snakes and spiders, one that is reinforced by myths, legends, and the rare true story of poison-induced death. But research like the type conducted by David Rakison of Carnegie-Mellon University and Jaime Derringer of the University of Minnesota seems to contradict the idea that fears of snakes and spiders are (at least predominantly) culturally developed. They showed pictures of spiders to five-month old infants and measured the amount of time they spent fixated on the them vs. pictures of other objects. The infants spent an average of 7-8 seconds longer looking at the spider pictures, an indication that the spiders were more interesting to the newborns, at a time when they seemingly could not have learned the significance of a spider through cultural or environmental processes. Similar research has been conducted in the past with spiders and snakes, with similar results, although not with humans of such a young age.

So what does this mean? Do we have inborn fears of snakes and spiders? Well, perhaps we are not born with a fear of spiders, but maybe with a predisposition toward viewing snakes and spiders in a different way than we would a cat or a flower. The evolutionary viewpoint is that a predisposition toward certain fears that would increase the chances of survival would be passed on. Sure, most snakes and spiders we encounter today aren't harmful, but in the environment where our evolutionary ancestors dwelled (Africa) there were more harmful varieties, as well as little medical ability to manage a poisoning. Thus, an encounter with a snake or spider was something to be feared, and those that had a healthy fear would be more likely to survive, reproduce, and propagate their genetic information. Rakison and Derringer suggest that a genotype that increases the chances of survival by only 5% would be widespread throughout a population in 20-30 generations.

Behavioral predispositions can be selected for evolutionarily because they are grounded in the physiology of the brain. In the case of this discussion, that physiology probably involves the amygdala. The amygdala is an almond-shaped (amygdala means almond in Greek) group of neurons in the temporal lobe, and part of the limbic system. The limbic system is heavily active in emotional reactions, and the amygdala has been implicated to a large extent in reactions of fear. Thus, an evolved fear module has been proposed that centers around the amygdala. This fear module is thought to consist of a network of brain regions that has evolved to recognize particular threats, like spiders and snakes, with little cognitive processing. This enables protective behavior without having to sit around and think about whether the spider is a black widow, the snake is a copperhead, etc. (by the time you decide, you may already have been bitten) This fear module may predispose babies to recognize spiders and snakes, and might be responsible for our unreasonable fears of them. If correct, this theory could also provide explanations for why we are afraid of heights, the dark, water, and even public speaking.

Welcome to Neuroscientifically Challenged!

Hi, welcome to Neuroscientifically Challenged! N.C. is a neuroscience blog that dissects current developments in neuroscience to make them approachable for the beginning neuroscientist, or even those with just a casual interest. It seems advances in neuroscience get more popular media attention every month. Not coincidentally, we see more advances each year that have a direct relevance to our daily lives. Thus, neuroscience is not only a great field to start a career in, it's also a field that it's becoming necessary to have a basic understanding of in order to grasp major scientific developments and analyze their resulting media coverage. Hopefully, this blog will help you do that.

The discussion here will inevitably stray into other areas of science like genetics, proteomics (the study of proteins), and other scientific areas, as well as a little neuroscience-related philosophy. This is my first blog, so I would appreciate any comments and suggestions as to how you think I could improve it. Also, if you have any topics you would like to see discussed, please let me know and I will do my best to comply. Thanks and enjoy!