Wednesday, August 27, 2008
Have a Face Only a Mother Could Love? Without Serotonin She Thinks You're Just as Ugly as Everyone Else Does
As the popularity of antidepressant medication has burgeoned over the past few decades, serotonin has become one of the more publicly recognized neurotransmitters. Along with that popularity has come a trend of attributing a wide variety of behaviors (especially depression) to “serotonin imbalances”. While this is a gross simplification in most cases, it does seem to be clear that there is a correlation between serotonin transmission and behavior.
A group of researchers at Case Western Reserve University has recently shown that the disruption of serotonergic function in mice is powerful enough to inhibit one of their strongest instincts: caring for their young. They used female mice with a mutation that causes a reduction in the expression of serotonergic genes and in the synthesis of the neurotransmitter, and monitored the survival of their young after they gave birth.
99% of the pups of the wild-type (normal) mice lived past the nurturing period of youth, but none of the pups of the serotonin-inhibited mothers survived. In fact, most of them were dead after 3-4 days. When the researchers took pups born to the serotonin-deficient mothers and gave them to the wild-type mothers to raise, the pups survived. The serotonin-deficient mothers failed to nurse their pups, didn't build nests for them, and didn't organize them near her in a huddle (which is necessary for their warmth and survival).
The serotonin-inhibited mothers did not seem to exhibit deficiencies in any other behavioral assay, such as maze-running or olfaction. They were not deemed to be overly anxious as measured by locomotor tasks, but instead of mothering they often simply paced the cage and engaged in repetitive digging. The authors suggest that anxiety behaviors may have been more prominent if not for the relaxing effect lactating has on rodents.
How applicable these findings are to humans is, of course, completely unclear. Postpartum depression is often treated with selective serotonin reuptake inhibitors, but even if untreated doesn’t generally lead to abandonment of one’s children. Regardless, finding a neurochemical substrate for an instinct like caring for one's young is notable, as it is a behavior essential to what is widely considered the goal of existence: high reproductive fitness.
Lerch-Haner, J.K., Frierson, D., Crawford, L.K., Beck, S.G., Deneris, E.S. (2008). Serotonergic transcriptional programming determines maternal behavior and offspring survival. Nature Neuroscience, 11(9), 1001-1003. DOI: 10.1038/nn.2176
Monday, August 18, 2008
Ten years ago, if you had asked a neuroscientist what neurotransmitter is most important to the development of an addiction, nine out of ten times they would have said “dopamine”. Ask the same question today, however, and you’ll probably be told that it is impossible to pin such a complex process on one neurotransmitter, as clearly (at least) both dopamine and glutamate are integral to the addiction process.
In hindsight, it is not surprising that glutamate be involved in addiction. Glutamate is the most abundant excitatory neurotransmitter in the brain. It is utilized in a number of cognitive processes, but essential to synaptic plasticity, and thus to learning and memory. And addiction is really just a type of learning—perhaps learning gone haywire, but learning nonetheless. It involves the association of a positive experience with the drug that was taken to induce it, resulting in a seeking of the drug to reproduce the experience. In addiction, however, unlike other learning processes, this seeking becomes obsessive and compulsive.
It is now thought that cocaine use causes glutamatergic synapses on dopamine neurons in the ventral tegmental area (VTA), a midbrain region of the reward system, to become stronger—even after just a single use. This makes the dopamine neurons there more sensitive to glutamate, causing a hyper-sensitivity to cocaine that results in addiction. It is believed the strengthening of these glutamatergic synapses involves changes in the composition of subunits of glutamate receptors.
In order to shed more light on the specifics of this subunit restructuring, a study published last week in the journal Neuron investigates the behavioral results of changes in glutamate receptor structure. The authors created genetically engineered mice that lacked one of three types of glutamate receptor subunits: GluR1, GluR2, or NR1.
As expected, they found that cocaine-induced strengthening of synapses on dopamine neurons was dependent on the functionality of glutamate receptor subunits, specifically the GluR1 and Nr1 subunits. They also, however, made two major discoveries. First, deletion of the GluR1 subunit caused the extinction of cocaine-seeking behavior to be slowed. Thus, these mice continued to seek cocaine long after cocaine had been withheld from them, when normal mice had already “forgotten” about the drug. By extension, this might mean that pharmacological stimulation of this receptor could have potential as a treatment for addiction.
Additionally, they found that the NR1 receptor subunit was necessary for the reinstatement of drug-seeking behavior after extinction. This is analogous to relapse behavior in humans. Once again, this could have pharmacological potential in addiction treatment.
Of course, these pharmacological applications, if viable, will take some time to work out. As you can imagine, it will not be easy to create a treatment that can selectively inhibit specific subunits on glutamate receptors in a particular brain region (although this can and has been done with other receptor subunits). And, with how important glutamate is to learning in general, there is potential that a treatment aimed at glutamate receptors could disrupt other cognitive processes. So, if you’re waiting for a pill to solve your cocaine problem, you may have to wait a while longer. A cocaine vaccine (see here) may be available first.
ENGBLOM, D., BILBAO, A., SANCHISSEGURA, C., DAHAN, L., PERREAULENZ, S., BALLAND, B., PARKITNA, J., LUJAN, R., HALBOUT, B., MAMELI, M. (2008). Glutamate Receptors on Dopamine Neurons Control the Persistence of Cocaine Seeking. Neuron, 59(3), 497-508. DOI: 10.1016/j.neuron.2008.07.010
Thursday, August 14, 2008
Why do we remember? To some this might seem like a ridiculous question. Memory is so intricately intertwined with our conception of existence that it is difficult to objectively ask questions about why we developed the capacity for it, or to imagine the possibility of a life without it. If one is to assume, however, that like every other facet of the human condition, memory evolved from rudimentary beginnings, then “why do we remember?” becomes not only a reasonable question, but an important scientific inquiry.
Looking at memory from an evolutionary standpoint, one must assume that it developed to serve an adaptive purpose. Of course, when one begins to cogitate on what that purpose might be, it is easy to stumble into purely speculative territory. Evolutionary psychologists have received much criticism for this. Examples of hypotheses about evolutionary origins gone wrong shouldn’t serve to negate the efforts of the entire field, however, it should just encourage a more cautious approach.
Two psychologists from Purdue University, James Nairne and Josefa Pandeirada, published an article in this month’s Current Directions in Psychological Science that describes their lab’s approach to the evolution of memory. It attempts to avoid blatant speculation by beginning with simple hypotheses about the purposes of memory, and testing their validity before moving on to more complex explanations.
They start with three basic assumptions that an evolutionarily adaptive perspective on memory would require. The first is that memory probably didn’t evolve just to recall the past. In other words, memory must serve a purpose, allowing us to predict the probability of future events given certain circumstances. Second, memory should be governed by priorities. We shouldn’t remember all environmental stimuli with equal clarity. This would lead to a maladaptive inability to remember the most salient stimuli, and would clutter our memories with unimportant details about our environment. Third, memory should assign the highest salience to environmental stimuli that improve reproductive fitness and evolutionary adaptiveness. So, those things in our environment that have historically proven to be the most important for survival should garner the most mnemonic attention.
Based on these assumptions, Nairne and Pandeirada conducted behavioral experiments to determine if survival-related processing enhances retention. After an initial study indicated that participants were able to remember survival-related words better than other words that required a similar level of processing, the researchers designed a large study that compared survival processing to some of the most renowned mnemonic techniques, like imagery and the use of autobiographical cues. They found that survival processing resulted in higher average recall rates than any of the other techniques tested.
So perhaps we remember because it allows us to predict where danger might lie, who we can trust, successful ways to court a mate, how to obtain food, etc. Maybe this seems obvious, but it only becomes so with a little thought. Our inherent predisposition toward an anthropocentric view of the world often causes us to unconsciously regard our mnemonic abilities as above the laws of science and the progression of evolution. We don’t usually think of our memory as evolving in the same way that our bodies have, but the idea that we have developed context-specific cognitive modules through evolution is becoming hard to ignore. Then again, perhaps there is a reason we have a tendency to ignore such explanations for our cognitive abilities. Anthropocentrism may be adaptive in its own right.
Nairne, J.S., Pandeirada, J.N. (2008). Adaptive Memory: Remembering With a Stone-Age Brain. Current Directions in Psychological Science, 17(4), 239-243. DOI: 10.1111/j.1467-8721.2008.00582.x
Monday, August 11, 2008
It has long been known in the addiction field that exposure to drug-associated stimuli, commonly referred to as relapse triggers, is one of the primary causes of relapse in abstinent addicts. Neuroscience studies have added evidential support for this perspective by providing a molecular explanation for it. It is thought to principally involve two neurotransmitters: dopamine and glutamate, and a region of the reward system called the ventral tegmental area (VTA).
The VTA is part of the midbrain, and two major dopamine pathways—the mesolimbic and mesocortical—run through it. It is chock full of dopamine, glutamate, and GABA neurons. When a subject who has acquired the self-administration of a drug like cocaine is exposed to environmental stimuli they have associated with the drug, glutamate and dopamine are released from the VTA. This rush of neurotransmitters activates another area of the reward system, the nucleus accumbens, and usually leads to an attempt to reinstate drug-using behavior.
As might be expected, cocaine use itself also results in increased dopamine and glutamate transmission in the VTA. Interestingly, however, this increased neurotransmitter activity begins before the pharmacological effects of cocaine can occur. While it takes about 10 seconds for cocaine to cross the blood-brain barrier and exert its psychotropic influence, dopamine levels rise almost immediately. Thus, it would seem that the reinforcing qualities of the drug may not be solely attributable to the euphoria it produces.
Roy Wise, Bin Wang, and Zhi-Bing You published an article last week in PloS One that investigates this phenomenon. They injected cocaine methiodide (MI)—an analogue to cocaine that does not cross the blood-brain barrier to have a psychotropic effect—into rats and measured the resultant changes in neurotransmitter levels.
In rats that had never been exposed to cocaine, the MI had no effect. But in those that had previously acquired cocaine self-administration, the MI caused VTA glutamate release. It was also enough to cause these rats to reacquire cocaine-seeking behavior that had been rendered extinct.
This study speaks to the complexity and potency of the inclination toward relapse. While it has been known that external cues can cause changes in brain chemistry that predispose one toward relapse, this is the first evidence that internal cues (besides the actual rewarding mental influences of the drug) may also play a role in reinstating drug use. Fortunately, these added influences can be avoided by continued abstinence from the drug. But once a drug is used, how pleasurable the resultant experience is may have little to do with the re-emergence of drug cravings.
Roy A. Wise, Bin Wang, Zhi-Bing You, Antonio Verdejo García (2008). Cocaine Serves as a Peripheral Interoceptive Conditioned Stimulus for Central Glutamate and Dopamine Release. PLoS ONE, 3 (8) DOI: 10.1371/journal.pone.0002846
at 10:12 PM
Thursday, August 7, 2008
Schizophrenia is one of the more frightening and debilitating mental disorders. It can cause hallucinations, delusions, and social withdrawal, as well as a variety of other cognitive afflictions. While scientists have yet to decipher the etiology of the disease, its high inheritability rate (60-85%) has led many to look for answers in genetics. Since schizophrenia affects cognitive functions that are distinctly human (like language-related abilities), some have begun to consider ways in which the human brain has evolved, and how this could shed light on the causes of schizophrenia.
A group of researchers published a study this week in Genome Biology that examines the relationship between schizophrenia and the evolution of higher order processes in humans. They first investigated the evolution of molecular mechanisms involved in human cognition. Then they examined the changes that occur in schizophrenic patients, and looked for an overlap between the two data sets.
They found that, of 22 biological processes that show a strong indication of recent positive selection, 6 involve disproportionate numbers of genes that are implicated in schizophrenia. All of those 6 are implicated in energy metabolism, or the regulation of energy flow through the body/brain.
The group then performed comparative analyses between schizophrenic patients, healthy controls, chimpanzees, and rhesus macaques. The reason other primates are used in such a study is to further delineate the evolutionary picture. If an evolutionary change in the brain can be found between a human and a chimpanzee, for example, then one can assume it was a human development that took place after the divergence of chimps and humans.
The researchers saw distinct differences between the four groups, indicating recent evolutionary changes. They again found that metabolites that play key roles in energy metabolism (e.g. lactate, glycine, choline) were affected.
These results caused the scientists to suggest that recent evolutionary changes in our brain’s energy metabolism may have been integral in the development of the higher order processes we associate with the human brain. These changes would have been necessary to meet increased energy demands as the brain went through increases in size, number of synaptic connections, extent of neurotransmitter turnover, etc.
It seems that brain energy metabolism is negatively affected in disorders like schizophrenia. For example, decreases in blood flow to the prefrontal cortex have been reported when schizophrenics attempt cognitive tasks. The researchers in this study suggest that, after the last 2 million years of rapid evolution, the human brain is basically pushing the limits of its metabolic abilities. Thus, any aberrations in the brain’s energy metabolism capabilities could have drastic results, schizophrenia being one example.
According to this perspective, schizophrenia is a by-product of our rapidly evolving brains. Because we are operating at near-capacity levels, any reduction in our ability to produce and process brain energy can be debilitating. In order to verify this hypothesis, however, much more work examining the correlation between evolution, energy metabolism, and brain disorders will need to be done.
Khaitovich, P., Lockstone, H.E., Wayland, M.T., Tsang, T.M., Jayatilaka, S.D., Guo, A.J., Zhou, J., Somel, M., Harris, L.W., Holmes, E., Paabo, S., Bahn, S. (2008). Metabolic changes in schizophrenia and human brain evolution. Genome Biology, 9(8), R124. DOI: 10.1186/gb-2008-9-8-r124
Monday, August 4, 2008
Science has arrived at credible hypotheses to explain a number of complex waking behaviors. Yet an overtly simpler behavior—one that doesn’t vary much from situation to situation or person to person, and involves a minimal amount of physical and mental activity—baffles us, leaving us with a surfeit of hypotheses that seem to explain some aspect of it, but none that is sufficient to explain it as a whole.
That perplexing behavior is sleep. It comprises 1/3 of our lives, yet we don’t really know why. It seems to play a number of roles. It acts as a restorative influence on the body, bolstering the immune system and our overall feeling of restedness. It also seems to be very important during development, occupying most of an infant’s time as its brain is rapidly maturing. And indications are that it's an important part of memory consolidation.
But none of these purported reasons for sleep can explain on its own why it may have evolved. For example, it seems that the restorative functions of sleep could be achieved without putting ourselves in a state where we are oblivious to our external environment—something that is very dangerous evolutionarily. The necessity of sleep during development doesn’t explain why adults need to continue doing it, and, while it may be less efficient, memory storage is still possible after sleep deprivation.
The unsatisfying nature of each of these hypotheses on their own has caused some to support an explanation of sleep that stresses its adaptive importance in helping our ancestors remain safe from predators. Sleep incapacitates us at a time (in the dark) when we are most vulnerable, keeping ancient humans out of the paths of nocturnal carnivores. While this might be evolutionarily adaptive, however, it doesn’t explain why we experience minimal conscious ability to monitor our environment during sleep (why not just a restful but conscious state?), or why animals that are predators and not generally hunted sometimes sleep a great deal (e.g. lions).
In addition to lacking a clear purpose for sleep, we have yet to understand the physiological mechanisms behind it. This has caused some scientists to turn to the model organism Drosophila for answers. The sleeping state of Drosophila has much in common with that of mammals. It involves homeostatic and circadian regulation, consists of long periods of immobility, becomes more fragmented with age, etc.
While scientists still haven’t come to a consensus on the reason for sleep, Drosophila research has led to several findings that have aided in the elucidation of its physiology. For example, it has helped to explain the role of neurotransmitters, like serotonin, that play a key role in sleep regulation. Recently, it has led to an amazing discovery: a way to reverse the effects of mental fatigue due to sleep deprivation by manipulating gene expression.
The cognitive deficits that Drosophila develop as a result of sleep deprivation are similar to those exhibited by humans. The extent of the impairment is correlated with the amount of time spent awake. Learning in Drosophila has been found to be dependent on a structure known as the mushroom bodies (MBs)—thought to be somewhat homologous to our hippocampus—and a dopamine receptor called the dopamine D1-like receptor (dDA1).
Scientists at the Washington University School of Medicine recently investigated whether sleep-loss induced learning impairments could be reversed in Drosophila. They used a learning task that takes advantage of the flies’ predisposition to fly towards a light. The flies were placed in a T-maze with a lighted tunnel and a dark tunnel. The lighted tunnel also contained a piece of filter paper soaked in quinine, which has a bitter taste and repels flies. On repeated trials, the flies had to learn to resist their urge to fly down the lighted tunnel by associating it with the bitter smell of quinine.
Sleep deprivation led to a decreased ability to perform on the learning assay. Additionally, the researchers found that learning the task at all was heavily dependent on the functionality of the dDA1. When they studied mutant flies with a deficiency in this receptor, learning was significantly reduced. Thus, they manipulated dDA1 in the MBs to be over-expressed and surprisingly found that this caused learning deficits after sleep deprivation to return to baseline levels.
The authors of the study use these findings to make a couple of postulations about sleep and sleep deprivation. First, they suggest that, although sleep deprivation probably affects several pathways, it may target specific brain areas that are essential for functioning (in this case, the MBs). Also, they hypothesize that one of the functions of sleep may be to restore levels of neurotransmitters essential to proper functioning, like dopamine.
While this finding has already led to speculation on popular science sites about a pharmacological method of negating sleep-deprived cognitive impairments, it’s important to remember that this was a study done in fruit flies, and much work would have to be done to find if it is potentially applicable to humans. Regardless, while the overall purpose of sleep continues to be a mystery, this study does add one more piece to the puzzle in understanding its physiological mechanisms.
SEUGNET, L., SUZUKI, Y., VINE, L., GOTTSCHALK, L., SHAW, P. (2008). D1 Receptor Activation in the Mushroom Bodies Rescues Sleep-Loss-Induced Learning Impairments in Drosophila. Current Biology DOI: 10.1016/j.cub.2008.07.028