Monday, June 30, 2008

The Commonalities of Buffalo Wings, Szechuan Peppers, and Ritalin Snorting

ResearchBlogging.org Spicy food—you either love it or hate it. Whichever group you fall into, though, there’s a good chance you’ve never thought about how intriguing a natural deception it really is. When we eat spicy food we may experience a variety of sensations (depending on the specific cuisine) ranging from tingling to numbness to painful burning. Yet, a short time later the feeling disappears, leaving no redness, scarring, or irritation behind, indicating that the previous unpleasantness we experienced was—literally—all in our heads.

The substance responsible for the burning sensation one may experience when eating chili or buffalo wings is known as capsaicin. It was identified in the 1800s, and a whole family of similar molecules, called capsaicinoids, were discovered in chili peppers in the 1960s. While capsaicin is an irritant to mammals, it has analgesic properties in birds when they consume it. Chili pepper seeds are broken down in the digestive tracts of mammals. Birds, however, pass the seeds intact. Thus, the capsaicin deters mammalian feeders and makes the peppers more palatable to birds, allowing the seeds to be dispersed efficiently through bird migrations. Hence, the burning feeling caused by capsaicin is probably a mechanism that evolved to promote seed dispersal.

It wasn’t until the late 1990s, however, that scientists began to unravel the mystery behind the phantom sensation caused by capsaicin. To understand it necessitates a little knowledge about neurophysiology. So, I’ll try to summarize half a semester of neurophys in a few short paragraphs.

Neurons (and some other types of cells) communicate with one another through pulses of voltage called action potentials. A neuron maintains a certain regular voltage, known as its resting potential. The membrane of a neuron is broken up by apertures called ion channels. When they are open, certain charged particles can pass in or out of them (which particles and to what extent depends on the type of channel and a number of other factors).

Neurons are influenced primarily by four types of ions: K+ and organic anions (A-) that are concentrated inside the cell, and Na+ and Cl-, which are for the most part outside of the cell. The resting potential across a neuron’s membrane is usually about –70mV. This potential is maintained by a sensitive pump that constantly pulls K+ in, while sending Na+ out.

When a neuron is excited, voltage-dependent ion channels quickly open that allow floods of Na+ into the cell. This causes a change in the voltage of the neuron, referred to as depolarization. The rapid depolarization is the trigger that sends a wave of voltage, the action potential, down the axon of the neuron. If it is strong enough, it will reach the end of the neuron, causing the release of neurotransmitter, which binds to surrounding neurons to open their ion channels, resulting in depolarization, and so on.

So, back to buffalo wings, chili, and capsaicin. Capsaicin is a ligand that binds to a specific receptor, the TRP vanilloid receptor subtype 1 (TRPV1). This receptor can also be stimulated with actual heat and physical injury. When it is activated, it opens ion channels that depolarize nerve cells by allowing an influx of Na+. This produces action potentials that travel to the brain and produce what is, in this case, a false sense of pain.

If you’ve ever eaten Szechuan peppers, you’ll know that the feeling they evoke is different than that of chili peppers. Szechuan peppers cause a tingling, sometimes numbing, feeling. Instead of capsaicin, their active ingredient is hydroxy-alhpa-sanshool (sanshool). How sanshool acts to produce its numbing effect was somewhat of an enigma until a study published last week in Nature Neuroscience offered an explanation.

According to the authors of the study, sanshool acts on a different group of neurons than capsaicin. Capsaicin affects small-diameter sensory neurons that express proinflammatory peptides (which are responsible for the pain), but sanshool acts on large diameter neurons usually associated with proprioception and detection of touch or vibration.

Sanshool was thought to have an effect by opening Na+ channels, in a manner similar to capsaicin. The Nature study, however, found that sanshool actually inhibits K+ channels. The result is still an action potential, but through a different mechanism.

You may be thinking this is a lot of research money being wasted to figure out why food is spicy. But understanding these subtleties of the sensory system is important in that it brings us closer to an overall comprehension of how our senses work. Also, both capsaicin and sanshool have applications as analgesics (ironically capsaicin can reduce pain when applied topically, possibly because it floods the sensory neurons to the point where they go numb).

A side note: A couple of years ago a Harvard researcher, Clifford Woolf, made a novel suggestion. Since the most highly abused prescription drugs like OxyContin and Ritalin generally lead to addiction when users begin snorting them, why not mix capsaicin in with them? This, Dr. Woolf asserted, would not affect the oral digestion of the pills but would make snorting them like “snorting an extract of 50 jalapeno peppers”.

One thing that has always amazed me about pills like these is how amenable they are to being crushed up and snorted. Elizabeth Wurtzel, in her book about Ritalin addiction More, Now Again: A Memoir of Addiction implies that pharmaceutical companies purposely make their drugs like this in order to increase demand and black market consumption. I don’t know if I agree with her or not yet, but when there seem to be options to change the consistency of the pill, or when deterrents like adding capsaicin are available, and they are ignored, it does become suspicious.

Reference:

Bautista, D.M., Sigal, Y.M., Milstein, A.D., Garrison, J.L., Zorn, J.A., Tsuruda, P.R., Nicoll, R.A., Julius, D. (2008). Pungent agents from Szechuan peppers excite sensory neurons by inhibiting two-pore potassium channels. Nature Neuroscience, 11(7), 772-779. DOI: 10.1038/nn.2143

Friday, June 27, 2008

Encephalon #49 Will Be Right Here--Send in Your Submissions

Neuroscientifically Challenged will host its first Encephalon Blog Carnival on July 7th and needs submissions! Please send your post on brain science or related topics to encephalon {dot} host {at} gmail {dot}com by 6pm on July 6th.

Thursday, June 26, 2008

It's All About Timing: Circadian Rhythms and Behavior

ResearchBlogging.org
Anyone who has ever tried to drastically alter his or her sleep schedule (e.g. going from working days to working nights) knows that it is one of the more difficult biological tasks we can take on. Even altering one’s sleep patterns by a couple of hours (such as the shift experienced by cross-country travelers) can be disruptive, and enough to make us feel tired, mentally unclear, and grumpy. But why are we so inflexible when it comes to our daily routine? Why are our otherwise diverse bodies so sensitive to an adjustment of our biological clocks by just a few hours? Perhaps it is because millions of years of evolution have led to a daily body clock so fine-tuned that this sensitivity is adaptive.

Circadian (from the Latin for “around” and “day”) rhythms are endogenous biological patterns that revolve around a daily cycle. They are found in all organisms that have a lifespan that lasts more than a day. They are adaptive in the sense that they allow an organism to anticipate changes in their environment based on the time of day, instead of just being a passive victim to them. Thus, to foster that readiness, they usually involve the coordination of a number of physiological activities, such as eating/drinking behavior, hormonal secretion, locomotor activity, and temperature regulation.

A major nucleus of the mammalian brain, located in the hypothalamus and called the suprachiasmatic nucleus (SCN), is responsible for acting as the master time-keeper in mammals. When the SCN is lesioned (i.e. in rodents), it results in a complete disruption of circadian rhythms. The animals will demonstrate no adherence to a daily schedule, sleeping and waking randomly (although still sleeping the same total amount of time each day).

The SCN receives information from ganglion cells in the retina, which keep it appraised of whether it is light or dark out, and maintain its synchrony with a diurnal schedule. It is not, however, completely dependent on visual input for keeping time. A number of other environmental cues, such as food availability, social interaction, and information about the physical environment (other than light) are thought to play an important role in the SCN’s ability to maintain regular daily rhythms.

Although the SCN is the center for circadian rhythms, it seems that many individual cells are not directly controlled by the SCN. Instead, they are thought to maintain their own time-keeping mechanisms. Known as peripheral oscillators, these cells are present in a number of organs throughout the body, and can be sensitive to environmental cues as well as the signals of the SCN.

So, how do the neurons of the SCN actually “keep time”? They appear to be controlled by a cycle of gene expression, which consists of a natural negative feedback mechanism. Throughout the day, a gene known as CLOCK (circadian locomotor output cycles kaput) is activated based on daytime environmental cues. This gene acts with another, BMAL1, as a transcription factor, driving the transcription of proteins period (PER) and cryptochrome (CRY). When large amounts of PER and CRY have been created, they form a complex, and act on the CLOCK and BMAL1 genes to inhibit their own expression. This occurs during the night, and the result is that PER and CRY proteins become diminished, allowing CLOCK and BMAL1 to begin transcribing them again. This happens around the morning of the next day. Thus, the feedback loop is synchronized with a 24-hour cycle, allowing the clock in the SCN to oscillate at a regular rate.

Disorders of the SCN can result in disruptive sleep problems, such as advanced sleep phase syndrome (early sleep and wake times) or delayed sleep phase syndrome (preference for evenings and delayed falling asleep). More attention is now being focused on the role a dysfunctional circadian system may play in already identified behavioral problems. A recent review in PloS Genetics examines the potential influence circadian rhythm disturbances may have in disorders like depression, schizophrenia, and even autism.

Circadian disruptions are present in all major affective disorders, including depression, bipolar disorder, and schizophrenia. Although the exact role circadian rhythms play in these disorders is not yet known, it may be substantial. This is supported by the influence changes in sleep patterns can have on the alleviation of primary symptoms of these disorders. For example, sleep deprivation has been demonstrated to have an antidepressant effect (albeit short-lived) in patients. And some affective disorders, such as seasonal affective disorder, seem to have a basis in the length of the day, and shape emotional states.

Autism spectrum disorders (ASD) are correlated with low melatonin levels, and a gene responsible for the synthesis of melatonin is considered a susceptibility gene for autism. Mice with a mutant form of this gene demonstrate deficits in social interaction, anxiety, and increased occurrence of seizures. It is postulated that behavioral problems in ASD may be influenced by the failure of an individual’s circadian clock to effectively take note of social and environmental cues.

Variants of a number of time-keeping genes, such as PER1, CLOCK, and CRY have been found to be associated with behavioral disorders. It has yet to be determined if these variations are causative, contributive, or unrelated to the disorders. Keeping in mind how influential a disturbance of circadian rhythms can be in our daily lives, however, it seems logical to investigate the possibility of their contribution to pathologies.

Reference:

Barnard, A.R., Nolan, P.M., Fisher, E.M. (2008). When Clocks Go Bad: Neurobehavioural Consequences of Disrupted Circadian Timing. PLoS Genetics, 4(5), e1000040. DOI: 10.1371/journal.pgen.1000040

Wednesday, June 25, 2008

Changes in Gene Expression and Addiction

As I discussed in a post last week, addiction seems to correspond to abnormalities in dopamine (DA) transmission throughout the reward areas of the brain. Specifically, initial uses of a drug tend to correlate with low levels of dopamine receptor availability in the nucleus accumbens (NAc), while long-term use affects DA transmission throughout the entire striatum (the NAc is located in the ventral portion of the striatum, or the part nearer the front of the brain).

The striatum is a subcortical region of the brain, and part of the mesocorticolimbic DA pathway, which is integral to the evaluation and appreciation of rewards (like drugs). Striatum is from Latin, and means striped. It is so named because the entire region has a striped appearance, due to the alternating bands of gray and white matter that make it up.

The changes that occur in the striatum are postulated to be responsible for the long-lasting behavioral changes that drug addicts can experience, such as cravings for drug use, an inability to enjoy previously rewarding experiences, and proneness to relapse. It has been suggested that these changes must be preceded by some sort of synaptic remodeling in order to have such a long-lasting effect, and those synaptic changes could be a result of fluctuations in DA transmission. How exactly they occur, however, has yet to be elucidated.

A study to be published in an upcoming issue of Nature may shed some light on the mechanism behind these changes. It involves gene expression, and a phosphoprotein known as DARPP32 (dopamine-and cyclic AMP-regulated phosphoprotein with molecular weight 32 kDa).

A phosphoprotein is a protein that has had a phosphate group attached to it, through a process known as phosphorylation. Phosphorylation is an important event in cells, as it often is the catalytic process that activates enzymes and receptors. Dephosphorylation can “turn off” these enzymes, and involves proteins called phosphatases.

When dopamine 1 receptors (D1R) are stimulated, they in turn activate DARPP32, which inhibits a phosphatase known as protein phosphatase 1 (PP1). This signaling cascade affects the phosphorylation of numerous proteins in the cytoplasm and nucleus of a cell.

In the Nature study, the researchers found that the administration of amphetamine, cocaine, or morphine to mice caused DARPP32 to accumulate in the nuclei of striatal neurons. Further studies of neural cultures indicated that dopamine prevents a specific DARPP32 phosphorylation site, Ser97, from being phosphorylated. Ser97 appears to be responsible for exporting DARPP32 from the nucleus of the cell, thus DARPP32 builds up inside the nucleus.

When DARPP32 accumulates in the nucleus, it causes the phosphorylation of a histone, H3. Histones are proteins that DNA winds around to make chromatin, the protein and DNA complex that makes up chromosomes. Phosphorylation of histones often affects chromatin structure, and gene expression as a result.

Mice with mutations in the Ser97 site demonstrated long-lasting aberrations in their behavioral responses to drugs and other rewards. They showed decreased acute locomotor responses to morphine administration, along with a reduced locomotor sensitization to cocaine. Their motivation to obtain a food reward was also diminished.

Thus, this signaling pathway may be responsible for one of the most potent behavioral changes in addiction, when euphoria achieved from the drug diminishes along with the pleasure once obtained from other rewards. This change can contribute to compulsive drug seeking, as an addict obsessively continues to seek the pleasure once associated with their drug of choice. If altered gene expression is responsible for these changes, it would help to explain why they can persist for such a long period of time after the cessation of drug use—sometimes continuing to affect the behavior of an addict for years, and often making their efforts to stay sober much more difficult.

My Evolutionarily Adaptive Response to Dog Poop

Any dog owners out there who (like me) don’t have their own yard in which to let their dog run wild, will probably agree that picking up after your dog is the most unpleasant daily aspect of having one. Every time I lean down to scoop up a pile of my dog Zooey’s regular gift to me, my nose wrinkles up, my eyes squint—and occasionally I may gag a little bit.

This expression of disgust is a common one. Charles Darwin, in The Expression of the Emotions in Man and Animals, noticed that some expressions like this occur throughout the world in many different cultures, and even in some animals. Thus, he hypothesized, they may have a biological rather than environmental origin. If so, Darwin suggested, they probably also have an adaptive purpose.

Recently a group of researchers from the University of Toronto investigated this 130-year-old hypothesis. They took two expressions that are widely considered to be universal: the wrinkled nose, raised lip, and narrowed eyes of disgust, and the wide eyes and flared nostrils of fear. They developed computer-generated images of faces displaying a typical rendition of each of these visages, then asked volunteers to recreate them while they underwent breathing and vision tests.

They found that each expression had specific effects on breathing and vision that could be considered adaptive. The look of disgust limited air flow and vision, a reaction which could be beneficial in keeping potentially noxious substances out of the eyes and mouth. The fearful expression improved peripheral vision, made eye movement quicker, and increased air flow—all responses that could theoretically make someone more prepared to face danger.

While these results may seem obvious in hindsight, I must admit it’s not something I ever thought about before when I picked up after Zooey.

Wednesday, June 18, 2008

The Darwinian Paradox of Homosexuality

Homosexuality has been an acknowledged aspect of human society since our earliest recorded history. In some civilizations, such as ancient Greece, homosexuality was relatively common and accepted. It is also a behavior that is not specific to humans. It has been documented in over 500 non-human animals, including penguins, bonobos, and grizzly bears.

Over the past few decades, evidence has begun to accumulate that homosexuality is a behavior that appears primarily due to biological or genetic influences. While environmental factors may play a part in the expression of a homosexual phenotype, most scientists would suggest their influence is not powerful enough to cause an otherwise heterosexual organism to become homosexual (although there are environmental conditions that may encourage transient homosexual behavior, e.g. captivity).

This dependence on biological factors, however, creates a paradox for evolutionary theorists. The assumed goal of all organisms, and thus of evolution, is reproduction—the passing on of one’s genes to a new generation. Since homosexuals reproduce at a much lower rate than the heterosexual population, one might think a genetic basis for homosexuality—even one that involved several different genes—would by now have disappeared from the gene pool.

A number of hypotheses have been proposed to explain this paradox, although none of them has gained the full support of the scientific community. One early explanation, which has for the most part fallen out of favor, is kin selection. Kin selection occurs when an organism acts in a way that fosters the reproductive success of its relatives, even at a cost to its own reproductive success. In this scenario, childless homosexuals might put more effort into helping raise nieces or nephews. These relatives might carry some of the same genes as the homosexual, and thus if they eventually reproduce they may pass on genes essential for homosexuality. Evidence in support of this theory is limited, however, and most feel it doesn’t tell the complete story.

More accepted explanations today include: 1) overdominance, 2) maternal effects, and 3) sexually antagonistic selection. Overdominance occurs when a heterozygous version of a gene provides an organism with some type of reproductive advantage. For example, a straight man might have a heterozygous gene that, if it were homozygous, would increase his chances of being gay. In a heterozygous state, however, it could result in increased sperm motility. Thus, when it is passed on through reproduction, its recessive allele (which predisposes for homosexuality) is as well.

The maternal effects hypothesis suggests a fetus is influenced by the environment of the mother’s womb, resulting in changes that predispose one toward homosexuality. This hypothesis was proposed after evidence began to appear that homosexuality in males is predicted by high numbers of older brothers. In trying to make sense of this statistic, scientists postulated that a mother might build immunity to male-specific antigens with each birth of a male child. The progressive immunization to these male antigens may eventually affect the brain of a male fetus. This could happen, for example, if antibodies crossed the placenta and attacked male-specific regions of the brain necessary for sexual differentiation.

Sexually antagonistic selection, which appears to have the most evidential support at this point, is a mechanism whereby genes are spread throughout a population by giving a reproductive advantage to one sex while disadvantaging another. In the case of homosexuality, the mother may have increased fertility at the expense of her child’s ability to procreate. This concept can be expanded upon to include female homosexuality as well, whereas any trait with a gender-specific benefit may have evolved by increasing female fertility.

Although the evolutionary explanation of homosexuality is still elusive, evidence for a biological basis for homosexuality continues to accrue. An article published this week in the Proceedings of the National Academy of Sciences reports that the brains of homosexuals have similarities to the brains of heterosexuals of the opposite sex. Using magnetic resonance imaging (MRI), researchers compared heterosexual and homosexual brains. They found that the brains of heterosexual men and homosexual women have a slightly larger right hemisphere than the brains of gay men and straight women. It has been found in the past that there are differences in activity between the right and left hemispheres in different sexes.

The researchers in this study also found, using positron emission tomography (PET), that the connectivity of the amygdala, an area of the brain important in emotion, was more similar in lesbians and straight men, and gay men and straight women, respectively.

These findings provide further evidence for homosexuality having a biological origin. Of course they don’t preclude the possibility that there are environmental influences on the expression of homosexuality. Instead, however, they make much less plausible the often bandied about argument that homosexuality is a “choice”. As the correlation between brain structure and a particular behavior becomes stronger, the involvement of choice usually decreases proportionally.

Tuesday, June 17, 2008

Hox Genes and Neurodevelopment

ResearchBlogging.org
In the 1980s, scientists knew surprisingly little about the role genes play in the development of an embryo. The discovery of a particular group of genes, however, known as Hox genes, drastically improved our understanding of embryology. At the same time it revolutionized genetics and developmental biology.

In the 1890s, an English biologist named William Bateson was repeatedly amazed when he came across “freaks” of nature in his studies. These included examples like a moth born with wings where its legs should be, or an insect born with legs for antennae. In 1915, another biologist, Calvin Bridges, gave a name to these aberrations, calling them homeosis (meaning the transformation of one body part into another). Bridges had noticed homeosis in fruit flies that were born with an extra pair of wings. Intrigued, he kept this strain alive through selective mating.

In the 1980s, scientists were finally able to isolate the gene that was causing the extra wing mutation in the fruit fly. They traced it back to a small group of genes, which they called Hox genes. They found that, by manipulating these genes, they could create virtual monsters, such as flies with legs that came bursting out of the middle of their heads.

The creation of these monsters, however, helped to elucidate the function of Hox genes. Hox is short for homeobox, which is the name for the DNA sequence that these genes have in common. Hox genes become active in early embryonic development. Their job is to designate which parts of the embryo will turn into which body parts (legs, wings, head, etc.). Hox genes are so specific that, if one that controls limb development is transplanted to the head of the embryo, a limb will grow out of the head.

Scientists began to find these types of master control genes in every embryo, regardless of the organism. Even more surprisingly, the genes are considerably similar across species. Scientists found they could replace a defective Hox gene in a fly with one from a mouse without any ill effects. Hox genes and other master control genes are present in humans as well, and play the same role in embryonic development. This congruity across species indicates that Hox and master control genes are probably an ancient evolutionary mechanism, developed before much speciation took place, but still present and active.

While understanding Hox and master control genes has led to great advancements in the comprehension of embryonic development, the development of the brain has still remained a little unclear. Specifically, scientists have had trouble figuring out how specialized neurons in our brain are formed in one region, then migrate to the areas they eventually have to settle in in order to function properly.

A study published online this week in PloS Biology may shed some light on the issue, however, and Hox genes are an important part of the explanation. The authors of the study investigated pontine (from the pons) neurons in mice. Pontine neurons are formed in the rear of the brain and then must migrate in the brainstem to eventually become part of the precerebellar system. This is an area that is necessary for coordinated motor movement, and provides the cerebellum with its principal input. So the question is, once these pontine neurons are formed, how do they “know” they have to travel to the precerebellar region?

The researchers who conducted this study found Hox genes to be the guide that leads the neurons to their appropriate resting place. A specific Hox gene, Hoxa2, was found to influence neuronal migration, preventing them from going astray through the influence of a pathway of molecular signaling. The Hoxa2 gene regulates the expression of a particular receptor, known as Robo. The receptor binds to a chemical called Slit, which prevents the neurons from being drawn toward other chemoattracants. This allows the neurons to ignore outside influences and to travel directly to the precerebellar region, where they belong. When the scientists knocked out the Hoxa2 gene, the pontine neurons were unable to resist being drawn to chemoattractants and often didn’t reach their final destination.

This adds some insight into the process of neuronal migration, something that has been problematic to neuroscientists for years. It is just the beginning of the story, however. Not all of the neurons reacted to Hoxa2, suggesting there may be other Hox genes involved in brain development. Thus, scientists will continue to search for other Hox genes that are part of the process. The success of this study, however, at least provides an indication that Hox genes, some of the most highly conserved in our bodies, may also be responsible for some of the most important aspects of brain development.

Reference:

Geisen, M.J., Meglio, T.D., Pasqualetti, M., Ducret, S., Brunet, J., Chedotal, A., Rijli, F.M., Zoghbi, H.Y. (2008). Hox Paralog Group 2 Genes Control the Migration of Mouse Pontine Neurons through Slit-Robo Signaling. PLoS Biology, 6(6), e142. DOI: 10.1371/journal.pbio.0060142

Saturday, June 14, 2008

Why Pretzels and Gunshot Wounds Make Us Thirsty

I re-watched one of my all-time favorite movies the other night: Unforgiven. After William Munney (Clint Eastwood) shoots his first victim, the camera zooms in on the fallen cowboy as he begins complaining about how thirsty he is, begging his companions for water. In a moment of compassion, Munney agrees to put down his gun to allow the cowboy’s friends to bring him a canteen.

You’ve probably all seen a similar scene before in another movie, if not this one (hopefully you’ve never seen it in person). Victims of gunshot wounds, or other wounds that involve a drastic loss of blood, are often portrayed as being very thirsty. I’m not sure if the reason why this occurs is common knowledge, but in case it’s not, I thought I would write a quick explanation.

First, a little about water in the body. The cells in our body not only contain water, but also are surrounded by what is called interstitial fluid. This fluid bathes the cells in a “seawater” type solution that contains water, sodium (Na), amino acids, sugars, neurotransmitters, hormones, etc. The cell is normally in an isotonic, or balanced, state in relation to its extracellular environment, meaning water doesn’t generally leave or enter the cell at large rates.

Water is also an important constituent of blood. It is essential for keeping blood volume at a level that allows for proper functioning of the heart. If volume gets too low, the atria of the heart don’t fill completely, and the heart cannot pump properly.

The need to keep the fluid balance in the body at a regular level results in the occurrence of two types of thirst that affect us when that equilibrium is disturbed: osmometric thirst and volumetric thirst. Osmometric thirst occurs when the osmotic balance between the amount of water in the cells and the water outside the cells becomes disturbed. This is what happens when we eat salty pretzels. The Na is absorbed into the blood plasma, which disrupts the osmotic balance between the blood plasma and the interstitial fluid. This draws water out of the interstitial fluid and into the plasma, now upsetting the balance between the cells and the interstitial fluid. The result is water leaving the cells to restore the balance.

The disruption in the interstitial solution is recognized by neurons called osmoreceptors, located in the region of the anterior hypothalamus. They send signals that cause us to drink more water, in order to restore the osmotic balance between the cells and the surrounding fluid. In the case of pretzel eating, if we don’t drink more water, eventually the excess Na is simply excreted by the kidneys.

Now, to the graver situation of a gunshot wound, and the other type of thirst: volumetric. Volumetric refers to the volume of the blood plasma, which is highly dependent upon water content of the body. As mentioned above, maintaining an adequate blood plasma volume is essential to proper functioning of the heart. If it gets too low, the heart can’t pump effectively.

When someone is injured and loses a lot of blood volume (known as hypovolaemia), less blood reaches the kidneys. This causes the kidneys to secrete an enzyme called renin, which enters the blood and catalyzes a hormone called angiotensinogen to convert it into a hormone called angiotensin. One form of angiotensin (angiotensin II) causes the pituitary gland and adrenal cortex to secrete hormones that prompt the kidneys to conserve water as a protective measure. Angiotensin II also affects the subfornical organ (SFO), an organ that lies just outside the blood-brain barrier. Through the SFO angiotensin II stimulates thirst.

There are also receptors in the heart that recognize decreases in blood plasma. Known as atrial baroreceptors, they detect reductions in blood plasma volume and subsequently stimulate thirst by signaling neurons in the medulla. So, when someone is shot and losing a lot of blood, it is because of the decrease in blood plasma volume that brain regions are stimulated through both of the above pathways to stimulate thirst.

Processes that stimulate thirst are really much more complicated than this brief explanation. But, I thought this was enough to give a general idea of why salty foods and gunshot wounds have similar effects on our desire to drink water.

Friday, June 13, 2008

Impulsivity and a Predisposition to Addiction

ResearchBlogging.org
The improved understanding of addiction that has emerged over the past few decades has transformed the question of whether or not addiction is a choice into a search for predisposing factors that make the risk of addiction much higher for some people than for others. This is a drastic improvement from the times when it was even considered that addiction was a voluntary process—a decision made by degenerates who had no motivation to live a successful life. It has also led to a great deal of research in an attempt to isolate specific genotypes and phenotypes that result in a susceptibility to drug abuse.

Much of that research has focused on two such phenotypes: the sensation or novelty-seeker, and the impulsive personality. Novelty-seeking has been considered a predisposing factor to drug abuse for some time. This was reinforced by studies in the 1980s that found it to be correlated with the acquisition of cocaine self-administration (SA) in rats. Last year, however, a group of researchers at Cambridge found that an impulsive phenotype in rats was more of a predisposing factor to the compulsive use of cocaine that is traditionally associated with addiction.

Novelty-seeking and impulsivity may sound very similar, but there is a distinction between the two types of behavior. In rats, novelty-seekers have a greater tendency to explore a new environment than to stay in one place. Impulsive rats perform poorly on trained responses that will reward them with food, acting too quickly and making premature responses. When it comes to cocaine use, novelty-seeking rats (known as high responders, or HR) have shown a tendency to acquire the SA of cocaine more rapidly. Highly impulsive (HI) rats, on the other hand, don’t acquire SA more quickly, but they are more prone to allow their cocaine use to progress from occasional to compulsive. Thus, HR are more likely to try cocaine, but HI rats are more liable to develop an addiction to it.

Another group of Cambridge researchers (including a few who were also involved in the study mentioned above) published a study in this week’s Science that further explores the difference between HR and HI rats. In order to gain greater relevance to human addiction, they focused on the observation in rats of actual Diagnostic and Statistical Manual of Mental Disorders-IV (DSM-IV) criteria for addiction, specifically 1) an increase in motivation to obtain the drug, 2) an inability to refrain from drug taking, and 3) maintained drug use despite aversive consequences (punishment in this case). Each of the rats was given an overall addiction score based on these criteria (0-3), which corresponded to a score on the Addiction Severity Index (ASI)—a valid, reliable, and widely used addiction assessment tool.

The HI rats were found to have the highest addiction scores, being represented largely in the 2 to 3 point range, whereas low-responder (LR), HR, and low-impulsivity (LI) rats usually had a score of 0 or 1. These results reinforce previous findings that the propensity to use cocaine and the tendency to progress to addiction are influenced by separate (although perhaps overlapping) behavioral traits.

These two patterns of drug use are very distinct in humans as well. The percentage of people who try or experiment with drugs/alcohol is much higher than the percentage who actually become addicted, mirroring the separation in rats between SA and compulsivity. It also makes sense that impulsivity could be more of a negative influence when attempting to stave off the urges of an addiction than novelty-seeking, whereas novelty-seeking could be a primary factor in spurring an initial interest in drug experimentation.

As might be expected, it also seems that different brain regions in the rat control these different types of drug use. Previous studies have found that addiction (in rats and primates) is correlated with lower levels of the availability of a specific dopamine (DA) receptor—the D2/D3 receptor. Cambridge researchers last year found that limited D2/D3 receptor availability in the ventral striatum/nucleus accumbens (NAc) of rats exists before drug use. As other studies have found limited D2 availability throughout the striatum during and after drug addiction, this group suggested that, while NAc D2 deficiencies may predate drug use, the abuse itself (and the excessive DA transmission associated with it) causes downregulation of DA receptors throughout the striatum. Therefore, the reduction in D2 availability in the dorsal striatum may represent more closely the switch to compulsive drug use seen in HI rats. A deficiency in NAc D2 receptors, however, could be a risk factor for the initial use of the drug.

These developments in our understanding of addiction serve several important purposes. They solidify the association that can be made between a predisposition to drug abuse, and addiction itself. As more data indicates that specific phenotypes and genotypes predispose organisms to drug use/abuse, the question of the involvement of choice in addiction will slowly fade into the background as attempts to treat its neurobiological underpinnings become of primary importance. And, as scientists come to more explicitly understand neurobiological influences on drug abuse, social attitudes toward addiction may necessitate revision. Many questions, both philosophical and legal in nature, will have to be examined. For example, should a drug addict be imprisoned for repeated drug possession arrests, when no other crime is involved? How much are they to be held responsible for neurophysiological dissimilarities that made the decision for them to abstain from drugs much more difficult than it is for someone without such deficits?

While the social reverberations may not be felt for some time, however, research will continue into both the genotypes and phenotypes that correlate with addiction, in the hopes of finding improved treatments for the affliction in the near future. Identifying the genes associated with pre-drug decreased D2 receptor density, for example, could lead to the development of a treatment to correct the deficiency (for information on a promising vaccination treatment for addiction, see this post). Already a staggering 1,500 genes that are associated with addiction have been identified, along with eighteen molecular pathways for the disorder, five of which are shared by common drugs of abuse. So, although there is much work to be done, a solid foundation has been laid. And, with each new finding, we come a bit closer to comprehending what was previously one of the most misunderstood brain disorders.


Reference:

Belin, D., Mar, A.C., Dalley, J.W., Robbins, T.W., Everitt, B.J. (2008). High Impulsivity Predicts the Switch to Compulsive Cocaine-Taking. Science, 320(5881), 1352-1355. DOI: 10.1126/science.1158136

Tuesday, June 10, 2008

Still Experiencing Technical Difficulties...

AT&T has found a way to turn a simple transfer of high speed internet service to a new address into a very complex, convoluted process. I will be back by Friday with some new postings, and a new internet provider. Thanks to anyone who is still hanging in there with me!

Friday, June 6, 2008

Be Back Soon...

To anyone who has checked in with this blog over the past week, I apologize for the lack of postings. I am in the process of moving to a new apartment and currently without internet access. I hope to be back up and running by the end of this weekend.