In this video I discuss the neuromuscular junction. The term neuromuscular junction refers to a synapse between a motor neuron and muscle fiber; activity here is essential for muscle contraction and movement. At the neuromuscular junction, the synaptic boutons of a motor neuron are situated over a specialized region of muscle called the end plate. The synaptic boutons release acetycholine, which travels across the synaptic cleft and activates acetylcholine receptors on the muscle fiber. This causes excitation of the muscle cell, leading to muscle contraction. Excess acetylcoholine is removed from the synaptic cleft by the enzyme acetylcholinesterase.
Welcome to Neuroscientifically Challenged! All of the content for the site is collected on the home page, but if you're looking for specific types of content you can use the menu bar above. By clicking on Articles, you'll find links to blog articles on a variety of different neuroscience topics. The Know Your Brain link will take you to a listing of reference articles, each of which deals with a different part of the nervous system. Clicking on the 2-Minute Neuroscience Videos link will take you to an assortment of 2-minute videos that each teach you about a different aspect of neuroscience. And the Glossary contains a large selection of definitions for common neuroscience terms.
While it may be difficult to imagine in a day and age when psychiatric medicines are advertised as a way to treat nearly every mental disorder, only 65 years ago targeted and effective psychiatric medicines were still just an unrealized aspiration. In fact, until the middle of the 20th century, the efficacy and safety of many common approaches to treating mental illness were highly questionable. For example, one method of treating schizophrenia that was common in the 1940s and 1950s, known as insulin coma therapy, involved the repeated administration of insulin to precipitate a coma---and then rousing the patient out of the coma with a sugar solution. Although in hindsight this was an ineffective and dangerous way of treating the disorder, that realization didn't spread throughout the medical community until the late 1950s (about 30 years after the introduction of the procedure). Other methods used at the time (e.g. lobotomy, electroconvulsive therapy) were applied in a similarly precarious fashion while providing little to no real improvement in symptomatology. Pharmacological approaches weren't much better, as the drugs used to treat psychiatric disorders tended to be very non-specific, and often dangerous. For example, agents like chloral hydrate or barbiturates might be used to calm a schizophrenic patient whose erratic symptoms made him difficult to pacify. These drugs, however, didn't target any pathology specific to schizophrenia; they simply caused massive sedation and in the process posed a variety of risks ranging from dependence to overdose.
In the 1950s, however, treatment of psychiatric disorders began to change. The decade saw the identification of the first true antipsychotic drugs to treat schizophrenia, the first antidepressants, and the first benzodiazepines to treat insomnia and anxiety. Indeed, the 1950s ushered in what many refer to as the "psychopharmacological revolution," an appellation that began to be used as the second half of the 20th century saw the development of an unprecedented number pharmacological treatments for psychiatric illnesses. Over this time, pharmacological treatments would surpass all other approaches as the most common ways to address psychiatric disorders. And many point to the first antipsychotic drug, chlorpromazine, as the drug that started it all. Considering the major impact it had on psychiatry, medicine, and society, it is perhaps surprising to consider the humble origins of chlorpromazine: the discovery of the drug can be traced back to a black sludge formed in the process of converting coal to fuel.
From coal tar to dyes
When coal is transformed into fuel, one of the byproducts left behind is a thick brown or black liquid known as coal tar. Coal tar smells strongly of naphthalene---one of its chemical constituents and the main ingredient in mothballs---and its appearance and odor probably wouldn't give anyone the impression that there was anything extraordinary about it. Coal tar, however, is made up of a very rich mixture of organic chemicals. Over the years, many of these chemicals---like naphthalene and benzene--were isolated from coal tar and found domestic or industrial uses either in their unaltered form or as starting points for other derivative chemicals. One such use was as synthetic dyes for clothing or other fabrics.
In the 1800s, dyes had to be obtained from natural sources; for example, blue or indigo dye was extracted from tropical plants of the Indigofera genus while yellow was obtained from the flowers of Crocus sativus, or the saffron plant. Relying solely on natural sources for dyes was expensive and resource dependent; the process of extracting dyes from natural sources also tended to be somewhat complicated. Thus, when chemists began to discover ways of creating synthetic dyes from cheap and readily-available substrates, synthetic dyes quickly supplanted natural dyes as the most common method for staining a variety of products ranging from clothing to upholstery. In the process, dye synthesis became the foundation on which a new industry that dealt in the production and use of chemicals was built. The growth of the chemical industry would not just change manufacturing and commerce, but also science---as for many it provided a necessary justification for the existence of chemistry as a scientific field in its own right.
The pioneering chemists in the work on dye synthesis had unintentionally found that aniline, one of the organic constituents of coal tar, could---with the appropriate reaction---produce dyes that were brilliantly purple, magenta, red, or really almost any color imaginable. Then it was discovered that many of the components of coal tar---like benzene, toluene, naphthalene, phenol, and anthracene---could be used to synthesize dyes as well. These discoveries made dyestuffs an extraordinarily lucrative industry in the second half of the twentieth century. The money made selling synthetic dyes helped several major firms like BASF, Bayer, and Sandoz become global powers; the fact that these companies (in some form---Sandoz is now Novartis) are still extremely influential in the chemical and/or pharmaceutical markets indicates that the impact of synthetic dye production can still be felt today.
From dyes to pharmaceuticals
The use of coal tar wasn't limited to dyes, however. Scientists found the rich organic makeup of coal tar could be exploited to produce a variety of substances ranging from paints to cosmetics. As they experimented with this bountiful substrate, researchers also began to find that some of the products they derived had potential as medicines. The first of these substances that were marketed for medicinal purposes were antipyretic, or fever-reducing drugs. The commercial success of some of these drugs led to it being commonplace to test dyes and related compounds for potential therapeutic effects.
At around the same time these antipyretic substances were discovered, chemists were working with another coal tar derivative dye called methylene blue. While examining the structure of methylene blue, the German chemist August Bernthsen discovered that it was a derivative of a previously unknown compound that would come to be called phenothiazine. Phenothiazine derivatives were subsequently found to have antiparasitic properties, and many were synthesized in the hopes of finding treatments for malaria. At the time quinine was the only available antimalarial treatment, and the need for an alternative was acutely felt when events like the World Wars limited access to the natural source of quinine, the tree quina cinchona.
Investigation of the phenothiazines led to some successes in malaria treatment. One in particular was the drug quinacrine, a methylene blue derivative that ended up being used to treat malaria as much as quinine itself. Many of the drugs that came out of these investigations, however, did not prove to be effective antimalarials. Instead of abandoning them altogether, though, researchers investigated their potential uses in treating other ailments. In the process, it was noted that some of the drugs had sedative properties, and one line of research explored their potential use in preventing surgical shock, a condition that can cause extremely low blood pressure during surgery and carries a significant risk of death. One hypothesis at the time, proposed by French surgeon Henri-Marie Laborit, was that surgical shock was precipitated by an excessive defensive reaction to stress, and that this exaggerated reaction might be inhibited through the use of sedatives. Laborit found that one of the phenothiazine derivatives, promethazine, was useful for this purpose when mixed with an opioid drug. Under the influence of this drug combination, patients were much calmer going into surgical procedures and the occurrence of shock was significantly reduced.
The success of promethazine in lowering the risk of surgical shock led to the investigation of other phenothiazine derivatives for their sedative effects. One of the resultant substances, a chlorinated derivative of promazine called chlorpromazine, seemed not only to be an ideal candidate for use in preventing surgical shock, but also to possess some other unique pharmacological characteristics. For example, while other drugs of a sedative nature (like barbiturates) inhibited all behavioral responses in experimental animals, chlorpromazine only inhibited certain learned responses. This suggested the drug was having a more targeted effect on the brain and thus that it might have a more specific mechanism than something like a barbiturate, which caused widespread central nervous system sedation.
As Laborit began to use chlorpromazine to prevent the occurrence of surgical shock, he was amazed at the degree of calmness and relaxation patients who were treated with it felt before, during, and after the surgery. These observations led Laborit to suggest the use of chlorpromazine be explored for the treatment of other psychiatric conditions that required sedation. It didn't take long before chlorpromazine was investigated as a potential treatment for psychosis.
Chlorpromazine as an antipsychotic
The symptoms schizophrenic patients present with are very diverse and vary from patient to patient. They can involve the loss of a normal function like speech, emotion, motivation, or the desire to interact with others; such symptoms that involve the deficit of a normal function are often referred to as negative symptoms. On the other hand, schizophrenic symptoms may involve the development of new thought patterns or behaviors. These symptoms, often called positive symptoms, can include delusions, hallucinations, and erratic behavior, and generally involve some loss of touch with reality---a phenomenon known as psychosis.
Positive symptoms can sometimes be difficult for caretakers to manage, as misguided efforts to calm agitated patients may actually cause patients to become more agitated. Thus, a tranquilizing medication that could help to calm agitated patients---but without some of the potentially severe side effects seen with the use of drugs like barbiturates or chloral hydrate---was welcomed by many practitioners of psychiatric medicine in the 1950s. And so the use of chlorpromazine to treat psychotic patients caught on quickly. Chlorpromazine had first been synthesized in 1950, but within five years its use had already spread through Europe and into the United States and Canada. It was introduced to the US market by Smith Kline & French Laboratories (which would later become GlaxoSmithKline) and sold under the trade name Thorazine. It would soon become a highly profitable drug for Smith Kline & French Laboratories, causing other pharmaceutical companies to rush to discover their own lucrative psychotherapeutic drugs.
From such humble beginnings
It wasn't all smooth sailing for chlorpromazine and the other antispychotic drugs that soon emerged in an attempt to replicate its success. It was quickly recognized that these first generation antipsychotics caused movement-related side effects that could be severe---and in some cases irreversible. Critics also argued that antipsychotic drugs still weren't targeting a mechanism specific to schizophrenia, and instead were just a safer way to sedate patients to make their symptoms more manageable. The success of chlorpromazine and other early antipsychotics, however, ushered in a new era of drug discovery that would change psychiatry and the way we think about mental disorders. The idea that medication could be be targeted to relieve the symptoms of a mental illness supported the perspective that disorders were caused by disruptions in neurobiology, and were better treated medically than through approaches like Freudian psychoanalysis (which was the preferred method of treatment until this time). A valid approach to treating psychiatric disorders like schizophrenia also provided options other than institutionalization, leading to improved treatment for schizophrenic patients and other patients with severe psychiatric disturbances.
Coal tar's major influence on psychiatry didn't end with chlorpromazine, as chlorpromazine would be modified to create imipramine, the first tricyclic antidepressant. Additionally, the success of chlorpromazine would cause increased interest in the therapeutic potential of dyestuffs, which would lead to the discovery of the first benzodiazepine (chlordiazepoxide) in the late 1950s. The discovery imipramine and chlordiazepoxide would also be significant moments in the early days of the psychopharmacological revolution. Due in part to the influence of the new drugs that began appearing in the 1950s, the appearance of psychiatric treatment looks nothing like it did 65 years ago. It is still imperfect, but significantly less barbaric and crude than it was at the middle of the 20th century. And, although such paradigm shifts inevitably involve the contribution of many factors, the role of smelly black sludge in this massive change in psychiatric therapy is impossible to deny.
In this video, I discuss the knee-jerk reflex. The knee-jerk reflex, also known as the patellar reflex, is a simple reflex that causes the contraction of the quadriceps muscle when the patellar tendon is stretched. I describe the course of the reflex arc from muscle spindles in the quadriceps muscle to motor neurons that cause movement of the leg. I also discuss the role of inhibitory interneurons in inhibiting the movement of the hamstring muscle, which allows the quadriceps contraction to be unopposed.
In this video I discuss the substantia nigra. I describe the two regions that make up the substantia nigra: the substantia nigra pars compacta and the substantia nigra pars reticulata. I discuss the role of the substantia nigra pars compacta in dopamine production and its contribution to the nigrostriatal pathway, one of the major dopamine pathways in the brain. I also briefly cover the role of the substantia nigra in movement and how this is disrupted by the neurodegeneration that occurs in Parkinson’s disease.
Where is the corticospinal tract?
The corticospinal tract is a collection of axons that carry movement-related information from the cerebral cortex to the spinal cord. About half of these axons extend from neurons in the primary motor cortex, but others originate in the nonprimary motor areas of the brain as well as in regions of the parietal lobe like the somatosensory cortex. Corticospinal tract neurons project from these cortical areas down through the brainstem and into the spinal cord, where they synapse on neurons that directly control the contraction of skeletal muscle.
What is the corticospinal tract and what does it do?
The corticospinal tract is one of the major pathways for carrying movement-related information from the brain to the spinal cord. Signaling along the corticospinal tract seems to be involved in a variety of movements, including behaviors like walking and reaching, but it is especially important for fine finger movements like those that might be involved in writing, typing, or buttoning clothes. After selective damage to the corticospinal tract, patients are usually able to regain the ability to make crude movements (e.g. reaching) after a period of time, but they may be unable to fully recover the ability to make individual finger movements. This suggests other tracts are involved in most aspects of voluntary movement, and that they can generally compensate for the loss of corticospinal tract innervation; individual finger movements, however, may be a function the corticospinal tract is solely or primarily responsible for.
As mentioned above, the corticospinal tract originates in several cortical areas, with about half of the neurons that make up the tract coming from the primary motor cortex. The neurons that travel in the corticospinal tract are referred to as upper motor neurons; they synapse on neurons in the spinal cord called lower motor neurons, which make contact with skeletal muscle to cause muscle contraction.
The axons that travel in the corticospinal tract descend into the brainstem as part of large fiber bundles called the cerebral peduncles. The tract continues down into the medulla where it forms two large collections of axons known as the pyramids; the pyramids create visible ridges on the exterior surface of the brainstem. At the base of the pyramids, approximately 90% of the fibers in the corticospinal tract decussate, or cross over to the other side of the brainstem, in a bundle of axons called the pyramidal decussation. The fibers that have decussated form the lateral corticospinal tract; they will enter the spinal cord---and thus cause movement---on the side of the body that is contralateral to the hemisphere of the brain in which they originated. The other 10% of the corticospinal tract fibers will not decussate; they will continue down into the ipsilateral spinal cord; this branch of the corticospinal tract is known as the anterior (or ventral) corticospinal tract. Most of the axons of the anterior corticospinal tract will decussate in the spinal cord just before they synapse with lower motor neurons. The fibers of these two different branches of the corticospinal tract preferentially stimulate activity in different types of muscles. The lateral corticospinal tract primarily controls the movement of muscles in the limbs, while the anterior corticospinal tract is involved with movement of the muscles of the trunk, neck, and shoulders.
As they travel down to the spinal cord, corticospinal tract neurons send off many collateral fibers that make connections in a number of areas including the basal ganglia, thalamus, various sensory nuclei, etc. Additionally, corticospinal tract fibers terminate in various places in the spinal cord, including the posterior horn (which is normally involved in processing sensory information). These diverse connections suggest that the functions of the corticospinal tract are likely diverse as well, and that defining it as having movement as its sole function is an oversimplification.
When the upper motor neurons of the corticospinal tract are damaged, it can lead to a collection of deficits sometimes called upper motor neuron syndrome. When such an injury occurs, it often results in a state of paralysis or severe weakness immediately following the event, usually on the side of the body opposite to the location of the injury. After several days, function begins to return, but some abnormalities persist. The patient often displays spasticity, which involves increased muscle tone and hyperactive reflexes; motor control may also be decreased. As mentioned above, after damage to the corticospinal tract the ability to make crude movements generally returns but some deficit in fine finger movements may remain. Also, patients may display other abnormal reflexes; the best known of these is the Babinski sign. When the sole of the foot is stroked it generally causes the toes in adults to curl inwards; in someone with damage to the corticospinal tract the toes fan outwards, an abnormal movement referred to as the Babinski sign after neurologist Joseph Babinski. In infants, it is normal to observe the Babinski sign due to the fact that the corticospinal tract is not yet fully myelinated. Thus, the lack of a Babinski sign in infants is considered abnormal and potentially problematic, while the presence of a Babinski sign is adults is pathological and indicates possible corticospinal tract damage.
Vanderah TW, Gould DJ. Nolte's The Human Brain. 7th ed. Philadelphia, PA: Elsevier; 2016.
Headaches are one of the most common neurological complaints; most people will experience headaches at some point in their life and close to 50% of the world's population is estimated to be suffering from a headache disorder at any point in time. The World Health Organization considers headaches to be one of the most disabling conditions people experience based on the impact chronic headaches can have on quality of life.
There are more than 200 different types of headaches, which are broadly classified as either primary or secondary headaches. Primary headaches are headaches that are not clearly associated with any disease or structural disturbance; they are often benign and make up the majority of headache complaints. Secondary headaches are less common, but also can indicate a much more dangerous situation as they may be a symptom of some underlying problem like an infection, injury, stroke, or tumor. While secondary headaches are not always serious, they are more likely to be than primary headaches.
Causes of primary headaches
There are several categories used to classify primary headaches; they include: migraine, tension-type headache (TTH), trigeminal autonomic cephalalgias (TACs), and other primary headache. Each of these categories also contains a number of headache subtypes. Migraine and TTH are the two most common types of primary headaches, while TACs are rare and are only estimated to affect about 1 in every 1000 people. Thus, this section of the article will focus on the more common primary headache types: TTH and migraine.
Tension-type headache (TTH)
TTH is the most common type of primary headache and chronic TTH is the most prevalent headache disorder; up to 80% of people are likely to experience at least one TTH in a given year and around 40% of the world's population is estimated to be suffering with TTH disorder. The symptoms of TTH involve a mild to moderate intensity headache that affects both sides of the head and usually feels like pressure or tightness around the head. Some patients describe TTH as feeling like their head is "in a vice."
Although the pathophysiology of TTH is not fully understood, it is thought to involve aspects of both the peripheral and central nervous systems. Peripherally, patients who suffer from TTH display abnormalities in the sensitivity of the muscles associated with the head, which tend to be more tender and sensitive to pain. For reasons that are unclear, these muscles are often harder in TTH patients as well; this hardness is also associated with increased tenderness and pain, and might indicate excessive contraction of the head muscles.
Although numerous studies have documented the increased tenderness and hardness of the muscles of the head in TTH patients, it is still uncertain what sort of stimulus might act as the "trigger" to cause the generation of excessive pain signaling from these muscles. A number of potential causes have been identified, such as sustained muscle contraction like that seen with teeth clenching, abnormalities in blood flow, and signaling initiated by inflammatory chemical messengers like serotonin. It is unclear at this point, however, if any of these mechanisms is a common denominator in the onset of all TTH.
It is also thought there may be some disruption in central pain pathways in TTH, and that this type of disruption may be especially important for the transition of infrequent TTH to chronic TTH. One hypothesis is that the increased muscular pain sensations mentioned above cause excessive activation of spinal cord and brainstem neurons involved in pain signaling as well as activation of neurons in areas of the brain that are involved with processing pain, like the thalamus and somatosensory cortex. This exaggerated activity may also cause pain pathways to become highly sensitized, leading them to be activated in response to smaller and smaller degrees of stimulation. Additionally, it is thought that mechanisms involved with pain inhibition may begin to fail to suppress pain signaling, compounding the effect.
In addition to these proposed mechanisms, it is thought there are a number of other potential influences on the occurrence of TTH. For example, studies suggest TTH has a genetic basis, and there also seems to be a relationship between psychological stress, anxiety, and depression and the occurrence of TTH. It is not clear, however, exactly how these factors may influence the pathophysiology of TTH. In other words, although there are commonalities in the genetics and/or mood state of people who are likely to experience TTH, the underlying neurobiology that causes those commonalities to lead to increased frequency of TTH is unknown. Thus, TTH is a relatively poorly understood disorder, and many questions remain about the precise mechanisms underlying TTH as well as the risk factors that make someone more likely to experience TTH.
Migraine is the second-most prevalent type of primary headache disorder, affecting more than 10% of the world's population. Migraines can sometimes be hard to differentiate from TTH, but there are some distinct symptomatic differences between the two complaints. For example, migraines tend to affect only one side of the head and are usually characterized by a throbbing or pulsating pain (as opposed to the pressure-like pain felt in TTH). Also, unlike TTH, migraines are generally made worse by physical activity and often are associated with nausea and/or vomiting. Migraine headaches can last anywhere from several hours to several days and involve a variety of symptoms ranging from autonomic problems like nausea and vomiting to sensory disturbances like extreme sensitivity to light and sounds.
There are often a variety of symptoms that occur in the hours before the onset of a migraine headache; these are referred to as prodromal symptoms. The presentation of symptoms varies significantly depending on the individual, and may involve diverse afflictions ranging from fatigue and mood changes to a stiff neck and frequent yawning. In many migraine sufferers, there is also a distinct period of neurological disturbance that occurs just before the onset of a headache. This is known as an aura, and it can include an assortment of symptoms such as: visual disturbances like flashing lights or loss of vision, numbness or tingling of the face or extremities, tremor, weakness, auditory hallucinations, or difficulty speaking.
It is thought that multiple brain regions may be responsible for the diverse group of prodromal symptoms. For example, aberrant activity in the hypothalamus may generate symptoms like food cravings and fatigue, brainstem neurons may cause neck stiffness and muscle tenderness, activity in the cortex may lead to unusual sensitivity to sensory stimulation, and other areas of the limbic system may be responsible for symptoms like depression and anhedonia. How this activity then triggers the development of a headache, however, is unclear. One hypothesis is that the activation of hypothalamic neurons can cause stimulation of neurons in the brainstem, leading to the release of substances that promote vasodilation and inflammation. These processes are associated with the stimulation of nociceptors in the meningeal layers surrounding the brain as well as with the activation of neurons associated with the trigeminal nerve, the main pain pathway for the head and face. Although it is still somewhat unclear why these relatively minor effects would result in pain signaling as intense as what is seen in migraines, it may be that migraine sufferers also have increased sensitivity in these pain pathways, causing what should be mild pain to be excruciating.
The auras that many migraine sufferers experience are also associated with abnormal waves of electrical activity in the brain, which are referred to as cortical spreading depression, or CSD. In CSD, a wave of excitation or depolarization spreads across neurons in the cortex; it is followed by a wave of inhibition or hyperpolarization. This abnormal brain activity is associated with the release of pro-inflammatory substances, which may cause nociceptors in the meninges to be activated and lead to the onset of the migraine itself.
Again, it is not clear why these patterns of activity should cause such excruciating pain, thus it is hypothesized that patients who experience migraines may also have overly-sensitive pain pathways that carry signals from the meninges and their associated arteries to the brain. The main pathway that carries such signals, called the trigeminovascular pathway, can be activated by events like CSD. Also, it appears that repeated activation can cause the pathway to begin to respond to stimuli that might not have been strong enough to elicit a response in the past. This hypersensitivity may be responsible for the chronic nature of migraine that is experienced by many sufferers of migraine disorders.
Causes of secondary headaches
While primary headaches are not clearly caused by some other disease, disorder, or otherwise underlying problem, secondary headaches can generally be associated with a distinct causal factor; this can be anything ranging from the overuse of certain medications to the presence of a brain tumor. Secondary headaches are less common---but potentially more serious---than primary headaches. They can be due to a number of different mechanisms depending on the primary cause of the headache. If due to a brain tumor, for example, the cause of the headache might involve increases in intracranial pressure driven by the presence of an abnormal growth; this increased pressure can activate the trigeminovascular pathway. Medication-overuse headache, another form of secondary headache that usually stems from the regular use of medications like opioid drugs to prevent headaches, can involve alterations in neurotransmitter systems among other potentially pathological neurobiological changes. Thus, the mechanisms underlying secondary headaches are as diverse as the conditions they are associated with, and for the most part our understanding of their pathogenesis is not complete.
Poorly understood afflictions
Despite the fact that headaches are an affliction that people have written about since the beginnings of recorded history, there is still much to be understood about how they occur. The mechanistic explanations for our most common headache disorders remain slightly vague and, in some cases, relatively speculative. It is hoped, however, that with continued study and the application of newer methods of investigation in neuroscience (e.g. neuroimaging), we will be able to understand this diverse collection of disorders more thoroughly, leading to better treatments for headaches and headache disorders.
In this video I discuss the corticospinal tract, a major tract that carries movement-related information from the motor cortex to the spinal cord. I discuss upper and lower motor neurons and trace the pathway the corticospinal tract takes from the cortex to the spinal cord, mentioning the major fiber bundles it is found in along the way like the cerebral peduncles and medullary pyramids. I describe the two branches of the corticospinal tract: the lateral and anterior corticospinal tracts, and I discuss their respective specializations. Finally, I cover the types of deficits that can appear when damage to the corticospinal tract occurs.