Brain development and decay: More cannibal neurons

The brain is a plastic organ constantly changing and adapting, creating new connections with the inclusion of novel thoughts and memories, and losing others as we age and decay. Two recent discoveries have further illuminated how and when these processes occur, challenging current theories on the processes of neurogenesis, cellular myelination, and neuronal pruning.

The majority of these neuro-developmental process takes place when we are young, the brain going through immense cortical growth in different regions of the brain as we mature and learn and require new skills and knowledge. This process occurs in waves throughout the brain, starting in the occipital and parietal lobes during childhood as we perfect sensory abilities, fine motor movement, coordination, and spatial awareness. Next the temporal lobe matures, improving our memory and language abilities. And finally the frontal cortex, responsible for our abilities to inhibit, control, plan, pay attention, and perform demanding cognitive tasks, becomes fully functional during adolescence. However, more cells and cell connections are created than are needed, and closely following this neurogenesis comes periods of pruning, when unused synapses or connections are weeded out and destroyed. This makes the brain more efficient, conserving space and cellular energy, and streamlining neural processes so only the essential and most important regions are utilized.

Until recently, the process by which this pruning occurred was a relative mystery. However, a study published last month in Science revealed that microglia–immune cells in the brain that identify and destroy invading microbes–may be involved in this process. Microglia travel through the brain and converge on areas of brain damage to clean up and dispose of leftover dead neurons and cell debris. They accomplish this via phagocytosis, the cellular process of engulfing solid particles similar to the process of autophagy I wrote about last week.

Scientists at the European Molecular Biology Laboratory in Monterotondo, Italy discovered that microglia also monitor synapses in the brain in a similar manner, consuming unused and un-needed connections during periods of brain maturation. Researchers used an electron microscope to look at microglia cells located near synapses in the brains of mice. These cells contained traces of both SNAP25, a presynaptic protein, as well as PSD95, a marker of postsynaptic excitatory activity. Both are an indication of cell interactions, suggesting that these microglia had consumed both presynaptic and postsynaptic parts of the connections. It is still unknown how these cells identify the proper synapses to engulf, though scientists believe it has to do with fractalkine, a large protein involved in the communication between neurons and microglia.

It was initially thought that this process of cortical maturation ended after adolescence, however, new evidence suggests that neural development, particularly in the prefrontal cortex, occurs well into one’s 30s. Studies published this year in PNAS and the Journal of Neuroscience indicate that both the addition of new white matter (myelinated connective neurons) as well as a decrease in gray matter (synaptic connections eliminated through pruning) continue into early adulthood. The implications of this discovery have to do with the timing of the onset of several psychiatric disorders, including schizophrenia and drug abuse, which commonly arise in the early to mid-20s. If the brain is still developing during this time, it suggests that we may be more vulnerable to environmental influences longer than suspected, or that problems with this period of final development could be at play in these disorders.

Cannibal neurons

A study published this month in Cell Metabolism indicates for the first time that the brain, in specific situations, may eat itself. Researchers from the Albert Einstein College of Medicine discovered that a starvation state causes cells in the hypothalamus to commit autophagy, the breakdown, consumption and recycling of itself by the cell’s lysosomes. This process is common in cells throughout the body and is an efficient method of energy conservation. The cell partitions off extraneous parts of itself from the core of the cell body and these bits are then broken down by the lysosomes. The resulting release of nutrients are consumed by the remaining cell in order to maintain survival. Autophagy is a natural process in the cycle of cell growth and death, and it is increased during starvation to provide an alternative source of nutrients and energy when external resources are not readily available. However, it was previously thought that the brain was exempt from this cannibalistic process during nutrient deprivation.

To examine this process in neurons, researchers first simulated starvation in hypothalamic cells by withholding a nutrient serum. After just 1 hour, these cells began to produce LC3-II, an indicator that autophagy was taking place. The levels of LC3-II progressively increased as the time without serum continued, however, this process was reversed once the cells were provided with nutrients. Researchers then repeated these results in starved rats. Rats who were fasted for 12 hours showed increased autophagy in the hypothalamus via elevated LC3-II levels in these neurons, compared to groups of both satiated and previously starved but recently re-fed rats who showed little to no change in the autophagic marker. After repeating the experiment in other neurons from different areas of the brain, coupled with the lack of findings from previous studies of autophagy in neurons, scientists have tentatively concluded that the hypothalamus is the only region in which this process occurs.

The wider implications of these results have to do with the specific region in which they are seen. The hypothalamus is a major regulatory center of the brain involved in hormone and neuropeptide release, which in turn control the homeostatic regulation of metabolism, hunger, and satiety, among other processes. However, when these hypothalamic cells are broken down by starvation-induced autophagy, the gentle balance of these regulatory hormones is up-ended. Nutrient deprivation and autophagy can result in an increase in free fatty acid production in the liver, which in turn elevates hypothalamic levels of the peptide AgRP, which is involved in increased food intake. Thus, when an individual is able to eat again, they are more likely to consume greater quantities of food than normal. These findings were further supported by studying mice who had been genetically modified to produce AgRP but were not susceptible to autophagy. Compared to normal mice, these genetically modified mice, when starved, did not have elevated AgRP levels, nor did they demonstrate the binge eating that is common after long fasts. Consequently, they also had lower body weight and body fat than normal control mice.

The implications of these results in weight loss and dieting further support the idea that fasting or extreme low-calorie diets are only beneficial in immediate weight loss but can have substantial contradictory results in the long term. It is widely accepted that food deprivation causes one to feel hungry and irritable, but the notion that diets can lead to brain cell loss, resulting in a greater inability to manage what one eats, is a novel theory. This can lead to loss of control, binges, regret, and further self-imposed starvation that is often seen in the vicious cycle of extreme dieting. Additionally, the increase in fatty acids during autophagy can also occur after an increase in fat consumption. Therefore, not only starvation, but also the following binges on fatty foods, can result in an increase in the desire for and consumption of food. To avoid this counter-productive weight loss tactic, it is recommended to instead change the source of your diet (i.e., from high-fat foods to fruits and vegetables) and only decrease calorie intake in moderation while still ensuring your body receives the nutrients and energy it needs.

Psychiatry and psychedelia

Timothy Leary, the influential psychologist who popularized the use of LSD in the 1960s, is a polarizing figure in the debate on psychiatry and psychedelic drugs. While revered by some as the father of psychedelic research, he is reviled by others for his unconventional research methods, which culminated in Leary’s lab, the Harvard Psilocybin Project, being shut down by the university in 1963. Following this setback, Leary established a private laboratory in a mansion in upstate New York where he continued conducting studies on his friends and followers. Leary’s previously unseen recordings and notes from this period have recently been purchased and published by the New York Public Library, for the first time giving us insight into the ground-breaking work done with these then novel substances.

The backlash to Leary and his colleagues’ extremist views regarding drug use, religion, and politics (Richard Nixon at one point called Leary “the most dangerous man in America”) potentially discredited any real benefits psychedelics may have in therapy for over 40 years. Experimentation and research with LSD, psilocybin (commonly known as hallucinogenic or “magic” mushrooms), and other psychedelic substances have been highly stigmatized and virtually impossible to conduct in a responsible laboratory setting. However, the restrictions against such research have gradually been lifted, and new studies are cautiously popping up heralding the clinical benefits of drugs like psilocybin and MDMA, or ecstasy.  These drugs are thought to help individuals who suffer from severe anxiety, post-traumatic stress disorder (PTSD), obsessive-compulsive disorder (OCD), and depression when paired with talk therapy methods. The effectiveness of these drugs is thought to lie in their ability to engender feelings of empathy, love, and connectedness, fostering a sense of unity and compassion for oneself and fellow man. These feelings may potentially create an easier, more open environment for patients to discuss their concerns, while safely being guided by clinicians on a therapeutic trip.

Some of the most notable research coming out of this renaissance is being conducted at John’s Hopkins University, led by Dr. Roland Griffiths. Griffiths has shown that psilocybin can be used to effectively reduce levels of depression and anxiety in terminally ill cancer patients, anecdotally helping some patients to accept and come to terms with their approaching mortality.

Another influential use for this type of research is in patients with PTSD. With the recent announcement that antipsychotic drugs–frequently prescribed to help treat PTSD in combat veterans–are no more successful than placebos at improving symptoms, it seems that a new method is needed to help those suffering from severe trauma. Antidepressants are already known to be ineffective in treating PTSD, and doctors were hoping that antipsychotic medication, a stronger affecter of mood, would be more successful at treating the associated symptoms, like flashbacks, memory suppression, outbursts of anger, anxiety, anhedonia, and depression. However, after six months of treatment, only 5% of patients who received the drug had recovered, a number that was statistically negligible and not significantly different from those who had received a placebo.

Some researchers are now looking at more unconventional methods, such as psychedelics, as a potential treatment for PTSD. Clinical researchers both in the US and in the Netherlands have shown that MDMA can be effective at reducing PTSD symptoms in survivors of rape or other traumatic events. Neurologically, MDMA stimulates serotonin in the brain, a neurochemical linked to feelings of happiness and the depletion of which is commonly attributed to depression. This activation takes place throughout the brain, but much of it is focused in the dorsal lateral prefrontal cortex (dlPFC), a region involved in higher order cognition, memory, and associative learning. Simultaneously, there seems to be a decrease in amygdala activity, an area involved in fear and emotion. Taken together, these two changes in neural activity are thought to increase memory and improve rational, cognitive coping of the traumatic event while down-playing the aversive negative emotions connected to it. Therefore, an individual would be able to replay the painful details of a memory and rationally analyze and come to terms with them, facilitated by a boost in mood from serotonin and disconnected from the typical painful affective responses. This could potentially help a patient “relearn” their associations with this memory, thus allowing them to lose the negative and recreate positive affective associations for these recalled experiences.

However, just as there are side effects with any drug, so there are too with MDMA. The most notable and potentially harmful one is a resulting decrease in serotonin after the high has worn off. When the brain is flooded with a neurochemical, it regulates itself to become less receptive to this neurotransmitter, adapting to re-obtain homeostasis in the chemical levels in the brain. Therefore, the brain becomes relatively depleted of serotonin in the days after taking MDMA, and after multiple uses (or abuses) of the drug this effect can become pervasive and long-lasting. While the amounts of depletion are not particularly severe after minimal use, this temporary loss could be potentially devastating in a patient who is already struggling with depression or anxiety.

Banning potentially valuable clinical research because of social concerns and constraints only hurts scientific progression and the community at large. However, it is important to keep in mind that these psychedelic substances are powerful drugs with potentially severe consequences. They should be investigated, as their benefits to clinical populations could be immense, but they should still be used carefully as much is still unknown (just as much as unknown about most drugs, prescription or otherwise) about their mechanisms and effects. Responsible research is the best way to investigate the therapeutic possibilities of these drugs, and the existence of methodical record taking like Leary’s can only help us in our quest to understand these substances and their effects on the mind.

A neural workout

Last week I wrote about some of the emotional benefits of regular moderate exercise. This week, several timely new articles have come out touting the cognitive advantages of even minimal daily activity.

Numerous studies have shown evidence of the neurological benefits of exercise, which can foster cell growth and new cell generation (known as neurogenesis). One region that seems to be particularly impacted is the hippocampus, an area known to be involved in memory consolidation. Prior studies in mice have shown that exercise can trigger neurogenesis in the hippocampus, and in humans exercise has been linked to better performance on memory assessments and spatial learning, as well as a decreased risk for dementia. While some of these benefits are believed to be due to the new proliferation of cells in the hippocampus and other associated regions of the brain, several studies published recently suggest that exercise may serve more as a protective factor against neurological decay than a booster of existing memory performance.

study presented last week at the Alzheimer’s Association International Conference by doctors at the University of California, San Francisco used mathematical modeling to estimate risk factors for the development of Alzheimer’s disease, and they came up with seven critical variables: diabetes, hypertension, obesity, smoking, depression, low education, cognitive inactivity, and physical inactivity. The researchers predicted that these seven variables were to blame in nearly 50% of current Alzheimer’s diagnoses, and lack of exercise alone was attributed to over 20% of cases. In addition, the researchers predicted that reducing these risk factors could potentially stave off over one million future diagnoses. However, these numbers are estimations, and first author Dr. Deborah Barnes–as well as other lead researchers in the field–caution against using these statistics as hard goals and guidelines. Barnes notes that while these seven factors do increase the risk for Alzheimer’s, a causal relationship has not yet been established, and therefore simply changing one’s behavior in regards to one or all of the variables may not be enough to prevent the onset of the disease.

While the association between Alzheimer’s and exercise is still tentative, there is little doubt about the mental and physical benefits of daily activity. However, previous studies have largely focused on the advantages of moderate-to-high levels of exercise in humans and animals, such as the widely recommended guidelines of 30 minutes of exertion 5 days a week. But what about people who are unable to workout that much or that often? Fortunately, there is new evidence that the even minimal movement can offer cognitive benefits. In a longitudinal study investigating elderly adults aged 70 and up, those with the least amount of daily average energy expenditure had the greatest amounts of cognitive decline over a period of three years, whereas those who were the most active had significantly less cognitive impairment both during the three-year study period, as well as after a five-year follow-up. It seems that even small exertions like walking around the block, doing household chores, or even fidgeting–behaviors that often go unreported in other studies of physical activity–can help stave off the neural deterioration that commonly occurs as we progress into old age.

However, if you still can’t be bothered to get up and start moving, you can always resort to surgical implants and get one of these to improve your memory.