Hallucinogens, starvation, and magnets: A new cure for depression?

What do hallucinogens, starvation and magnets all have in common? No, they’re not the key ingredients for a wild and crazy weekend; they are all potential alternative treatments for depression that are being explored by researchers and clinicians alike.

Scientists have long known that the serotonin theory of depression is imperfect, yet few treatment options are available beyond the standard course of cognitive-behavioral therapy and selective serotonin reuptake inhibitors (SSRIs). In my new piece for Pacific Standard, I explore recent research that has emerged looking at some potential alternatives for depression that are rather… unconventional.

This includes giving people psilocybin, the active ingredient in so-called “magic” mushrooms, which also boosts serotonin levels and crucially taps into the amygdala, the brain’s major emotional center. Another possible avenue involves boosting ghrelin levels in the brain, a hunger hormone that may also play a role in protecting neurons from the destructive effects of stress, particularly in the hippocampus. Alternatively, using high-powered magnets, researchers and clinicians are able to activate certain key parts of the brain that can potentially lead to a suppression of other over-active emotional regions, turning down our feelings of anxiety or depression.

While none of these options is perfect, they do provide an encouraging new perspective, thinking outside the box to treat this condition that will afflict at least one in ten of us at some point in our lives.

You can check out the full story in Pacific Standard here.

Instant gratification as a way out of addiction?

Impulsivity is often seen as a hallmark of addiction — acting without thinking about the consequences of your actions and valuing the immediate reward of a drug-induced high over the future long-term payout of a healthy lifestyle. This type of delay discounting has been linked to a greater risk for drug addiction, but new research suggests that this type of “myopia for the future” may also improve someone’s chances of staying sober when they’re trying to get clean.

My latest piece for The Fix investigates the research behind this paradox, which suggests that those who are the most impulsive have the most to gain from effective treatment, cognitive training successfully improving their self-control. But is this effect a result of the treatment program itself or just a regression to the mean? Check out the article here to find out.

Resisting temptation in the brain

Having spent the last three years studying how difficult it is to say no to our vices, and being intimately acquainted with all that can go wrong in fMRI research, I’m always a bit skeptical of studies that claim to be able to predict our capacity for self-control based on a brain scan. But a new paper out this week in Psychological Science seems to have done a pretty admirable job, tying our real-life ability to resist temptation with activity in two specific areas of the brain.

Researchers from Dartmouth University first tested 31 women on two different tasks: an assessment of self-control and a measurement of temptation. Using an fMRI scanner, they compared where the women’s brains lit up when they were stopping themselves from performing a certain action (pressing a button, to be exact), and when they were seeing images of ice cream, hamburgers, and other tasty treats. As expected, better performance on the response inhibition task was linked to activation in a part of the brain called the inferior frontal gyrus (IFG), a region in the frontal cortex known to be involved in inhibiting a response. Conversely, looking at pictures of chocolate and chicken sandwiches activated the nucleus accumbens (NAcc) a deeply rooted part of the brain that’s essential in feelings of reward.

So far, this is all pretty par for the course; you exert self-control, you activate your control center. Looking at something enticing? Your reward region is going to light up. Nothing new or ground-breaking (or even that useful, to be honest). But the researchers didn’t stop there. Instead, they took the study out of the lab to see what happened when the participants were faced with real-life temptations. Equipping them with Blackberry smartphones, the participants were prompted throughout the week with questions about how badly they desired junk food, how much they resisted these cravings, whether they gave in to their urges, and how much they ate if they did cave to temptation.

Comparing these responses to brain activity in the two target areas, the researchers discovered that the women who had the most activity in the NAcc while viewing images of food were also the ones who had the most intense cravings for these treats in real life. Additionally, these women were more likely to give in to their temptations when they had a hankering for some chocolate. On the other hand, those who had greater activity in the IFG during the inhibition task were also more successful at withstanding their desires — in fact, they were over 8 times more likely to resist the urge to indulge than those with less activity in the region. And if they did give in, they didn’t eat as much as those with a smaller IFG response.

Having confirmed the link between activity in these areas and real-life behaviors, the next step is to figure out how to ramp up or tamp down activity in the IFG and NAcc, respectively. One technique that psychologists are exploring is transcranial magnetic stimulation, or TMS. This involves zapping a certain part of the brain with an electromagnetic current, trying to either stimulate or depress activity in that region. So far, use of TMS in studies of addiction and eating disorders — attempting to enhance self-control and decrease feelings of craving — has been met with limited success. Pinpointing the exact right area through the skull and figuring out the correct frequency to use can be difficult, and in fact, a few studies have actually accidentally increased desire for the substance! Additionally, the effects are often temporary, wearing off a few days after the stimulation is over. Other studies have looked at cognitive training to try to enhance self-control abilities, particularly in children with ADHD, although these attempts have also varied in their success.

Beyond targeting certain psychiatric disorders or trying to get us to say no to that second (or third or fourth) cookie for reasons of vanity, there’s a push to enhance self-control from a public health standpoint. The authors of the current study cite the staggering statistic that 40% of deaths in the U.S. are caused by failures in self-control. That’s right, according to research, 40% of all fatalities are caused by us not being able to say no and partaking in some sort of unhealthy behavior, the biggest culprits being smoking and over-eating or inactivity leading to obesity. Clearly then, improving self-control is not only needed to help individuals on the outer edges of the spectrum resist temptation, it would benefit those of us smack dab in the middle as well.

Happy Friday!

Cannabis and memory loss: dude, where’s my CBD?

I’ve got a new piece in The Guardian today on memory deficits in heavy cannabis users, and how the type of weed you’re smoking can actually impact your risk for impairment. Dedicated Brain Study readers might recognize it as a revamped, beefed-up version of the infamous “Weed be better off smoking our parents’ pot” post from last year. Now, I’ve incorporated some new research into the piece on cognitive problems in heavy smokers, as well as the relevant policy news from Colorado, Washington and Uruguay regarding legalization. I also talk about how these developments could result in more than one type of harm reduction, which is an exciting prospect for improving the safety of the drug with government regulation.

Check out the full piece here, and as always, let me know what you think.

Beating the poppy seed defense

During my PhD, one of the research projects I was involved in was a relapse prevention study testing individuals who had previously been addicted to alcohol, cocaine or heroin, but were no longer using any drugs.

One participant who took part in the study — I’ll call him Dave — was a young guy who was dependent on alcohol, but swore up and down he had never abused any drugs. Dave was three weeks into the study and doing well, staying abstinent and remaining cheerful and cooperative throughout the sessions. However, one morning when Dave came in and went through his usual drug screen, he tested positive for heroin, something he claimed (and I believed) he had never taken.

Instead, Dave maintained he had eaten a poppy seed bagel for lunch the day before, which would explain the positive test.

Opiates — like heroin, morphine or opium — are all derived from the poppy seed plant, and it’s not uncommon for poppy seeds to give a false-positive result for opiates on a drug screen. However, it’s also not uncommon for people to falsely plead the poppy seed defense, and there is no way of confirming what form of morphine (heroin or poppy seed) is actually causing the positive screen. Until now.

Researchers from King’s College London have discovered a metabolite of heroin that only exists in the synthetic form of the drug and can be reliably tested for using a urine screen. This means that instead of screening for all types of opiates, doctors and researchers can now test for only the presence of heroin in the body.

Notably, the test would also not come back positive for any prescription painkillers, which is simultaneously an advantage and a disadvantage of the new screen. For those who are legitimately prescribed the medications, there would be no more concerns over having a suspicious positive result. However, the tests would also not be able to identify the more than 12 million Americans who are using these drugs without a prescription. This is especially problematic as prescription painkillers have quickly surpassed all other types of drugs as the most common form of overdose, totaling more deaths in 2010 than cocaine and heroin combined, and prescription painkiller overdose has now become the leading cause of death by injury in the U.S.

The new test is still under investigation and isn’t perfectly refined (only 16 of the 22 known current heroin users tested positive for the metabolite in the study — meaning it has a detection rate of only about 75%), but it is a promising new avenue for researchers and medical screeners to more accurately identify the presence of heroin.

As for Dave, he successfully completed the study without any other events, and he never ate another poppy seed before a session again.

Can synesthesia in autism lead to savantism?

I’ve got a new piece out on the Scientific American MIND blog network today on the fascinating link that’s been discovered between synesthesia – a “crossing of the senses” where one perceptual experience is tied to another, like experiencing sound and color together – and autism spectrum disorder.

Individuals with autism have significantly higher rates of synesthesia than the rest of the population, and the two are potentially linked by a unique way in which the brain is wired. White matter tracts that traverse our brains, connecting one area to another, are thought to be increased in both conditions. This results in an abnormal wiring of the brain that may lead in more close-range connections, but fewer long-distance ones. And it’s possible that these extra connections may also contribute to some of the extraordinary cognitive abilities seen in some autistic individuals with savant syndrome.

For more on the story, check out the full piece on here.

Sweet dreams are made of cheese

You’re running down a hallway; running away from someone? Running towards something? Your feet start to lift off the ground and the ceiling opens up. You float higher and higher, and you get the feeling you’re not alone. You turn to your left and it’s Bob Dylan, laughing and calling you “Mr. Tambourine Man”. Suddenly the balloon you were holding onto, carrying you up into the sky, turns into a tangerine and you start to plummet back to earth. Just before you slam into the ground you awaken; sweaty, sheets twisted, wondering what the hell that was all about.

Dreams are weird. Especially if you’ve eaten a lot of cheese the night before.

Or so says the common myth. From Charles Dickens to Arab Strap, cheese dreams have been a part of our popular culture for over the last 150 years. But is there actually any truth in this old wives’ tale?

A study conducted in 2005 by the British Cheese Board attempted to debunk this claim by giving 200 participants 20 grams (roughly 0.7 ounces) of cheese 30 minutes before they went to bed and asking them to record their dreams and quality of sleep. In the study, 67% of participants recalled their dreams, and none reported the presence of any nightmares, something the Cheese Board is calling a win.

Instead of night terrors, the researchers report that the cheese resulted in pleasant nighttime fantasies in most individuals. They even went so far as to test the varying effects different types of fromage had on an individual’s dream-state. From their conclusions, blue Stilton resulted in the most bizarre trips, affecting about 80% of participants and resulting in visions of talking animals, vegetarian crocodiles and warrior kittens. On the other end of the spectrum, Cheshire cheese produced the least memorable nights, with less than half of the participants being able to recall their dreams.

The study (again, initiated by the cheese industry) also claimed that eating cheese before bed actually helped people fall asleep. This is supposedly due to the relatively high tryptophan content in cheese, an amino acid involved in the production of melatonin (and serotonin), which plays an important role in our sleep-wake cycle.

However, it should be noted that there was no report of a control or placebo group in this experiment, such as participants who ate nothing or consumed a soy cheese sample (yum!) before bed. Thus, there’s no empirical evidence that it was actually the cheese causing these effects and that it was not just the natural sleep state for these individuals.

As for the dream link, there is only one academic paper that mentions the cheese-dream phenomenon, and that is only anecdotally. However, one Internet theory I found (I know, I’m reaching here) proposed that the bacteria and fungal content in cheese, and in potent blue cheeses in particular, might be at the root of the increase in dream vividness. This is due to the potential psychoactive effects different compounds found in fungi, like tryptamine or tyramine, might have, influencing our brains’ chemical systems and thus our state of mind.

Tryptamine is a common chemical precursor for serotonin and other related alkaloids, some of which are involved in the hallucinogenic effects of psilocybin (“magic” mushrooms) and DMT. However, there’s no hard evidence that tryptamine is actually present in the Stiltons and Gorgonzolas of the world, and even if it was, it would be in extremely low doses. After all, when was the last time you felt high after eating cheese?

Conversely, tyramine is a monoamine that works by releasing other neurotransmitters like adrenaline, noradrenaline and dopamine into the body. Another theory is that tyramine’s effect on noradrenaline release in an area of the brain called the locus coereleus, a region important in our sleep-wake cycle, is altering our dream patterns.

Some antidepressants work by inhibiting the breakdown of monoamines (monoamine oxidase inhibitors – MAOIs), and it can be potentially dangerous to eat foods high in tyramine when on this medication as it can result in an excess of these chemicals in your brain and body. The medication mentioned in the old academic paper, pargyline hydrochloride, actually works as an MAOI, potentially explaining the bizarre effect eating cheese had on the patient. There are also reports of foods high in tyramine causing migraines in some individuals, particularly those on MAOIs; however, another study found no evidence of this link.

Finally, there are numerous other types of foods that contain chemical compounds like tyramine and tryptophan affecting our neurotransmitter systems. This includes cured meats, egg whites and soybeans, none of which have the dream-producing reputation of cheese. So for now, it appears to be an untenable link between cheese specifically and these nighttime apparitions.

Then again, I did eat some cheddar last night, which might just explain Bob Dylan’s appearance in my nocturnal activities. According to the Cheese Board, cheddar was linked to visions of celebrities dancing in your head.

(Thanks to Sam Greenbury for the inspiration for this post.)