Hallucinogens, starvation, and magnets: A new cure for depression?

What do hallucinogens, starvation and magnets all have in common? No, they’re not the key ingredients for a wild and crazy weekend; they are all potential alternative treatments for depression that are being explored by researchers and clinicians alike.

Scientists have long known that the serotonin theory of depression is imperfect, yet few treatment options are available beyond the standard course of cognitive-behavioral therapy and selective serotonin reuptake inhibitors (SSRIs). In my new piece for Pacific Standard, I explore recent research that has emerged looking at some potential alternatives for depression that are rather… unconventional.

This includes giving people psilocybin, the active ingredient in so-called “magic” mushrooms, which also boosts serotonin levels and crucially taps into the amygdala, the brain’s major emotional center. Another possible avenue involves boosting ghrelin levels in the brain, a hunger hormone that may also play a role in protecting neurons from the destructive effects of stress, particularly in the hippocampus. Alternatively, using high-powered magnets, researchers and clinicians are able to activate certain key parts of the brain that can potentially lead to a suppression of other over-active emotional regions, turning down our feelings of anxiety or depression.

While none of these options is perfect, they do provide an encouraging new perspective, thinking outside the box to treat this condition that will afflict at least one in ten of us at some point in our lives.

You can check out the full story in Pacific Standard here.

Instant gratification as a way out of addiction?

Impulsivity is often seen as a hallmark of addiction — acting without thinking about the consequences of your actions and valuing the immediate reward of a drug-induced high over the future long-term payout of a healthy lifestyle. This type of delay discounting has been linked to a greater risk for drug addiction, but new research suggests that this type of “myopia for the future” may also improve someone’s chances of staying sober when they’re trying to get clean.

My latest piece for The Fix investigates the research behind this paradox, which suggests that those who are the most impulsive have the most to gain from effective treatment, cognitive training successfully improving their self-control. But is this effect a result of the treatment program itself or just a regression to the mean? Check out the article here to find out.

Resisting temptation in the brain

Having spent the last three years studying how difficult it is to say no to our vices, and being intimately acquainted with all that can go wrong in fMRI research, I’m always a bit skeptical of studies that claim to be able to predict our capacity for self-control based on a brain scan. But a new paper out this week in Psychological Science seems to have done a pretty admirable job, tying our real-life ability to resist temptation with activity in two specific areas of the brain.

Researchers from Dartmouth University first tested 31 women on two different tasks: an assessment of self-control and a measurement of temptation. Using an fMRI scanner, they compared where the women’s brains lit up when they were stopping themselves from performing a certain action (pressing a button, to be exact), and when they were seeing images of ice cream, hamburgers, and other tasty treats. As expected, better performance on the response inhibition task was linked to activation in a part of the brain called the inferior frontal gyrus (IFG), a region in the frontal cortex known to be involved in inhibiting a response. Conversely, looking at pictures of chocolate and chicken sandwiches activated the nucleus accumbens (NAcc) a deeply rooted part of the brain that’s essential in feelings of reward.

So far, this is all pretty par for the course; you exert self-control, you activate your control center. Looking at something enticing? Your reward region is going to light up. Nothing new or ground-breaking (or even that useful, to be honest). But the researchers didn’t stop there. Instead, they took the study out of the lab to see what happened when the participants were faced with real-life temptations. Equipping them with Blackberry smartphones, the participants were prompted throughout the week with questions about how badly they desired junk food, how much they resisted these cravings, whether they gave in to their urges, and how much they ate if they did cave to temptation.

Comparing these responses to brain activity in the two target areas, the researchers discovered that the women who had the most activity in the NAcc while viewing images of food were also the ones who had the most intense cravings for these treats in real life. Additionally, these women were more likely to give in to their temptations when they had a hankering for some chocolate. On the other hand, those who had greater activity in the IFG during the inhibition task were also more successful at withstanding their desires — in fact, they were over 8 times more likely to resist the urge to indulge than those with less activity in the region. And if they did give in, they didn’t eat as much as those with a smaller IFG response.

Having confirmed the link between activity in these areas and real-life behaviors, the next step is to figure out how to ramp up or tamp down activity in the IFG and NAcc, respectively. One technique that psychologists are exploring is transcranial magnetic stimulation, or TMS. This involves zapping a certain part of the brain with an electromagnetic current, trying to either stimulate or depress activity in that region. So far, use of TMS in studies of addiction and eating disorders — attempting to enhance self-control and decrease feelings of craving — has been met with limited success. Pinpointing the exact right area through the skull and figuring out the correct frequency to use can be difficult, and in fact, a few studies have actually accidentally increased desire for the substance! Additionally, the effects are often temporary, wearing off a few days after the stimulation is over. Other studies have looked at cognitive training to try to enhance self-control abilities, particularly in children with ADHD, although these attempts have also varied in their success.

Beyond targeting certain psychiatric disorders or trying to get us to say no to that second (or third or fourth) cookie for reasons of vanity, there’s a push to enhance self-control from a public health standpoint. The authors of the current study cite the staggering statistic that 40% of deaths in the U.S. are caused by failures in self-control. That’s right, according to research, 40% of all fatalities are caused by us not being able to say no and partaking in some sort of unhealthy behavior, the biggest culprits being smoking and over-eating or inactivity leading to obesity. Clearly then, improving self-control is not only needed to help individuals on the outer edges of the spectrum resist temptation, it would benefit those of us smack dab in the middle as well.

Happy Friday!

Cannabis and memory loss: dude, where’s my CBD?

I’ve got a new piece in The Guardian today on memory deficits in heavy cannabis users, and how the type of weed you’re smoking can actually impact your risk for impairment. Dedicated Brain Study readers might recognize it as a revamped, beefed-up version of the infamous “Weed be better off smoking our parents’ pot” post from last year. Now, I’ve incorporated some new research into the piece on cognitive problems in heavy smokers, as well as the relevant policy news from Colorado, Washington and Uruguay regarding legalization. I also talk about how these developments could result in more than one type of harm reduction, which is an exciting prospect for improving the safety of the drug with government regulation.

Check out the full piece here, and as always, let me know what you think.

Beating the poppy seed defense

During my PhD, one of the research projects I was involved in was a relapse prevention study testing individuals who had previously been addicted to alcohol, cocaine or heroin, but were no longer using any drugs.

One participant who took part in the study — I’ll call him Dave — was a young guy who was dependent on alcohol, but swore up and down he had never abused any drugs. Dave was three weeks into the study and doing well, staying abstinent and remaining cheerful and cooperative throughout the sessions. However, one morning when Dave came in and went through his usual drug screen, he tested positive for heroin, something he claimed (and I believed) he had never taken.

Instead, Dave maintained he had eaten a poppy seed bagel for lunch the day before, which would explain the positive test.

Opiates — like heroin, morphine or opium — are all derived from the poppy seed plant, and it’s not uncommon for poppy seeds to give a false-positive result for opiates on a drug screen. However, it’s also not uncommon for people to falsely plead the poppy seed defense, and there is no way of confirming what form of morphine (heroin or poppy seed) is actually causing the positive screen. Until now.

Researchers from King’s College London have discovered a metabolite of heroin that only exists in the synthetic form of the drug and can be reliably tested for using a urine screen. This means that instead of screening for all types of opiates, doctors and researchers can now test for only the presence of heroin in the body.

Notably, the test would also not come back positive for any prescription painkillers, which is simultaneously an advantage and a disadvantage of the new screen. For those who are legitimately prescribed the medications, there would be no more concerns over having a suspicious positive result. However, the tests would also not be able to identify the more than 12 million Americans who are using these drugs without a prescription. This is especially problematic as prescription painkillers have quickly surpassed all other types of drugs as the most common form of overdose, totaling more deaths in 2010 than cocaine and heroin combined, and prescription painkiller overdose has now become the leading cause of death by injury in the U.S.

The new test is still under investigation and isn’t perfectly refined (only 16 of the 22 known current heroin users tested positive for the metabolite in the study — meaning it has a detection rate of only about 75%), but it is a promising new avenue for researchers and medical screeners to more accurately identify the presence of heroin.

As for Dave, he successfully completed the study without any other events, and he never ate another poppy seed before a session again.

Can synesthesia in autism lead to savantism?

I’ve got a new piece out on the Scientific American MIND blog network today on the fascinating link that’s been discovered between synesthesia – a “crossing of the senses” where one perceptual experience is tied to another, like experiencing sound and color together – and autism spectrum disorder.

Individuals with autism have significantly higher rates of synesthesia than the rest of the population, and the two are potentially linked by a unique way in which the brain is wired. White matter tracts that traverse our brains, connecting one area to another, are thought to be increased in both conditions. This results in an abnormal wiring of the brain that may lead in more close-range connections, but fewer long-distance ones. And it’s possible that these extra connections may also contribute to some of the extraordinary cognitive abilities seen in some autistic individuals with savant syndrome.

For more on the story, check out the full piece on here.

Sweet dreams are made of cheese

You’re running down a hallway; running away from someone? Running towards something? Your feet start to lift off the ground and the ceiling opens up. You float higher and higher, and you get the feeling you’re not alone. You turn to your left and it’s Bob Dylan, laughing and calling you “Mr. Tambourine Man”. Suddenly the balloon you were holding onto, carrying you up into the sky, turns into a tangerine and you start to plummet back to earth. Just before you slam into the ground you awaken; sweaty, sheets twisted, wondering what the hell that was all about.

Dreams are weird. Especially if you’ve eaten a lot of cheese the night before.

Or so says the common myth. From Charles Dickens to Arab Strap, cheese dreams have been a part of our popular culture for over the last 150 years. But is there actually any truth in this old wives’ tale?

A study conducted in 2005 by the British Cheese Board attempted to debunk this claim by giving 200 participants 20 grams (roughly 0.7 ounces) of cheese 30 minutes before they went to bed and asking them to record their dreams and quality of sleep. In the study, 67% of participants recalled their dreams, and none reported the presence of any nightmares, something the Cheese Board is calling a win.

Instead of night terrors, the researchers report that the cheese resulted in pleasant nighttime fantasies in most individuals. They even went so far as to test the varying effects different types of fromage had on an individual’s dream-state. From their conclusions, blue Stilton resulted in the most bizarre trips, affecting about 80% of participants and resulting in visions of talking animals, vegetarian crocodiles and warrior kittens. On the other end of the spectrum, Cheshire cheese produced the least memorable nights, with less than half of the participants being able to recall their dreams.

The study (again, initiated by the cheese industry) also claimed that eating cheese before bed actually helped people fall asleep. This is supposedly due to the relatively high tryptophan content in cheese, an amino acid involved in the production of melatonin (and serotonin), which plays an important role in our sleep-wake cycle.

However, it should be noted that there was no report of a control or placebo group in this experiment, such as participants who ate nothing or consumed a soy cheese sample (yum!) before bed. Thus, there’s no empirical evidence that it was actually the cheese causing these effects and that it was not just the natural sleep state for these individuals.

As for the dream link, there is only one academic paper that mentions the cheese-dream phenomenon, and that is only anecdotally. However, one Internet theory I found (I know, I’m reaching here) proposed that the bacteria and fungal content in cheese, and in potent blue cheeses in particular, might be at the root of the increase in dream vividness. This is due to the potential psychoactive effects different compounds found in fungi, like tryptamine or tyramine, might have, influencing our brains’ chemical systems and thus our state of mind.

Tryptamine is a common chemical precursor for serotonin and other related alkaloids, some of which are involved in the hallucinogenic effects of psilocybin (“magic” mushrooms) and DMT. However, there’s no hard evidence that tryptamine is actually present in the Stiltons and Gorgonzolas of the world, and even if it was, it would be in extremely low doses. After all, when was the last time you felt high after eating cheese?

Conversely, tyramine is a monoamine that works by releasing other neurotransmitters like adrenaline, noradrenaline and dopamine into the body. Another theory is that tyramine’s effect on noradrenaline release in an area of the brain called the locus coereleus, a region important in our sleep-wake cycle, is altering our dream patterns.

Some antidepressants work by inhibiting the breakdown of monoamines (monoamine oxidase inhibitors – MAOIs), and it can be potentially dangerous to eat foods high in tyramine when on this medication as it can result in an excess of these chemicals in your brain and body. The medication mentioned in the old academic paper, pargyline hydrochloride, actually works as an MAOI, potentially explaining the bizarre effect eating cheese had on the patient. There are also reports of foods high in tyramine causing migraines in some individuals, particularly those on MAOIs; however, another study found no evidence of this link.

Finally, there are numerous other types of foods that contain chemical compounds like tyramine and tryptophan affecting our neurotransmitter systems. This includes cured meats, egg whites and soybeans, none of which have the dream-producing reputation of cheese. So for now, it appears to be an untenable link between cheese specifically and these nighttime apparitions.

Then again, I did eat some cheddar last night, which might just explain Bob Dylan’s appearance in my nocturnal activities. According to the Cheese Board, cheddar was linked to visions of celebrities dancing in your head.

(Thanks to Sam Greenbury for the inspiration for this post.)

Is Oreo addiction a thing?

No. No it’s not.

I wrote last week on the idea of having an “addictive personality“. This was meant in the context of common drugs of abuse, like alcohol, cocaine or heroin, but what about addictions to things other than drugs? Like your iPhone. Or the internet. Or Oreos.

The idea of food addiction is not a new one, and I’ve written on this trend before, both on Brain Study and in real science journals. But a new study released last week takes this claim to a whole new (and unsubstantiated) level, claiming that Oreos – and especially that all-enticing creamy center – are as addictive as cocaine.

I’ve written a rant that’s been published in The Guardian today critiquing the study and stating what exactly is so wrong with the research. I’ve also provided some much better links to articles on the topic.

Enjoy it with a cookie or two, but probably not a rice cake.

Thanks to Torey Van Oot for the tip on this study.

Do you have an addictive personality?

You’ll have to bear with me if this is a bit of a self-indulgent post, but I have some exciting news, Brain Study-ers: I’ve officially submitted my dissertation for a PhD in psychology!

In light of this – the culmination of three years of blood, sweat, tears and an exorbitant amount of caffeine – I thought I’d write this week on part of my thesis work (I promise to do my best to keep the jargon out of it!)

One of the biggest questions in addiction research is why do some people become dependent on drugs, while others are able to use in moderation? Certainly some of the risk lies in the addictive potential of the substances themselves, but still the vast majority of individuals who have used drugs never become dependent on them. This then leads to the question, is there really such a thing as an “addictive personality”, and what puts someone at a greater risk for addiction if they do choose to try drugs?

We believe that there are three crucial traits that comprise much of the risk of developing a dependency on drugs: sensation-seeking, impulsivity and compulsivity.

Sensation-seeking is the tendency to seek out new experiences, be they traveling to exotic countries, trying new foods or having an adrenaline junkie’s interest in extreme sports. These people are more likely to first try psychoactive drugs, experimenting with different sensations and experiences.

Conversely, impulsivity is acting without considering the consequences of your actions. This is often equated with having poor self-control – eating that slice of chocolate cake in the fridge even though you’re on a diet, or staying out late drinking when you have to be at work the next day.

While impulsivity and sensation-seeking can be similar, and not infrequently overlap, they are not synonymous, and it is possible to have one without the other. For example, in research we conducted on the biological siblings of dependent drug users, the siblings showed elevated levels of impulsivity and poor self-control similar to that of their dependent brothers and sisters, but normal levels of sensation-seeking that were on par with unrelated healthy control individuals. This led us to hypothesize that the siblings shared a similar heightened risk for dependence, and might have succumbed to addiction had they started taking drugs, but that they were crucially protected against ever initiating substance use, perhaps due to their less risk-seeking nature.

The final component in the risk for addiction is compulsivity. This is the tendency to continue performing a behavior even in the face of negative consequences. The most classic example of this is someone with OCD, or obsessive-compulsive disorder, who feels compelled to check that the door is locked over and over again every time they leave the house, even though it makes them late for work. These compulsions can loosely be thought of as bad habits, and some people form these habits more easily than others. In drug users, this compulsive nature is expressed in their continued use of the substance, even though it may have cost them their job, family, friends and health.

People who are high in sensation-seeking may be more likely to try drugs, searching for that new exciting experience, but if they are low in impulsivity they may only use a couple of times, or only when they are fairly certain there is a small risk for negative consequences. Similarly, if you have a low tendency for forming habits then you most likely have a more limited risk for developing compulsive behaviors and continuing an action even if it is no longer pleasurable, or you’ve experienced negative outcomes as a result of it.

Exemplifying this, another participant group we studied were recreational users of cocaine. These are individuals who are able to take drugs occasionally without becoming dependent on them. These recreational users had similarly high levels of sensation-seeking as the dependent users, but did not show any increase in impulsivity, nor did they differ from controls in their self-control abilities. They also had low levels of compulsivity, supporting the fact that they are able to use drugs occasionally but without having it spiral out of control or becoming a habit.

We can test for these traits using standard questionnaires, or with cognitive-behavioral tests, which can also be administered in an fMRI scanner to get an idea of what is going on in the brain during these processes. Behaviorally, sensation-seeing roughly equates to a heightened interest in reward, while impulsivity can be seen as having problems with self-control. As mentioned above, compulsivity is a greater susceptibility to the development of habits.

In the brain, poor self-control is most commonly associated with a decrease in prefrontal cortex control – the “executive” center of the brain. Reflecting this, stimulant-dependent individuals and their non-dependent siblings both showed decreases in prefrontal cortex volume, as well as impairments on a cognitive control task. Conversely, recreational cocaine users actually had an increase in PFC volume and behaved no differently from controls on a similar task. Thus, it appears that there are underlying neural correlates to some of these personality traits.

It is important to remember that we all have flashes of these behaviors in differing amounts, and it is only in extremely high levels that these characteristics put you at a greater risk for dependence. Also, crucially it is not just one trait that does it, but having all three together. Most notably though, neuroscience is not fatalistic, and just because you might have an increased risk for a condition through various personality traits, it does not mean your behavior is out of your control.

Oh, and I’ll be going by Dr. D from now on.

Ersche, KE et al., Abnormal brain structure implicated in stimulant drug addictionScience 335(6068): 601-604 (2012).

Ersche, KE et al., Distinctive personality traits and neural correlates associated with stimulant drug use versus familial risk of stimulant dependenceBiological Psychiatry 74(2): 137-144 (2013).

Smith, DG et al., Cognitive control dysfunction and abnormal frontal cortex activation in stimulant drug users and their biological siblings.Translational Psychiatry 3(5): e257 (2013).

Smith DG, et al., Enhanced orbitofrontal cortex function and lack of attentional bias to cocaine cues in recreational stimulant users.Biological Psychiatry Epub ahead of print (2013).

What do your hands say about you?

When I tell people that I ‘do psychology’ I typically get one of three reactions. 1) People ask if I can read their thoughts. No, unless you’re a drunken guy in a bar, in which case, gross. 2) They begin to tell me about their current psychological troubles and parental issues, to which I listen sympathetically and then make it clear that I got into experimental psychology because I didn’t want to have to listen to people’s problems (sorry). Or 3) they ask me a very astute question about the brain that 9 times out of 10 I can’t answer. This last option is by far the most preferable and I’ve had several very interesting conversations come out of these interactions.

One such question I received recently was where does handedness come from in the brain? While initially a basic-seeming question, I quickly realized that I had no idea how to answer it without dipping into pop psychology tropes about right- and left-brained people that I definitely wanted to validate before I started trotting them out.

So what exactly is handedness? Does it really reflect differences in dominant brain hemispheres? Is it underlying or created, and what happens if you switch back and forth? Can a person truly be ambidextrous?

Handedness may indeed relate to your so-called ‘dominant’ hemisphere, with the majority of the population being right-handed and thus ‘left-brained’ (each hemisphere controls the opposite side of the body in terms of motor and sensory abilities). The dominant side of the body, by definition, is quicker and more precise in its movements. This preference originates from birth and is then ingrained by your actions, such as the practice of fine motor skills like handwriting.

Going beyond basic motor differences, handedness has been loosely related to the more general functions of each brain hemisphere as well. The left hemisphere is typically associated with more focused, detailed and analytical processing, and this type of thinking may be reflected in the precision movements utilized by the more typically dominant right hand. Conversely, greater spatial awareness and an emphasis towards systems or pattern-based observations are thought to reside primarily in the right hemisphere. (I highly recommend Iain McGilchrist’s RSA animation on the divided brain for a great overview.) However, it is important to note that these types of thought and behavior are by no means exclusive to one hemisphere or another, and the different areas of the brain are in constant communication with each other via signals sent through white matter tracts that traverse the brain, like the corpus callosum that connects the two hemispheres.

Contributing to the right-hand/left-brain theory, the left hemisphere is largely responsible for language ability, which has traditionally been used as another indicator of hemispheric dominance. It was initially thought that this control was switched in left-handed people, with the right hemisphere in charge of verbal communication; however, it has since been proven that this linguistic laterality doesn’t really match up that neatly.

In the 1960s a simple test was devised to empirically determine a person’s dominant hemisphere in terms of language. This, the Wada test, involves injecting sodium amobarbital into an awake patient in an artery traveling up to one side of the brain, temporarily shutting down that hemisphere’s functioning. This allows neurologists to see which abilities are still intact, meaning that they must be controlled by the opposite side. This test is especially important in patients undergoing neurosurgery, as ideally you would operate on the non-dominant hemisphere to reduce possible complications in terms of movement, language and memory. The Wada test revealed that many left-handers are actually also left-brained dominant in terms of language, and that in only a small proportion does language reside in the right hemisphere. Still other left-handers share language abilities across the two hemispheres.

So where does this appendage preference come from? Handedness is thought to be at least partially genetic in origin, and several genes have been identified that are associated with being left-handed. However, there is evidence that it is possible to switch a child’s natural preference early in life. This often happened in cultures where left-handers were perceived as ‘evil’ or ‘twisted’, and attempts were made in schools to reform them, forcing them to act as right-handers. As mentioned above, when motor movements (and their underlying synaptic connections) are practiced, they become stronger and more efficient. Thus, individuals who were originally left-handed may come to favor their right hand, particularly for tasks like writing, as they were forced to develop these pathways in school. These same individuals may still act as left-handed for other motor tasks though, simultaneously supporting both the nature and nurture aspects of handedness. Notably, this mixed-handedness is different from ambidextrousness, as both hands cannot be used equally for all actions. True ambidexterity is extremely rare and has been largely under-studied to date. However, it has been theorized that in ambidextrous individuals neither hemisphere is dominant, and in some cases this has led to evidence of mental health or developmental problems in children.

So the next time you meet a psychologist in a bar, instead of challenging them to guess what you’re thinking, ask them the most basic brain-related question you have. It will undoubtedly lead to a much better conversation!

(Originally posted on Mind Read)