Resisting temptation in the brain

Having spent the last three years studying how difficult it is to say no to our vices, and being intimately acquainted with all that can go wrong in fMRI research, I’m always a bit skeptical of studies that claim to be able to predict our capacity for self-control based on a brain scan. But a new paper out this week in Psychological Science seems to have done a pretty admirable job, tying our real-life ability to resist temptation with activity in two specific areas of the brain.

Researchers from Dartmouth University first tested 31 women on two different tasks: an assessment of self-control and a measurement of temptation. Using an fMRI scanner, they compared where the women’s brains lit up when they were stopping themselves from performing a certain action (pressing a button, to be exact), and when they were seeing images of ice cream, hamburgers, and other tasty treats. As expected, better performance on the response inhibition task was linked to activation in a part of the brain called the inferior frontal gyrus (IFG), a region in the frontal cortex known to be involved in inhibiting a response. Conversely, looking at pictures of chocolate and chicken sandwiches activated the nucleus accumbens (NAcc) a deeply rooted part of the brain that’s essential in feelings of reward.

So far, this is all pretty par for the course; you exert self-control, you activate your control center. Looking at something enticing? Your reward region is going to light up. Nothing new or ground-breaking (or even that useful, to be honest). But the researchers didn’t stop there. Instead, they took the study out of the lab to see what happened when the participants were faced with real-life temptations. Equipping them with Blackberry smartphones, the participants were prompted throughout the week with questions about how badly they desired junk food, how much they resisted these cravings, whether they gave in to their urges, and how much they ate if they did cave to temptation.

Comparing these responses to brain activity in the two target areas, the researchers discovered that the women who had the most activity in the NAcc while viewing images of food were also the ones who had the most intense cravings for these treats in real life. Additionally, these women were more likely to give in to their temptations when they had a hankering for some chocolate. On the other hand, those who had greater activity in the IFG during the inhibition task were also more successful at withstanding their desires — in fact, they were over 8 times more likely to resist the urge to indulge than those with less activity in the region. And if they did give in, they didn’t eat as much as those with a smaller IFG response.

Having confirmed the link between activity in these areas and real-life behaviors, the next step is to figure out how to ramp up or tamp down activity in the IFG and NAcc, respectively. One technique that psychologists are exploring is transcranial magnetic stimulation, or TMS. This involves zapping a certain part of the brain with an electromagnetic current, trying to either stimulate or depress activity in that region. So far, use of TMS in studies of addiction and eating disorders — attempting to enhance self-control and decrease feelings of craving — has been met with limited success. Pinpointing the exact right area through the skull and figuring out the correct frequency to use can be difficult, and in fact, a few studies have actually accidentally increased desire for the substance! Additionally, the effects are often temporary, wearing off a few days after the stimulation is over. Other studies have looked at cognitive training to try to enhance self-control abilities, particularly in children with ADHD, although these attempts have also varied in their success.

Beyond targeting certain psychiatric disorders or trying to get us to say no to that second (or third or fourth) cookie for reasons of vanity, there’s a push to enhance self-control from a public health standpoint. The authors of the current study cite the staggering statistic that 40% of deaths in the U.S. are caused by failures in self-control. That’s right, according to research, 40% of all fatalities are caused by us not being able to say no and partaking in some sort of unhealthy behavior, the biggest culprits being smoking and over-eating or inactivity leading to obesity. Clearly then, improving self-control is not only needed to help individuals on the outer edges of the spectrum resist temptation, it would benefit those of us smack dab in the middle as well.

Happy Friday!

Advertisements

Keeping hope alive: Brain activity in vegetative state patients

Thirteen-year-old Jahi McMath went into Oakland Children’s Hospital on December 9 for a tonsillectomy. Three days later she was declared brain-dead; severe complications from the surgery resulted in cardiac arrest and her tragic demise. While neurologists and pediatricians at the hospital have declared Jahi brain-dead, her family refuses to accept the doctors’ diagnosis, fighting to keep her on life support.

This heartrending battle between hospital and family is sadly not a new one, and there is often little that can be done to compromise the two sides. However, neuroscientific research in recent years has made substantial developments in empirically determining if there are still signs of consciousness in vegetative state patients. These revelations can either bring hope to a desperate family or provide stronger footing for doctors trying to do the more difficult but often more humane thing.

In 2010, researchers at the University of Cambridge published a groundbreaking study in the New England Journal of Medicine that looked at brain activity in minimally conscious or vegetative state patients using fMRI. These patients were placed in the scanner and asked to imagine themselves in two different scenarios: in the first, they were instructed to envision themselves playing tennis and swinging a racket, which would activate a motor region of the brain called the supplementary motor cortex. In the second, they were told to think of a familiar place and mentally map or walk around the room. This mental map lights up the parahippocampal gyrus, an area of the brain involved in spatial organization and navigation.

Five of the patients (out of 54) were able to consistently respond to the researchers’ requests, reliably activating either the supplementary motor cortex or parahippocampal gyrus upon each instruction. Even more amazing, one of the patients was able to turn this brain activation into responses to yes or no questions. The patient was asked a series of autobiographical questions like “Do you have any siblings?” If the response to the question was yes, she was instructed to “play tennis,” while if the answer was no, she should take a mental stroll around the room. Remarkably, this individual was able to accurately respond to the researchers’ questions using just these two symbolic thought patterns.

Building on this research, a new study by the same scientists published in November of this year in NeuroImage used EEG to measure electrical activity in the brain in an attempt to better assess consciousness in the same group of vegetative state patients.

A certain type of EEG brain wave, the P300, is generated when we are paying attention; and just as there are different kinds of attention (i.e. concentration, alertness, surprise), there are different P300 responses associated with each type. An “early” P300 burst in activity in the parietal lobe (P3a) is externally triggered, such as when something surprising or unexpected grabs our attention. Conversely, delayed P300 waves in the frontal cortex (P3b) are more internally generated and are activated when we are deliberately paying attention to something.

To test this, the Cambridge researchers hooked up the same group of minimally conscious patients to an EEG machine and made them listen to a string of random words (gown, mop, pear, ox). Sprinkled throughout these distractor stimuli were also the words “yes” and “no,” and patients were instructed to only pay attention to the word “yes.” Typically, when someone performing this task hears the target word (yes), they experience a burst in delayed P300 activity, signifying that they were concentrating on that word. However, upon hearing the word “no,” participants often show early P300 activity, its association with the target word attracting their attention even though they were not explicitly listening for it.

Similar to the first study, four of the participants exhibited brain activity that indicated they were able to successfully distinguish the target from the distractor words. This result suggests that these patients are aware and able to process instructions. Three of the four individuals also demonstrated the appropriate activation during the tennis test listed above. However, it’s important to remember that in both of these studies only a very small minority of the patients were able to respond; the vast majority showed no evidence of consciousness during either task.

For the McMath family, studies such as these provide hope that their daughter is still somewhere inside herself, still able to interact with the outside world. But doctors fear this research may be misleading as these results are by far the exception. Additionally, there is no evidence that this type of activity will result in any change in the patient’s prognosis. Finally, and most relevant to the current controversy, complete brain death–as in the case of young Jahi–is very different from vegetative state or minimal consciousness; there is never any recovery from brain death. Advancements in neuroscience have grown more and more incredible in the last decade, and our knowledge of the brain has increased exponentially, but there is still more that we do not know than what we do, and we are a long way off from being able to bring back the dead.

Also posted on Scitable: Mind Read

Do you have an addictive personality?

You’ll have to bear with me if this is a bit of a self-indulgent post, but I have some exciting news, Brain Study-ers: I’ve officially submitted my dissertation for a PhD in psychology!

In light of this – the culmination of three years of blood, sweat, tears and an exorbitant amount of caffeine – I thought I’d write this week on part of my thesis work (I promise to do my best to keep the jargon out of it!)

One of the biggest questions in addiction research is why do some people become dependent on drugs, while others are able to use in moderation? Certainly some of the risk lies in the addictive potential of the substances themselves, but still the vast majority of individuals who have used drugs never become dependent on them. This then leads to the question, is there really such a thing as an “addictive personality”, and what puts someone at a greater risk for addiction if they do choose to try drugs?

We believe that there are three crucial traits that comprise much of the risk of developing a dependency on drugs: sensation-seeking, impulsivity and compulsivity.

Sensation-seeking is the tendency to seek out new experiences, be they traveling to exotic countries, trying new foods or having an adrenaline junkie’s interest in extreme sports. These people are more likely to first try psychoactive drugs, experimenting with different sensations and experiences.

Conversely, impulsivity is acting without considering the consequences of your actions. This is often equated with having poor self-control – eating that slice of chocolate cake in the fridge even though you’re on a diet, or staying out late drinking when you have to be at work the next day.

While impulsivity and sensation-seeking can be similar, and not infrequently overlap, they are not synonymous, and it is possible to have one without the other. For example, in research we conducted on the biological siblings of dependent drug users, the siblings showed elevated levels of impulsivity and poor self-control similar to that of their dependent brothers and sisters, but normal levels of sensation-seeking that were on par with unrelated healthy control individuals. This led us to hypothesize that the siblings shared a similar heightened risk for dependence, and might have succumbed to addiction had they started taking drugs, but that they were crucially protected against ever initiating substance use, perhaps due to their less risk-seeking nature.

The final component in the risk for addiction is compulsivity. This is the tendency to continue performing a behavior even in the face of negative consequences. The most classic example of this is someone with OCD, or obsessive-compulsive disorder, who feels compelled to check that the door is locked over and over again every time they leave the house, even though it makes them late for work. These compulsions can loosely be thought of as bad habits, and some people form these habits more easily than others. In drug users, this compulsive nature is expressed in their continued use of the substance, even though it may have cost them their job, family, friends and health.

People who are high in sensation-seeking may be more likely to try drugs, searching for that new exciting experience, but if they are low in impulsivity they may only use a couple of times, or only when they are fairly certain there is a small risk for negative consequences. Similarly, if you have a low tendency for forming habits then you most likely have a more limited risk for developing compulsive behaviors and continuing an action even if it is no longer pleasurable, or you’ve experienced negative outcomes as a result of it.

Exemplifying this, another participant group we studied were recreational users of cocaine. These are individuals who are able to take drugs occasionally without becoming dependent on them. These recreational users had similarly high levels of sensation-seeking as the dependent users, but did not show any increase in impulsivity, nor did they differ from controls in their self-control abilities. They also had low levels of compulsivity, supporting the fact that they are able to use drugs occasionally but without having it spiral out of control or becoming a habit.

We can test for these traits using standard questionnaires, or with cognitive-behavioral tests, which can also be administered in an fMRI scanner to get an idea of what is going on in the brain during these processes. Behaviorally, sensation-seeing roughly equates to a heightened interest in reward, while impulsivity can be seen as having problems with self-control. As mentioned above, compulsivity is a greater susceptibility to the development of habits.

In the brain, poor self-control is most commonly associated with a decrease in prefrontal cortex control – the “executive” center of the brain. Reflecting this, stimulant-dependent individuals and their non-dependent siblings both showed decreases in prefrontal cortex volume, as well as impairments on a cognitive control task. Conversely, recreational cocaine users actually had an increase in PFC volume and behaved no differently from controls on a similar task. Thus, it appears that there are underlying neural correlates to some of these personality traits.

It is important to remember that we all have flashes of these behaviors in differing amounts, and it is only in extremely high levels that these characteristics put you at a greater risk for dependence. Also, crucially it is not just one trait that does it, but having all three together. Most notably though, neuroscience is not fatalistic, and just because you might have an increased risk for a condition through various personality traits, it does not mean your behavior is out of your control.

Oh, and I’ll be going by Dr. D from now on.

Ersche, KE et al., Abnormal brain structure implicated in stimulant drug addictionScience 335(6068): 601-604 (2012).

Ersche, KE et al., Distinctive personality traits and neural correlates associated with stimulant drug use versus familial risk of stimulant dependenceBiological Psychiatry 74(2): 137-144 (2013).

Smith, DG et al., Cognitive control dysfunction and abnormal frontal cortex activation in stimulant drug users and their biological siblings.Translational Psychiatry 3(5): e257 (2013).

Smith DG, et al., Enhanced orbitofrontal cortex function and lack of attentional bias to cocaine cues in recreational stimulant users.Biological Psychiatry Epub ahead of print (2013).

You are what you eat

Anyone who’s ever tried to cure the blues with Ben and Jerry’s knows that there is a link between our stomachs and our moods. Foods high in fat and sugar release pleasure chemicals like dopamine and opioids into our brains in much the same way that drugs do, and I’d certainly argue that french fries and a chocolate milkshake can perk up even the lousiest of days.

This brain-belly connection works in the opposite direction, too. Ever felt nauseous before giving a big presentation? Or had butterflies in your stomach on a first date? It’s this system relating feedback from your brain to your gut causing those sensations and giving you physical signals that something big is about to happen.

However, instead of trying to suppress those feelings (or running to the bathroom every five minutes) it now appears that we can use this brain-body loop to our advantage. Formally referred to as the microbiome-gut-brain axis, bacteria that live in our stomach and intestines can affect our responses to stress and anxiety, and research in recent years has shown that probiotic bacteria – like those found in many strains of yogurt – can help to reduce anxiety and elevate mood in addition to helping us “stay regular”.

Previous research has shown reduced fear and stress responses during anxiety-inducing tests in mice who were fed broth with an added probiotic. This included less freezing in the face of fear, greater exploration of new environments, and fewer indicators of depression during a behavioral despair test (cheerful, huh?). These chilled out mice also had lower levels of corticosterone – a major stress hormone – after being tested, corroborating these behavioral findings.

Now, recent research from a team of doctors at UCLA’s School of Medicine and *CONFLICT OF INTEREST ALERT* funded by Danone, the yogurt company, has for the first time provided support for this brain-stomach connection in humans. These researchers looked at the effect eating yogurt (or as they like to call it, a “fermented milk product with probiotic”) every day for four weeks had on neural responses to pictures of negative faces. This type of task usually causes an increase in activity in emotion and somatosensory regions of the brain, like the amygdala and the insula, indicating an unpleasant or stressful reaction to the images. Compared to control individuals who had eaten just a normal fermented milk product, those who had eaten the probiotics had decreased activity in these brain areas, suggesting they were not as affected by the pictures.

Curiously though, there was no difference between the groups in probiotic levels found in stool samples taken (yes, they tested their poop), and none of the participants reported feeling any changes in their levels of stress, anxiety or depression during the study. However, there were significant differences in brain activity between the groups while they were resting, including in the areas identified during the task. Altogether, it looks like even small amounts of probiotics (i.e., not enough to change your gut levels) can still have a significant affect on our brain activity, even without noticeably changing our moods.

This interaction between our guts and our gray matter is thought to be facilitated by the vagus nerve traveling down the base of the brain into the stomach, transmitting sensory information and chemical signals from internal organs back up to the brain. Supporting this theory, when this nerve was cut in the first study the positive effects of the probiotics disappeared, and the test mice were back to their normally anxious selves.

It doesn’t appear that non-fermented milk products have the same positive effects on the brain, so it looks like I’ll be switching my usual Ben and Jerry’s to frozen yogurt for the next few weeks while I finish writing up my PhD thesis. Maybe it’ll help with my growing “thes-ass” too!

(Originally posted on Mind Read)

(“Thes-ass” coinage credit to Anna Bachmann)

Beating the odds of addiction

An article I wrote for The Psychologist magazine based on my thesis research investigating risk and protective factors in drug dependence was published online this week.

This work all stems from a question I (and countless others in the field) have of why some people are able to use illicit drugs without becoming dependent, while others seem to quickly succumb to addiction.

While we’re still far from answering this question definitively, my lab at Cambridge, headed by Dr. Karen Ersche, has some theories on why this might be the case.

For example, it appears that there are underlying traits, like impulsivity, compulsivity and sensation-seeking, that can put someone at a greater risk for developing drug dependence. Some of these traits also correspond to differences in brain structure and function, such as smaller frontal cortex volume potentially making it harder for people to stop or inhibit a behavior.

If you’re interested in reading more, a full link to the article is here (the magazine kindly made it available open access). So please check it out, and as always I welcome any questions or feedback!

Inside the mind of a criminal

On Law and Order: SVU, the story stops when the bad guy is caught. The chase is over, justice is served, the credits roll and we can all sleep easier at night knowing that Detectives Benson and Stabler have successfully put another criminal behind bars.

Of course in the real world, things are never that simple.

Our criminal justice system operates on the tenets of punishment and reform. You do the crime, you do the time — and ideally you are appropriately rehabilitated after paying penance for your sins. But unfortunately it doesn’t always work that way. Recidivism rates in the U.S. have been estimated at 40-70%, with most former inmates ending up back behind bars within three years of being released.

Parole boards make their decisions carefully, trying to weed out those whom they think are most likely to re-offend, and basing their decisions on the severity of the initial crime and the individual’s behavior while in jail. But clearly there is room for improvement.

A recent study by Dr. Eyal Aharoni and colleagues attempted to tackle this problem by using neuroimaging techniques to look inside the brains of convicted felons and using these scans to predict who is most at risk for re-offense. Their widely discussed findings show that a relative decrease in activation in the anterior cingulate cortex (ACC) during performance of a motor control task is related to a two-fold higher recidivism rate in the four years following release from jail.

However, this result should be taken with more than one grain of salt, as activation in the ACC has been linked to, well, pretty much everything.

In fact, a quick look at PubMed shows that there have been nearly 150 neuroimaging publications listing the ACC as a region of interest in the last six months alone! This includes papers on topics ranging from phobias to self-representation to physical pain. This implies that the ACC is involved in self-perception, fear, pain, cognition, decision-making, error monitoring, emotional processing and a host of other behaviors — not exactly a precise region, is it? (To be fair, damage to the ACC has previously been linked to increases in aggression, apathy and disinhibition.)

Additionally, while in the current study decreased activity in the area during response inhibition was related to a greater predictive risk for future re-offending, there was crucially a large portion of the sample who did not meet these predictions. In fact, 40% of participants with low ACC activity did not re-offend during the course of the study, and 45% of those with high activity did. Thus, while the differences in activation did lead to a statistically significant contributor to the risk for re-offending, they certainly were not deterministic.

Fortunately, the authors acknowledge much of the study’s short-comings and report that the results should be interpreted carefully. Most notably, they state that the findings should only be taken into consideration with contributions from a variety of other personal and environmental factors, most of which are already used in sentencing and parole decisions. For example, other significant predictive factors for re-offense include the individual’s age and their score on a test of psychopathy that is widely administered to inmates.

There are also two different ways to look at and interpret these results. On the one hand, they could be used in an attempt to exonerate or reduce sentences for men who supposedly can’t control their actions due to low brain activity. Alternatively, these scans could be used to potentially block the granting of parole to inmates who show particularly suspicious brain activation. If criminals with low ACC activity are more likely to commit future crimes, then the logic goes that they should be locked up longer — even indefinitely — to prevent them from offending again. But then where does this line of thinking end?

Do we really want to let people off because their brains “made them do it”? And conversely, just because a couple of blobs on a very commonly activated part of the brain are lighting up differently, is this a good reason to keep someone locked up longer? What about redemption? What about a second chance? What about free will?

As the fields of neuroscience and off-shoots like neuro-law progress, these questions will become more and more important; and the potential for a police state more reminiscent of Minority Report than Law and Order becomes frighteningly real. Therefore, it is all of our responsibility to think critically about results such as these and not be swayed by the bright lights and colored blobs.

(Originally posted on Mind Read)

Billions of dollars to map billions of neurons

A lot of money is being spent right now to ‘map the human brain’. In the last month, both the European Commission and U.S. president Barack Obama have pledged to give billions of dollars to fund two separate projects geared towards creating a working model of the human brain, all 100 billion neurons and 100,000 billion synapses.

The first, the Human Brain Project, is being spearheaded by Prof Henry Markram of École Polytechnique Fédérale de Lausanne. Together, with collaborators from 86 other European institutions, they aim to simulate the workings of the human brain using a giant super computer.

To achieve this, they will work to compile information about the activity of tons of individual neurons and neuronal circuits throughout the brain in a massive database. They then hope to integrate the biological actions of these neurons to create theoretical maps of different subsystems, and eventually, through the magic of computer simulation, a working model of the entire brain.

Similarly, the Brain Activity Map Project, or BAM! (exclamation added because it’s exciting), is a proposed initiative that would be organized through the United States’ National Institutes of Health and carried out in a number of universities and research institutes throughout the U.S. BAM will attempt to create a functional model of the brain – a ‘connectome’ – mapping its billions of neuronal connections and firing patterns. This would enable scientists to create both a ‘static’ and ‘active’ model of the brain, mapping the physical location and connections of these neurons, as well as how they work and fire together between and within different regions. At the moment, we have small snap-shots into some of these circuits but on only a fraction of the scale of the entire brain. This process would first be done on much smaller models, such as a fruit fly and a mouse, before working up to the complexities of a human brain version.

BAM proposes to create this model by measuring the activity of every single neuron in a circuit. At the moment, this is done using deep brain techniques, a highly invasive process that involves opening up the skull to implant electrodes onto individual cells to read and record their outputs. Understandably, this is only done in patients already undergoing brain surgery, and is a slow and expensive process. Thus, the first task of BAM would be to develop better techniques to acquire this information. Research into this field is already underway, and exciting proposals have included nanoparticles and lasers that could measure electrical outputs from these cells less invasively, or even using DNA to map neural connections.

Neither project has directly acknowledged the other, but it is thought that the recent announcement of the U.S. proposal is a response to the initial European scheme launched earlier this year. And while there are distinct differences between the two initiatives in how they will acquire and store the raw information, as well as how they plan to build their subsequent models, the two projects overlap significantly. Both have the potential to better illuminate how exactly the brain works, and each ultimately hopes to provide us with a clearer picture of not only normal brain functioning, but also what happens when these processes are disrupted. Scientists and doctors could then use computer models to simulate dysfunction involved in neurological or psychiatric disorders, such as Alzheimer’s or schizophrenia. This would also open up possibilities for investigating better treatment options, as well as drastically cutting down on the expense and risk currently involved in clinical drug trials for psychiatric and neurological disorders.

However, there is a long list of obstacles these projects must overcome before we get too excited, not the least of which are the 100,000,000,000,000 connections that need to be measured and modeled. That’s over one million times as many neurons as there were genes to map in the Human Genome Project, the closest approximation to the current endeavors. Additionally, while there was a clear end to the human genome, the ambition of making a human connectome is both much larger and much less well-defined. Indeed, neither proposal yet has a definitive end-goal, and no one is clear on what the final product will look like.

For the Human Brain Project, the collaboration of over 80 different labs across Europe will also be a significant challenge. By collaborating rather than competing, the capacity for productivity and innovation in this and future projects is far higher. However, it will be extremely difficult to manage differences in laboratory methods and communication, not to mention egos, between these institutions.

Another major concern for the American proposal is funding. With the financial crisis, fiscal cliff and federal sequestration of recent months, the U.S. economy (and Congress) do not have a very good track record at the moment. And it is hard to believe they are going to approve a multi-billion dollar project when they cannot even agree to continue funding for health care, education and military spending. Private companies including Google and Microsoft, as well as charities such as the Howard Hughes Medical Institute and Allen Institute for Brain Science have signed on to the project, but the bulk of funding will still have to be provided by government institutions.

In his State of the Union address, President Obama alluded to the Brain Activity Map Project, and tried to head-off the inevitable financial protests to it by invoking the Human Genome Project, which cost $2.7 billion to complete but has reportedly produced a return of $140 to every dollar spent. This was manifested through pharmaceutical and biotechnology developments, as well as subsequent start-up companies. This turnover has the potential to grow even further through future reductions in health care spending from medical developments, and the hope is that BAM will produce similar high returns. However, the question remains as to whether this investment could be better spent elsewhere, such as improving the medical system, research for drug treatment developments, or health education and prevention programs. Some in the scientific community are also worried that already limited funding to other fields of research will be slashed in order to subsidize the project.

Despite these concerns, it is undeniable that if these programs were to succeed they would be spectacular achievements in scientific research, not unakin to the discovery of the Higgs Boson or even the first space expeditions of the 1960s. Many believe that the human brain is the final frontier for medical research, and it will remain to be seen whether these brain-mapping projects will enable us to finally understand the wild and intricate workings of our own minds.

(Originally posted on King’s Review)

(And an updated version has been published on The Atlantic)