External (and internal) influences on decisions

We like to think that we are in control of our decisions. Yet evidence from various neuroeconomics and marketing studies show that many of the decisions we make in our day-to-day lives have less to do with our own personal choices than we would like to think, and we are instead easily influenced by internal visceral states and external suggestions and primes.

According to Martin Lindstrom, author of Brandwashed, many of the decisions we make, particularly in supermarkets and shopping situations, are determined by manipulations made by marketing executives. Whole Foods and other supermarkets prime us to shop by arranging their stores, displays, and prices in certain ways to make us perceive their products in a particular manner. They fill their stores with flowers, particularly right at the entrances, connoting freshness and evoking thoughts of newly picked produce straight from the fields, when in fact much of these products have been sitting in warehouses for weeks. They also display items packed unnecessarily in ice or sprayed with water, again ensuring us of their freshness and vitality. These manipulations do little for the products themselves, but they affect our perception of them and, therefore, our willingness to pay.

Bodily states can also alter our decision-making processes and preferences. Previous studies investigating the effect of visceral states on external decisions have shown that when in a condition of hunger, people have a greater desire not only for food but also for money. Fasted individuals also make riskier bets on a financial decision-making task involving lottery choices, opting for the riskier option significantly more often when they are fasted and choosing the safer bet when full. This finding is supported by the animal literature, in which animals are more risk-averse when sated but risk-seeking when hungry. This is presumably an evolutionarily selected trait prompting exploration and risk-seeking when in states of hunger, which could potentially lead to the acquisition of new food sources.

A similar “state of urgency” might be expected to be seen in situations where people have to use the restroom, choosing an immediate satisfaction over long-term outcomes. However, in a clever study published last year in Psychological Science (and a recent recipient of an Ig Noble award), individuals with a full bladder actually chose the delayed reward more often than instant gratification.

Led by Mirjam Tuk, researchers in the Netherlands had participants consume either 700 or 50 ml of water and then complete a delay discounting task. The discounting task involved binary decisions between two set options, one a small reward that participants could receive immediately and the other a reward of greater magnitude they would receive after a certain period of time. Participants also had to indicate how badly they needed to urinate, ranging from “very urgently” to “not urgently at all.” Individuals who had consumed the larger amounts of water and reported a greater urgency to urinate chose the delayed option more often than those who had received the smaller serving of water.

The researchers hypothesized that this was because bladder control involves deliberate inhibitory efforts, which then promote inhibition and self-control in other aspects of life. The authors call this idea “inhibitory spillover,” where conscious cognition in one aspect, influenced by a visceral state, leaks over into other domains. This observation contradicts alternative theories of self-control that posit restraint is a limited personal resource that can be depleted through instances of restriction in one area, thereby allowing lapses of control in other instances.

These studies provide evidence that we should be aware of our surroundings and current physical and mental states when making important decisions, particularly concerning money. Clearly we do not live or function in a vacuum, nor do we make our decisions in one. Being mindful of the subconscious influences we are exposed to, both internally and externally, can help us make better decisions with a clearer mind and less biased approach.

Advertisement

Did I do that? Reality monitoring in the brain

Most of us have no problem telling the real from the imagined. Or so we think.

Reality monitoring, the incorporation and distinction of internal thoughts and imaginings from external experiences and memories, typically happens seamlessly for most individuals. However, there are times when we cannot recall if someone else told us about that interesting article or whether we read it ourselves, or if we remembered to lock the door before leaving the house or not. Did we actually do or hear these things, or did we only imagine them? This is a common problem in patients with schizophrenia, who at times cannot distinguish between what they think they remember or believe to be true, and what actually occurred.

A new study on reality monitoring published last week in the Journal of Neuroscience reveals that many of us are not as good at making this distinction as we might think. Additionally, the ability to discern between perceived and imagined events may be rooted in one very specific region of the brain, which nearly 30% of the population is missing. Led by Marie Buda and Dr. Jon Simons at the University of Cambridge*, researchers administered a very particular type of memory test to healthy participants who had been pre-selected based on the prominence of the paracingulate sulcus (PCS) in their brains. Running rostral-caudal (front to back) and located in the anteriomedial (middle-frontal) prefrontal cortex, this region is involved in higher level cognitive functioning and is one of the last parts of the brain to mature. Consequently, it can be relatively underdeveloped or even seemingly absent in many people. This is particularly the case in individuals with schizophrenia, where as many as 44% of patients lack this particular region.

Participants for the current study were chosen from a database of individuals who had previously undergone an MRI scan and clearly showed a presence or absence of the PCS in either one or both of the neural hemispheres. The memory task in question involved a list of common word pairs such as “yin and yang” or “bacon and eggs”. These words were either presented together (perceive condition), or only one word was presented and the participant was to fill in the complimenting phrase (imagine condition). The second portion of the experiment involved the source of this information, i.e. whether the subject or the experimenter was the one to read off or verbally complete the pair. After the task, the subject was asked to report whether the pair was fully perceived or imagined, and whether this information was attributed to themselves or the experimenter. They were also asked to rate their confidence in both of these responses.

Participants with a complete absence of the PCS in both hemispheres performed significantly worse on the reality monitoring task than individuals who exhibited a definite presence of the sulcus. This difference was based on their source attribution memory (themselves vs. the experimenter); performance on the perceive or imagine condition did not differ between the groups. Interestingly, the two groups also did not differ in their confidence in their responses. Thus, even though the PCS-absent group performed significantly worse on attributing the source of the information, they were still just as confident in their answers as individuals who responded correctly, indicating a lack of interospective awareness in the absent group in regards to their memory abilities.

It should be noted that there was also a correlation between overall gray matter volume in the prefrontal and motor cortices and scores on the reality monitoring task. This is important as it may indicate that there are other regions involved in this process outside of the PCS, and the authors caution that this enhanced ability may stem from an increase in gray matter and connectivity in the medial prefrontal cortex, rather than from the PCS itself.

These findings could have useful applications in clinical psychiatry. As stated above, an impairment in reality monitoring is often associated with schizophrenia, and the absence of the PCS could serve as a potential biomarker for this disorder. Additionally, although not commonly discussed in terms of reality monitoring, another psychiatric diagnosis that could potentially benefit from this type of research is obsessive compulsive disorder (OCD). OCD often consists of obsessions and the urge for frequent compulsive checking of things, such as whether one remembered to turn off the stove. This ruminating and checking behavior could be indicative of a breakdown in reality monitoring where patients can not determine whether a target action actually occurred or not. While this problem is not encompassing of all OCD patients, reality monitoring disability could be a potential area to investigate in those patients for whom checking is a significant problem.

*Disclaimer: Marie Buda and Jon Simons are fellow members of the Department of Experimental Psychology at the University of Cambridge with me.

The cookie monster in all of us: Sugar addiction

Having quite a large sweet tooth myself, I feel a bit hypocritical writing on the subject of excessive sugar consumption. However, the importance of this message, not to mention the sheer fascinating nature of the topic, has prompted me to press on.

There are reasons to believe that certain hedonic foods, namely those high in sugar, can qualify as substances that are at risk for addictive abuse–not just colloquially, but also in the clinical sense of the term. Psychologists at Princeton University, lead by Dr. Nicole Avena, research sugar addiction using a sucrose solution in rats, and they have discovered several similarities between the neurochemical effects of sugar and addictive drugs on the brain.

In her research, the scientists fed rats on an intermittent feeding schedule, depriving them for 12 hours before allowing them to eat from a sucrose solution in addition to their regular food chow. After a month of this schedule, rats began to show binge, craving, and withdrawal-like behaviors in response to the sucrose, whereas rats who had received only the food chow on this intermittent schedule or had unrestricted access to food and sucrose did not show these effects. The scheduling of the sucrose availability is crucial, as it prompts the rats to develop binge-like behaviors and is similar to the schedule used to develop cocaine dependence in animals. This means that after being deprived of the food or drug, the rat will self-administer extremely large quantities of the substance once it is available again. This behavior tapers off once the animals are sated, but these binges will consistently occur after each period of deprivation.

Due to these sucrose binges, adaptations similar to those that occur in cocaine-addicted animals were seen in the rats’ brains. The binges resulted in surges of dopamine being released in the nucleus accumbens, a key facet of drug addiction, and one that is linked to feelings of reward and novelty. Over time mutations can occur in these dopamine receptors, increasing the sensitivity of some types (D1 and D3) while simultaneously decreasing the sensitivity and overall levels of others (D2). This results in larger doses of the substance being required to achieve the same level of arousal and decreases the sensitivity of this region to other types of natural rewards. This occurs after abuse of both sugar and drugs and seems to be linked to the bingeing nature of compulsive consumption.

An increase in craving was also seen in the test animals, demonstrated by an increase in sucrose-seeking in rats deprived of the solution. This behavior was resistant to extinction, remaining over a month after the last exposure to sucrose, and was also maintained in the face of adverse consequences, a principle criterion for compulsive behaviors.

In addition to craving, researchers discovered that the rats seemed to go through withdrawal-like symptoms when deprived of sugar. These included tremors, head shakes, and signs of anxiety and aggression, all symptoms seen in animals going through opiate withdrawal. These effects stem from a crash in striatal dopamine accompanied by decreases in opioid receptor activation, as well as an increase in acetylcholine, a neurochemical that has been linked to depression. One explanation for the similarity between sugar and opiate withdrawal is the activation of the opioid system by both substances. High fat/high sugar foods have been found to stimulate the release of endogenous opioids in the brain, which is thought to be due to their rewarding and pleasurable characteristics and causes effects similar to the administration of synthetic opiates, though on a much smaller scale.

This overlap between opiates and sugary foods has also been seen in abstinent heroin users, who often quickly develop a strong affinity for sweets while in recovery. Candy and junk food come to replace heroin as recovering users’ preferred drug of abuse, indirectly activating the opioid system, and ex-heroin users are known to hoard, binge, and go to extremes to seek out sweets after going off drugs. This phenomenon is referred to as “consummatory cross-sensitization,” and it occurs when dependence upon one substance leads to the rapid acquisition of increased intake or dependence on another. It suggests that the brain and these neurotransmitter receptors can become primed after frequent activation, and will, therefore, become increasingly responsive to other sorts of excitatory stimuli.

After the New York Time’s recent embarrassing op-ed publication on “iPhone addiction” and the resulting backlash, I’m reluctant to use the term in reference to anything other than drugs. However, given the evidence above, I believe there is a convincing argument that our brains and bodies process large quantities of sugar in much the same way as we do drugs of abuse. This does not mean that anyone who eats a slice of cake will develop sugar cravings, just as not everyone who takes drugs becomes addicted to them. However, for individuals who suffer from binge eating disorder or other eating problems it may be possible that their brains and bodies have adapted to put them at a disadvantage for trying to cut back on unhealthy foods or lose weight.