BALTIMORE, July 10 (UPI) -- A U.S. study suggests the genetic bits and pieces driving evolution do not confer any advantages or disadvantages to humans or other animals.
"For a long time, the basic belief of evolution was that all random genetic changes that manage to stick around have some selective advantage," Johns Hopkins University Associate Professor Nicholas Katsanis said. "But our work adds to the case that frequently, we are what we are largely due to random changes that are completely neutral."
Katsanis, however, doesn't discount the role of natural selection -- the persistence of genetic changes that confer some advantage.
"What this study does is to reinforce and highlight the equal, and in some cases greater, importance of neutral genetic drift."
Katsanis and colleagues demonstrated DNA repeat elements -- making up more than 40 percent of the human genome -- do not offer any benefits. Repeat elements contain the same repetitive sequence of chemical base pairs several hundred times.
The study that included Johns Hopkins co-authors Adrian Gherman, Peter Chen, Tanya Teslovich, Carl Kashuk, Aravinda Chakravarti and David Cutler, along with Pawel Stankiewicz, Marjorie Withers and James Lupski of the Baylor College of Medicine, appears in the online journal PLoS Genetics.
13 July 2007 Evolution And The Hive Mind By Rusty Rockets
Chat to any god-botherer about evolution and they'll tell you that it doesn't exist, that there's no such thing. They'll pay lip-service to the scientific elbow-grease that has gone into evolutionary research, but they'll argue that the end product is only ever a "theory." Overlooking the obvious confusion over the term "theory," what these folks are really saying is that there is no physical evidence for natural selection. "Where are the transitional fossils," they point out; "where are the intermediate creatures?"
It's no wonder that so many (mostly American) people still doubt evolution; they just don't know what it means. Their eyes start to glaze over when biologists use terms like alleles, macro/microevolution, genetic drift, and speciation. What the evolutionary naysayers want is some definitive bit of evidence that would prove evolution once and for all – a smoking gun.
Unfortunately, evolutionary timescales don't easily work in science's favor; we can't see a limb turn into a wing overnight. But thanks to new analysis techniques, we can actually document natural selection in its slow-and-steady progress.
Biologists at Cornell University, who have been studying genetic changes occurring in the human genome over the last 15,000 to 100,000 years, have found that over this relatively short period of time the human genome has changed by a staggering 10 percent. "We undertook a very careful study of genetic differences within and among major human groups, and aimed to explain why certain parts of the genome differed," says lead author Scott Williamson, an assistant professor of computational biology. "We aimed to eliminate as many possible confounding variables as possible, and when all is said and done, we find that as much as 10 percent of the genome may have been affected by one of these bouts of recent selection." These changes amount to natural selection at work; the adaptations needed for survival.
The Cornell study goes a long way in identifying the small, gradual changes (microevolution) that demonstrate species divergence from a common ancestor millions of years ago (macroevolution). It makes many human-to-human comparisons throughout the complete human genome; rather than comparing a human to mice or chimpanzees. In this way, it can show how we humans have been changing over time, due to our ancestors being exposed to – among other selective pressures – different climates as they spread across the globe.
One example of the type of change humans have undergone is our tolerance to lactose; or lack thereof, in the case of our dairy challenged ancestors. Lactose is an enzyme found in milk, and prior to the domestication of animals humans did not have the capacity to digest milk after infancy. Some time after humans began migrating and domesticating animals, we developed a gene that allowed us to tolerate consuming milk into adulthood. "As humans have populated the world, there has been strong selective pressure at the genetic level for fortuitous mutations that allow digestion of a new food source or tolerate infection by a pathogen that the population may not have faced in a previous environment," explains Williamson.
Now that scientists are readily identifying genomic changes due to selective pressures, what's next? Would it be too far fetched to suggest that social pressures could affect brain function at a genetic level? At least one study has identified collective behavioral differences between Western cultures like the United States and China, possibly suggesting the beginning of brain divergence among humans.
The study, from the University of Chicago, makes the claim that people living in the United States have difficulties with accepting another person's point of view, which they put down to US culture prizing individualism. They say that in China, where a collectivist attitude is encouraged, quite the opposite is true, with Chinese citizens being much more in tune with how others are thinking. As a result, the researchers argue that there may be more scope for communication confusion among Western citizens relative to citizens of China. "Members of these two cultures seem to have a fundamentally different focus in social situations. Members of collectivist cultures tend to be interdependent and to have self-concepts defined in terms of relationships and social obligations," says Boaz Keysar, a Professor in Psychology at the University of Chicago. "In contrast, members of individualist cultures tend to strive for independence and have self-concepts defined in terms of their own aspirations and achievements."
The team's conclusions may seem a little wayward – perhaps verging on cultural propaganda – but they are based on a very straightforward group cooperation experiment. Teams of Americans and Chinese were pitted against each other whilst manipulating objects on a grid, with one person from each team being the "director," and another acting as the worker, or "subject." Time and again it was shown that the Chinese had a clear advantage; the subject understanding the director's perspective, or vision, as though it were second nature. "Despite the obvious simplicity of the task, the majority of American subjects (65 percent) failed to consider the director's perspective at least once during the experiment, by asking the director which object he or she meant or by moving an object the director could not see," explained Keysar. Comparatively, only one of the Chinese appeared to flounder during the course of the experiment.
"Apparently, the interdependence that pervades Chinese culture has its effect on members of the culture over time, taking advantage of the human ability to distinguish between the mind of the self and that of the other, and developing this ability to allow Chinese to unreflectively interpret the actions of another person from his or her perspective," the study's authors concluded.
While studies with children have shown that the ability to appreciate another's point of view is universal, could cultural pressures evidenced by the University of Chicago study actually manifest as selective pressures? It was the brain that was to eventually separate us from our chimp cousins, so could yet more divergence emerge from the mysterious grey goop between our ears?
13 July 2007 Tourette’s Sufferers Enjoy Superior Grammar Skills
Children with Tourette's syndrome are much quicker at certain mental grammar skills than are children without the disorder, suggests a new study by researchers from the Georgetown University Medical Center.
"These children were particularly fast, as well as largely accurate, in certain language tasks. This tells us that their cognitive processing may be altered in ways we have only begun to explore, and moreover in a manner that may provide them with performance that is actually enhanced compared that of typically-developing children," reported the study's senior investigator, Michael Ullman, in the journal Neuropsychologia.
About 200,000 Americans have the most severe form of Tourette's syndrome, but as many as 10 percent of all Americans have a milder form. The most common initial symptom is a facial tic, and other tics - sudden, rapid, repeated movement or vocalization - may follow. Tics can include eye blinking, repeated throat clearing or sniffing, arm thrusting, kicking, shoulder shrugging or jumping. Coprolalia, the involuntary use of obscene words, is only rarely associated with Tourette's syndrome.
The disorder is linked to structural and functional abnormalities in the basal ganglia and frontal cortex area of the brain, which result in decreased inhibition of frontal activity, leading to hyperkinetic behaviors. The disorder is also associated with abnormalities in the way that chemical substances, such as hormones and neurotransmitters, mediate communication between cells.
Ullman explained that the two basic aspects of language - "rule governed" and "idiosyncratic" knowledge - depend on distinct neurobiological processes. Rule-governed knowledge involves the procedural memory system that depends on frontal/basal-ganglia area circuits in the brain; in language, it is used to combine parts of words together according to the grammatical rules of the language. In contrast, idiosyncratic knowledge depends on declarative memory, and is learned and processed in the hippocampus and other temporal lobe areas in the brain. This kind of memory allows us to learn that a word is linked to an object.
In this new study, eight children, aged 8-17, with Tourette's syndrome and eight typically developing children of the same ages without the disorder were given tasks that included producing past-tense forms. All of the children had a normal IQ. The investigators found that the children with Tourette's syndrome were significantly faster than the control group in producing rule-governed past tenses (like "slip" and "slipped") that depend on grammar and procedural memory but not in producing irregular or other unpredictable past tenses (such as "bring" and "brought") that are stored in declarative memory.
"This may mean that the brain abnormalities we see in Tourette's syndrome may lead not only to tics but also to a much wider range of unsuppressed and rapid behaviors," Ullman speculated. The researchers are now developing new language and memory tests for patients with Tourette's syndrome.
Source: University of Chicago Press Journals Date: July 15, 2007
Threats To Hope: Desperation Affects Reasoning About Product Information
Science Daily — When our hopes are threatened, we often turn to the marketplace for help. Can't fit into the gorgeous outfit you bought for your high school reunion" Trying to get pregnant" Want a bigger house but afraid you can't afford it" A new study by researchers from University of Southern California argues that in situations like these, consumers are susceptible to "motivated reasoning." We believe what we want to believe about products that promise to help -- even if the arguments don't come from credible sources.
Hope is threatened when people lose confidence that what they yearn for is possible, and this loss may result in a range of seemingly irrational behavior. Specifically, consumers interested in products that purport to enable goal-attainment will:
-Search for information from product-favorable information sources (including advertisements) -Regard favorable information as more credible -Be less discriminating about low-credibility message arguments -Be more likely to judge the product as effective
In the August issue of the Journal of Consumer Research, USC researchers Deborah J. MacInnis (Professor of Business Administration and Vice Dean of Research), David W. Stewart (Professor of Marketing and Chair, Dept. of Marketing), and their late colleague Gustavo De Mello outline three studies that demonstrate these phenomena.
For example, the first experiment asked ninety-nine undergraduates in the midst of mid-term exams to participate in a purportedly unrelated study conducted by the Office of Student Affairs, asking students to report on a variety of things, including how confident they felt about getting good grades.
The students were then asked to participate in an "unrelated" study that asked them to evaluate a new product -- a memory booster. They were given background information from a manufacturer's brochure (a favorable source) or from a newspaper article (an objective source), which they could access by clicking on a computer screen.
While both confident students and insecure students accessed the same number of pieces of information before making a judgment on the product, insecure students searched for more information from the favorable source than confident students and ended up rating the memory booster as more effective.
Similarly, the researchers found in subsequent experiments that less confident students had a more difficult time than more confident students differentiating between credible claims and weak claims for products that claimed to help meet a goal.
"The role of confidence in directing behavior, though widely recognized in psychology, has received scant attention in the consumer behavior literature," write the authors. "Weight loss products, alternative medicines, and dietary supplements, are examples of product categories for which reduced confidence may be relevant and for which this illusion of control may be highly prevalent."
Reference: Gustavo De Mello, Deborah J. MacInnis, and David W. Stewart, "Threats to Hope: Effects on Reasoning about Product Information." Journal of Consumer Research: August 2007.
Child TV addicts 'are greedy and unhappy' By SEAN POULTER
Last updated at 22:00pm on 15th July 2007
Television and the Internet are making children disruptive, disrespectful and greedy, government-backed research has found.
A study by the National Consumer Council will warn today: 'Those who spend lots of time in front of the TV and computer screen are more materialistic.
'These children argue more with their family, have a lower opinion of their parents and lower self-esteem.'
The research will make worrying reading for Gordon Brown, who has said one of his priorities is to challenge the 'erosion' of childhood.
He complained earlier this year that the Internet and TV had 'exposed children increasingly to the pressures of very aggressive advertising'.
The NCC report claims to have uncovered a divided society where the influence of adverts are exerted unevenly across social groups.
The authors found that deprived children are more likely to watch commercial television - as well as programmes made for an older audience.
This means they are exposed to more adverts - and the ones they do watch may not be appropriate for their age group.
The report found: 'Just over half of children - 51 per cent - from disadvantaged areas think that "when you grow up, the more money you have, the happier you are". Similarly, almost half of children - 47 per cent - in deprived areas would "rather spend my time buying things than doing almost anything else".'
By contrast only 23 per cent of youngsters from affluent families believed that money was the key to happiness and that shopping was a good way to spend time.
The NCC report, called Watching, Wanting and Wellbeing: Exploring the Links, added: 'These stark variations show that in some households the screen appears to be ever-present, particularly during mealtimes.
'In disadvantaged areas, for example, children are six times more likely to watch TV during the weekday evening meal.
'Furthermore, around one in four in disadvantaged areas say they have the TV on at lunchtime on Sunday, compared to only one in 30 from the better-off neighbourhoods.'
The report could lead to Government action to extend controls on the advertising of junk food and other products - perhaps including a ban through to the 9pm watershed.
18 July 2007 Mirror Neurons Show Their Xenophobic Side
The infamous middle finger salute is recognized by nearly everyone in our society, but to someone from a foreign country, it may be incomprehensible - or even worse, interpreted as some kind of precopulatory come-on. Likewise, Americans are likely to be clueless about the common gestures of a different culture. Which raises an intriguing question — particularly in the context of recent findings that linked cultural differences between China and America to varying degrees of cooperative behavior — about how culture influences the human brain.
Istvan Molnar-Szakacs, a researcher at UCLA's groovily-named Center for the Biology of Creativity, and Marco Iacoboni, director of the scarily-named Transcranial Magnetic Stimulation Lab at UCLA, contend that culture does indeed influence how the human brain develops. Their research, appearing in the current issue of PLoS ONE, looked specifically at the imprinting effects of culture on the brain's mirror neuron network.
Mirror neurons "fire" when an individual performs an action, but they also fire when someone watches another individual perform that same action. Neuroscientists believe this mirroring is the neural mechanism by which we can "read" the minds of other people and empathize with them. When it comes to the influence of culture, they found that the mirror neuron network responds differently depending on whether we are looking at someone who shares our culture, or someone who doesn't.
The researcher's used two actors, one an American, the other a Nicaraguan, to perform a series of gestures - American, Nicaraguan, and meaningless hand gestures, to a group of American subjects. Transcranial magnetic stimulation was then used to gauge levels of "corticospinal excitability" — a measure of mirror neuron activity. They found that the American participants demonstrated higher mirror neuron activity while observing the American making gestures compared to the Nicaraguan. And when the Nicaraguan actor performed American gestures, the mirror neuron activation of the observers dropped.
"We believe these are some of the first data to show neurobiological responses to culture-specific stimuli," said Molnar-Szakacs. "Our data show that both ethnicity and culture interact to influence activity in the brain, specifically within the mirror neuron network involved in social communication and interaction."
"We are the heirs of communal but local traditions," added Iacoboni. "Mirror neurons are the brain cells that help us in shaping our own culture. However, the neural mechanisms of mirroring that shape our assimilation of local traditions could also reveal other cultures, as long as such cross-cultural encounters are truly possible. All in all, our research suggests that with mirror neurons our brain mirrors people, not simply actions."
According to the study, the neural systems supporting memory, empathy and general cognition encode information differently depending on who's giving the information — a member of one's own tribe, or an outsider. "An important conclusion from these results is that culture has a measurable influence on our brain and, as a result, our behavior. Researchers need to take this into consideration when drawing conclusions about brain function and human behavior," noted Molnar-Szakacs. The findings, the researchers note, may also have implications for motor skill and language learning, intergroup communication, as well as the study of intergroup attitudes toward other cultures.
Source: University of Michigan Health System Date: July 18, 2007
Great Expectations: Why The Placebo Effect Varies From Person To Person
Science Daily — Why do some people experience a "placebo effect" that makes them feel better when they receive a sham treatment they believe to be real -- while other people don't respond at all to the same thing, or even feel worse?
A new study from the University of Michigan Health System may help explain why.
Using two different types of brain scans, U-M researchers have found that the extent to which a person responds to a placebo treatment is closely linked to how active a certain area of their brain becomes when they're anticipating something beneficial.
Specifically, the research finds strong links between an individual's response to a placebo "painkiller", and the activity of the neurotransmitter known as dopamine in the area of the brain known as the nucleus accumbens. That's a small region at the center of the brain that's involved in our ability to experience pleasure and reward, and even to become addicted to the "high" caused by illicit drugs.
The new research, published in the July 19 issue of the journal Neuron, builds on research previously published by the same U-M team in 2005. That study was the first to show that just thinking a placebo "medicine" will relieve pain is enough to prompt the brain to release its own natural painkillers, called endorphins, and that this corresponds with a reduction in how much pain a person feels.
"Receptors for both endorphins and dopamine are clustered heavily in the nucleus accumbens. So, taken together, our studies delve directly into the mechanisms that underlie the placebo effect," says senior author and U-M neuroscientist, psychiatrist and brain-imaging specialist Jon-Kar Zubieta, M.D., Ph.D. "This is a phenomenon that has great importance for how new therapies are studied, because many patients respond just as well to placebo as they do to an active treatment. Our results also suggest that placebo response may be part of a larger brain-resiliency mechanism."
For the current study, Zubieta and his colleagues -- led by neuroscience graduate student David J. Scott -- combined information from two types of brain scan to come to their conclusions. They performed PET (positron emission tomography) scans on the brains of 14 healthy volunteers, and fMRI (functional magnetic resonance imaging) scans on those 14, and on 16 other healthy volunteers.
The PET scans focused on brain dopamine, looking at its activity as volunteers were told to expect, and then received, a painful injection of saline solution in their jaw muscle. They were then told to expect, and then received, an injection that they were told could either be a painkiller or a placebo. (Both were in fact placebos.) The fMRI scans looked at volunteers' brains while they played a game. Before each round, they learned that a correct answer would win or lose an amount of money, up to $5.
The PET scans were made using 11C-raclopride, which combines a drug that binds preferentially to dopamine receptors with a short-lived radioactive form of carbon that can be "seen" on PET scans. Throughout the PET scanning session, volunteers were asked to rate their level of pain on a numerical scale, and to describe any emotions they were experiencing.
Before the painful injection began, but after the volunteers had been told it was coming, they were also asked to guess how much pain relief they'd get from the "painkiller" if they received it. Half the volunteers were women, all in the same stage of their monthly cycle to avoid differences in hormonal state that might affect tolerance of pain -- another topic that Zubieta's team has studied.
The PET scans and pain ratings revealed that as a group, the volunteers experienced significant pain relief from the placebo. But when researchers looked at each individual's results, they found that only half of the volunteers reported less pain when they received the "painkiller" placebo.
These placebo responders, as they were dubbed, had significantly more dopamine activity in their left nucleus accumbens than the other volunteers, beginning when they were told the painkiller medicine was about to begin flowing into their jaws. It also turned out that these individuals had also all anticipated the "painkiller" would give good pain relief before they even received it.
Meanwhile, of the seven individuals who didn't experience the placebo effect, four actually reported feeling more pain when the "painkiller" was delivered -- a phenomenon that has been dubbed the "nocebo" effect and has been observed in other situations.
Just to make sure that the volunteers' pain ratings weren't affected by the fact that they always received painful injections followed by placebo "painkiller", the researchers put a separate group of 18 male volunteers through the same experience twice, but no placebo was actually given, and actual PET scans were not done. Their pain and emotion ratings were significantly different from those of volunteers who received placebo.
"The results of these functional molecular imaging studies indicate that dopamine activity is activated in response to a placebo in a manner that's proportional to the amount of benefit that the individual anticipates," says Zubieta, who is the Phil F. Jenkins Professor of Depression in the U-M Medical School's Department of Psychiatry and a member of U-M's Molecular and Behavioral Neuroscience Institute, Depression Center and Department of Radiology.
The fMRI scans, which were performed on different days from the PET scans, revealed additional information about how individual expectations correlated with their placebo response. Each volunteer had an fMRI scan that looked at blood oxygenation throughout their brain, which allows researchers to spot areas where neurons (brain cells) are especially active as the individual performs a task or plays a game. In this case, the task was a very simple gambling game, in which subjects were scanned while expecting varying levels of a monetary reward or no reward.
As in the PET scans, the nucleus accumbens was a hotbed of activity as the volunteers were told how much money they could win or lose in the next round; as they waited for the round; and as they pressed the button and learned if they had succeeded in winning or avoiding losing money.
Then, the researchers compared the PET and fMRI scans for the volunteers who had had both types of scan. They also compared the ratings of anticipated placebo effect, the analgesia induced by the placebo during the pain studies, and the emotional changes associated with it. They found that those who expected a placebo to help them and got greater benefit from it (more analgesia, better emotional state) were also those who had the most activity in their nucleus accumbens during the anticipation of receiving a reward in the fMRI money game.
In addition to Zubieta and Scott, the study's authors are Christian Stohler, DMD, formerly of the U-M School of Dentistry and now dean of the University of Maryland School of Dentistry; Christine Egnatuk and Heng Wang of the U-M MBNI; and Robert Koeppe, Ph.D., director of the PET Physics Section in the Nuclear Medicine division of the U-M Department of Radiology. The study was funded by the National Institutes of Health.
Source: Weill Cornell Medical College Date: July 20, 2007
Pediatric Ritalin Use May Affect Developing Brain, New Study Suggests
Science Daily — Use of the attention deficit/hyperactivity disorder (ADHD) drug Ritalin by young children may cause long-term changes in the developing brain, suggests a new study of very young rats by a research team at Weill Cornell Medical College in New York City.
The study is among the first to probe the effects of Ritalin (methylphenidate) on the neurochemistry of the developing brain. Between 2 to18 percent of American children are thought to be affected by ADHD, and Ritalin, a stimulant similar to amphetamine and cocaine, remains one of the most prescribed drugs for the behavioral disorder.
"The changes we saw in the brains of treated rats occurred in areas strongly linked to higher executive functioning, addiction and appetite, social relationships and stress. These alterations gradually disappeared over time once the rats no longer received the drug," notes the study's senior author Dr. Teresa Milner, professor of neuroscience at Weill Cornell Medical College.
The findings, specially highlighted in the Journal of Neuroscience, suggest that doctors must be very careful in their diagnosis of ADHD before prescribing Ritalin. That's because the brain changes noted in the study might be helpful in battling the disorder but harmful if given to youngsters with healthy brain chemistry, Dr. Milner says.
In the study, week-old male rat pups were given injections of Ritalin twice a day during their more physically active nighttime phase. The rats continued receiving the injections up until they were 35 days old.
"Relative to human lifespan, this would correspond to very early stages of brain development," explains Jason Gray, a graduate student in the Program of Neuroscience and lead author of the study. "That's earlier than the age at which most children now receive Ritalin, although there are clinical studies underway that are testing the drug in 2- and 3-year olds."
The relative doses used were at the very high end of what a human child might be prescribed, Dr. Milner notes. Also, the rats were injected with the drug, rather than fed Ritalin orally, because this method allowed the dose to be metabolized in a way that more closely mimicked its metabolism in humans.
The researchers first looked at behavioral changes in the treated rats. They discovered that — just as happens in humans — Ritalin use was linked to a decline in weight. "That correlates with the weight loss sometimes seen in patients," Dr. Milner notes.
And in the "elevated-plus maze" and "open field" tests, rats examined in adulthood three months after discontinuing the drug displayed fewer signs of anxiety compared to untreated rodents. "That was a bit of a surprise because we thought a stimulant might cause the rats to behave in a more anxious manner," Dr. Milner says.
The researchers also used high-tech methods to track changes in both the chemical neuroanatomy and structure of the treated rats' brains at postnatal day 35, which is roughly equivalent to the adolescent period.
"These brain tissue findings revealed Ritalin-associated changes in four main areas," Dr. Milner says. "First, we noticed alterations in brain chemicals such as catecholamines and norepinephrine in the rats' prefrontal cortex — a part of the mammalian brain responsible for higher executive thinking and decision-making. There were also significant changes in catecholamine function in the hippocampus, a center for memory and learning."
Treatment-linked alterations were also noted in the striatum — a brain region known to be key to motor function — and in the hypothalamus, a center for appetite, arousal and addictive behaviors.
Dr. Milner stressed that, at this point in their research, it's just too early to say whether the changes noted in the Ritalin-exposed brain would be of either benefit or harm to humans.
"One thing to remember is that these young animals had normal, healthy brains," she says. "In ADHD-affected brains — where the neurochemistry is already somewhat awry or the brain might be developing too fast — these changes might help 'reset' that balance in a healthy way. On the other hand, in brains without ADHD, Ritalin might have a more negative effect. We just don't know yet."
One thing was clear: 3 months after the rats stopped receiving Ritalin, the animals' neurochemistry largely had resolved back to the pre-treatment state.
"That's encouraging, and supports the notion that this drug therapy may be best used over a relatively short period of time, to be replaced or supplemented with behavioral therapy," Dr. Milner says. "We're concerned about longer-term use. It's unclear from this study whether Ritalin might leave more lasting changes, especially if treatment were to continue for years. In that case, it is possible that chronic use of the drug would alter brain chemistry and behavior well into adulthood."
This work was funded by the U.S. National Institutes of Health.
Source: Massachusetts Institute of Technology Date: July 22, 2007
Mechanism Behind Fear Discovered
Researchers from MIT's Picower Institute for Learning and Memory have uncovered a molecular mechanism that governs the formation of fears stemming from traumatic events. The work could lead to the first drug to treat the millions of adults who suffer each year from persistent, debilitating fears - including hundreds of soldiers returning from conflict in Iraq and Afghanistan.
Emotional disorders such as post-traumatic stress and panic attacks stem from the inability of the brain to stop experiencing the fear associated with a specific incident or series of incidents. For some people, upsetting memories of traumatic events do not go away on their own, or may even get worse over time, severely affecting their lives. (Credit: iStockphoto/Viorika Prikhodko)
A study conducted by the Army in 2004 found that one in eight soldiers returning from Iraq reported symptoms of post-traumatic stress disorder (PTSD). According to the National Center for PTSD in the United States, around eight percent of the population will have PTSD symptoms at some point in their lives. Some 5.2 million adults have PTSD during a given year, the center reports.
Li-Huei Tsai, Picower Professor of Neuroscience in the Department of Brain and Cognitive Sciences, and colleagues show that inhibiting a kinase (kinases are enzymes that change proteins) called Cdk5 facilitates the extinction of fear learned in a particular context. Conversely, the learned fear persisted when the kinase's activity was increased in the hippocampus, the brain's center for storing memories.
Cdk5, paired with the protein p35, helps new brain cells, or neurons, form and migrate to their correct positions during early brain development. In the current work, the MIT researchers looked at how Cdk5 affects the ability to form and eliminate fear-related memories.
"Remarkably, inhibiting Cdk5 facilitated extinction of learned fear in mice. This data points to a promising therapeutic avenue to treat emotional disorders and raises hope for patients suffering from post-traumatic stress disorder or phobia," Tsai said.
Emotional disorders such as post-traumatic stress and panic attacks stem from the inability of the brain to stop experiencing the fear associated with a specific incident or series of incidents. For some people, upsetting memories of traumatic events do not go away on their own, or may even get worse over time, severely affecting their lives.
Treating these disorders involves methods geared toward making the behavior go away, or become extinct, but the molecular mechanisms underlying the extinction process are not well understood. However, Tsai said, studies have shown that some of the molecular machinery that initially encodes the troubling memories also regulates their extinction.
In the current work, genetically engineered mice received mild foot shocks in a certain environment and were re-exposed to the same environment without the foot shock. Mice with increased levels of Cdk5 activity had more trouble letting go--or extinguishing--the memory of the foot shock and continued to freeze in fear. Conversely, in mice whose Cdk5 activity was inhibited, the bad memory of the shocks disappeared when the mice learned that they no longer needed to fear the environment where the foot shocks had once occurred.
"In our study, we employ mice to show that extinction of learned fear depends on counteracting components of a molecular pathway involving the protein kinase Cdk5," said Tsai, a Howard Hughes Medical Institute investigator. "We found that Cdk5 activity prevents extinction, at least in part by negatively affecting the activity of another key kinase.
The team will report their results in the July 15 advance online publication of Nature Neuroscience.
In addition to Tsai, authors include MIT affiliate Farahnaz Sananbenesi; Picower Institute research affiliates Andre Fischer and Xinyu Wang; Christina Schrick and Jelena Radulovic of Northwestern University; and Rachel Neve of McLean Hospital in Belmont, Mass.
This work was supported by the National Institutes of Health and the National Institute of Mental Health.
Source: Carnegie Mellon University Date: July 27, 2007
Humanitarians, You're Not As Generous As You Think
A new study out of Carnegie Mellon University reveals that people who regard themselves as humanitarians are even more likely than others to base donations to the poor on whether they believe poverty is a result of bad luck or bad choices.
The study by Christina Fong, a research scientist in the Department of Social and Decision Sciences at Carnegie Mellon, supports previous findings that people are more likely to give money to the poor when they believe that poverty is a result of misfortune rather than laziness. What's surprising is that this effect is largest among people who claim to have more humanitarian or egalitarian beliefs. In fact, humanitarians give no more than others when recipients are deemed to be poor because of laziness.
Fong's results, published in the July issue of the Economic Journal, are significant because altruistic behavior is not well explained by traditional economics, which assumes that self-interest is the prime motivator for human behavior.
"These findings, along with prior findings from social survey data and experiments, help economists develop more realistic models of human behavior so that they can better explain how societies deal with poverty and inequality. They imply that people may be more likely to support policies and charities that help insure people against bad luck rather than their own choices," Fong said.
"For instance, transfer payments to people with prior work histories tend to be relatively popular as do expenditures on health and education for poor children, who are too young to be held personally accountable for their poverty," she said.
Fong conducted an experiment in which subjects were given $10 and asked to decide how much, if any, to give to a real-life welfare recipient. A few days prior to the experiment, participants completed surveys about their values and beliefs, including beliefs about whether lack of effort or bad luck cause poverty. The survey also included questions designed to measure whether participants considered themselves to be humanitarians.
During the experiment, donors were randomly matched with three different welfare recipients with varying work histories and desires for full-time work. This information, combined with the participants' individual beliefs about the causes of poverty, had a major impact on giving. People who believed that their recipient was poor because of bad luck gave six and a half times as much as people who believed that their recipient was poor because of laziness.
Those who scored high on the humanitarian measure gave more money to recipients judged to be victims of bad luck than those who scored low - but the two groups made the same offers to welfare recipients judged to be lazy. Fong terms this desire to help people on the condition that they appear to deserve it "empathetic responsiveness."
"This concept blends two well-known concepts of empathy. The first is the idea that empathy is an emotion that can evoke altruistic behavior. The second is the idea that empathy is the ability to attend and respond to another being," Fong said. "Empathy in this sense may result not only in positive responses to another person, such as sympathy followed by helping behavior, but also in negative responses such as anger followed by revenge."
Iron is the workhorse of trace minerals. An essential component of red blood cells, disruption of iron levels in the body will result in a myriad of serious conditions, and life cannot be sustained without it.
In novel research, investigators at the University at Buffalo's School of Public Health and Health Professions, have learned that iron is only one half of an all-important duo of trace minerals -- the other being copper -- that work in tandem to maintain proper iron balance, or homeostasis.
It appears the workhorse has a helper.
James F. Collins, Ph.D., UB assistant professor of exercise and nutrition sciences and biochemistry, discovered that when iron-absorption by cells lining the small intestine decreases during iron-deficient states, copper absorption increases.
Collins now is exploring the relationship between these two trace minerals through a $1.38 million grant from the National Institute of Diabetes and Digestive and Kidney Diseases (NIDDK).
The work will be carried out using established models of intestinal iron absorption in humans, including iron and iron/copper-deficient rodents and cultured intestinal epithelial cells.
"This project is intended to test the overall hypothesis that increased copper transport during iron-deficiency is critical to enhance certain aspects of intestinal iron absorption," said Collins.
"Iron or copper deficiency causes anemia, and abnormal intestinal iron transport is associated with several common human pathologies, including anemia of chronic disease (ACD) and hereditary hemochromatosis (HH), different forms of which result from several common genetic defects."
HH is an inherited metabolic disorder characterized by abnormally high absorption of dietary iron, which is deposited in body tissues and organs, where it may become toxic. ACD is a blood disorder caused by low body iron levels resulting from any medical condition that affects the production and lifespan of red blood cells, such as chronic infection, chronic immune activation resulting in inflammation, or malignancy.
"In collaboration with Dr. Zihua Hu, Ph.D., a computational scientist at UB's New York State Center of Excellence in Bioinformatics and Life Sciences, we determined that several genes related to iron and copper homeostasis were strongly induced by iron deprivation across different developmental stages in the rat small intestine," said Collins. "We will concentrate on understanding the role of two key proteins encoded by these genes: an intestinal iron transporter called divalent metal transporter 1 (Dmt1) and an intestinal copper transporter, the Menkes copper ATPase (Atp7a)."
The overall goal of the project is to answer three specific questions regarding the role of copper in intestinal iron transport, Collins noted:
1) Are Atp7a and Dmt1 solely responsible for enhancing dietary copper absorption during iron-deficiency?
2) What are the molecular mechanisms leading to induction of the Atp7a and Dmt1 genes? and
3) Which physiological processes related to intestinal iron ion homeostasis are enhanced by increased copper levels in enterocytes (cells of the superficial layer of the intestines) and in the liver?
"We also expect to learn more about the mechanisms of dietary copper absorption, which currently are not well defined," Collins said. "Furthermore, studies addressing the impact of increased enterocyte and liver copper levels during iron-deficiency have not been reported in the scientific literature to date, so this investigation is novel. "
Key collaborators at UB are Hu, Michael D. Garrick, Ph.D., professor of biochemistry, and Laura M. Garrick, Ph.D., research associate professor of biochemistry.
Why Do People Love Horror Movies? They Enjoy Being Scared
A bedrock assumption in theories that explain and predict human behavior is people's motivation to pursue pleasure and avoid pain. How can this be reconciled with the decision to engage in experiences known to elicit negative feelings, such as horror movies" It certainly seems counterintuitive that so many people would voluntarily immerse themselves in almost two hours of fear, disgust and terror. "Why do people pay for this?" "How is this enjoyable?"
Horror movie viewers are happy to be unhappy, new research suggests. (Credit: iStockphoto/Peter Finnie)
Investigators generally use one of two theories to explain why people like horror movies. The first is that the person is not actually afraid, but excited by the movie. The second explanation is that they are willing to endure the terror in order to enjoy a euphoric sense of relief at the end. But, a new study by Eduardo Andrade (University of California, Berkeley) and Joel B. Cohen (University of Florida) appearing in the August issue of the Journal of Consumer Research argues that neither of these theories is correct.
"We believe that a reevaluation of the two dominant explanations for people's willingness to consume "negative" experiences (both of which assume that people can not experience negative and positive emotions simultaneously) is in order," explain Andrade and Cohen in their study.
They continue: "The assumption of people's inability to experience positive and negative affect at the same time is incorrect."
In other words, the authors argue that horror movie viewers are happy to be unhappy. This novel approach to emotion reveals that people experience both negative and positive emotions simultaneously -- people may actually enjoy being scared, not just relief when the threat is removed. As the authors put it, "the most pleasant moments of a particular event may also be the most fearful."
Andrade and Cohen developed and utilize a new methodology to track negative and positive feelings at the same time. Their method could apply to other experiences that seem to elicit terror, risk, or disgust, such as extreme sports.
"When individuals who typically choose to avoid the stimuli were embedded in a protective frame of mind, such that there was sufficient psychological disengagement or detachment, they experienced positive feelings while still experiencing fearfulness," the authors explain.
Reference: Eduardo B. Andrade and Joel B. Cohen. "On the Consumption of Negative Feelings" Journal of Consumer Research: August 2007.
Women’s brains are different from men’s. That’s not news. What is news is that the differences are smaller than most people believe. They are not big enough to say that one sex is smarter or better at math than the other.
What is also news is that the small differences can be significant when it comes to memory, arousal, reasoning, and risk of some diseases. The latter include depression, anxiety, schizophrenia, drug abuse, Alzheimer’s, diabetes, and heart disease.
“Brain differences, though small, help us to understand the nature of sex differences in disease, and thus will hopefully aid in devising sex-specific treatments and prevention strategies,” notes Jill Goldstein, a professor of psychiatry and medicine at Harvard Medical School (HMS).
Here are some examples. Medical experts advise men to take a baby aspirin a day to help protect them against heart attacks. But Julie Buring, a Harvard professor of medicine who works at the Brigham and Women’s Hospital in Boston, found that aspirin does not work the same in women.
More women suffer from Alzheimer’s disease than men, and it’s not just because they live longer. They also have a much greater risk than men for contracting type 1 diabetes, rheumatoid arthritis, lupus, and other diseases caused by a defective metabolic and immune system. Premenopausal women recover from stroke sooner and with less disability than men of the same age or postmenopausal women.
Why? Goldstein and her many colleagues at the Connors Center for Women’s Health and Gender Biology at Brigham and Women’s are working on answers. They have found a number of brain regions that are significantly different in size in men and women. This has led many people to think that such variations result in sex differences in function and behavior, such as memory and emotionality.
For example, part of the hippocampus at the center of the brain and other areas at the front of the brain contribute to short-term memory, and are larger in women. Does this mean they have better working memories? Other areas in the brain, believed to be seats of mating and arousal, grow larger in males, leading to conclusions that men are more aggressive. “This is not always the case,” Goldstein points out. “Size alone does not drive function.”
In tests of short-term memory, Goldstein and her colleagues showed that although brain activity differs between sexes, their short-term memory performance is the same. “Such results show that male and female brains can take different actions to arrive at the same behavioral response,” she notes. However, these variations in activity may account for a small advantage that women have over men in verbal fluency and speed of perception.
Research to date leads to the conclusion that more variability exists in the size of brain regions and behavior within each sex than between the sexes. “The great value in exploring this variability is to understand the role that these differences play in certain diseases,” Goldstein explains.
Another conclusion is that specific behaviors are not driven solely by one region of the brain, notwithstanding those neat diagrams of how the brain works in popular magazines and Sunday supplements. “No single brain region controls a particular behavior,” is the way Goldstein puts it. “The hippocampus is important in memory but other brain regions are involved. Most functions are regulated by a network of brain regions.”
Both hormones and genes drive these subtle but significant sex differences in human brains from the womb to puberty and beyond. Goldstein’s team studies normal sex differences in order to better understand what goes wrong in mental disorders. The tools they use include scanning techniques that produce images of activity in various brain regions as men and women engage in various tasks.
In a series of experiments, Goldstein’s team investigated sex differences in response to stress. While in a scanner, women watched a series of pictures showing both neutral scenes, like cows grazing in a pasture, and those that stimulate high arousal, like horrible car crashes. The women were tested at the beginning of their menstrual cycles and again at ovulation, after eggs leave the ovaries.
Stress regions showed greater activity during the beginning of the cycle than during ovulation, but, surprisingly, the women felt no change in mood. When the same brain networks were examined in men, their brains looked similar to those in women at the cycle start. Goldstein and her team are now mapping out which hormones, like estrogen and progesterone, are associated with the differences.
“This is only one of many studies to identify the role of hormones and genes in regulating sex differences in response to stress,” Goldstein notes. “We hope that these findings help us understand the higher rates of depression and anxiety in women than in men.” Such findings could also shed light on the diminished sensitivity to trauma in men with depression and anxiety disorders. This new understanding of normal reactions to stress is the first step in putting together sex-specific treatments for women and men who must deal with these mental disorders.
As an aside, the natural lessening of anxiety during ovulation may increase a woman’s availability, receptivity, and desire to mate during this crucial time. “That idea goes beyond what our data shows us,” admits Goldstein, “but it’s reasonable to think it may be adaptive for survival of the species.”
What’s more, the brain’s stress response network has been implicated in other mental disorders, such as psychosis in which a person loses contact with reality, and in medical problems that include heart disease and diabetes. Stress and anxiety disorders are co-companions with a host of physical problems that show critical sex differences. Studies are under way by Goldstein and others to understand the connections and, perhaps, find effective ways to treat them.
Source: JAMA and Archives Journals Date: August 1, 2007
Group Psychotherapy Effective For Treating Depression Of Teen Girls Affected By War In Africa
Group psychotherapy was effective in reducing depression among displaced adolescent girls who are survivors of war in northern Uganda, though the intervention was not effective for adolescent boys, according to a study in the August 1 issue of JAMA, a theme issue on violence and human rights.
"Over 1.8 million individuals, mainly ethnic Acholi, have been internally displaced during 20 years of conflict between the government of Uganda and the Lord's Resistance Army. The Lord's Resistance Army has been accused of human rights abuses including mass violence, rape, and the abduction of more than 25,000 children.
Local populations have crowded into internally displaced persons camps where they face threats to their health and well being," the authors write. Prior research on children affected by armed conflicts documents increased risk of mental health problems, yet few interventions have been evaluated rigorously in randomized trials or have generated mixed results.
Paul Bolton, M.B.B.S., of the Johns Hopkins Bloomberg School of Public Health, Baltimore, and colleagues investigated whether a therapy-based intervention (interpersonal psychotherapy for groups, [IPT-G]) and an activity-based intervention (creative play, [CP]) were effective for relieving mental health and psychosocial problems resulting from war and displacement among 314 adolescents (age 14-17 years), living in two camps in northern Uganda. The randomized controlled trial was conducted from May 2005 through December 2005.
Both interventions comprised 16 weekly group meetings, lasting 1.5 to 2 hours each. Locally developed screening tools assessed the effectiveness of the interventions in reducing symptoms of depression and anxiety, improving conduct problems and functioning among those who met study criteria and were randomly allocated to one of three study groups: 105, interpersonal psychotherapy for groups; 105, creative play; and 104, wait-control group (individuals wait listed to receive treatment at study end).
The researchers found that interpersonal psychotherapy was superior to the wait-list control condition in reducing depression symptoms, but statistically significant improvement was limited to the girl participants in the study. Creative play was not superior to the wait-list control condition. Neither interpersonal psychotherapy for groups nor creative play was effective in improving anxiety, conduct problems or functioning among boys or girls.
The failure of both interpersonal psychotherapy for groups and creative play "to significantly assist boys in this study raises the question of whether other interventions may be needed to assist war-affected boys with depression symptoms. Since both group psychotherapy and activity-based interventions were not effective, some form of individual psychotherapy or an entirely different type of intervention may be indicated as the basis for a future trial," the researchers write.
This study suggests that group-based psychological interventions can help adolescents who have been traumatized by war and displacement and who live in poor, rural, and illiterate communities.
Reference: JAMA. 2007;298(5):519-527
Editorial: Children of War and Opportunities for Peace
In an accompanying editorial, Robert J. Ursano, M.D., of Uniformed Services University, Bethesda, Md., and Jon A. Shaw, M.D., of the University of Miami, comment on the issue of children and war.
"No one endures war-related traumatic events unchanged. Little is known about the changes in values and hopes and views of the future that exposure to such trauma engenders. Children who are still learning to regulate mood and aggression are certainly even more vulnerable to these life-changing forces. The researchers reporting the results of their studies in this issue bring much-needed attention to the violence of war and the resulting mental health problems. Deeper appreciation of the effects of exposure to war-related trauma as well as improved understanding of individuals' attitudes toward reconciliation and the means to achieve peace may contribute to development of interventions to address the barriers to recovery not only from disease and illness but from lost futures and visions of life."
Reference for editorial JAMA. 2007;298(5):567-568.
Heat-related deaths, unfortunately, are common enough. But the suicide rate begins to rise in the UK when the thermometer hits 18C. In my part of the World, 18C is Winter. I suppose the key to this research is that heatwave conditions can cause neurochemical reactions which lead to an increase in the rate of suicide: Suicide rate rises in hot weather
The damp summer may have made us all miserable, but research suggests it is hot weather that poses a far more serious problem for vulnerable people.
A team from London's Institute of Psychiatry found that suicide rates go up during hot weather.
Analysis of more than 50,000 suicides in England and Wales between 1993 and 2003 showed the suicide rate rose when average daily temperatures topped 18C.
The study appears in the British Journal of Psychiatry.
The researchers found that once the daily average temperature rose above 18C each further degree increase was associated with a rise in suicides of almost 4%.
The rate in the rise of violent suicides was even higher, at 5% per degree rise in temperature.
Aggression and irritability
Researcher Dr Lisa Page said there were a number of possible reasons for the link between hot weather and suicide.
She said: "We felt overall that the most likely explanation was probably a psychological one where for some people you have an unusually high degree of irritability, aggression and impulsivity."
She said it was possible that the effect was linked to levels of the mood controlling chemical serotonin in the brain, which have been shown to dip in the summer months.
Alternatively, the suicide rate may be linked to the tendency to consume higher levels of alcohol in hot weather.
However, she said the finding was unlikely to be down to people being made miserable by seeing others enjoying the good weather, as the effect was specific to unusually hot days, rather than summer days in general.
The researchers found that the suicide rate rose by 46.9% during the 1995 heat wave.
A similar run of hot weather in 2003 appeared to have little impact, possibly because high temperatures came in two distinct spells, giving people a change to adapt.
During the 11-year period covered by the research, the average temperature in England topped 18C on 222 days.
There were 53,623 suicides - an average of 13.3 per day.
Three-quarters of all suicides were by men and this proportion remained constant over the study period.
The highest daily suicide count was recorded for 1 January.
The largest number of suicides took place on Mondays, with numbers declining as the week wore on.