Why is it harder to learn languages when you’re older? Why do star athletes and master musicians start training young? Why are certain medical conditions reversible in kids, but not in adults?
The answer to all of these questions and many more lies in the existence of ‘critical periods’—windows of time in human and animal development when environmental experience molds neural circuits. While experience can influence the brain long before and after critical periods, the sculpting power of its hand is never quite as strong than as during these periods of heightened plasticity.
Years of research from neuroscience laboratories worldwide, including the team of Takao Hensch, director of the Conte Center at Harvard, focused on developmental origins of mental illness, has revealed that inhibitory neural activity—in which neurons release chemical signals that dampen or silence the electrical activities of their partners—plays a crucial role in orchestrating critical periods. If inhibitory neurons in the cerebral cortex don’t mature, critical periods don’t open. Conversely, if levels of inhibition are amped up early in development, critical periods can open prematurely.
But why? How exactly does inhibition allow a neuron to “open” a window of plasticity?
Music therapy has been around for a while; it had its unofficial beginnings with Pythagoras, who proscribed music treatments for mental disorders6 and took root in the United States around 1950, with the return of World War II veterans in need of rehabilitation2. Still, for many, “music therapy” might evoke nothing more than psychiatry sessions with occasional bouts of soothing Mozart sonata, and we have yet to see music therapists become an integral and habitual part of the healthcare system. The relative obscurity of music therapy flies in the face of its rather staggering breadth of applications, as well as some stunning success stories (for example, speech and gait recovery in Parkinson’s patients). Why this gap?
Years ago, I was captivated by an adorable baby on the front cover of a book, The Scientist in the Crib: What Early Learning Tells Us About the Mind, written by a trio of research scientists including Alison Gopknik, PhD, Andrew Meltzoff, PhD, and Patricia Kuhl, PhD.
At the time, I was simply interested in how babies learn about their worlds, how they conduct experiments, and how this learning could impact early brain development. I did not realize the extent to which interactions with family, caretakers, society, and culture could shape the direction of a young child’s future.
Now, as a speech-language pathologist in Early Intervention in Massachusetts, more cognizant of the myriad of factors that shape a child’s cognitive, social-emotional, language, and literacy development, I have been absolutely delighted to discover more of the work of Dr. Kuhl, a distinguished speech-and-language pathologist at The University of Washington. So, last spring, when I read that Dr. Kuhl was going to present “Babies’ Language Skills” as one part of a 2-part seminar series sponsored by the Mind, Brain, and Behavior Annual Distinguished Lecture Series at Harvard University1, I was thrilled to have the opportunity to attend. Below are some highlights from that experience and the questions it has since sparked for me: Read more…
Depictions of impossibly slim bodies in the media are often seen as a cause of modern eating disorders. While attending Elizabeth Lawson’s Conte Center lecture on the role of hormones in anorexia earlier this month, I began to wonder: what happened before mass media? Have eating disorders always existed? And if so, did they appear in the past in the same forms as today?
Ancient Times: Meal Purging
Ancient Egyptian hieroglyphs hold the first known record of abnormal eating behavior: purging after meals, then considered a health ritual4. Purging was also common in Ancient Rome, where the rich would sometimes vomit during banquets in order to make room to continue a lavish feast. Even in the Middle Ages, the wealthy classes purged so they could eat more, seeing excessive consumption as a mark of prestige. These records reveal a surprising prevalence of behavior that today would be seen as pathological. Clearly the norm of eating behaviors is capable of radical shifts under cultural control.
At the same time, records from early dynasty China and Persia describe binging and purging behaviors4. Also, African tribal lore tells of adults fasting during times of famine in order to feed their children. Sometimes these individuals continued fasting after the famine, even to the point of death by starvation–suggesting that prolonged fasting lastingly transformed their relationship to food.
The Middle Ages: Fasting Saints
The middle ages saw the rise and fall of a very particular form of anorexia sometimes called “anorexia mirabilis”, tied to the contemporary cultural ideal of spiritual asceticism. Throughout the 13th century, women partook in extreme fasting behaviors as part of religious practice, some even dying of starvation. St. Catherine of Siena, one famous anorexic figure of the time, refused all food but the Eucharist, cold water, and bitter herbs that she would chew and spit back out 1. When forced to consume food, she experienced pain and swelling in her stomach.
In a 1373 letter, Catherine attributed her extreme fasting to “God who by a most singular mercy allowed me to correct the vice of gluttony”1. Although the saint’s symptoms resemble those of modern anorexia nervosa—loss of appetite, inability or unwillingness to eat, and stoppage of menstruation–they were couched in a context of extreme religious practice, alongside self-flagellation, scalding, and sleeping on a thorny bed1.
Last month, on Blue Sky Girls Day, about thirty youth affected by Rett syndrome gathered with their families, friends and community members for a symbolic climb up the stairs of Gordon Hall, a stately building at the center of the Harvard Medical School quadrangle in Boston. It was an inspiring scene, demonstrating the strength that comes from working together in the face of adversity—and also one that makes it crystal clear why biomedical research matters.
Rett syndrome is a rare neurogenetic disorder, characterized by regression during development, often in the ability to talk, walk, and make purposeful hand movements—plus a host of other serious medical problems including seizures, breathing difficulties, scoliosis, and gastrointestinal issues. Usually Rett affects girls and is caused by sporadic mutation of a gene on the X-chromosome called MeCP2.
Many adult psychiatric illnesses originate in childhood or adolescence. Researchers have known about this for some time, and over the years they have conducted several regional, national, and cross-national surveys on youth mental health. Yet, it was only two years ago that data from a large-scale U.S. survey of adolescents—assessing a broad range of mental disorders with in-depth diagnostic interviews—became available. This 2010 data revealed that more than one in five U.S. youth aged 13 to 18 is likely to have experienced a mental disorder with severe impairment at some point.
This year, two studies published in the Archives of General Psychiatry have expanded upon the analysis of that survey, known as the National Comorbidity Survey Replication Adolescent Supplement (NCS-A). A major finding, adding to the evidence that mental health problems often start in youth, was that within a 12-month time period, 8% of U.S. teens experienced serious emotional disturbances (SEDs), and just over 40% experienced some sort of mental disorder. Anxiety disorders were most common, followed by behavior, mood, and substance disorders.
Mental disorders were defined as disorders appearing in the current Diagnostic and Statistical Manual of Mental Disorders (DSM-IV), developed by the American Psychiatric Association. SEDs were defined as mental disorders producing significant impairment in family, school, or community activities—as described by the federal government.
Overall, the number of mental health diagnoses a teen had was more important than which diagnoses he or she had, in terms of risk of falling into the SED category.
I interviewed Ronald Kessler, PhD, McNeil Family Professor of Health Care Policy at Harvard Medical School, lead author on both recent studies, to find out what the goals of surveys like the NCS-A are, and how the latest statistics might guide the design of interventions. Here’s what I learned:
Last week at the 2012 BIO International Convention, Massachusetts Governor Deval Patrick and seven well-known biotech companies – Abbott, Biogen Idec, EMD Serono, Janssen Research & Development, LLC, Merck, Pfizer, and Sunovion – announced the launch of the Massachusetts Neuroscience Consortium.
As reported in the Boston Globe and on the Governor’s website, this new consortium aims to foster collaborations between industry and academia by funding pre-clinical neuroscience research in the state’s colleges and universities. The total initial funding is $1.75 million. In addition to leaders from the state government and pharmaceutical industry, the announcement at BIO was attended by academic leaders such as the Dean of Harvard Medical School, individuals suffering from illnesses such as multiple sclerosis and Alzheimer’s, and heads of patient advocacy groups such as the Alzheimer’s Association.
Michael Ehlers, Chief Scientific Officer for Pfizer Neuroscience, commented on the importance of this collaboration for those suffering from mental illness as well as other nervous system disorders, saying in a prepared statement, “This collaboration is a step forward in our effort to address the urgent need for therapies in neurologic and psychiatric disease.”