Years ago, I was captivated by an adorable baby on the front cover of a book, The Scientist in the Crib: What Early Learning Tells Us About the Mind, written by a trio of research scientists including Alison Gopknik, PhD, Andrew Meltzoff, PhD, and Patricia Kuhl, PhD.
At the time, I was simply interested in how babies learn about their worlds, how they conduct experiments, and how this learning could impact early brain development. I did not realize the extent to which interactions with family, caretakers, society, and culture could shape the direction of a young child’s future.
Now, as a speech-language pathologist in Early Intervention in Massachusetts, more cognizant of the myriad of factors that shape a child’s cognitive, social-emotional, language, and literacy development, I have been absolutely delighted to discover more of the work of Dr. Kuhl, a distinguished speech-and-language pathologist at The University of Washington. So, last spring, when I read that Dr. Kuhl was going to present “Babies’ Language Skills” as one part of a 2-part seminar series sponsored by the Mind, Brain, and Behavior Annual Distinguished Lecture Series at Harvard University1, I was thrilled to have the opportunity to attend. Below are some highlights from that experience and the questions it has since sparked for me: Read more…
Depictions of impossibly slim bodies in the media are often seen as a cause of modern eating disorders. While attending Elizabeth Lawson’s Conte Center lecture on the role of hormones in anorexia earlier this month, I began to wonder: what happened before mass media? Have eating disorders always existed? And if so, did they appear in the past in the same forms as today?
Ancient Times: Meal Purging
Ancient Egyptian hieroglyphs hold the first known record of abnormal eating behavior: purging after meals, then considered a health ritual4. Purging was also common in Ancient Rome, where the rich would sometimes vomit during banquets in order to make room to continue a lavish feast. Even in the Middle Ages, the wealthy classes purged so they could eat more, seeing excessive consumption as a mark of prestige. These records reveal a surprising prevalence of behavior that today would be seen as pathological. Clearly the norm of eating behaviors is capable of radical shifts under cultural control.
At the same time, records from early dynasty China and Persia describe binging and purging behaviors4. Also, African tribal lore tells of adults fasting during times of famine in order to feed their children. Sometimes these individuals continued fasting after the famine, even to the point of death by starvation–suggesting that prolonged fasting lastingly transformed their relationship to food.
The Middle Ages: Fasting Saints
The middle ages saw the rise and fall of a very particular form of anorexia sometimes called “anorexia mirabilis”, tied to the contemporary cultural ideal of spiritual asceticism. Throughout the 13th century, women partook in extreme fasting behaviors as part of religious practice, some even dying of starvation. St. Catherine of Siena, one famous anorexic figure of the time, refused all food but the Eucharist, cold water, and bitter herbs that she would chew and spit back out 1. When forced to consume food, she experienced pain and swelling in her stomach.
In a 1373 letter, Catherine attributed her extreme fasting to “God who by a most singular mercy allowed me to correct the vice of gluttony”1. Although the saint’s symptoms resemble those of modern anorexia nervosa—loss of appetite, inability or unwillingness to eat, and stoppage of menstruation–they were couched in a context of extreme religious practice, alongside self-flagellation, scalding, and sleeping on a thorny bed1.
Last month, on Blue Sky Girls Day, about thirty youth affected by Rett syndrome gathered with their families, friends and community members for a symbolic climb up the stairs of Gordon Hall, a stately building at the center of the Harvard Medical School quadrangle in Boston. It was an inspiring scene, demonstrating the strength that comes from working together in the face of adversity—and also one that makes it crystal clear why biomedical research matters.
Rett syndrome is a rare neurogenetic disorder, characterized by regression during development, often in the ability to talk, walk, and make purposeful hand movements—plus a host of other serious medical problems including seizures, breathing difficulties, scoliosis, and gastrointestinal issues. Usually Rett affects girls and is caused by sporadic mutation of a gene on the X-chromosome called MeCP2.
Many adult psychiatric illnesses originate in childhood or adolescence. Researchers have known about this for some time, and over the years they have conducted several regional, national, and cross-national surveys on youth mental health. Yet, it was only two years ago that data from a large-scale U.S. survey of adolescents—assessing a broad range of mental disorders with in-depth diagnostic interviews—became available. This 2010 data revealed that more than one in five U.S. youth aged 13 to 18 is likely to have experienced a mental disorder with severe impairment at some point.
This year, two studies published in the Archives of General Psychiatry have expanded upon the analysis of that survey, known as the National Comorbidity Survey Replication Adolescent Supplement (NCS-A). A major finding, adding to the evidence that mental health problems often start in youth, was that within a 12-month time period, 8% of U.S. teens experienced serious emotional disturbances (SEDs), and just over 40% experienced some sort of mental disorder. Anxiety disorders were most common, followed by behavior, mood, and substance disorders.
Mental disorders were defined as disorders appearing in the current Diagnostic and Statistical Manual of Mental Disorders (DSM-IV), developed by the American Psychiatric Association. SEDs were defined as mental disorders producing significant impairment in family, school, or community activities—as described by the federal government.
Overall, the number of mental health diagnoses a teen had was more important than which diagnoses he or she had, in terms of risk of falling into the SED category.
I interviewed Ronald Kessler, PhD, McNeil Family Professor of Health Care Policy at Harvard Medical School, lead author on both recent studies, to find out what the goals of surveys like the NCS-A are, and how the latest statistics might guide the design of interventions. Here’s what I learned:
Last week at the 2012 BIO International Convention, Massachusetts Governor Deval Patrick and seven well-known biotech companies – Abbott, Biogen Idec, EMD Serono, Janssen Research & Development, LLC, Merck, Pfizer, and Sunovion – announced the launch of the Massachusetts Neuroscience Consortium.
As reported in the Boston Globe and on the Governor’s website, this new consortium aims to foster collaborations between industry and academia by funding pre-clinical neuroscience research in the state’s colleges and universities. The total initial funding is $1.75 million. In addition to leaders from the state government and pharmaceutical industry, the announcement at BIO was attended by academic leaders such as the Dean of Harvard Medical School, individuals suffering from illnesses such as multiple sclerosis and Alzheimer’s, and heads of patient advocacy groups such as the Alzheimer’s Association.
Michael Ehlers, Chief Scientific Officer for Pfizer Neuroscience, commented on the importance of this collaboration for those suffering from mental illness as well as other nervous system disorders, saying in a prepared statement, “This collaboration is a step forward in our effort to address the urgent need for therapies in neurologic and psychiatric disease.”
If you’re interested in the links between childhood adversity and brain science, but didn’t have a chance to attend the Spring 2012 Picower Symposium, “New Insights on Early Life Stress & Mental Health,” at MIT last month, you’re in luck.
All 12 scientists’ talks are now online. And it’s the perfect time to watch them, as May 6-12 is National Children’s Mental Health Awareness Week.
Here are the direct links:
Matt Wilson: Opening Remarks
Jane Isaacs-Lowe: Rewiring the Trajectory for Vulnerable Children: A Foundation Perspective
Bruce McEwen: The Brain on Stress: Adaptive Plasticity in Response to the Social Environment
John Eckenrode: Preventing early adversity and improving the life chances of socially disadvantaged children & families
Michael Meaney: Effects of maternal care on gene regulation and behavior
Robert Anda: The Lifelong Impact of Adverse Childhood Experiences on Health and Society. Neurobiology & Epidemiology Converge
Andrew Garner: Translating Developmental Science into Healthy Lives
Kay Tye: Activating dopamine neurons acutely rescues a stress-induced depression phenotype
Moshe Szyf: The DNA methylation landscape of early life adversity
Li-Huei Tsai: The convergence of epigenetics and stress in cognitive impairment and repair
Jack P. Shonkoff: Leveraging the Biology of Adversity to Shape the Future of Early Childhood Policy
Steve Hyman: Closing Remarks
In the heated climate of today’s discussions on attention deficit hyperactivity disorder (ADHD), one of the areas we sometimes lose sight of is the history of this disorder. Although frequently discussed in the context of our fast-paced, high-tech modern lives, the symptoms of ADHD are by no means unique to our time.
An ADHD-like disorder was actually described as early as 1798, by Scottish physician Sir Alexander Crichton. Writing a chapter “On Attention and its Diseases” in a three-book series entitled “An inquiry into the nature and origin of mental derangement,” Crichton spoke of a disease characterized by difficulty sustaining focus, a predisposition to distraction, restlessness, and possibly some type of impulsivity—highly reminiscent of the current DSM definition of ADHD (although lacking the hyperactivity component). Crichton even recognized the developmental nature of the disorder and understood that it might be due to neurological dysfunction. Read more…