February 22–28 is National Eating Disorder Awareness Week. Learn more at http://nedawareness.org
Eating disorders are a major public health issue in the US. They are the deadliest of any mental illness. But their causes are hard to pin down.
Eating disorders are complex psychosocialbiological illnesses that involve a continuous interplay of nature versus nurture. While scientists are interested in uncovering the neurobiological predispositions that lead people to develop eating disorders, psychology is more interested in looking at the personality traits of the sufferer, his/her past experiences, traumas, co-morbid conditions, and interpersonal relationships.
Starvation as a Religious Act
Self-inflicted starvation is not new. People have been starving themselves for centuries. However, it wasn’t until the 1800s that anorexia first appeared in the medical literature, and around the 1970s when it became a part of our social consciousness.
Based on this, it is reasonable to conclude that eating disorders of the Biblical era and beyond were probably motivated by factors other than an obsession with thinness. The Old Testament mentions that fasting was a common practice used to cleanse oneself from previous sins—a form of self-castigation.
People also fasted to purify oneself, prove a devotion to God, and/or to place themselves in a trance-like state which enabled them to receive prophetic visions from God. Moses, for example, fasted for 40 days prior to going to Mount Sinai where he received the 10 Commandments from God.
During the first to the fourth centuries, men belonging to Gnostic cults were known for fasting in an attempt to deny themselves the physical and material comforts of life. The first known death due to self-inflicted starvation occurred in late fourth century Rome when a 20-year old woman had been fasting under the direction of her priest, St. Jerome.
There were also non-religious reasons why people fasted; Greco-Roman physicians recommended fasting and purging with the use of herbs that acted as laxatives and diuretics as a remedy for various physical ailments (Stryer SB. Anorexia. Santa Barbara, Calif: Greenwood Press; 2009).
The incidence of self-starvation, interestingly, decreased during the early Middle Ages when society was in economic decay and there was famine. But by the 10th and 11th century, people started to prosper again and accounts of self-starvation resurfaced.
This time, it was linked directly to the Catholic Church, which promoted fasting to attain spiritual perfection and holiness. Young women who were able to survive on so little food were labeled Saints. Historians refer to them as “Holy Anorexics.”
The Protestant Reformation of the 1500s challenged the idea of ascetic fasting and asserted that one does not need to starve him- or herself in order to show their devotion to God. In fact, those who did continue to fast were charged with witchcraft.
By the 17th century, the bewitchment theory died down and those who continued to starve themselves were referred to as “Miraculous Maidens.” Unlike the Saints, they starved themselves presumably for unknown, non-religious reasons.
Medical Understanding of Self-Starvation During the Industrial Revolution
Dr. Richard Morton (1637–1698) was the first physician to provide a medical description of what we classify today as anorexia nervosa, in his Treatise of Consumptions (1694). He spoke of anorexia as “a wasting disease of the muscular parts of the body” due to one of two types of “consumption”—original (due to a medical illness), or symptomatical (a nervous condition with no known cause). The cases he discussed in his Treatise describes the same symptoms we see in modern day anorexics: loss of menses, pale skin, cold body temperature, fainting spells, etc (Morton, R. Treatise of consumptions: wherein the difference, nature, causes, signs, and cure of all sorts of consumptions are explained. London; 1694).
Dr. Robert Whytt (1714–1766) was the first to record the biological changes that occur with severe fasting, including a temperature below 95°F (hypothermia) and a resting heart rate under a BPM of 60 (bradycardia). He called anorexia “nervous atrophy,” which he described as an “unnatural or morbid state of the nerves, of the stomach, and intestines.” He also noted psychological changes that occur during severe fasting, such as melancholy (Silverman JA, Int J Eat Disord 1987;6(1):143–146).
Physicians began noting these psychological motivations for self-starvation and blamed it on mental illnesses that were commonly diagnosed at the time, such as hysteria, nervous atrophy, and melancholy.
In the 1870s, Dr. William Gull and Dr. Charles Lasegue were the first to document patients who displayed what matches our modern day definition of anorexia nervosa—preoccupation with one’s body size, fear of gaining weight, self-starvation, and significant weight loss not associated with any otherwise explained medical condition. Prior to this, there is no documentation with reference to people starving themselves in an effort to lose weight. Historians are unsure if this is because it either didn’t exist or its signs were attributed to some other type of illness, like hysteria or nervousness (Stryer 2009).
Anorexia in Post-Industrial Society
By the Industrial Revolution, society was drastically changing economically, politically, and socially. There was an emerging, prosperous middle class. Family dynamics shifted; men began working in factories to financially support the family, while women stayed at home taking care of the children, cooking, and doing household chores. Families had more disposable income, which led to the rise of women’s interest in fashionable clothes and accessories, and along with it, concern about her outer beauty.
The waves of the Feminist movement brought about liberal change—women’s right to vote, their ability to pursue advanced education, and increased female workforce participation. Fashion shifted with society.
During the Victorian era, women wore multiple layers of clothing—petticoats, corsets, long skirts, frills, and lace—that attempted to emphasize the hourglass figure by accentuating the waist and hips. By the 1920s, the flapper style dress was introduced, which looked best on small-busted, narrow-waisted, thin women. In order to fit into the dress, some women had to wear a flattening brassiere underneath.
As fashion changed over the next century, women’s clothing became more revealing and necessitated a slim figure (Rollin L. Twentieth-century teen culture by the decades: a reference guide. Westport, Conn.: Greenwood Press;1999).
In 1965, America was introduced to the stick-thin model, Lesley Lawson, who went by the name Twiggy. Since then, the modeling industry has recruited mostly thin models, and the definition of beauty has become associated with thinness.
This, plus our newfound understanding of the science behind food, led to a society obsessed with dieting. Mass media also played a huge role in spreading this new ideal of stick thin. As a result, the weight loss industry became a multibillion dollar enterprise.