This section contains abstracts and PDFs of a selection of Andrew’s publications, with the most recent at the top. For a complete list of publications available for download, click here.
“‘The Penny Lunch Has Spread Faster Than the Measles’: Children’s Health and the Debate over School Lunches in New York City, 1908-1930”
A. R. Ruis
History of Education Quarterly 55, no. 2 (2015): .
A few days before Thanksgiving in 1908, the home economist Mabel Hyde Kittredge initiated a school lunch program at an elementary school in Hell’s Kitchen, serving soup and bread to hungry children in the infamous Manhattan neighborhood. The following year, she founded the School Lunch Committee (SLC), a voluntary organization composed of home economists, educators, physicians, and philanthropists dedicated to improving the nutritional health and educational prospects of schoolchildren. By 1915, the SLC was serving 80,000 free or low-price lunches a year to children at nearly a quarter of the elementary schools in Manhattan and the Bronx. Sparse but compelling evidence indicated that the program had reduced malnourishment among the children who partook, and teachers and principals at participating schools reported reductions in behavioral problems, dyspepsia, inattentiveness, and lethargy. With the hope of expanding the service and making it a permanent function of New York City’s public schools, the SLC transferred control to the Board of Education in 1919. Despite the success of the pilot program and the availability of public funding earmarked to maintain and even expand school lunch provision, the Board drastically reduced meal service. What had been a carefully planned and executed school health initiative was mostly replaced by a for-profit concessionaire system with no public health or educational mandate, no nutritional requirements, no food safety inspections, no reduced-price or free meals for poor children, and virtually no oversight of any kind. It is overly simplistic to regard the Board’s abdication of a popular health, education, and social welfare program as a government agency’s callous indifference to the needs of the poor. Because school meals were a matter of public policy in numerous domains, including health, education, labor, law, and social welfare, what the SLC regarded as a simple transfer from private charity to public entitlement was in fact a socially and politically charged negotiation of responsibility for children’s nutritional health and the proper role of the public school.
“Pomegranate and the Mediation of Balance in Early Medicine”
A. R. Ruis
Gastronomica: The Journal of Critical Food Studies 15, no. 1 (2015): 22-33.
Different elements of the pomegranate, both tree and fruit, had a wide range of uses in pre-modern therapeutics. Pomegranate also had a rich symbolic role in the art, literature, and religion of numerous cultures. In nearly every part of the globe where the pomegranate grew, it came to represent fundamental dualities: life and death, inside and out, many and one. The medicinal purposes for which healers recommended pomegranate at times reflected broader symbolic associations, and those associations are an important part of the therapeutic tradition. The dualistic symbolism that attended the pomegranate in various cultural traditions synergized with dualistic medical concepts, reinforcing the therapeutic power of pomegranate in otherwise diverse contexts. Reflecting this duality, pomegranate was both an astringent and a laxative, an emmenagogue and an antimenorrhagic, an expectorant and an antiemetic, a pyrogen and an febrifuge, a restorative and a soporific. In both literary and medical traditions, the pomegranate mediated transitions—or maintained balance—between opposing states. This essay provides an overview of the rich and sundry uses of pomegranate in pre-modern therapeutics, revealing how cultural associations both reflected and informed medical practices.
“‘Children with Half-Starved Bodies’ and the Assessment of Malnutrition in the United States, 1890-1950”
A. R. Ruis
Bulletin of the History of Medicine 87, no. 3 (2013): 380-408.
Malnutrition was one of the most significant children’s health issues of the early twentieth century, but it also engendered considerable controversy. Just how many children were truly malnourished, and how could they be reliably identified? Despite the failures of numerous diagnostic methods—even the definition of malnutrition defied consensus—health authorities remained convinced that malnutrition was a serious and widespread problem. Indeed, the imprecision that surrounded the condition allowed it to be used metaphorically to advance a broad range of professional, social, and public health agendas. By the 1940s, due in part to the lack of reliable diagnostic methods, public health nutrition policy shifted abruptly from one of assessment, based on mass surveillance and individualized care, to one of management, based on a universal program of nutrition education, fortification of foods, and food security that treated all children as in need of nutritional assistance.
“Nutrition Classes and Clinics”
A. R. Ruis
The Oxford Encyclopedia of Food and Drink in America, 2nd Edition, ed. Andrew F. Smith (Oxford: Oxford University Press, 2012), 723-25.
The nutrition class, also know as the nutrition clinic, helped undernourished children to achieve and to maintain good health through a combination of routine medical examination and care, supplemental feeding, instruction in foods and nutrition, and social work. Along with other public health nutrition initiatives developed during the Progressive Era, such as school meal programs, anthropometric assessment of nutritional health, and extension work in foods and nutrition, nutrition classes were a response to public and professional concern about malnutrition in the first decades of the 20th century.
“The Schism Between Medical and Public Health Education: A Historical Perspective”
A. R. Ruis & Robert N. Golden
Academic Medicine 83, no. 12 (2008): 1153-57.
The separation of “medicine” and “public health” in academic institutions limits the potential synergies that an integrated educational model could offer. The roots of this separation are deeply imbedded in history. During the past two centuries, there have been repeated efforts to integrate public health education into the core training of physicians, usually in response to a perceived short-term crisis, and without widespread, lasting success. The cost of additional public health instruction and the “overcrowding” of the medical curriculum have been cited as obstacles for creating an integrated medical/public health curriculum for more than a century. Several thoughtful and prescient proposals for integration were developed at a conference convened by the Rockefeller Foundation in the early 20th century, but not all were implemented. Today, there is growing recognition of the considerable value afforded by the integration of medicine and public health education. Many schools have responded to a national call for a renewed relationship between medicine and public health by increasing the availability of MD/MPH programs and/or by incorporating one or more public health courses into the basic medical curriculum. A few schools have created more substantial and innovative changes. Review and consideration of the history and politics of past efforts may serve as a guide for the development of successful new approaches to creating a clinical workforce that incorporates the principles of both clinical medicine and public health.