Eating to Learn, Learning to Eat: School Lunches and Nutrition Policy in the United States, 1890-1946
This project explores the early history of school meals and other nutrition programs for school children from their beginnings in the late 19th century until the passage of the National School Lunch Act in 1946. It examines how the grass-roots labor of mothers, teachers, school administrators, nutritionists, and physicians led to the creation of the National School Lunch Program, the federal government’s longest-running health and social welfare initiative for children. The history of school meals is one of continuous conflict, between ideology and implementation, medical needs and political parsimony, nutritional health and agricultural protection. These conflicts were central to negotiations of responsibility for children and their well-being between home and state, private charity and public welfare, and national and local government. Although conceived in an era marked by eugenic policies, the resurgence of nativism, hemispheric isolationism, big-stick foreign policy, and imperial paternalism, American school meal programs were promoted as a universal and egalitarian approach to health, education, and social welfare. School meals represented not only the beginning of public responsibility for the nutritional health of children, but they soon became a backdrop against which a rapidly diversifying population addressed issues of race and citizenship, as the lunchroom (and the foods that filled it) represented both assimilation to American culture and the heterogeneity that was beginning to define it in the early 20th century.
“Children with Half-Starved Bodies” and the Origins of Public Health Nutrition, 1880-1960
This project examines the factors that led health authorities in the early 20th century to develop comprehensive public health nutrition programs. Although medical professionals had agreed since the turn of the 20th century that malnutrition was a severe public health problem, particularly for infants and children, as late as the 1940s they still disagreed considerably about the clinical presentation, incidence, diagnosis, surveillance, and prevention of malnourishment. Malnutrition was characterized not by the presence of something foreign, as with infectious diseases, but by the absence of something essential. Much like obesity today, malnutrition in the first half of the 20th century provoked an eristic dialog on the limits of state intervention, the role of health professionals in prevention, and the nature of illness itself.
Health authorities thought early detection critical for prevention, but this raised significant questions about how to define “normal” health and how to measure deviation from it. In the search for standardization, a fundamental contradiction emerged: as research into human growth and nutrition increasingly quantified both child development and the relationship between food and health, the diagnosis of malnutrition seemed to defy quantification. Numerous anthropometric and biochemical methods were proposed for use in both private practice and in population-based programs, but all failed to identify malnourished children reliably, foreshadowing the failures of the body-mass index (BMI) to identify obese persons a half century later. By the late 1940s, physicians turned increasingly to technical (and expensive) diagnostic techniques, and health departments abandoned the search for simple but effective surveillance methods in favor of a universal, non-controversial approach to malnutrition based on education, enrichment of foods with vitamins and minerals, and dietary improvement.