When I was in graduate school, I was looking for a scenic, quiet place to live. My university was on Long Island, 50 miles east of New York City, and I decided to explore communities that were farther out on the eastern tip. It wasn’t the fashionable part, with the Hamptons and Montauk Point, but the North Fork, made up of mostly agricultural communities and small towns. By early in the winter I’d found a wonderful old farmhouse tucked away along a bluff. From there, I could make my way through blackberry thickets and down the 70 steps leading to the beach.
The realtor had been hesitant to even show me the property. Though structurally sound, it obviously hadn’t been used in ages. The bathroom still worked, but there was no shower, and during the entire year I was there, the water ran brown from decades of rusting pipes. Exiting the bathtub was like emerging from a tanning salon. It was beyond rustic; I might have called it uninhabitable if it weren’t for my imagination and daily access to the breathtaking, endless beach. I wasn’t bothered by the fact that it had no refrigerator because I figured it would surely be cold enough most of the year to store things on the windowsill. Sure enough, when I aired out the place, I actually found a stick of margarine on the windowsill. At first glance, I thought it might have dropped off the assembly line recently, but after reading the label, I realized, to my amazement, that it dated back to the beginning of World War II—the last time the house had been occupied.
I had more surprises in the spring. My house was sitting on acres of deep purple—purple cauliflower, which I had never seen before. Long Island was rich in food production in those days, and this land was leased to a local farmer. Small farmers thrived by growing a diversity of foods that have since virtually disappeared from our mainstream food system. Purple cauliflower, a centuries-old heirloom from South Africa, flourished in this nutrient-rich soil and bathed in moist sea air. To my delight, this hardy companion was a healthy food. I would cut it fresh and sauté it whenever I prepared a meal—from farm to table before it came into fashion—in perhaps half an hour. It had a delicious, delicate flavor—and it didn’t require margarine.
So, I had the two extremes at my home—the dream of purple cauliflower and the nightmare of 30 year-old, still “perfect” margarine. As someone who loves to cook and entertain, I enjoyed watching friends’ reactions to these two marvels. The margarine inevitably produced shock and dismay, taking away people’s appetites. The cauliflower, on the other hand, had a way of perking up a dish and lifting moods, owing greatly to its vibrant color. The full purple effect started in the fields. Seeing my home tucked in the rolling violet blankets, friends would wonder about its taste and texture as they approached, wanting to know the story behind it, where it came from, how it got here.
Food is an everyday expression of our culture, and every cultural identity is partly tied to a unique way of preparing it. Since our survival requires consumption of food, our culinary traditions have reflected our history, both in terms of the land where our ancestors lived and what that land produced. Human ingenuity and community traditions added these basic materials, still reflecting a local ecosystem and its resources. As food cultures have been passed down through the generations, families and communities inherited their knowledge and rituals, which have had an enduring personal and shared significance.
So how, then, did the modern U.S. food system move toward the bionic margarine? From cultivation and processing to marketing and transportation, modern-day food consumption patterns have been re-shaped by corporate food production. In a relatively short time span, with developments in technology, aided by agricultural policies, industry changed its emphasis from the marketing of whole foods to manufacturing highly processed food products. Initial innovations, like frozen fruits and vegetables or tomato sauce had significant benefits, but the drive for profit has led to much greater and more extensive changes, including reduced standards for what constitutes “food.”
Consumers welcomed the food industry’s innovations, in part, because they lowered prices, reduced risks from spoilage, and extended availability beyond the growing seasons. Meals could be prepared more quickly, so women (in traditional roles) were able to spend less time in the kitchen. And for women who continued to work, there was still the expectation that they prepare meals, so convenience was perhaps even more important. Snappy marketing campaigns capitalized on consumer interest in ease and thrift.
Over time, industry and retailers altered the types of available foods and shifted national eating habits, concocting handy foods that were fun, fast, cheap, and tasty. But behind the alluring novelties and happy marketing is a reductionist science that breaks down food into nutrients infused with addictive additives and treats our food like a jigsaw puzzle—with the hubris to assume industry can out-do nature and put all the pieces back together again. Subsidized additives—by-products of corn and soy—have allowed manufacturers to modify “recipes” for greater profit. Instead of nature, health, and culture guiding food decisions, business interests became the driver. These changes have caused a radical shift in norms and a host of food-related health impacts.
Nutrient-dense, whole foods—like fruits, vegetables, grains, nuts, seeds, and legumes—that have sustained people for generations have been replaced by the three additives selected to make us buy and consume more: sugar, salt, and fat. Michael Jacobson, head of the Center for Science in the Public Interest and author of terms like liquid candy, coined the term junk food. Dr. Wendel Brunner, the public health director I worked for in Contra Costa County, goes even further by calling this food toxic waste.
I wasn’t surprised when the National Cancer Institute announced recently that nearly 40 percent of the calories U.S. youth consume are empty ones, coming from things like soda, pizza, and desserts. In fact, few U.S. youth eat the recommended amounts of fruits and vegetables. The predictable results of these trends have included epidemics of chronic diseases such as diabetes, heart disease, and stroke, all of which have roots in unhealthy eating and inactivity and have seen their onset of illness occurring increasingly in children and adolescents.
In a U.S. health care model in which disease is treated rather than prevented, the costs associated with these trends have, of course, been astronomical. As the former chief CDC medical coroner Dr. Beverly Coleman Miller once told me, even the organs of children and adolescents are changing: in examining the bodies of inner-city youth who died early—too often from violence—she was shocked to find that their internal organs were damaged to an extent she had previously only seen with older adults.
Reprinted with the permission of Oxford University Press, Prevention Diaries: The Practice and Pursuit of Health for All (copyright 2017).
Larry Cohen is founder and Executive Director of the Prevention Institute; a leading authority in developing practical prevention strategies for communities. Cohen’s accomplishments include catalyzing the nation’s first multi-city no-smoking laws; helping define violence as a preventable health issue; advancing chronic disease prevention through physical activity and healthy eating; and promoting better understanding of the underlying causes of illness, injury, and health inequities. He lives and works in the San Francisco Bay Area.