Friday, December 9, 2011

Do you eat like a lab rat?

Like you, Rattus norvegicus enjoys snack cakes

Are you living the rat race? If you labor alone, yet have little privacy, find your work unrewarding, and your lifestyle stressful, get little exercise, fresh air, or fresh food, and find your weight is also slowly climbing, you might be living like a lab rat.

Rattus norvegicus 101

Rats are not mice, and mice are not men. Lab rats are larger than their wild relations, and much larger than house mice. Wistar rats, one of the most common lab rats, weigh 300-500 grams on average as adults, depending on their sex. By comparison, a typical house mouse weighs only 10-25 grams.

Lab rats live in boring, confined, sedentary, lonely conditions, where food is the only constantly available reward. In controlled studies on diet, standard lab rats are typically fed ad libitum, or free-fed, on a standard rat chow. Under these conditions, rats steadily gain weight.

In a fifteen-week study on the so-called “cafeteria diet,”[1] rats fed ad libitum on standard chow gained almost their entire starting weight, from an average starting weight of 300 grams to over 550 grams (see Fig 1e).  

Rats living in the wild are rarely found weighing over 500 grams. The weight gain of lab rats on fairly boring, but unlimited diets is the baseline effect of living like a lab rat. So what happens if you switch to a low-fat formula?

Messing with the formula

Rats fed either a high-fat chow (HFC), or a low-fat/added sugar chow (LFC), both gained weight more quickly than the rats fed a standard formula (SC). You can see by the graph that, while both the high-fat and low-fat fed rats gained more weight on average, the tight clusters in the chart spread out noticeably on both of these modified feeds. This means that there was more variation among individual rats in their response to being switched from standard chow to a new formula. The new formula was either higher in fat or in sugar, suggesting higher palatability.

Those eating the LFD ate the most sugar, even more than rats with unlimited access to snack foods. The researchers also noted that the rats initially eat more of the new food, but by week two, their consumption declines to a level of mild overeating that remains steady through the remainder of the study.

What is the “cafeteria diet”?

But this study wasn’t just on what happens when you change the standard chow. The real test group in this study were those receiving what the researchers call a “cafeteria diet” (CAF). This group of rats got standard rat chow, ad libitum. They also had snack foods, more than they could possibly consume. Three different kinds of snacks, rotated for variety (a full list of the snacks, which included items such as wedding cake and Doritos, is here), were available each day in large quantities. As the researchers write, “this diet engages hedonic feeding.”

The rats in the CAF group naturally disdained most of their standard chow in favor of the tastier snack foods. They ate markedly more food than any of the other rats, and continued doing so even after the two modified-feed groups (HFC and LFC) adjusted their food intake after the second week. Those eating the CAF diet of standard chow plus snack foods continued to eat about half again as much food by weight as those on standard chow alone, and gained weight at about twice the rate. That’s some energy-dense food. These were the heaviest rats at the end of the fifteen weeks, averaging around 750 grams: two and half times their starting weight, and half again as big as the largest wild rats.

What happens when you put humans on a cafeteria diet?

Human volunteers—lean, adult men—allowed to eat whatever they wanted from refrigerated vending machines overate by about 50-60%, just like the rats. The resulting weight gain was about five pounds for each of the volunteers in a one-week study. [2]

As in the rat study, the humans were confined, had few opportunities for recreation, and were constantly observed. The rats, like the humans, were confined in pairs. The humans typically ate alone; no mention is made of the social dining habits of lab rats.

The food that they ate was not exactly the same. While the so-called cafeteria diet that the rats were placed on in the Obesity study was made up of standard rat chow plus an abundance of snack foods, the humans were not offered any kind of standard human chow: no rice, kale, and beans or poached chicken breasts over green salad; no meal replacement bars. Instead, they got all of their food from a set of refrigerated vending machines, stocked with commercially prepared foods. A full list is included in Table 1 of the 1992 American Journal of Clinical Nutrition study: note that the only fresh fruit or vegetable the human volunteers got all week was apples. The foods available were items such as white bread and chicken pie, as well as rat-approved snack foods like cake and Doritos. There were no green vegetables in any of the meals, and very few vegetables or fruits in any form.

Of rats and men

Among the differences between the human and rat studies above, the first, of course, is that humans are not rats. The second is that there was no standard chow available to the humans, only hyperpalatable food, while the rats had access to both. The third is the duration of the study.

Presumably, humans will also adjust to their diets, even if they are eating from vending machines. No one could maintain a lifestyle in which they gained five pounds a week for very long: that’s a hundred and ten pounds a year. Since it’s so difficult to get human volunteers to agree to longer studies, it hasn’t been well documented whether laboratory humans will make the second week adjustment the same way that laboratory rats do.

What’s the difference between your rat race and a rat’s?

Studies on the so-called cafeteria diet are really on the effects of food deserts and the stress of confinement. When all we have is food to reward us in our dingy, artificially-lit cubicles, we gain weight and overeat. Switching from standard commercial fare to a low-fat version, or even to a high-fat version, doesn’t help matters. And when we have really sweet and fatty food as our only reward in life, we eat even more of it.

Humans can insert consciousness, as my therapist likes to say. We can examine our motivations for eating out of pleasure or emotion rather than hunger, and decide whether to proceed or not. Whether to pull the emotional lever. Rats can’t do that. They will go for it, every time.

You can get out of your maze. Your food doesn’t have to come out of vending machines. Your only food options are not chain restaurants and commercial packaged foods, with the odd highly-shippable iceberg lettuce wedge or conventionally grown apple thrown in as a concession to health.  You can cook your own food from fresh ingredients. You can exercise and seek out other stimulating activities that engage your hedonic circuits.

You are not a rat, and you are not trapped.

Monday, October 24, 2011

Pick a new plate: how MyPlate fails

Choose for healthy commodities
At no time in history have we had more access to information, and yet as modern people today many of us don’t know anything more about how to feed ourselves than bringing it from plate to mouth. Every day, millions of people eat food that they don’t know how to make. They don’t know what the ingredients are, where those products come from, or how they’re manufactured. We have lost traditional knowledge of how to make food, but also, we’ve lost the larger knowledge of what our diets should look like. Food that used to be regional, seasonal, and prepared and eaten in particular ways, have been replaced with mass-produced products, even fresh produce, that doesn’t vary throughout the year, or from place to place. For guidance on eating these foods, instead of family and community, we have central authorities and credentialed experts. Foremost among these authoritative institutions is the United States Department of Agriculture.

To begin to understand why the USDA is a suspect place to get nutrition information, begin with the USDA mission statement. The USDA’s mission is to promote commodity crops: grains, meat, and dairy; and, to a far lesser extent, garden crops. A pie chart of subsidies from the USDA to the various food industries looks like this:
Subsidies to commodities

Compare the subsidies the USDA gives to the largest producers of these cash crops, with the USDA's official recommendations on diet to individuals. Then look at other sources of advice on diet that do not share the USDA’s ethical issue of being in the business of supporting business as well as being tasked with promoting the health of Americans. The bias becomes clear. MyPlate is skewed in favor of keeping legacy food groups that are commodities, like dairy and grains, on the plates of Americans, and creating new groups, like “Protein,” to encourage us to eat the new industrial food commodities, including refined soy and whey proteins.

Someone at the USDA is probably very proud for having thought up the “Protein” food group. It seems so inclusive, like a nondenominational blessing. It caters to several belief systems about food at once: industry-friendly nutritionism, the “plant-based” diet continuum from flexitarian omnivores to completely vegetarian vegans, and fat phobia. Unlike “meat,” “nuts,” or any other real food containing protein, “protein” itself is refined and treated as a commodity. MyPlate reflects the belief in a plant-based diet as health-promoting, while doing nothing to promote fresh produce. Highly processed foods like breakfast cereal, skim milk, and orange juice are included in its food groups.

Yet the USDA finds fault with fatty meat from a pastured animal, and with whole, raw milk: neither of these are part of an industrial diet, and their industrial counterparts are both correlated with disease. Because the USDA does not recognize the differences between industrial and natural foods, like the more balanced omega fatty acids profiles or other vitamins and enzymes present in pastured animal products, these healthy foods are not part of MyPlate, and their consumption is actively discouraged: the meat because it is fatty, and the milk because it is unpasteurized.

Dairy is still promoted, of course. A round plate should be a pie chart, but there’s an extra plate--or MyGlass--off to one side, full of "Dairy." And although the proportions are clearly intended as a guide, the shape of this pie chart makes it difficult to read. How much of each of these food groups is in a healthy diet? Should we “eat 120%”? Why isn’t MyGlass instead a salad, a cup of bone broth, a pickle plate, a glass of wine, or plain water? Can MyPlate even be modified into a proper model, or should it be scrapped in favor of a whole new shape?

While bone stocks and fermented beverages are also important traditional foods, they are not commodities, and their protein sparing and enzymatic benefits are simply not recognized as significant to the nutrition scientists or public health professionals who created MyPlate. Collard greens have almost exactly as much calcium as milk, cup for cup, but few people know it, because there’s no “Got Collards?” campaign educating the public, and there’s no “Dark, leafy green vegetables” group on MyPlate, either. This promotion of commodities is far reaching and very effective. Children learning about the food groups for the first time are introduced to milk, but not to collards. Which is enshrined in the school lunch program? Which deserves to be?

Milk isn’t even a necessary food. While Americans certainly drink lots of milk, traditionally, the people of the Americas didn’t drink cow’s milk. Throughout Asia and much of Africa, people don’t drink milk. Milk doesn’t need promoting as a food.

There should be more food groups on MyPlate dedicated to vegetables. Of all of the food groups on MyPlate, no group contains more diversity than this one quarter of the plate.

The “Vegetables” group, like the “Protein” group, lumps together foods with very different properties. Orange or green, starchy, fibrous, sweet, bitter, and sour, they all have different traits. Most traditional diets include some raw vegetables, and a fermented vegetable dish or two, like kimchee or sauerkraut. Raw and fermented foods provide many benefits worth promoting on MyPlate. Dark, leafy greens, and pod and flower vegetables, are important enough to merit their own, separate food group, and should be consumed often. Pretending that all vegetables are equivalent, from acorn squash to zucchini, raw kale salad to French fried potato, is just as gross an oversimplification as putting all “Protein” foods in one group.

Seven food groups promoted by the USDA in 1943
One of the original seven food groups was butter. Fats and oils, particularly the fats found in meat and dairy from animals raised on pasture and forage, are critical to human health. Once the pinnacle of the food pyramid, butter, lard, and other healthy fats are not anywhere to be found on MyPlate. The Weston A. Price alternative guidelines, based on foods included in all traditional diets, has just four food groups, one of which is Fats and Oils. Fat is a macronutrient, like protein, and just as essential to life. But MyPlate, with its disdain for traditional knowledge and exaltation of nutritionist dogma, fatphobia, and magical thinking about dietary fat, doesn’t include a “Fats” group at all.

The Food Pyramid of 1992
Then there’s everything else about food that nutritionism, and MyPlate, completely ignore, because it’s too hard to produce on an industrial scale, like the enzymes in raw foods. Nutritional guidelines from the USDA do not assign any value to a food for its environmental sustainability. They avoid any suggestion that fresh, organic, locally grown food in its season is preferable in any way to commercially prepared, conventionally grown alternatives. MyPlate does not suggest any of the synergies of traditional, complementary food pairings with its groupings. It doesn’t provide guidance on making meals suitable to different bodies, stages of life, or cultural norms. MyPlate doesn’t suggest a social setting, or the value of sharing a meal with family or friends. And even as MyPlate promotes a plant-based diet, it provides insufficient guidance to vegetarians and those with common food allergies to wheat, corn, soy, and dairy.

The most gaping flaw in the MyPlate model is that it does not provide enough guidance to be properly nourished. Since its only objective is to guide Americans in eating a healthy diet, it appears to be a failure. If, however, its real purpose is to uphold the USDA mission to promote commodity crops, it appears that MyPlate is a success.

Tuesday, October 4, 2011

Doubly SADD: The Standard American Dieter’s Diet

What’s wrong with the way Americans eat?

The way most Americans eat, and increasingly, the way more people in the world eat, is SAD. SAD is an acronym for the “Standard American Diet,” a generalization of how people in the most highly industrialized nations eat. Depending on who you ask, the problems with the SAD may include:
  • too much meat 
  • too much fat 
  • vegetable oil 
  • refined flour 
  • refined sweeteners 
  • chemicals 
  • foods denatured by excessive heat 
  • not enough plant-based foods 
  • not enough fiber
However, not every one of these factors has a clear relationship to how much energy storage we carry in our bodies, or to our health. The changes to our diet that have correlated with the obesity epidemic have not been in how much meat we eat, but in how cheap and highly available vegetable oil and high fructose corn syrup have become in our diets since the early 1980s. The Standard American Diet is undoubtedly a highly processed diet, relying on the produce of conventionally raised plants—grown in depleted soil, genetically modified, grown with pesticides—animals raised in abusive conditions, on unnatural diets; and industrial chemicals. The SAD is not natural: it’s manufactured using modern technology, and the resulting diet is unlike anything we ate in any other period of human history.

The Standard American Diet is made of cheaply produced foods that have been modified to appeal to our senses so that we will buy more of them. They are a product first, not an essential of life. Purveyors of SAD food want you to keep buying their products. Your wellness is not their primary concern.

For most of human history, the ability to store fat in times of trouble was an advantageous feature that helped us survive, not a defect that threatened our well-being. The Standard American Diet takes advantage of loopholes in our feedback systems of satiety and satiation, rather than being optimized to most healthfully feed our human bodies and psyches. The SAD provides much more calorie-dense food than any we naturally had access to through most of history, but is otherwise nutritionally inadequate. People living on food like this will tend to eat too many calories, because no matter how much bad food a person eats, the body never gets its other nutritive needs properly satisfied.

What’s wrong with Standard American Diet food?

A huge percentage of people in America, and increasingly, the rest of the world, are on weight loss diets, either now or at some point in their lives. Even those of us who aren’t on a diet right now often have some idea of what would be a healthy diet, and how close our own diet is to that goal. “Diet food” is very often as bad or even worse than what it replaces, because its qualities are based on faulty assumptions about what causes people to gain or lose weight, and on a highly simplistic view of the relationship between weight and health. Our concepts of diet for weight loss are punitive, classist, and feed a cycle of shame that precludes healthy attitudes about our bodies and what we eat.

Willpower is not infinite. Many diet foods suggest that they take the willpower out of the equation by providing a technological marvel such as “fat free cream cheese” or “sugar free, calorie free sweetened beverages.” Of all of the tricks that industrial food plays on our senses, diet food plays the worst tricks of all, by pretending to be more food than it is, with thickeners, added flavors, and artificial fats and sweeteners simulating calories. The reason these tricks can be played at all is because we have two feedback systems in our bodies for regulating food intake: one for satiety and one for satiation. Satiation has a quicker response time, quick enough to tell us to stop eating. It’s a heuristic for satiety, which takes longer to report to the body, because food has to be at least partially digested and absorbed for the body to know for sure what it contains and what the body may still need.

Diet science is junk science. The reason all of these tricks are played is a belief in the so-called “calories in, calories out” model of body weight that, like the Body Mass Index, is made up of some numbers that make bad superstition look like good science.

The BMI is the calculated ratio of your height over your weight. The CDC claims that it is a reliable indicator for most people, yet this ratio doesn’t actually tell you anything about an individual’s actual body mass. It makes the assumption that not just most people, but you, specifically, will have the same ratio of body fat to lean mass as other people of your height and weight. It also assumes that, even if we knew what your lean to fat body mass ratio is, that it would tell you something important about your health. Clearly, this is all bullshit. Even people who’ve never heard of Health at Every Size can see that a highly active athlete and a couch potato can have the same BMI, and enjoy vastly different states of health.

The “calories in, calories out” model presumes both that we can accurately know how much nourishment you get from your food, and how much you expend. Even under laboratory conditions, we have isolated a number of factors that change how much of different nutrients people absorb from their food, from how palatable the food is to whether the person has an innate inability to properly digest the food. In the real world, other factors affect how much we eat, and these methods are studiously applied to us every time we eat at the home of the endless soup bowl. We’ve only just begun to understand the role of our gut flora in nutrient absorption, and that each of us has a different makeup of the many species of bacteria that live in our bodies and without which we would not exist. We know for sure that not everyone gets the same benefit from each food, each time they eat it. Calories are measured by burning a food to ash, not by observing their effect in human bodies, and not in yours, particularly. This measurement on the “Nutrition Facts” label doesn’t guarantee that the food in your package precisely matches the sample that was measured, or that you will absorb the nutrients listed. Being able to accurately measure not just the potential nutrition you ingest, but what you actually absorb and how your body uses it, is beyond our abilities to scientifically measure.

Therefore, we can conclude that there are healthy ways of eating that are older than our concept of the kilocalorie. Luckily, we still have our bodies to tell us what to eat, how much, and when, as long as we don’t game the system with industrial foods, including diet foods.

How weight-loss dieting damages health

Industrial food can never be the basis of a healthy diet.

It isn’t just that food that is manufactured is not fresh, or that it’s unbalanced compared with what we evolved to eat. It costs society more to allow industry to “add value” to food, including allowing industry to add the value of “diet versions” of the same foods, than it would be to ban all of it and revert to earlier ways of feeding people: fresh food, grown locally and prepared to meet personal tastes and requirements. The costs aren’t all in one place, and many of the costs are born by our bodies, which labor under the sufferings caused by bad food.

Chronic stress levels keep cortisol levels high, which drives us to eat, especially starchy foods. Dieting is a willpower-enforced famine condition. It’s profoundly unnatural and very difficult to sustain. It is a chronic condition of not getting enough calories, of baiting-and-switching the satiety/satiation feedback systems in the body. The stress of chronic malnourishment and exercise of rigid willpower drives up cortisol levels, which only serves to increase the drive to eat carbohydrate-rich foods. Eventually, the body asserts its needs for food, and the dieter falls off the diet.

Instead of realizing the actual factors that are causing the epidemic of obesity---an increasingly industrialized food chain, stagnating quality of life, increased stress---we persist in this notion that systemic problems should be fixed with individual willpower. Willpower is like spoons, or socks: we don’t all have a lot of them, some of us have to make careful arrangements not to run out, and the people who are hurt the most are the ones most in need and have the least. Asking people already laboring under the burdens of an unhealthy society to increase their personal expense to fix the same health problems that chronic stress and injustice cause is victim-blaming, judgmental, and unfair. It also doesn’t work.

What is really making us fat

So we know that calorie-counting doesn’t work for eating. “Calories out” is just as fuzzy a number as “calories in,” and for the same reasons: it’s practically impossible to know exactly how much energy a person expends in an activity, even one that has been measured in other people, because we differ metabolically. One example of the differences among people’s metabolisms is that some of us who may be described as metabolically tending toward obesity burn glycogen, which is our quick go-to energy source in the body, at much higher rates than people whose metabolisms do not incline them toward obesity. Not only do those of us with this metabolic tendency burn through our glycogen faster while at rest, but also when exercising. Additionally, we may be less “metabolically flexible,” meaning that our bodies are less able to shift to burning energy from fat stores than the “metabolically lean.” This makes it perceptibly more difficult for some people to do exercise.

Another factor that tends to keep people fat is one of insulin resistance. Eating a diet that delivers huge jolts of sugar to the bloodstream, several times a day, makes the body’s cells resistant to accepting insulin. Since insulin is what brings energy to our cells, we’re starved and exhausted at the cellular level. The energy goes into fat stores, not into the cells that need it.
Historically, when humans were chronically stressed, it was in conditions of famine and other physical hardship, and having the ability to use less energy and store it in our bodies, instead, was an advantage. The world we live in has so changed that this stress response is no longer a helpful adaptation. This is most evident among the people with the least power to make non-mainstream choices. People who are poor, abused, lack transportation, live in food deserts, can’t cook for themselves, are institutionalized, have no safe places to exercise, live with trauma and other chronic, painful, and debilitating conditions, work in unsafe conditions, and live with other modern stresses, are the hardest hit by the effects of industrial food. Our modern, efficient food delivery system cures energy starvation, but instead causes diseases of overconsumption and malnourishment.

The effects of metabolic inflexibility and insulin resistance make it increasingly difficult to exit a spiral of sedentary lifestyle and obesity. Add chronic stress, limited food choices, and few socially acceptable ways for fat people to exercise, and it’s easy to see how modern life makes people fat: not because we lack willpower, but because it’s unreasonable to expect people to expend energy as if it’s unlimited, to overcome the obstacles placed there by the same institutions that are supposed to make health easier to achieve.

Eating well is for every person

Health isn’t measurable through a BMI. Health is a subjective measure of one’s well-being and includes physical, social, mental, emotional, and spiritual factors. It’s not just your risk of developing a serious illness, but how well you feel each day.

The dietary model of willpower and calories doesn’t work for weight loss. More industrial food, harmful misinformation about eating and health, and victim-blaming exacerbate the health problems that industrial food has wrought on society: malnutrition, overeating, guilt, poor esteem and health. The Standard American Dieter’s Diet relies on the false science of calories and nutritionism, and the false assumption that willpower is an infinite resource. It assumes that our systemic problem is actually just a lot of individuals with willpower problems.

It suits the diet industry for their diets to convince you of their efficacy, but continually fail you. A beguiling promise that makes you their convert, and their consumer, is a cash cow. No one stands to make much money encouraging people to eat home-cooked meals together with their families. But everyone in society stands to benefit from increased well-being, which is why public health is properly publicly funded work, not contracted out to McDonald’s. It’s time to remember what the covenant of civilization is supposed to be for: not to make a few rich, but to give everyone a level of freedom and security in pursuit of their natural rights.

The cycle of bingeing, starving, and failed attempts to gain control over one’s life through willpower, is bad for every aspect of one’s health: self-image, nutrition, and energy levels. What is needed is not more willpower for fat people, but a unified model, universally embraced, of competent eating that includes a healthier attitude about food, treating food as a right and a pleasure instead of drudgery, more complete knowledge of how our bodies work, and a better food system that actually meets our needs for health first, rather than the corporation’s bottomless pit of desire for profit.

Tuesday, September 13, 2011

How to eat: hunger and its satisfaction

Hedonism Bot knows what hedonic circuits are
The concept of eating according to my desires and trusting my body to choose what I need, and only as much as I could use, was first introduced to me as a new parent. Folk wisdom derived from Ellyn Satter’s Division of Responsibility in Feeding came down to me as a method to feed small children. Based on this trust in the wisdom of the body, Satter encourages caregivers to provide regular meals including a variety of healthy foods, but not to worry about which foods the child eats or rejects.

Years later, on the Beyond Vegetarianism website, I read about Anapsology, or the Natural Hygiene movement. Like others in the ancestral health movement, these people attempt to mimic the prelapsarian conditions in which we thrived. “Instinctos,” as Anapsologists are also called, eat only one food at a time, in its unseasoned, uncooked state. They choose the food by following their desires, and eating attentively until they no longer want any more of the food. They describe the sensation of a food suddenly becoming less palatable as a “stop.”

The “stop” is also known as “satiation.” J. Stanton’s educational series on hunger and its satisfaction was a revelation for me, beginning with its vocabulary lesson in Part I. Not only were the subjective experiences of my body important and discoverable, but now I could talk about them with specificity. My hungers were not just my problem, or shameful. They were clues that I could use to satisfy my needs. Free-feeding from a healthy selection wasn’t just for small children: adults are also designed to eat according to their body’s cues.“Hunger is not a singular motivation: it is the interaction of several different clinically measurable, provably distinct mental and physical processes,” Stanton wrote, and I realized that I was learning the very basics of how I could feed myself competently.

Not only is there more than one hunger, the body has two separate systems, satiety and satiation, that tell you when to stop eating. One system, satiation, is a heuristic that estimates the nutrition you’re getting from the foods you’re eating, based on everything from what the food tastes like, what else you’ve been eating lately, and how much other people seem to be eating, to the comfort of your surroundings at mealtime. Satiation is meant to be a good predictor of satiety, which is the satisfaction of our nutritive needs.  Satiety is based on having digested the food, and knowing for certain what it contains. When we are sated, we have gotten enough energy, protein, fat, and other nutrients. We feel satisfied: replete.

The satiation system is accurate in a healthy person, offered a natural selection of foods, closely predicting the nutrition we will be able to get from the food we eat. Having a system that provides instant feedback at the moment we can use it is more useful than waiting the several hours it would take for satiety to occur. Satiation prevents us from chronically overeating until we feel sated. If we can't feel the "stop," we have to rely on other clues to tell us to stop eating, or eat until we are uncomfortably full.

The unhealthy effects of food deserts that limit our real or presumed choices, industrial foods that are designed to be highly palatable, large serving sizes that fool us into thinking this is what people typically eat at a sitting, and unexamined attitudes and beliefs about the relative value of different foods, all sabotage the hedonic circuits that translate food reward into future satiety. Our satiation systems, guessing at what will ultimately sate us, are led astray by hyper-rewarding food. When we eat quickly, eat foods that are bland, or drink high-calorie beverages, we take in the food too quickly even for our satiation signals to tell us to stop before we’ve had enough to sate. On the other hand, when we eat foods that simulate starch, fat, sugar, or salt, the satiation messages we receive do not match the foods’ abilities to sate. In either case, we may eat to satiation but not be sated, if our food doesn’t contain what we need.

Creating a lifestyle that supports healthy eating means correcting the environment, as much as possible, so we receive realistic messages about what we’re eating and can “hear” what our bodies want. The Instinctos do this by limiting their definition of food to what could be found and eaten in the wild, where we presumably thrived before civilization. Other traditional diets manage the environment by using the tools of civilization to regulate food supply and demand. Customs limiting the foods one may choose from, or the ways those foods are combined, or when they are eaten, are used to impose healthy limits. Satter’s program, as it is applied to adults, employs traditional methods like having regular meals and sharing them with others, as well as listening to the body’s cues for hungers and their satiation.

Being able to eat competently requires confidence that we know what is good to eat, that there is enough food, that it will be offered at frequent and predictable intervals, and that we can eat according to our pleasure and senses. An important step toward achieving this level of confidence is developing trust in our hunger drives, and being trustworthy in satisfying them.

Tuesday, August 30, 2011

What is ancestral health?

The vegetarian movement is gaining speed. Walking into the local Whole Foods Market this summer, I noticed a large banner hanging inside the entrance that recommended a “plant-based diet.” A New Yorker interview with co-CEO, John Mackey, suggests that the vegan at the head of the company would like to lead Whole Foods shoppers away from the meat counter. One of my food heroes, Hugh Fearnley-Whittingstall, who once killed a chicken on live television, has recently declared himself a mostly-vegetarian.

My own dietary practice is guided by a belief that people have always eaten meat. While I share a concern for animal welfare, I believe the ethical answer must include satisfaction of our animal needs. A fear of meat suits the industrial food complex, frankly: it’s cheaper for them, and has a higher profit margin, to sell you an overprocessed “burger” out of the freezer case than to make truly healthy meat—properly raised, humanely slaughtered, and untainted—available to everyone. The industry moves slowly, so while they’re turning their barge around, they have some cash crops to unload on the consuming public. I’m even cynical about the health news cycle. They all chant in chorus, from fad to fad: first fat was bad for us, then carbs, and now protein. Eventually, they’ll have to cycle back around again, or there will be nothing left to eat.

The plant-based approach has a challenger in the ancestral health movement. “Ancestral health” is an umbrella for practices that attempt to closely emulate the way people ate and moved a long time ago, with the aim of curing or preventing diseases of civilization. Although there are varying interpretations, the general assumption within the ancestral health movement is that for most of our history as a species, people ate a diet that met our nutritional needs quite well, and that diet has never come in a box or a can, had a nutrition facts label, been manufactured, or been entirely devoid of animal products. Where those in the ancestral health movement begin to disagree with one another is on the matter of how long it’s been since we ate the right diet for human health.

According to those who advocate the “Paleo diet,” our ideal diet is the one we evolved to eat, fifty thousand years ago. People who follow it vary in the strictness of their interpretation, generally deferring to Dr. Loren Cordain’s book (and blog), "The Paleo Diet," as the first and most orthodox guide. Internet discussion tends to concentrate on certain pet topics: what kind of exercise people regularly got in the Paleolithic, and whether some agricultural food or another is permissible on the diet, are common subjects. Paleo is popular with the CrossFit crowd. My other friends, the ones who shop at Walmart and BJ’s, and even the ones who shop at Whole Foods, have mainly never heard of the Paleo diet. I start by explaining the basics of the Paleo diet. You can eat meat, fruit, and vegetables (but no nightshades), and nuts. No dairy, grains, beans or lentils, on the assumption that before agriculture was invented, ten thousand years ago, no one ate dairy and wild grains were too small and sparse to be a significant food source.

After the Paleolithic era came the Neolithic, with its revolution of human innovation: farming was the first of these technologies, and its benefits gave us more leisure, food security, and hence, more people, to invent even more things, including ways to feed even more people.

Haven’t we managed to find any healthy ways to eat in the last ten thousand years? Dr. Weston A. Price was a dentist who traveled around the world, looking at people’s teeth. He found some commonalities among the many different indigenous people who still ate pre-industrial diets in the early twentieth century. They had terrific teeth, for starters. They also lacked certain malformations that we think of as perfectly normal today: narrow faces, overbites, crowded teeth, wisdom teeth that have no room in which to erupt. In some of these cultures people traditionally ate only dairy, or had very little meat, or ate a lot of seafood. All of the people in the cultures Dr. Price found ate animal products, some nearly exclusively. They all had cultured foods in their diets, and ate at least some of their foods raw (including raw animal products). Those who ate grains and legumes soaked, sprouted, and fermented these foods before eating them.

It would appear that, for at least some people who are gluten or lactose intolerant, some of the staples of agricultural diet are undigestible. Even greater numbers of people suffer ill health from eating sugar, factory-farmed meat and dairy, and refined flour. Both the Weston A. Price people and the Paleo people believe that industrial foods are bad for your health. They disagree on whether to accept the compromise we made ten thousand years ago.

Tuesday, August 16, 2011

Nutritionism: a fabrication of the Industrial Age

When Spaniards brought maize to the Old World, they failed to import the traditional wisdom for processing the corn to make the niacin bioavailable. Wherever corn consumption spread in Europe, a disease called pellagra became endemic. Hundreds of years later, when the cause of the condition was proven by Western science, instead of using the ancient method of nixtamalization to process corn meal, niacin was added.

The way nutrients were first discovered was through industrial food. White rice, flour, and sugar started off as luxury goods. They were “refined” in both senses of the word: undesirable matter was discarded, and the resulting product was smoother and more appealing. At first only the rich could afford them, but as demand grew for the refined foods, scale of production increased and prices dropped. When these industrial foods were produced on a large enough scale that everyone could afford them, they became so cheap that they became staples of the very poor. And that is when we started to see how poor these new foods really were, because people who lived on them developed diseases of malnutrition.

Diseases of malnutrition don't always have very obvious signs, so it took a while for scientists to piece together the clues. When people living on white rice developed a range of symptoms, including difficulty walking, sufferers called it “beriberi.” The condition was cured with thiamin, which is found in rice bran, but which is removed from white rice. White rice was adopted by industry because it stores for longer periods than brown rice, so instead of switching people back to brown rice, with its shorter shelf life and lower profit margin (corporations make money by "adding value," i.e. processing, to raw ingredients), they solved the problem more cheaply and profitably ("Now with thiamin!") by fortifying white rice with thiamin.

And so it went, learning from one mistake after another, and then finally through concentrated study, what parts make up food, their functions in the body, what in food can be dangerous, and how much and what kinds are needed for health. "Nutritionism" is the belief that our need for food can be deconstructed into a finite number of irreducible and replicable substances called “nutrients,” which may be assembled in any order convenient for manufacture. For many years now, ready-to-eat breakfast cereals have been supplemented with vitamins and minerals, justified by the tenets of nutritionism. The additives in Captain Crunch are typical: niacinamide, reduced iron, zinc oxide, thiamin mononitrate, pyridoxine hydrochloride, riboflavin and folic acid. Many are in forms chemically distinct from the forms found in food, and in some cases, we even know that the body doesn’t use the synthetic, supplementary nutrient in the same way as when these nutrients are found in our food.

Under nutritionism, any compound in a food can be isolated, separated, and either concentrated as a nutrient (i.e. soy protein isolate) or discarded as undesirable (i.e. fat-free milk). Which nutrients we value, and which we abhor, change like trends in politics and skirt lengths. Fiber wasn't valued by those who refined rice, considering it an inefficiency in getting enough energy in one's diet, but today it's added to foods as a supplement. In the 1980s, everyone feared fat and dieters ate big plates of pasta and baked potatoes. Since then, the tide has completely turned on carbohydrates, while we retain fear of both fat and calories. It creates anxiety around food to be afraid of most macronutrients, a fear that industry nurtures and is glad to assuage with their highly processed offerings.

The ultimate faith act in nutritionism is to attempt to live on nothing but industrial meal replacement. But to do so would be folly: the evidence suggests that industrial food is causing a whole new set of diseases today, from diabetes to cancer. The pace of change in our diets has grown exponentially: so quickly that we can’t even reliably look to our oldest living relatives for a model of a sustainable diet. Nutritionism sells peace of mind, with their certainty that science has just unlocked the nutrient you need to add to your shopping cart.

We won’t find the answers to our health care crisis in the industrial model, because the machines of capitalism don't back up: they can only plow forward, doing what machines do, which is make more products to sell. Just as industry's answer to beriberi is fortified, sugar-sweetened, chemically preserved grain flour, they have an answer equally appetizing and easy to manufacture for heart disease, breast cancer, or diabetes. They'll even sell you food for obesity. As Michael Pollan points out in his famous essay, “Unhappy Meals,” the fact that an industrial food product has a label and makes health claims should make it suspect. It's a product first, not a food.

What we know about nutrition may be wrong, and is certainly incomplete. However, we know a great deal about food.

So: Eat food.
The rest is commentary.

Tuesday, August 9, 2011

You can't manufacture food security

This has been a good time to talk shit about the food industry. In California recently, raw milk sellers were arrested: it reminded me of when the medical marijuana dispensaries there were raided by the feds, as a battleground in the war of states’ rights. At the same time that fresh milk and dried herbs were being seized and destroyed, food-manufacturing giant Cargill was regularly selling meat that the USDA knew contained salmonella. No one died or even allegedly got sick from drinking raw milk from the California suppliers, but as one blogger astutely noted, if they had, it would have been easier to trace the bad milk back to its source. Cargill, on the other hand, is so big that they deal with other giant suppliers. Back in 2007, one of their beef suppliers sold them beef adulterated with E.coli strain 0157:H7. Right after the 2011 turkey recall, it was announced that Cargill is suing that supplier.

The USDA doesn’t consider salmonella a contaminant and only considers one strain of E. coli—the antibiotic resistant strain called 0157:H7—to be an “adulterant,” even though any strain of E. coli in your food can make you wicked sick. The difference in the worrying strain with all the extra numbers is that we don’t have a good treatment for this new strain, yet. This is the danger of the arms race against bugs: we come up with a spray or a pill that kills a bug that plagues us. We start using it all over everything, like people washing their hands with antibiotic goo that they keep in their handbags and on wall dispensers. Since there’s no where else for the germs to go—some of them need us as hosts, or to breed—the bugs evolve to resist our efforts. The game has just stepped up a notch. We see it as the problem of resistance against common sprays like Roundup, and germs that sicken humans and livestock, like E. coli and salmonella.

We can’t actually sterilize the planet against bugs we don’t like. I have the normal revulsion against icky things, including bugs, that lots of my peers do, with what we consider the privileges that protected us against a certain amount of suffering: we drank and bathed in clean water, and if we picked up parasites, our parents had us quickly deloused, dewormed, or otherwise dosed. The world is and contains within it robust, complicated systems that include these bugs. It’s part of the big plan of life that we will get diseases and sometimes die of them. It’s why I say it’s good that we want to reduce suffering, but we can’t imagine that there will be a world without disease or death. Our lives depend on the deaths of others, and even on our own deaths, and on the lives of those that depend on our deaths. Not just bacteria need us dead: our children need us to clear out to make room for them and our descendants.

I think my society’s attitudes toward life and death are at the root of our unhealthy food systems. It’s why we have no problem seeing everything we do as progress, and that more of the same will solve the problems that remain in the world. It doesn’t see our resources and how we prioritize them, or conceive of our problems and set about solving them, as being directed by any values. The only thing we seem to value is money, because it’s the one thing we consistently measure, and we translate every other cost—health care, environmental cleanup—into dollars in order to compare them.

I think people are very uncomfortable with choosing answers outside the mainstream. Usually, it’s a winning strategy to go with the herd. But sometimes the herd is wrong: sometimes the leaders have other agendas and priorities. Local councils representing the interests of a group of food producers are made up of people: they have families who eat their products, too. But the business interests of a farmer are much like yours: money’s tight, and you want a sure thing if you can get it. You don’t take risks you don’t understand. You grow what you have a buyer for. And so your interests become aligned with your employer, or with whoever’s buying your product and therefore calls the shots. A farmer who wants to be big and successful will embrace the model that the infrastructure already supports: cash crops, grown by conventional methods because that’s what the big buyers buy and what the loan officers at the bank want to hear, too. It makes cash sense to do this, and it makes the same kind of sense on up the line.

Put yourself in the mindset of the CEO of a corporation, beholden to make his stockholders money every quarter. The USDA thinks that it’s taking care of eaters because it’s taking care of farmers, because the farmers are convinced that their needs are as capitalists, and not as people who eat food. What’s for sale has nothing at all to do with what’s good to eat. It has to do with supply chains and the security of huge scale and inertia.

But you, cooking for you and your family and friends, do not make food decisions like Cargill. You can change how you eat today. Can you imagine how long it would take for all of those supply chains to change? We see how it changes when something ripples down from Washington DC. It can happen from a more local change, as when California or Massachusetts or New York leads the way by creating some local law: no saturated fats, or San Francisco causing the Happy Meal to sprout half a conventional apple. McDonald’s wants to be the same everywhere it goes, so even if you’re in Dubuque, you’ll get half an apple, too. Whole Foods Market is a national chain, and if you go in there looking for local flavor you’re going to be disappointed. You can start shopping at your farmer’s market, or the co-op, or an independent grocer. Making this change will have rippling effects on your life, too. You’ll cook more. You’ll learn about foods you hadn’t considered eating before, and ways to buy it you didn’t imagine, either. You can learn how to find your own, personal food security.