Friday, April 6, 2012

Meat and potatoes

Meat and potatoes have gotten a bad rap over the years, and they don’t deserve it. I’ve changed how I source and prepare my food so that I can be sure it’s good, healthy food, all the way to its source. Your meat and potatoes are probably going to kill you, but it’s not for the reasons you might think.



Frying potatoes in freshly rendered suet
Let’s start with the potatoes. How do you take your spuds: pre-fried and then individually quick frozen? Pre-fried, frozen, and then refried? Mash from a box? Unless you’re buying organic, chances are your convenience potatoes are grown with an eye for convenience to the supply chain rather than nutrition and flavor. Conventional agricultural practice requires so many toxic pesticides that potatoes are on the EWG’s “dirty dozen” list of vegetables you should never buy from conventional growers. Who does buy from conventional growers are the commercial producers of potato products: the companies that sell you frozen fast food French fries, hash browns, potato chips and other industrial potato products. And it’s not just the cheap meals that are made with conventional potatoes: just about every restaurant uses conventional produce and frozen convenience foods. Whether you’re washing down drive-thru hash browns with coffee, having a bag of chips with your sandwich, or order a baked potato with your steak dinner, the potato is likely to have been conventionally grown in a monoculture of Russet Burbanks that are sprayed with poisons, as well as being genetically modified to make their own pest killing chemicals.

Doesn’t that sound delicious?



Then there’s the frying oil. If you’re old enough and were raised within a few miles of a McDonald’s, you can remember what potatoes fried in beef fat smell like. According to “Fast Food Nation” author Eric Schlosser, for decades McDonald’s used a mixture of mostly beef fat with some cottonseed oil to fry their famous shoestring potatoes. In 1990, they switched to pure vegetable oil, although they still add “natural flavor” from “an animal source” to their French fries to evoke that beefy flavor and odor. Vegetable oil might be a step up from shortening, which really will kill you, but it’s still several steps back from the healthiest frying choice: their original, beef fat.

People have only started using large amounts of vegetable oils recently. In antiquity, we ate oils from fruits like the olive, which are evidently oily to the casual observer, and can be pressed by hand. The kinds of vegetable oils used today in great quantities, like canola and cottonseed, are extracted from tiny, hard seeds using pressure and heat, changing the chemical structure of the oil and generally making it rancid. Rancid oils are carcinogens. Wellness Mama’s blog post on why you should never eat vegetable oil or margarine includes a flow chart the describes the process of manufacturing vegetable oil. Margarine is typically made from hydrogenated vegetable oil (like Crisco) and has had its fats warped in such a way that they are recognized as harmful: New York City banned trans fats in city restaurants in 2008.

Dr. Mercola points out that Crisco was originally developed for candlemaking, but when that industry declined in favor of electric light, they began marketing it as a healthy alternative to saturated animal fats. Suet has also been used in candlemaking. It is very firm and waxy, and once rendered, is snowy white in color. Old-school cooks in the South and elsewhere swear by rendered animal fats for the flakiest pie crusts and light, crispy fried fish.



Two hundred years ago in Ireland, a family could support itself in good health on an acre of potatoes and a milk cow. Potatoes are a nutritious food, not only energy dense, but full of potassium, vitamin A, protein, and fiber. Raw dairy supplied more protein and other vitamins, as well as the saturated fat essential to life. 


Traditionally raised and prepared pork ribs, collards, and fried potatoes
A hundred years ago, people ate virtually no vegetable oils, and the average American ate about 18 pounds of butter a year. Today, people eat half that amount of butter, but have added 70 pounds of vegetable fat a year to their diets.

The idealized diet in the West now looks like a meal you can buy at McDonald’s: a small amount of lean flesh from grain fed animals (like a chicken breast or beef patty), and the rest of the diet being made up of high-density vegetable fat, like cooking oil, and carbohydrates from vegetable sources, either naturally dense or made so with refining: the white flour bun, soda, and fries. This is a long way from a buttered roasted potato, as it was commonly eaten and enjoyed two hundred years ago, in every regard: ratios, sources, methods, the length and complexity of the chain from soil and sun to dinner plate.

This week, I made a dinner of French fries fried in freshly rendered beef fat, and slow-cooked pork ribs, with a side of slaw. The next night, I ate the leftovers with some garlicky collard greens. The fried potatoes were surprisingly light, not greasy, and the scent reminded me of how fries used to taste when I was a kid. The ribs, from a locally raised pig that foraged, were strong in pig flavor and muscle fiber, not mushy like grain fed pork can be after slow cooking. The meat was delicious and tender and pulled away from the bone cleanly.

This weekend, having ground beef, bacon, mushrooms, Cheddar, suet, and potatoes all on hand, I will probably make a gluten-free equivalent of the fast food classic: a beef burger with all the fixings, and a large serving of French fries. The meat isn’t going to kill us, any more than the fat will, because we’re supposed to eat animals that lived naturally. The ground beef and tallow both came from a grass-fed bull. I might eat some kale with it, but I don’t need to add greens to redeem this meal. Food made the right way, all the way back to its origins in the Earth, will nourish, and I can trust my senses to tell me when I’ve had enough of any part of it. A restaurant meal can end in bloat and regret, but this never seems to happen when I make it myself from scratch.

A meal of conventionally raised meat and potatoes can look identical to one prepared with the right ingredients. Thanks to the flavor industry, you may even be fooled into thinking they taste very similar. Yet the two meals can be different in every important and measurable way, from its genetics, to the health of the organisms when they were alive, to its current chemical composition and nutritional profile. These differences extend to including whether the two meals in question---one produced traditionally, the other, produced industrially---will feed you or poison you, and in what measure they will satisfy and nourish, or discomfort and deplete.

Wednesday, March 7, 2012

Industrial Imperialism: Exporting the Flavor of America



In Japan, children are taught about healthy eating from early school age, in a unit called shokuiko, or “food education.” Last year it was reported that McDonald’s, home of the Happy Meal, now produces food education materials used in schools in Japan. According to Dr. Yoni Freedhoff, family physician and health blogger, the American fast food giant is hoping that Japanese school children “will associate McDonald's branding of their healthy eating classes, and hence McDonald's as a whole, with healthy eating.”

McDonald’s is selling the values and flavor of the West: America’s most important export. In China, McDonald’s presents its food as being of the highest quality. Given the recent history of food recalls from China due to tainting and corruption, (would you buy organic from China?) it would be ironic, if not for all of the victims who are taken in by the commercial food producers’ claims.

There have always been dangers inherent in food. We must eat or we die, but eating the wrong thing, once, can be a fatal mistake. Worse is when we eat something continuously, sensing no harm, only to find out that the dangers accumulate over time with few or no symptoms. Smoking a cigarette won’t kill you. Smoking a pack a day for ten years won’t even kill you. But we know that cigarettes contain dangerous toxins, and that people who smoke for decades are much more likely to develop certain diseases. Saccharine, Red #5, and other famously toxic food additives didn’t taste like poison to fans of Tab and the old red M&Ms.

Products with known health dangers, like tobacco cigarettes and infant formula, were once far more popular in the US than they are today. In the 1970s, more than half of adults smoked cigarettes. Breastfeeding, once the only choice for feeding infants, came to be regarded as practically unnatural once infant formula became widely available. A new cultural wave of breastfeeding awareness and celebration has been breaking against that corporate front for more than a generation.


Since then, both of these products have been more aggressively marketed in other countries, where the common wisdom had not yet developed against these new dangers. Rising middle classes, like the ones in India and China, want what has been so successfully sold to the middle classes who existed before them. A Big Mac and a Marlboro is the flavor of America. Even infants can suckle at rubber teats full of chemicals, and their mothers convinced that this is better. 

Don’t look down your nose at them: we were convinced to do exactly the same things, here. Look at pictures from the 1970s and be reminded of how, in the course of an American century, cigarette smoking went from uncommon to ubiquitous and back to uncommon again.


Today, a similar war is being pitched against fast food. New York City bans trans fats, and San Francisco bans Happy Meals. Yet, most people still eat fast food in this country, still wash it down with carcinogenic beverages. (Again, are you surprised?) 


They’re clamoring for them, say the multinationals who sell poison, cheap and tasty. The people want caramel color made using ammonia and that causes leukemia. If they didn’t want it, they’d buy something else.


Are you buying it?


Photo credit: nelo_hotsuma

Wednesday, February 1, 2012

Paula Deen Is Not Your Mom



Paula Deen, food entertainer, has recently revealed her relationship to Victroza, a diabetes drug



“Honey, I’m your cook, not your doctor. You have to be responsible for yourself.”---Paula Deen

“Let your medicine be your food and your food be your medicine.” ---Hippocrates

Caveat emptor---let the buyer beware---is Paula Deen’s warning regarding her portrayal of Southern cuisine.


Deen’s food entertainment empire began with her restaurant in the city of Savannah, Georgia, where she lives, and has grown to include TV shows, cookbooks, endorsements, and product lines. Fans of celebrity chefs like Deen are encouraged to consider the similarities between cooking at home and cooking for a living. But home cooks have different concerns from restaurant chefs. A home cook’s main or only job isn’t making dinner, but home cooks feel responsible for the health of the people they cook for, not just their enjoyment of the meal.

If you are lucky, there is at least one person in the world who cares about your health enough to cook for you accordingly. Make no mistake: Paula Deen is not that person. She is not your cook, not your doctor, and unless your name is Jamie or Bobby Deen, she’s not your mom. She is an entertainer, and her media empire now extends to speaking on behalf of Victroza, a drug she takes to manage her diabetic condition.

Paula Deen has recently admitted that she has insulin resistance, or type 2 diabetes. In The Washington Post, Jane Black describes Deen’s condition as “eminently avoidable” and “usually brought on by a combination of unhealthy eating, excess weight, high blood pressure and a couch-potato lifestyle.” A more nuanced explanation is available courtesy of Dr. Stephan Guyanet, a science blogger who breaks out the following factors causing insulin resistance in a series on his website: excess energy, or calories in the diet over those expended, inflammation, your brain, micronutrient status and the macronutrient composition of your diet.

Paula Deen’s image as a Southern cook is distinct from how she conducts her private life. As she has said repeatedly, the food she cooks as part of her public persona is for entertainment purposes. We can judge whether the food she prepares on TV is the basis of a healthy diet---and many have criticized her offerings---but we can’t say what part bacon and butter had in causing Deen’s medical condition.

If you cook from a Southern tradition, you probably don’t cook exactly like Paula Deen, either. There are regional and class differences among home cooks, as well as a range of personal tastes. Compare Deen’s food to the French-inspired Southern menu of Frank Stitt’s restaurant in Birmingham, or for a different kind of French influence on Southern food, the spice and humor of the late, great Cajun Cook, Justin Wilson. There are clear African influences in Southern food, from collards and sweet potatoes, so like African yams, to the pit barbecue. While the exported images of Southern food may more often fit the mold of Roscoe's House of Chicken and Waffles than of The Pink Teacup, there are cuisines that are recognizably Southern that are less energy dense, have less vegetable fat, fewer processed ingredients, and more vegetables, than is imagined by people whose image of Southern cuisine is based entirely on media, rather than on personal experience in the homes of Southern cooks.

Tamar Adler, in her essay on Paula Deen and balance in the diet, celebrates another home cook whose approach to “normal” Southern home cooking is of a sort I recognize from my years in rural Florida, with green beans and plenty of pork. She writes about Edna Lewis who, in her country cookbook, described hearty breakfasts followed by lunches of greens.

Rural Southerners learned to cook what flourished in the South, to feed people who did hard physical labor under grueling conditions. These cuisines survived for generations because they met the needs of the people eating it. It tastes good, can be prepared from available ingredients and time, and sustains health. Some of it was dense fare, but part of a home cook’s responsibility is to serve an array of good, fresh foods in proportions that can be eaten in good health by all members of the family: sedentary, active, young, and old. While our idea of what is “real” may translate to what has survived most unchanged by time, all of these cuisines, including Deen’s, are authentic Southern cuisine. The challenge is in discerning what of the new should be embraced, and what of the old discarded, if anything.

We all used to eat traditional diets, but today, our culture changes so quickly, and people move away from where they grew up, so the foods we used to eat are no longer available, and in addition to that, our needs have changed. Most of us do less physical work, and contend with more environmental stress and pollution, than our forebears. Modern people are always having to find new ways to feed ourselves, based on what is available in our changed environments. One way we do that is by looking around at what other people are eating.

Cooking programs like Deen’s are a part of that landscape, but like other marketing images, they mislead us into consuming much more food than is healthy, and into eating foods that are highly rewarding--hyperpalatable foods full of sugar, salt, and fat--rather than the natural range of flavors we were designed to discern and enjoy.

Deen is not a good example of Southern cooking, or of comfort food, or of home cooking, because she doesn’t demonstrate how to make healthy meals that are inexpensive and simple to prepare. She doesn’t use local ingredients except incidentally, or traditional methods or equipment. The food she prepares and sells is all-American, based on nationally available commercial products, and soothing (read: numbing) in its hyperpalatable way, but not healthy.

While we may hope to learn something from them, the expectations we bring to cooking shows may poorly match what is provided. Entertainment cooking programs don’t teach you how to choose or prepare ingredients for cooking, and the cost of ingredients, waste, seasonal availability, and geography are not considerations. The show’s producers compress the perceived prep and cooking time to fit the show. They may take advantage of your ignorance as a viewer, and misrepresent a regional cuisine with which you are unfamiliar.

Here in western Massachusetts in January, the “Winter Warmers” episode of Paula’s Home Cooking sounded most appealing as an introduction to her program. Paula cooks from a home that, in exterior shots on the show, has a palm tree waving from the front yard and green grass growing; in other words, nothing like the view from my window, of bare tree branches and snow on the ground: land that could use a “warmer.” In this episode, she makes a seafood gratin, a fried veal cutlet over spinach, and an apple cookie bar dessert. She never mentions where any of her ingredients come from: the shrimp and scallops, the spinach, the apples, and everything else, must all just come from a Savannah supermarket, indistinguishable from a supermarket in Akron or Phoenix, and where these ingredients are always available, year-round.

The meal, while fairly rich, is also very expensive to produce, but not inventive in its use of unusual or expensive ingredients. Her ingredients are alternately inordinately expensive, or cheap and nutritionally weak. In the gratin, she covers flavorful, expensive scallops and shrimp with a cheese sauce. Even conventionally raised veal is very expensive and can be hard to find, but she fries that up in butter cracker crumbs. These might taste good, but Deen’s dishes are examples of gilding the lily, not hearty country fare. This isn’t what people in the South eat in their kitchens after climbing off a tractor: this is what Brooklyn hipsters wearing ironic John Deere caps eat in comfort food diners.  

The ingredients are all prepped in advance: the shrimp are peeled and deveined, the apples peeled, cored, and sliced. This isn’t unusual, but it is a missed opportunity, if the cooking show star has some basic skills to demonstrate. Martin Yan has made a long career of cutting up chickens at lightning speed and mincing onions faster than the eye can follow, with his signature cleaver. When Deen finally handles a raw ingredient, her onion chopping is crude and she reams a lemon, seeds and all, right into her finished dish.

Deen’s meal ends with the production of a dessert bar made of cheap, energy dense, commercial products, including at least one--Philadelphia Cream Cheese--that Deen is paid to promote. The dishes in this episode appear to be easy to produce, and look like they would taste good, but not amazing, and not good enough to compensate for their poor nutrient value, the cost of the ingredients, or the time it would take to prepare. While the end of the “Cajun Cook” always left me feeling a little hungry for Wilson’s dinner---and he ended every show seated at dinner with a glass of wine---imagining eating Deen’s “Winter Warmers” makes me feel queasy. My body tells me that eating whole meals like this every day would make me very sick, indeed.

Even if a strong causal link could be made between Gooey Butter Cake Bars and type 2 diabetes, Deen’s body is not a matter of public health, and her choices regarding diet, exercise, and medication are none of our business. Dr. Hyman warns that Victroza could be the next Avandia: a popular diabetes drug that caused heart attacks. Victroza is not only potentially dangerous, but is also an expensive drug, too, as Nestle points out. Yet while Deen promotes her medication of choice, she also tries to play the class card by saying that we can’t all afford to eat prime rib. Of course, Deen can afford to eat whatever she wants, but that was never the point.

Carol Plotkin writes in "Paula Deen and the Fallacy of Moderation", “To say ‘all things in moderation’ to me seems like an excuse to maintain the status quo, which arguably is average.” If the average diet is an unhealthy one, leading to an epidemic of lifestyle-related diseases, there’s no reason any of us should emulate what we see on TV. Deen and similar food entertainers are not, or should not be, a goal toward which home cooks strive.

There are class reasons why the average experience for most people in the South is to be overweight and to develop type 2 diabetes. Not Southern cooking, but stress, food deserts, and poverty, are to blame for the poor health of people living in the American Southeast. Tamar Adler says we risk demonizing butter, bacon, comfort food, or Southern cuisine when we rush to demonize Paula Deen. Deen’s cooking “is not a good representation of comfort food,” Adler writes. “It confuses too much for a good, delicious, soulful amount.” The line of Paula Deen desserts available at Walmart is not an oasis of Southern hospitality; it’s one of the mirages of the food desert.

Friday, December 9, 2011

Do you eat like a lab rat?


Like you, Rattus norvegicus enjoys snack cakes


Are you living the rat race? If you labor alone, yet have little privacy, find your work unrewarding, and your lifestyle stressful, get little exercise, fresh air, or fresh food, and find your weight is also slowly climbing, you might be living like a lab rat.

Rattus norvegicus 101

Rats are not mice, and mice are not men. Lab rats are larger than their wild relations, and much larger than house mice. Wistar rats, one of the most common lab rats, weigh 300-500 grams on average as adults, depending on their sex. By comparison, a typical house mouse weighs only 10-25 grams.

Lab rats live in boring, confined, sedentary, lonely conditions, where food is the only constantly available reward. In controlled studies on diet, standard lab rats are typically fed ad libitum, or free-fed, on a standard rat chow. Under these conditions, rats steadily gain weight.

In a fifteen-week study on the so-called “cafeteria diet,”[1] rats fed ad libitum on standard chow gained almost their entire starting weight, from an average starting weight of 300 grams to over 550 grams (see Fig 1e).  

Rats living in the wild are rarely found weighing over 500 grams. The weight gain of lab rats on fairly boring, but unlimited diets is the baseline effect of living like a lab rat. So what happens if you switch to a low-fat formula?

Messing with the formula

Rats fed either a high-fat chow (HFC), or a low-fat/added sugar chow (LFC), both gained weight more quickly than the rats fed a standard formula (SC). You can see by the graph that, while both the high-fat and low-fat fed rats gained more weight on average, the tight clusters in the chart spread out noticeably on both of these modified feeds. This means that there was more variation among individual rats in their response to being switched from standard chow to a new formula. The new formula was either higher in fat or in sugar, suggesting higher palatability.

Those eating the LFD ate the most sugar, even more than rats with unlimited access to snack foods. The researchers also noted that the rats initially eat more of the new food, but by week two, their consumption declines to a level of mild overeating that remains steady through the remainder of the study.

What is the “cafeteria diet”?

But this study wasn’t just on what happens when you change the standard chow. The real test group in this study were those receiving what the researchers call a “cafeteria diet” (CAF). This group of rats got standard rat chow, ad libitum. They also had snack foods, more than they could possibly consume. Three different kinds of snacks, rotated for variety (a full list of the snacks, which included items such as wedding cake and Doritos, is here), were available each day in large quantities. As the researchers write, “this diet engages hedonic feeding.”

The rats in the CAF group naturally disdained most of their standard chow in favor of the tastier snack foods. They ate markedly more food than any of the other rats, and continued doing so even after the two modified-feed groups (HFC and LFC) adjusted their food intake after the second week. Those eating the CAF diet of standard chow plus snack foods continued to eat about half again as much food by weight as those on standard chow alone, and gained weight at about twice the rate. That’s some energy-dense food. These were the heaviest rats at the end of the fifteen weeks, averaging around 750 grams: two and half times their starting weight, and half again as big as the largest wild rats.

What happens when you put humans on a cafeteria diet?

Human volunteers—lean, adult men—allowed to eat whatever they wanted from refrigerated vending machines overate by about 50-60%, just like the rats. The resulting weight gain was about five pounds for each of the volunteers in a one-week study. [2]

As in the rat study, the humans were confined, had few opportunities for recreation, and were constantly observed. The rats, like the humans, were confined in pairs. The humans typically ate alone; no mention is made of the social dining habits of lab rats.

The food that they ate was not exactly the same. While the so-called cafeteria diet that the rats were placed on in the Obesity study was made up of standard rat chow plus an abundance of snack foods, the humans were not offered any kind of standard human chow: no rice, kale, and beans or poached chicken breasts over green salad; no meal replacement bars. Instead, they got all of their food from a set of refrigerated vending machines, stocked with commercially prepared foods. A full list is included in Table 1 of the 1992 American Journal of Clinical Nutrition study: note that the only fresh fruit or vegetable the human volunteers got all week was apples. The foods available were items such as white bread and chicken pie, as well as rat-approved snack foods like cake and Doritos. There were no green vegetables in any of the meals, and very few vegetables or fruits in any form.

Of rats and men

Among the differences between the human and rat studies above, the first, of course, is that humans are not rats. The second is that there was no standard chow available to the humans, only hyperpalatable food, while the rats had access to both. The third is the duration of the study.

Presumably, humans will also adjust to their diets, even if they are eating from vending machines. No one could maintain a lifestyle in which they gained five pounds a week for very long: that’s a hundred and ten pounds a year. Since it’s so difficult to get human volunteers to agree to longer studies, it hasn’t been well documented whether laboratory humans will make the second week adjustment the same way that laboratory rats do.

What’s the difference between your rat race and a rat’s?

Studies on the so-called cafeteria diet are really on the effects of food deserts and the stress of confinement. When all we have is food to reward us in our dingy, artificially-lit cubicles, we gain weight and overeat. Switching from standard commercial fare to a low-fat version, or even to a high-fat version, doesn’t help matters. And when we have really sweet and fatty food as our only reward in life, we eat even more of it.

Humans can insert consciousness, as my therapist likes to say. We can examine our motivations for eating out of pleasure or emotion rather than hunger, and decide whether to proceed or not. Whether to pull the emotional lever. Rats can’t do that. They will go for it, every time.

You can get out of your maze. Your food doesn’t have to come out of vending machines. Your only food options are not chain restaurants and commercial packaged foods, with the odd highly-shippable iceberg lettuce wedge or conventionally grown apple thrown in as a concession to health.  You can cook your own food from fresh ingredients. You can exercise and seek out other stimulating activities that engage your hedonic circuits.

You are not a rat, and you are not trapped.

Monday, October 24, 2011

Pick a new plate: how MyPlate fails

Choose MyPlate.gov for healthy commodities
At no time in history have we had more access to information, and yet as modern people today many of us don’t know anything more about how to feed ourselves than bringing it from plate to mouth. Every day, millions of people eat food that they don’t know how to make. They don’t know what the ingredients are, where those products come from, or how they’re manufactured. We have lost traditional knowledge of how to make food, but also, we’ve lost the larger knowledge of what our diets should look like. Food that used to be regional, seasonal, and prepared and eaten in particular ways, have been replaced with mass-produced products, even fresh produce, that doesn’t vary throughout the year, or from place to place. For guidance on eating these foods, instead of family and community, we have central authorities and credentialed experts. Foremost among these authoritative institutions is the United States Department of Agriculture.


To begin to understand why the USDA is a suspect place to get nutrition information, begin with the USDA mission statement. The USDA’s mission is to promote commodity crops: grains, meat, and dairy; and, to a far lesser extent, garden crops. A pie chart of subsidies from the USDA to the various food industries looks like this:
Subsidies to commodities


Compare the subsidies the USDA gives to the largest producers of these cash crops, with the USDA's official recommendations on diet to individuals. Then look at other sources of advice on diet that do not share the USDA’s ethical issue of being in the business of supporting business as well as being tasked with promoting the health of Americans. The bias becomes clear. MyPlate is skewed in favor of keeping legacy food groups that are commodities, like dairy and grains, on the plates of Americans, and creating new groups, like “Protein,” to encourage us to eat the new industrial food commodities, including refined soy and whey proteins.


Someone at the USDA is probably very proud for having thought up the “Protein” food group. It seems so inclusive, like a nondenominational blessing. It caters to several belief systems about food at once: industry-friendly nutritionism, the “plant-based” diet continuum from flexitarian omnivores to completely vegetarian vegans, and fat phobia. Unlike “meat,” “nuts,” or any other real food containing protein, “protein” itself is refined and treated as a commodity. MyPlate reflects the belief in a plant-based diet as health-promoting, while doing nothing to promote fresh produce. Highly processed foods like breakfast cereal, skim milk, and orange juice are included in its food groups.


Yet the USDA finds fault with fatty meat from a pastured animal, and with whole, raw milk: neither of these are part of an industrial diet, and their industrial counterparts are both correlated with disease. Because the USDA does not recognize the differences between industrial and natural foods, like the more balanced omega fatty acids profiles or other vitamins and enzymes present in pastured animal products, these healthy foods are not part of MyPlate, and their consumption is actively discouraged: the meat because it is fatty, and the milk because it is unpasteurized.

Dairy is still promoted, of course. A round plate should be a pie chart, but there’s an extra plate--or MyGlass--off to one side, full of "Dairy." And although the proportions are clearly intended as a guide, the shape of this pie chart makes it difficult to read. How much of each of these food groups is in a healthy diet? Should we “eat 120%”? Why isn’t MyGlass instead a salad, a cup of bone broth, a pickle plate, a glass of wine, or plain water? Can MyPlate even be modified into a proper model, or should it be scrapped in favor of a whole new shape?


While bone stocks and fermented beverages are also important traditional foods, they are not commodities, and their protein sparing and enzymatic benefits are simply not recognized as significant to the nutrition scientists or public health professionals who created MyPlate. Collard greens have almost exactly as much calcium as milk, cup for cup, but few people know it, because there’s no “Got Collards?” campaign educating the public, and there’s no “Dark, leafy green vegetables” group on MyPlate, either. This promotion of commodities is far reaching and very effective. Children learning about the food groups for the first time are introduced to milk, but not to collards. Which is enshrined in the school lunch program? Which deserves to be?


Milk isn’t even a necessary food. While Americans certainly drink lots of milk, traditionally, the people of the Americas didn’t drink cow’s milk. Throughout Asia and much of Africa, people don’t drink milk. Milk doesn’t need promoting as a food.


There should be more food groups on MyPlate dedicated to vegetables. Of all of the food groups on MyPlate, no group contains more diversity than this one quarter of the plate.


The “Vegetables” group, like the “Protein” group, lumps together foods with very different properties. Orange or green, starchy, fibrous, sweet, bitter, and sour, they all have different traits. Most traditional diets include some raw vegetables, and a fermented vegetable dish or two, like kimchee or sauerkraut. Raw and fermented foods provide many benefits worth promoting on MyPlate. Dark, leafy greens, and pod and flower vegetables, are important enough to merit their own, separate food group, and should be consumed often. Pretending that all vegetables are equivalent, from acorn squash to zucchini, raw kale salad to French fried potato, is just as gross an oversimplification as putting all “Protein” foods in one group.

Seven food groups promoted by the USDA in 1943
One of the original seven food groups was butter. Fats and oils, particularly the fats found in meat and dairy from animals raised on pasture and forage, are critical to human health. Once the pinnacle of the food pyramid, butter, lard, and other healthy fats are not anywhere to be found on MyPlate. The Weston A. Price alternative guidelines, based on foods included in all traditional diets, has just four food groups, one of which is Fats and Oils. Fat is a macronutrient, like protein, and just as essential to life. But MyPlate, with its disdain for traditional knowledge and exaltation of nutritionist dogma, fatphobia, and magical thinking about dietary fat, doesn’t include a “Fats” group at all.



The Food Pyramid of 1992
Then there’s everything else about food that nutritionism, and MyPlate, completely ignore, because it’s too hard to produce on an industrial scale, like the enzymes in raw foods. Nutritional guidelines from the USDA do not assign any value to a food for its environmental sustainability. They avoid any suggestion that fresh, organic, locally grown food in its season is preferable in any way to commercially prepared, conventionally grown alternatives. MyPlate does not suggest any of the synergies of traditional, complementary food pairings with its groupings. It doesn’t provide guidance on making meals suitable to different bodies, stages of life, or cultural norms. MyPlate doesn’t suggest a social setting, or the value of sharing a meal with family or friends. And even as MyPlate promotes a plant-based diet, it provides insufficient guidance to vegetarians and those with common food allergies to wheat, corn, soy, and dairy.


The most gaping flaw in the MyPlate model is that it does not provide enough guidance to be properly nourished. Since its only objective is to guide Americans in eating a healthy diet, it appears to be a failure. If, however, its real purpose is to uphold the USDA mission to promote commodity crops, it appears that MyPlate is a success.