All About Betty When I was six years old, I entered a Betty Crocker recipe contest. My dish was maybe a little ironically named, “Arctic Mississippi Mud Pie,” a frozen concoction of chocolate, brownies, marshmallows, ice cream, caramel and nuts. The chocolatey, gooey sweetness still sticks in my mind as one of my early food-related memories. It seems that many of my early childhood memories involve food in one way or another. At the time, we were living in the tiny town in Kansas, and I recall at about the same age assisting my father prepare to smoke turkeys, brisket and cheese, as well as my own clumsy attempts at backyard gardening … potatoes, corn, beans, tomatoes and cucumbers chaotically interspersed, ignoring any attempt at rows. Though I don’t remember it clearly, I also have a vague image-memory of a scene recounted by my mother of my first attempt at cooking when I was 3 years old. Apparently, I got up early one morning and proceeded downstairs to the kitchen, where I placed a cantaloupe in the oven, somehow managing to turn it on. Fortunately for us all, my parents awoke to the strange aromas wafting from the kitchen. I guess my knack for creating a mess in the kitchen has some very early roots. I’m not certain where I first heard of “Mississippi Mud Pie.” In fact, I read recently that the phrase first appeared in print in 1975, two years after I submitted it for the contest. So, in my conspiratorial mind, it could have been a moniker of my own creation, one which was greedily, clandestinely lifted from a innocent but talented six-year-old by some depraved General Mills employee, selfishly looking to get ahead. In actuality, it’s a dish which has been around for six or seven decades, now associated with its namesake state and is believed to have origins in the post WWII kitchens of the South. At age six, what I knew about the Mississippi River, if anything, probably came from National Geographic and Huckleberry Finn, and though I was an expert at eating desserts, I knew little of creating them. Needless to say, I didn’t win the competition. In spite of having my hopes dashed of meeting Betty Crocker herself, I was actually delighted to receive the official letter of “Your recipe was seriously considered. Thank you for your participation.”, a concrete sign that the letters that I was placing in the mailbox were actually making it out of my finite existence into the larger world, with at least a possibility of receiving an answer. I don’t remember the recipe or process of creating the dish, but I’m sure that my family dutifully tasted several versions of my concoction, feigning satisfaction.
Betty Crocker was created in 1921 by home economist Marjorie Husted for the Washburn Crosby Company, which merged with five other milling companies to form General Mills in 1928. Apparently, the name Betty was selected for its wholesome American and “cheery” appeal. “Crocker” was chosen as the last name for William Crocker, a director in the Washburn Crosby Company. The Betty Crocker name and overall concept was developed in order to personalize responses to consumer product questions. She first received her visual identity in 1936, when a portrait was commissioned, and was portrayed by several actresses in hundreds of advertising campaigns, television and radio broadcasts in the following decades. Regardless of the fact she was a fictional character, the public recognized her as a voice of authority from the grocery store to the kitchen with the “Betty Seal of Approval” red spoon logo, which was branded in 1954 by General Mills. To this day, cultural references to her can be found in film, music and literature.
On The Train to Convenience
One of the products which was first introduced under the Betty Crocker brand was Bisquick, conceived of in 1930 by Carl Smith, a General Mills sales executive who found himself famished on a train to San Francisco. With the dining car already closed, he asked the chef if he might make him something quick to tide him over. Minutes later the chef appeared with a plate of freshly baked biscuits, which Smith found to be utterly delicious. Upon inquiring with the chef as to how he managed to prepare them so quickly, he explained that he pre-mixed his batter using lard, salt, banking powder and flour, which he then stored on ice, which translated to delicious piping hot biscuits at a moment’s notice. Smith returned to General Mills and asked food scientists to develop a shelf-stable product which produced homemade-like results, and Bisquick was born in 1931. It was an major success for the company, and stars of the age from Shirley Temple to Clark Gable helped promote the product. The ads of the era boasted “90 seconds from package to oven.” Though trans fats, chemical preservatives and other nutritiously “enhanced” foods had begun appearing in the American marketplace a couple of decades prior in products like Oreos, Crisco, Nathan’s hot dogs, Aunt Jemima’s syrup, Velveeta cheese and others, Bisquick was something entirely new. It was a product which allowed the home cook to prepare not just flawless biscuits, cakes, pies, cookies and other baked goods, but it was a powerful tool which allowed homemakers to maintain their pride as a cook, all the while saving time and providing their families with nutritious “homemade” food. Win win.
It could be stated that processed foods have been around since the very first time that our ancient ancestors discovered, accidentally, that heating meat or plants over a fire made them more palatable, and easier to digest. That may have been close to 2 million years ago. Since then, humans have managed to figure out a lot, from unlocking the nutrients in cattails and ferns by pounding them into a flour, then adding water and cooking, (creating what may be the first forms of bread) and applying that same technique to other forms of grains and grasses. Fermentation is something that happens in nature, without human intervention, and it was most certainly recognized by our earliest ancestors and eventually developed into techniques, traditions, a culinary art. We can imagine a scenario where a group of early humans had collected fruit, nuts, seeds or grown grains to be saved in an effort to stave off hunger during leaner times, but perhaps the grain became water logged and began to bubble, ferment or the fruit began “rotting,” also fermentation. There was little that could or would have been wasted, so the only alternative was to consume it, maybe mash up and cook the fermented grain, bread? It’s hot, you’re thirsty, why not at least try to drink the fermented fruit juice, wine? All of this is “processed” food, and the fact that we as a species were able to unlock the mysteries of how to get nutrients from plants and animals which would not have been available to us in raw form is the only reason we as humans are still around. The story of humans is the story of food and community, how do we get it, how do we share it, how do we preserve it.
Flour, vegetable shortening, phosphate, sugar, dry skim milk, salt and soda, the ingredients in the original Bisquick. The modern version lists “Enriched Flour Bleached (wheat flour, niacin, iron, thiamin mononitrate, riboflavin, folic acid), Partially Hydrogenated Soybean and/or Cottonseed Oil, Leavening (baking soda, sodium aluminum phosphate, monocalcium phosphate), Dextrose, Salt.” Other than the dry skim milk that was present in the original, and the use of “enriched” flour in the contemporary version, the other ingredients remain largely unchanged. So, what’s wrong with using Bisquick? Aside from the health risks associated with consuming partially hydrogenated GMO soybean and cottonseed oils, and the fact that the bleached white flours used in today’s commodity processed foods have little nutritional value, other than the vitamins and minerals which are added back in so, no, it ain’t great, but today the majority of people consume far worse substances daily. Premixing some ingredients in order to save steps in the cooking process is a practice which is common in even the most righteous commercial kitchens. The problem comes when we step beyond that line, into a space where there is little resemblance to the traditional act of cooking at home (or for the public) using whole food ingredients. For example, here’s an ingredient list from a simple yellow cake: Flour, baking powder, salt, sugar, butter, vanilla, eggs, milk
Fast forward: Here’s a recipe for Betty Crocker’s Yellow Cake Mix: Enriched Flour Bleached (wheat flour, niacin, iron, thiamin mononitrate, riboflavin, folic acid), Sugar, Corn Syrup, Leavening (baking soda, sodium aluminum phosphate, monocalcium phosphate). Contains 2% or less of: Modified Corn Starch, Corn Starch, Partially Hydrogenated Soybean and/or Cottonseed Oil, Propylene Glycol Mono and Diesters of Fatty Acids, Salt, Distilled Monoglycerides, Dicalcium Phosphate, Natural and Artificial Flavor, Sodium Stearoyl Lactylate, Xanthan Gum, Cellulose Gum, Yellows 5&6, Nonfat Milk, Soy Lecithin.
In addition, one still needs to add water, eggs and milk in order to complete the recipe. Stabilizers, preservatives, dyes, flavoring, thickening agents and other additives make up the majority of the ingredients in that list, one that by comparison isn’t even particularly long in today’s highly-processed industrial food scene. In fact, it’s a little funny and perhaps very ironic that many of these “mixes” originally contained everything one needed, “just add water.” However, early consumer research found that the homemaker wanted to have a little ownership in the cooking process, which is why the newer formations required the addition of water, eggs, butter and milk, something for the home cook to feel “proud” of.
Again, is there anything particularly wrong with these “other” substances being added to our food? It may seem obvious that our bodies have evolved to be efficient in their use of nutrients, and to treat various types of naturally-occurring compounds in particular ways, which have been well-developed via evolution … proteins, amino acids, carbohydrates, different types of fat, vitamins, minerals, our bodies are well-equipped for extracting, utilizing and storing energy as needed. Conversely, when our body doesn’t recognize a substance, toxic or not, it may employ mechanisms to isolate or expel that “foreign” substance, which could involve sending it in the direction of an orifice, or even surrounding it in fat cells. In the National Human Adipose Tissue Survey (conducted by the EPA in ‘80s) researchers found that “all human fat samples contained traces of four industrial solvents and one dioxin. Nine more chemicals, including three more dioxins and one furan were found in more than 90 percent of the fat samples. And on average, 83 percent of the samples had PCBs.” Is it any wonder that diet soda and low fat cookies didn’t lead to a health revolution, quite the contrary? Cornfed
Aside from the direct health risk of consuming the chemical and artificial substances in highly-processed foods, there’s another, perhaps even more-confusing and dangerous side. That is, what’s contained in the “natural” ingredients, what’s “natural,” for that matter? Corn is the obvious example of a commodity crop gone horribly wrong. The list of food additives which can be derived from corn is astounding,including: Ascorbic Acid, Calcium Citrate, Caramel, Cellulose, Citrate, Citric Acid, Corn Starch, Corn Syrup, Decyl Glucoside, Dextrin, Maltodextrin, Dextrose, Ferrous Gluconate, Flavoring, Hydrolyzed Vegetable Protein, Lactic Acid, Lauryl Glucoside, Magnesium Citrate, Magnesium Stearate, Malic Acid, Malt, Malt Flavoring, Maltodextrin, Maltitol, Maltose, Methyl Gluceth, Modified Food Starch, Monosodium Glutamate, Polydextrose, Polylactic Acid, Polysorbates, Potassium Citrate, Saccharin, Sodium Citrate, Sodium Erythorbate, Sodium Starch Glycolate, Sorbitan, Sorbitan Monostearate, Sorbitol, Starch, Sucralose, Tocopherol, Xanthan Gum, Xylitol.
“Cornfed” may be a designation which we use for grain/corn-fed cattle, but we are without a doubt the most cornfed species on Earth, and it’s nearly impossible to get away from it. It’s in our food, our medicine, our cosmetics and in our fuel tanks, all made possible by government subsidies, which has arguably morphed into a form of corporate welfare that would have been difficult for generations past of farmers and politicians even to imagine. Of course, soybean is another large monoculture crop which has taken over in the Midwest, turning field after field into higher profit margins for multi-national agrochemical-seed-biopharmaceutical giants like Pfizer and Monsanto. For farmers in top agricultural states, one of if not the the main question each year is whether they should plant corn or soybeans. Which will be in higher demand, more profitable? It’s an obligatory decision which has many independent farmers feeling like they are on a perpetual industrial-production roller coaster, one with the obligatory circular track, but the ride never stops, and the controls rest in the hands of mega pharma-ag.
Corn and cattle, smart people have written expansive, well-considered essays and books about the monolithic (maybe polylithic?) agro-chemical complex which has defined and enabled our commodity food production system, so I’m not going to go down the path of describing how incredibly destructive that system has become. Rather, I’ll address what might be considered to some as “collateral” damage. Before I get to that, another early food memory comes to mind. As a pre-teen in small-town Kansas, I recall the periodic arrival of fresh Gulf shrimp on compact insulated trucks which maybe only a couple of days prior were loaded along with tons of ice on the docks of Louisiana and Texas. My father’s love for fried shrimp meant that these occasions required the purchase of as many pounds of shrimp as our current financial situation and fridge/freezer space would allow. It also required the participation of the entire family, with a assembly line for peeling and deveining the shrimp then butterflying them so that he could try out his latest recipe for batter or breading, would it be beer batter, panko-style, or perhaps the simple buttermilk batter used by the first place dad had tried shrimp at decades before, all good. Then came the careful, almost artful preparation of the cocktail sauce. The end result was transformative, the briny sweetness of the Gulf shrimp, the perfectly seasoned coating, the sweet spicy umami of the cocktail … I can only remember never getting quite enough, though I know I ate far too many. A Dead End
The “Dead Zone,” it sounds like an urban apocalyptic zombie horror-thriller. While dead zones are horrific, they’re very real and quite literally growing in the Gulf of Mexico, as well as other large bodies of water around the world. According to NOAA, “Dead zones are hypoxic (low-oxygen) areas in the world's oceans and large lakes, caused by "excessive nutrient pollution from human activities coupled with other factors that deplete the oxygen required to support most marine life in bottom and near-bottom water." Gulf Shrimpers first reported a dead zone back in the 1950s, but it wasn’t until the ’70s that the scientific community became more engaged in identifying the impacts on ecosystems and the fishing industry. In order to provide a brief and succinct explanation as to why people should care about the dead zone, I’ve selected a couple of graphics. The first is from the EPA and outlines the process that takes place once agro-pollution travels down the Mississippi and is deposited in the Gulf:
So, to summarize, fertilizer, shit and other chemicals float down river and hit the Gulf, plankton and other micro-organisms bloom and boom, die and fall to the bottom. Decomposition depletes water of oxygen. Fish and other sea creatures that are more mobile get outta Dodge. Crab, shrimp and other non-migratory species simply die. The next graphic from NOAA shows the extent of the 2017 Gulf of Mexico Dead Zone.
If that’s not horrific, I’m not sure what is. Now let’s back up a little to where the source of this horror begins, upstream. Not coincidentally or surprisingly, the development of the Nation’s industrialized food system (with massive growth starting in the ‘50s) corresponds directly to the growth of the Gulf Dead Zone. From the conversion of our fields to massive monoculture corn and soybean operations, and the widespread use of chemical fertilizers to subsi-surplus corn helping create an entirely new scale of notoriously dirty and disgustingly inhumane cattle and pig lot “cities.” All of that accounts for billions of cubic tons of waste water horribly tainted with sewage, chemical fertilizers, antibiotics and worse running into our streams and rivers, with a final destination of the Gulf of Mexico.
Your Neighbor's Shit Many farmers seem oblivious, unmoved or even combative about taking responsibility for the negative impacts which are occurring due to industrial agriculture. But even if people who live upstream from this environmental disaster give little consideration as to where their shit ends up, there are several reasons they should care. Obviously, if they like to eat seafood, that’d be one. For example, just measuring the affect on Gulf brown shrimp is telling. It was once the most valuable fishery in the country. Now the dead zone keeps a large percentage of the shrimp from reaching their offshore spawning grounds. The end result is scarcity of large shrimp, and direct economic impacts to the Gulf shrimp industry which have now been clearly shown in a recent NOAA-funded Duke University study. That’s to speak nothing of species like oysters, clams and snails which can’t swim out of the dead zone and can only survive for a short period without standard oxygen levels. Then there’s the issue that our Gulf Dead Zone, albeit huge, is one of many throughout the world, and it is a global crisis we face. Why? Because as our oceans die, we die. For example, the oceans produce 50% of the oxygen we breath and suck up about a third of the carbon produced by human activities. Then there’s the fact that major populations throughout the world rely on seafood as their primary source of protein, and as we produce less seafood, major global food shortages loom on the horizon. Conversely, healthy oceans provide potential solutions to climate change and food shortages, and studies have shown in places like Chesapeake Bay and the Black Sea that dead zones can be reversed and vital species will return. With over 400 dead zones around the world the potential for positive ecological effects cannot be understated.
OK, I hear you, so what’s my point? Is it that the need for convenience has created an industrial food system which now threatens to completely undo what nature took millions of years to create? Is it that cheap processed food has destroyed biodiversity and the American farmer along with it? Or maybe outrage that we’re biting our collective stiff upper lip while watching our shit float downriver to become our neighbor’s disaster. Yes, that, but it’s more. To quote the great American novelist, poet and farmer, Wendell Berry “...the care of the earth is our most ancient and most worthy and, after all, our most pleasing responsibility. To cherish what remains of it, and to foster its renewal, is our only legitimate hope.”