You don’t look, smell or act like a ruminant. Then why would you eat like one?
Those of you who have already forgone wheat do not, of course. But if you remain of the ‘healthy whole grain’-consuming persuasion, you have fallen victim to believing that grasses should be your primary source of calories. Just as Kentucky bluegrass and ryegrass in your lawn are grasses from the biological family Poaceae, so are wheat, rye, barley, corn, rice, bulgur, sorghum, triticale, millet, teff and oats. You grow teeth twice in your life, then stop, leaving you to make do for a lifetime with a prepubertal set that erupted around the age of 10; produce a meagre litre of saliva per day; have three fewer stomach compartments unpopulated by foreign organisms and without grinding action; don’t chew a cud; and have a relatively uninteresting, linear, nonspiral colon. These adaptations allow you to be omnivorous – but not to consume grasses.
Early members of our species found nourishment through scavenging, and then hunting, animals such as gazelles, turtles, birds and fish, and consuming the edible parts of plants, including fruit and roots, as well as mushrooms, nuts and seeds. Hungry humans instinctively regarded all of these as food. About 10,000 years ago, during a period of increasing temperature and dryness in the Fertile Crescent, humans observed the ibex and aurochs grazing on einkorn, the ancient predecessor of modern wheat. Our hungry, omnivorous ancestors asked, ‘Can we eat that, too?’ They did, and surely got sick: vomiting, cramps and diarrhoea. At the very least they simply passed wheat plants out undigested, since humans lack the ruminant digestive apparatus. Grass plants in their intact form are unquestionably unappetizing. We somehow figured out that for humans, the only edible part of the einkorn plant was the seed – not the roots, not the stem, not the leaves, not the entire seed head – just the seed, and even that was only edible after the outer husk was removed and the seed was chewed or crushed with rocks and then heated in crude pottery over fire. Only then could we consume the seeds of this grass as porridge, a practice that served us well in times of desperation when ibex meat, bird eggs and figs were in short supply.
Similar grass-consuming adventures occurred with teosinte and maize (the ancestors of modern corn) in the Americas; rice from the swamps of Asia; and sorghum and millet in sub-Saharan Africa, all requiring similar manipulations to allow the edible part – the seed – to be consumed by humans. Some grasses, such as sorghum, posed other obstacles; its content of poisons (such as hydrocyanic acid, or cyanide) results in sudden death when the plant is consumed before maturity. Natural evolution of grasses led to wheat strains such as emmer, spelt and kamut as wheat exchanged genes from other wild grasses, while humans selected strains of corn with larger seeds and seed heads (cobs).
What happened to those first humans, hungry and desperate, who figured out how to make this one component of grasses – the seed – edible? Incredibly, anthropologists have known this for years. The first humans to consume the grassy food of the ibex and aurochs experienced explosive tooth decay; shrinkage of the maxillary bone and mandible, resulting in tooth crowding; iron deficiency; and scurvy. They also experienced a reduction in bone diameter and length, resulting in a loss of as much as 5 inches in height for men and 3 inches for women.1
The deterioration of dental health is especially interesting, as dental decay was uncommon prior to the consumption of the seeds of grasses, affecting less than 1 per cent of all teeth recovered, despite the lack of toothbrushes, toothpaste, fluoridated water, dental floss and dentists. Even though they lacked any notion of dental hygiene (aside from possibly using a twig to pick the fibres of wild boar from between their teeth), dental decay was simply not a problem that beset many members of our species prior to the consumption of grains. The notion of toothless savages is all wrong; they enjoyed sturdy, intact teeth for their entire lives. It was only after humans began to resort to the seeds of grasses for calories that mouths of rotten and crooked teeth began to appear in children and adults. From that point on, decay was evident in 16 to 49 per cent of all teeth recovered, along with tooth loss and abscesses, making tooth decay as commonplace as bad hair among humans of the agricultural Neolithic Age.2
In short, when we started consuming the seeds of grasses 10,000 years ago, this food source may have allowed us to survive another day, week or month during times when foods we had instinctively consumed during the preceding 2.5 million years fell into short supply. But this expedient represents a dietary pattern that constitutes only 0.4 per cent – less than one-half of 1 per cent – of our time on earth. This change in dietary fortunes was accompanied by a substantial price. From the standpoint of oral health, humans remained in the Dental Dark Ages from their first taste of porridge all the way up until recent times. History is rich with descriptions of toothaches, oral abscesses, and stumbling and painful efforts to extract tainted teeth. Remember George Washington and his mouthful of wooden false teeth? It wasn’t until the 20th century that modern dental hygiene was born and we finally managed to keep most of our teeth through adulthood.
Fast-forward to the 21st century: modern wheat now accounts for 20 per cent of all calories consumed by humans; the seeds of wheat, corn and rice combined make up 50 per cent.3 Yes, the seeds of grasses provide half of all human calories. We have become a grass seed-consuming species, a development enthusiastically applauded by agencies such as the USDA, which advises us that increasing our consumption to 60 per cent of calories or higher is a laudable dietary goal. It’s also a situation celebrated by all of those people who trade grain on an international scale, since the seeds of grasses have a prolonged shelf life (months to years) that allows transoceanic shipment, they’re easy to store, they don’t require refrigeration and they’re in demand worldwide – all the traits desirable in a commoditized version of food. The transformation of foodstuff into that of a commodity that’s tradeable on a global scale allows financial manipulations, such as buying and selling futures, hedges and complex derivative instruments – the tools of mega-commerce – to emerge. You can’t do that with organic blueberries or Atlantic salmon.
Examine the anatomy of a member of the species Homo sapiens and you cannot escape the conclusion that you are not a ruminant, have none of the adaptive digestive traits of such creatures and can only consume the seeds of grasses – the food of desperation – by accepting a decline in your health. But the seeds of grasses can be used to feed the masses cheaply, quickly and on a massive scale, all while generating huge profits for those who control the flow of these commoditized foods.
Mutant Ninja Grasses
The seeds of grasses, known to us more familiarly as ‘grains’ or ‘cereals’, have always been a problem for us nonruminant creatures. But then busy geneticists and agribusiness got into the act. That’s when grains went from bad to worse.
Readers of the original Wheat Belly know that modern wheat is no longer the 41⁄ 2-foot-tall traditional plant we all remember; it is now an 18-inch-tall plant with a short, thick stalk; long seed head; and larger seeds. It has a much greater yield per acre than its traditional predecessors. This high-yield strain of wheat, now the darling of agribusiness, was not created through genetic modification but through repetitive hybridizations, mating wheat with nonwheat grasses to introduce new genes (wheat is a grass, after all) and through mutagenesis, the use of high-dose x-rays, gamma rays and chemicals to induce mutations. Yes: modern wheat is, to a considerable degree, a grass that contains an array of mutations, some of which have been mapped and identified, many of which have not. Such uncertainties never faze agribusiness, however. Unique mutated proteins? No problem. The USDA and US Food and Drug Administration (FDA) say they’re okay, too – perfectly fine for public consumption.
Over the years, there have been many efforts to genetically modify wheat, such as by using gene-splicing technology to insert or delete a gene. However, public resistance has dampened efforts to bring genetically modified (GM) wheat to market, so no wheat currently sold is, in the terminology of genetics, ‘genetically modified’. (There have been recent industry rumblings, however, that make the prospect of true GM wheat a probable reality in the near future.) All of the changes introduced into modern wheat are the results of methods that pre-date the technology to create GM foods. This does not mean that the methods used to change wheat were benign; in fact, the crude and imprecise methods used to change wheat, such as chemical mutagenesis, have the potential to be worse than genetic modification, yielding a greater number of unanticipated changes in genetic code than the handful introduced through gene-splicing.4
Corn and rice, on the other hand, have been genetically modified, in addition to undergoing other changes. For instance, scientists introduced genes to make corn resistant to the herbicide glyphosate and to express Bacillus thurigiensis (Bt), a toxin that kills insects, while rice has been genetically modified to make it resistant to the herbicide glufosinate and to express beta-carotene (a variety called Golden Rice). Problem: while, in theory, the notion of just inserting one silly gene seems simple and straightforward, it is anything but. The methods of gene insertion remain crude. The site of insertion – which chromosome, within or alongside other genes, within or without various control elements – not to mention disruption of epigenetic effects that control gene expression, cannot be controlled with current technology. And it’s misleading to say that only one gene is inserted, as the methods used usually require several genes to be inserted. (We discuss the nature of specific changes in GM grains in Chapter 2.)
The wheat, corn and rice that make up 50 per cent of the human diet in the 21st century are not the wheat, corn and rice of the 20th century. They’re not the wheat, corn and rice of the Middle Ages, nor of the Bible, nor of the Egyptian empire. And they are definitely not the same wheat, corn and rice that were harvested by those early hungry humans. They are what I call ‘Frankengrains’: hybridized, mutated, genetically modified to suit the desires of agribusiness, and now available at a supermarket, convenience store or school near you.
Wheat: What Changed . . . and Why Are the Changes So Bad?
All strains of wheat, including traditional strains like spelt and emmer, are problems for nonruminant humans who consume them. But modern wheat is the worst.
Modern wheat looks different: shorter, thicker shaft, larger seeds. The reduction in height is due to mutations in Rh (reduced height) genes that code for the protein gibberellin, which controls stalk length. This one mutant gene is accompanied by other mutations. Changes in Rh genes are thereby accompanied by other changes in the genetic code of the wheat plant.5 There’s more here than meets the eye.
Gliadin
While gluten is often fingered as the source of wheat’s problems, it’s really gliadin, a protein within gluten, that is the culprit behind many destructive health effects of modern wheat. There are more than 200 forms of gliadin proteins, all incompletely digestible.6 One important change that has emerged over the past 50 years, for example, is increased expression of a gene called Glia9, which yields a gliadin protein that is the most potent trigger for coeliac disease. While the Glia-9 gene was absent from most strains of wheat from the early 20th century, it is now present in nearly all modern varieties,7 probably accounting for the 400 per cent increase in coeliac disease witnessed since 1948.8
New gliadin variants are partially digested into small peptides that enter the bloodstream and then bind to opiate receptors in the human brain – the same receptors activated by heroin and morphine.9 Researchers call these peptides ‘exorphins’, or exogenous morphine-like compounds. Gliadin-derived peptides, however, generate no ‘high’, but they do trigger increased appetite and increased calorie consumption, with studies demonstrating consistent increases of 400 calories per day, mostly from carbohydrates.
Gluten
Gluten (gliadin + glutenins) is the stuff that confers the stretchiness unique to wheat dough. Gluten is a popular additive in processed foods such as sauces, instant soups and frozen foods, which means the average person ingests between 15 and 20 grams (g) per day.10 Gluten has been genetically manipulated to improve the baking characteristics of its glutenin. Geneticists have therefore crossbred wheat strains repeatedly, bred wheat with nonwheat grasses to introduce new genes, and used chemicals and radiation to induce mutations. Breeding methods used to alter gluten quality do not result in predictable changes. Hybridizing two different wheat plants yields as many as 14 unique glutenin proteins never before encountered by humans.11
Wheat Germ Agglutinin
The genetic changes inflicted on wheat have altered the structure of wheat germ agglutinin (WGA), a protein in wheat that provides protection against moulds and insects. The structure of WGA in modern wheat, for instance, differs from that of ancient wheat strains.12 WGA is indigestible and toxic, resistant to any breakdown in the human body, and unchanged by cooking, baking and sourdough fermentation. Unlike gluten and gliadin, which require genetic susceptibility to exert some of their negative effects, WGA does its damage directly. WGA alone is sufficient to generate coeliac disease-like intestinal damage by disrupting microvilli, the absorptive ‘hairs’ of intestinal cells.13
Phytates
Phytic acid (phytates) is a storage form of phosphorus in whin-breeding efforts over the past 50 years have eat and other grains. Because phytates also provide resistance to pests, graselected strains with increased phytate content. Modern wheat, maize and millet, for instance, each contain 800 milligrams (mg) of phytates per 100 g (31⁄2 ounces) of flour. Phytate content increases with fibre content, so advice to increase fibre in your diet by consuming more ‘healthy whole grains’ also increases the phytate content of your diet. As little as 50 mg of phytates can turn off absorption of minerals, especially iron and zinc.14 Children who consume grains ingest 600 to 1,900 mg of phytates per day, while enthusiastic grain-consuming cultures, such as modern Mexicans, ingest 4,000 to 5,000 mg of phytates per day. These levels are associated with nutrient deficiencies.15
Alpha-Amylase Inhibitors and Other Allergens
Wheat allergies are becoming more prevalent. Numerous allergens have been identified in modern wheat that are not present in ancient or traditional forms of the plant.16 The most common are alpha-amylase inhibitors, which are responsible for causing hives, asthma, cramps, diarrhoea and eczema. Compared with older strains, the structure of modern alpha-amylase inhibitors differs by 10 per cent, meaning it may have as many as several dozen amino acid differences. As any allergist will tell you, just a few amino acids can spell the difference between no allergic reaction and a severe allergic reaction, or even anaphylactic shock. People in the baking industry frequently develop a condition called baker’s asthma. There is also a peculiar condition called wheat-derived exercise-induced anaphylaxis (WDEIA), a severe and life-threatening allergy induced by exercising after eating wheat. Both conditions are caused by an allergy to gliadin proteins.17 Many other proteins have undergone changes over the last 40 years: lipid transfer proteins, omega-gliadins, gamma-gliadins, trypsin inhibitors, serpins and glutenins. All trigger allergic reactions.
Life Outside the Grain Mooovement
The start of grain consumption for humans coincides with the dawn of the domestication of livestock. We learned that some herbivorous species, such as aurochs and ibex, when confined and allowed to reproduce in captivity, could be put into the service of the human diet. While we were domesticating these creatures into cows and goats, they showed us that their diet of grasses was also something we could try to mimic. They also contributed to human diseases by giving us smallpox, measles, tuberculosis, and rhinoviruses that cause the common cold.
While much of the world followed the lead of grazing ruminants and adopted a diet increasingly reliant on the seeds of grasses, not all cultures took this 10,000-year dietary detour. A number of hunter-gatherer societies throughout the world never embraced grains, relying instead on traditional omnivorous menus. The diets followed by such societies therefore largely reflect the diets of pre-Neolithic humans, i.e., diets that pre-date the development of agriculture. The modern world has, over the past few hundred years, encroached on these primitive societies, particularly if their land or other resources were prized. (Think Native Americans and Canadians of the Pacific Northwest or Aboriginal populations of Australia.) Each instance provides a virtual laboratory to observe what happens to health when there is a shift from a traditional grain-free to a modern grain-filled diet.
We have cultural anthropologists and field-working doctors to thank for such insights. Scientists have studied, for instance, the San of southern Africa, Kitavan Islanders of Papua New Guinea and the Xingu peoples of the Brazilian rainforest, all of whom consume foods obtained from their unique habitats. None consume modern processed foods, of course, meaning no grains, no added sugars, no hydrogenated oils, no preservatives and no artificial food colouring. People following their ancestral diets consistently demonstrate low body weight and body mass index (BMI); freedom from obesity; normal blood pressure; normal blood sugar and insulin responses; lower leptin levels (the hormone of satiety); and better bone health.18 Body mass index, reflecting a ratio of weight to height, is typically 22 or less, compared with our growing ranks of people with BMIs of 30 or more, with 30 representing the widely accepted cutoff for obesity. The average blood pressure of a Xingu woman is 102/66 mmHg, compared with our typical blood pressures of 130/80 or higher. The Xingu experience less osteoporosis and fewer fractures.
The Hadza of northern Tanzania are a good example of a hunter-gatherer society that, despite contact with Westerners, has clung to traditional methods of procuring food.19 The women dig for roots and gather edible parts of plants, while the men hunt with bows and poison-tipped arrows and gather honey from bees. The average BMI of this population? Around 20, with vigour maintained into later life, as grandparents help rear grandchildren while mothers gather and prepare food. Despite a lifestyle that appears physically demanding on the surface, the total energy expenditure of the Hadza is no different to that of modern people – not greater or less than, say, an average accountant or schoolteacher.20 Activity is parcelled a bit differently, of course, with hunter-gatherers tending to experience bursts of intense activity, followed by prolonged rest, and modern cultures gradually playing out activity throughout the day, but detailed analyses of energy expenditure among primitive people show virtually no difference. This challenges the notion that modern excess weight gain can be blamed on increasingly sedentary lifestyles.21 (Note that this is not true for all hunter-gatherer cultures; the Luo and Kamba of rural Kenya, for instance, exhibit high levels of energy expenditure. The point is that differences in weight are not solely explained by differences in energy expenditure.)
Humans are adaptable creatures, as the wide variety of diets consumed worldwide attests. Some rely almost exclusively on the flesh, organs and fat of animals, such as the traditional Inuits of the northernmost Pacific Northwest of North America. Some diets are high in starches from roots (such as yams, sweet potatoes, taro and tapioca) and fruit, as with the Kitavans of Papua New Guinea or the Yanomami of the Brazilian rain-forest.
The incorporation of foods from the mammary glands of bovines has provoked expression of a lactase-persistence gene that allows some adults to consume milk, cheese and other products that contain the sugar lactase after the first few years of life – an advantage for survival. The seminomadic Maasai people of central Africa are a notable example. Largely herders of goats, sheep and cattle, they traditionally consume plentiful raw meat and the blood of cows mixed with milk, and they’ve done so for thousands of years. This lifestyle allows them to enjoy freedom from cardiovascular disease, hypertension, diabetes and excess weight.22
This is the recurring theme throughout primitive societies: A traditional diet, varied in composition and high in nutrient content but containing no grains or added sugars, allows people to enjoy freedom from all the chronic ‘diseases of affluence’. Even cancer is rare.23 This is not to say that people following traditional lifestyles don’t succumb to disease; of course they do. But the range of ailments is entirely different. They suffer infections such as malaria, dengue fever and nematode infestations of the gastrointestinal tract, as well as traumatic injuries from falls, battles with humans and animals, and lacerations, reflecting the hazards of living without modern tools, conveniences, central governments or modern health care.
What happens when a culture that has avoided the adoption of agriculture and grain consumption is confronted with modern breads, biscuits and crisps? This invasion by modern foods has played out countless times on a worldwide stage, with the same results each and every time: weight gain and obesity to an astounding degree, tooth decay, gingivitis and periodontitis, tooth loss, arthritis, hypertension, diabetes, and depression and other psychiatric conditions – all the modern diseases of affluence. Like a broken record, this same refrain has played over and over again in varied populations, on every continent.
It has been observed in Pima Indians of the American Southwest, where 40 to 50 per cent of adults are obese and diabetic, many toothless.24 It has been observed in native tribes of Arizona, Oklahoma and the Dakotas, resulting in 54 to 67 per cent of the population being overweight or obese.25 Peoples inhabiting circumpolar regions of Canada and Greenland have all experienced dramatic increases in obesity and diabetes.26 In Pacific Islanders, such as the Micronesian Nauru, 40 per cent of adults are obese with diabetes.27 Modernized diets have put Australian Aboriginal populations in especially desperate health straits, with 22 times the risk of complications of diabetes, 8 times higher cardiovascular mortality, and 6 times greater mortality from stroke compared with non-Aboriginal Australians.28
Until recently, the Maasai of central Africa, Samburu of Kenya and Fulani of Nigeria showed virtually no overweight or obesity, no hypertension and low total cholesterol values (125 mg/dl). When relocated to urban settings, hypertension and obesity explode, with 55 per cent overweight or obese.29 Former hunter-gatherers develop iron deficiency anaemia and folate deficiency as they transition away from hunting game and gathering wild vegetation and rely on purchased foods, especially corn.30 Dr Roberto Baruzzi, a Brazilian doctor, studied hunter-gatherers of the Xingu region of Brazil in the 1960s and 1970s and found slender people with no discernible excess body fat, no diabetes, no cardiovascular disease, no ulcers and no appendicitis. A repeat survey in 2009, following 30 years of contact with modern food, found 46 per cent of the people overweight or obese, 25 per cent of the men hypertensive, and most with abnormalities of cholesterol panels (such as low HDL cholesterol or high triglycerides) and rampant dental decay.31 Another recent assessment of Aruák natives of the Xingu region documented 66.8 per cent of men and women as overweight or obese, 52.1 per cent of women with abdominal obesity and 37.7 per cent of men with hypertension.32