Addicted to “Food Addiction”

I recently wrote a short article on “food addiction” for the Risk Innovation Lab’s CrisBits blog (collaboratively published by Arizona State and Michigan University!). This piece mainly focuses on the scientific side of the issue- I really wanted to broadly cover research on the topic, since so many popular articles on food addiction focus on singular studies (and end up being extremely misleading). Yet I also really wanted to address the topic from an anthropological perspective.

… the notion of addictive foods attracts us on a much deeper level as well

So why are we.. almost addicted to the belief that “food addiction” is a thing? If you read my CrisBits article, you’ll see that there is (as of now) no actual evidence for any food ingredients causing addictive-like responses in humans. The field is highly debated, though: there’s plenty of scholars arguing pro and against. On top of that, the media often does a horrible job sensationalizing food addiction research (well, I suppose it does a great job sensationalizing, but a horrible job communicating the results correctly). All of that can surely create the illusion that science actually supports the food addiction theory. However, the notion of addictive foods attracts us on a much deeper level as well…

Image result for food addiction


The allure of addictive foods

There is a strong cultural appeal in the idea that certain “bad” foods or their components can cause dependence and are thus dangerous (e.g. MSG, casein, gluten). This view of overeating as addiction includes the need to “detox” and instead eat a “clean” diet (e.g. this: The Diary of a Sugar Addict in Detox).

These are not just modern health trends, but a manifestation of a need to understand our world by imposing structure and thus meaning on the untidy experience that is reality. Structure is created by categorizing things into clean/unclean, healthy/unhealthy, pure/dirty- and things that don’t clearly fit into such categories are considered unclean and dangerous. Anthropologist Mary Douglas makes this point in her seminal book, Purity and Danger, as she examines food taboos (cultural rules about what not to eat). Douglas points that prohibited foods are considered “polluting” because they defy easy classification into culturally important categories. The current unease with genetically engineered foods is a fantastic modern example: as a technology that blurs the lines between natural and unnatural domains, it is indeed often termed by opponents as “genetic pollution” or “contamination”.

…prohibited foods are considered “polluting” because they defy easy classification into culturally important categories.

The categories we create to make sense of the world have strong moral overtones, as they allow us to essentially define right and wrong. Indeed, the word “addiction” itself is connected to the moral disapproval of socially undesirable behaviors (e.g. drug abuse). Psychologist Paul Rozin points out how the fear of sugars in American diets, for example, reflects the Puritan belief that things that are very pleasurable must also be bad.

Religious Scholar Alan Levinovitz also emphasizes that people frame eating in terms of morality and religion. He discusses how concepts of healthiness reflect the “myth of paradise past”- the idealistic belief that things were better, healthier, and even morally superior before. From such perspective, novel changes to foods represent our fall from grace- whether via agriculture (e.g. as in paleo diet ideology) or industrialization and technology (as with processed and genetically modified foods).

So, that’s my little anthropological view of food addiction beliefs as a cultural phenomenon. Hope you enjoyed it!


P.S. You might see news reports on studies about food addiction.. but keep in mind that no clinical diagnosis for “food addiction” exists, and most such research uses a self-report questionnaire: the Yale Food Addiction Scale (YFAS). This tool uses DSM-IV’s generic criteria for substance abuse to measure addictive-like eating.

Most importantly, it does not validate the existence of “food addiction” as a true disorder (DSM diagnostic criteria is intended for trained clinicians, not a checklist for self-diagnosis via a simple questionnaire). This is a critical issue to consider, as most food addiction research with humans is based on diagnosing food addiction this way.

Advertisements

Time of Eating & Health: Video

I made my first science communication video! It took me only ~ 15 hours, no big deal 🙂

It won’t be as time consuming from now on..but there is a lot of work involved nevertheless: writing a good concise script, sketching all the images that could go along with it, setting up the recording (can be so tricky!), recording yourself draw (and redraw.. and redraw) every frame.. Then editing all those videos, recording the audio (and re-recording..and re-recording again), and finally matching video to audio (as well as finding some free background tunes to go along!). Check it out: 

 

I chose this topic because i’ve been craving to cover it for some time now. Since my dissertation work focused on lay models of healthy eating across cultures (so: people’s beliefs about what it means to eat well), I did not address the scientific accuracy of any perceptions. But oh I wanted to! And that is because one of the most fascinating findings from my interviews was that eastern European (EE) participants considered “how you eat” (i call these “eating styles”) to be more important for health than American respondents.

Eastern Europeans (EE) judged statements about EATING STYLES (such as time of eating) as more important for health…

Specifically, EE participants rated the statement “it is important to avoid eating late in the day” significantly higher than Americans (and this was true from my past survey-based studies!).

This is what the image below shows, but let me explain the method behind it: I conducted >70 interviews in the U.S., Romania, and Ukraine where I asked people to look at 42 different statements about “healthy eating”. Among other activities, they had to indicate how much they personally agreed with each statement (from “4” agree completely to “-4” disagree completely; I used Q Methodology for this, by the way ).

Screen Shot 2017-06-13 at 11.01.58 AM
So, between Americans and eastern Europeans, 1 statement about eating styles (or “context” as I referred to it in this chart) that was more important for the latter: not eating late.

So, out of ALL 42 cards, only “avoid eating late in the day” got a statistically significantly higher agreement score from eastern Europeans. AND when prompted to explain their views, my respondents gave an explanation that was amazingly close to the actual science of circadian rhythms!!

Why did EE folks seem to know about circadian rhythms way more than Americans?

I didn’t analyze why EE folks seem to know about circadian rhythms way more than Americans, but this knowledge is something they knew from childhood.. It was part of general recommendations and “common sense” while growing up in the USSR. It fact, importance of eating styles is prominent in traditional beliefs about health (like in Japan and China).. perhaps all the focus on nutrients that’s possible with modern science is taking our attention away from this old wisdom?

perhaps all the focus on nutrients that’s possible with modern science is taking our attention away from this old wisdom

Now that nutrition science is paying increasingly more attention to eating styles as well (CHRONO-NUTRITION!), I assume American folks will begin incorporating beliefs about importance of food timing also!

Are we all just food-selecting zombies?

It was a Saturday afternoon- I spent all morning writing one of my dissertation articles. It was unfortunate that i had to be on campus instead of enjoying my morning coffee at home, but some syncing error got me panicking as I couldn’t find my latest saved draft.. So here I was with only 25 minutes before my aerial fitness class: I made a quick stop at an empty campus store, grabbed something to eat and rushed out to my car, still deeply pensive over some changes I should make- I am writing about lay interpretations of healthy eating context.

I looked down to see what it is I was holding in my hand, because it seemed like i made my snack choice in some auto-pilot mode: fullsizerender-20

OK, it made sense. Considering the context. And then I thought- I just spent 4 hours writing up my article on healthful eating beliefs.. how would I go about finding out what rationale was behind my food selection right now?

I pretended to ask myself in an interview format- “why did you choose these items?”- and immediately imagined a word cloud of my transcribed answer: there was a whole bunch of stuff there, but several most salient words stood out: calories, protein, satiety, light. The transcribed text would read:

…so I was not starving yet, but it was almost 1pm and I had an intensive aerial class that I anticipated i’d want energy for… i needed to feel full but not physically full (so, light)- can’t eat anything big before hanging upside down on the aerial hoop! I know protein is satiating, and I like this bar because it’s damn delicious (i’m aware of the halo effect that “protein” has in this situation – extrapolating the “goodness” of protein to unrelated product characteristics, such as it’s overall healthiness… it’s really just a candy bar! but the health claims on the package do pacify the guilt splendidly). I also know that sweetness may provoke hunger on it’s own, so I have to balance the taste with the umami-ness of string cheese. This combo is also just about 300 calories, which is my upper limit for a snack (I gage it, though I know i’m exactly on point with the number despite not checking the nutrition label)… I don’t really count calories- I think it’s not a helpful behavior and one can become fixated on it, which might get detrimental for your dietary quality. Yet I also can’t help being somewhat vigilant- I know eating gets more “fun” later in the day, so I want to leave enough of an energetic allowance to indulge in my evening netflix/playstation time. Calories definitely matter- i’m so tired of people’s hopeful attempts to fight this truth and discover a loophole in the first law of thermodynamics. Sure, there are nuances- cooking and processing can change the availability of calories to your body, but those are just nuances to me- at least that’s my current stance based on the literature.

Wow, that’s a whole lot of rationale for an “auto-pilot” choice that took 20 seconds without conscious effort. Of course, eating perceptions and choices are my research topic, so I am quick to self-reflect in detail. Yet for many respondents, who hold their own complex mental models of healthy eating, this can be like pulling teeth- it’s not easy to explain things that seem obvious or natural to us (unless maybe you’re writing a dissertation on it). My reasons are good examples of cognitive heuristics- “rules of thumb” used to make choices in complex situations, such as eating (we make about 200 eating decisions daily, according to Dr. Wansink- too lazy to give you the specific study name.. just google it 🙂 ) The “Protein- satiety- good” connection is a simple heuristic, the “power” and “energy” words on the bar signaled appropriateness of this snack before a workout, the familiarity of the products (I know this bar and it’s taste; bought it before) also played a role.

But anyway: I’m almost done writing my first chapter now. I’m in the process of shortening it actually……… by about 10,000 words :S It’s such a painful process to let go of your findings- perhaps I’ll post a bunch of interesting results here in the coming months! I could be sporadically posting cool quotes on twitter or Instagram too, but honestly- that’d get attention of maybe 10 people. Meanwhile my latest quick sketch of a friend pulling off an aerial trick just got more than 1000 likes… So forcing myself to tweet the dissertation is lacking in motivation at the moment. In the meantime- enjoy whatever it is you might be eating right now! Don’t overanalyze it, I suppose?

UPDATE:

I stopped by the campus store on this fine “dissertating” morning, and got the protein bar again + another item to illustrate my previous point. This probably won’t shock anyone, but i’d say i was quite correct in stating 2 days earlier “it’s really just a candy bar!”

img_4553
The protein bar’s serving size says “1 COOKIE”. Cookie! Kit Kat has the decency to refer to itself as 1 package 😀 Surely, both are just candies.

At least if you consider the energy content and, really, majority of ingredients (i will admit- “monk fruit” sounds mysteriously awesome, though it is the last ingredient (so there’s like a trace amount of it).

Now, obviously there’s a difference- and that’s the difference that drives the high price point of the protein bar (as well as it’s healthiness message): the power bar has more protein (13 g vs. 3) and less sugars (5g vs 21g). On another hand, the power bar has a bit more saturated fat and cholesterol. That last point is most likely less relevant to an average reader- so far, my interviews and surveys show people vilify sugar much more than fat (again, you’re probably not shocked and i’m definitely not the first one to notice- the low fat fad is over, it’s been all about the horror of carbs for awhile).

Now, protein appears to be more satiating than sugar, according to a bunch of studies (go check out Google Scholar), so perhaps you indeed might eat more later after the Kit Kat, despite eating the same amount of calories as from the Power Bar. And something like that can be tested in a nicely designed experimental study (probably has been). Despite all of this, next time i make a quick stop at the store, i’ll probably still reach for the  Power Crunch bar. Buying a Kit Kat is too bizarre- I  don’t eat candy! And though i know the bar is really just another candy- well,  it just leads to less cognitive dissonance 😛

 

 

Red meat, human vulnerability, and.. mammal pets?

12993520_545709718944195_6669954929821747069_nExciting day! Another diet-related talk at ASU’s Center for Evolution & Medicine. This was a nice break from the horror that is the last 2 weeks of the semester..

It’s taking me awhile to “digest” all the information (hehe), but I found the seminar fascinating and wanted to summarize some main points. Lots of open questions remain, but John Pepper of National Cancer Institute really shows how examination of any health problem needs to focus not only on proximate causes, but the ultimate or evolutionary causes.

So.. Pepper asks- why is mammal meat bad for humans, specifically?

FullSizeRender 19
Meet Dr. Pepper!
In humans, red meat (he refers to it just as mammal meat) is linked to inflammatory diseases (cardiovascular, alzheimer’s, arthritis). What’s the mechanism behind this?
The inflammation from mammal meat has to do with our antibodies attacking something coming from other species.. When we eat mammal meat, we in fact incorporate something non-human from the diet- sialic acid.
FullSizeRender 17
Both human and other mammals have sialic acid in their tissues, actually, but humans have a unique mutation that replaces the form found in other mammals (ancestral form- Neu5Gc) with a different one- uniquely human (Neu5Ac).
So.. if we eat meat we get the new aquired ancestral sialic acid, it becomes part of our cells, and the small structural differences in the two get recognized by the immune system.. which responds with a defense- inflammation!
Chimpanzees are humans’ closest evolutionary relatives, sharing a common ancestor 6–7 million years ago..
WHY does human sialic acid differ uniquely? The “Malaria hypothesis” (see Martin&Rayner, 2005) proposes that in Africa, early humans escaped from the ancestral pathogen they shared with chimpanzees. They managed to do so by replacing the pathogen’s binding target (ancestral sialic acid Neu5Gc) with novel Neu5Ac. With time, a population of that old evaded pathogen evolved to infect humans again by recognizing the new Neu5Ac..leading to the origin of malaria.
Screen Shot 2016-04-21 at 2.03.08 PM.png
The longer an animal has been domesticated, the more humans share parasites and diseases with them

If the Malaria Hypothesis explains why the initial change in humans happened.. why has it remained the same to this day? I mean, it’s been some several million years now- has this mutation been advantageous this whole time? It’s an important question because this sialic acid mutation poses a COST on our health: this trait causes chronic inflammation in people who eat mammal-derived foods + it also now causes vulnerability to malaria.

The hypothesis for why the human sialic acid modification is still around is that it
provides benefits- specifically, protection from parasites and pathogens via increased inflammation. This is relevant because of what humans have been doing for the last ~15,000 years. Animal domestication!
Humans are more vulnerable to shared pathogens from other mammals (than from non-mammals). So being around cattle, for example, carries a risk of catching pathogens from which that cattle suffers. Such animal pathogens impose a strong selective pressures on humans.. Pepper suggests that the uniquely human sialic acid (Neu5Ac) allows our diet to adapt us to the issue of animal pathogens by adjusting our inflammatory tone (how much inflammation we are experiencing): “those human populations that are exposed to domesticated food-mammals and their pathogens are also eating mammal-derived foods that are pro-inflammatory (both meat and dairy).”
Inflammation is a great example of a trade-off. It both has benefits (protection from parasites & infections) and costs (chronic disease, metabolic expense of mounting an immune response). The optimal balance for this trade-off would depend on how strong of a pathogen pressure you’re experiencing.
This increases inflammatory PROTECTION only where it’s most needed (like around animals). So this auto-immune inflammation from mammal foods in the diet not only increases likelihood of chronic disease, but protects against shared mammalian pathogens.
…..    ……    ……
It got me thinking about human culture and our ability to modify our environment in all sorts of ways- an example of “maladaptation” to modern times! Living in cities, not exposed to higher pathogen load from being around domesticated animals..yet having access to all the mammal meat we can buy = all put you in a situation where the good old sialic acid mutation might do more harm than good. Should people go vegan? Should they simply cut down on red meat? There was no discussion on the effect size of mammal meat eating and chronic disease, so I wouldn’t necessarily jump onto any lifestyle changes based on this talk. Yet the process of understanding this health concern through the lens of evolutionary medicine is quite fascinating!
 P.S. I’m not an expert on this topic. If you have something to correct or add, please comment 🙂
FullSizeRender 18
Very cool use of evolutionary medicine principles in this case & a glimpse into why it’s important to use them if we want to understand disease.

 

Eat Less- Live Long? Not so FAST..

Got it- to FAST? 😀
The past week has been a treat in terms of great talks on campus. At ASU we are super-lucky to have the Center for Evolution & Medicine, which holds weekly talks by amazing speakers.

First
February 18- Arizona State University

When I saw that the upcoming seminar was related to diet and eating..or more specifically NOT eating or “dietary restriction”, I of course RSVPd in a heartbeat.

“Eat breakfast yourself, share dinner with a friend, give the supper to your enemy”- Russian Proverb

I’ve been in fact fascinated with caloric restriction for years now Screen Shot 2016-02-20 at 1.30.46 PM.png(I wrote a whole research paper on it in the first year of my master’s degree). You might have heard of intermittent fasting (e.g. popular in the CrossFit world), or the CR Society ( http://www.crsociety.org/ )- all are related to the concept that restricting food intake results in health benefits (from extending life to preventing and reversing disease).

I’m sure you can Google caloric restriction and find a bunch of information on its reported benefits..you would see this chart at the CR society website- the lifespan of calorie-restricted (CR) mice vs non-CR mice. You can see that those whose food intake was restricted by more & more % lived longer. Screen Shot 2016-02-20 at 1.24.28 PM.pngWhy do many animals (and perhaps humans) appear to be so well-adapted to eating less? The traditional interpretation of this CR phenomenon is that the dietary restriction effect “has evolved as a way to enhance survival & preserve reproduction during periods of naturally occurring food shortage”. In other words- being adapted to do well on restricted food intake during rough times would have helped our ancestors survive them & stay healthy to have kids later when the food situation improves.

The traditional interpretation of this CR phenomenon is that the dietary restriction effect “has evolved as a way to enhance survival & preserve reproduction during periods of naturally occurring food shortage”.

Experimental evidence with animals, however…supports a different hypothesis- the one Dr. Austad (Professor & Chair of the Department of Biology at the University of Alabama) presented to us last week. Again, I wouldn’t be able to cover everything he discussed during the seminar, but I do want to highlight a couple of main points!

I. First, even though the first book on dietary restriction (DR) HowWorks.jpgdates back to the late 16th century, we still do not know the mechanism behind why DR seems to extend life and vigor in animals + delay disease such as cancers. METABOLISM was the original suspect, as metabolic rate goes down with fasting.. however, metabolic rate drops initially yet gradually goes back UP (takes 6-8 weeks to happen).. Since DR changes an unbelievable amount of physiological parameters (see screenshot ->) it is very hard to determine its mechanism.

II. Second, while many sources cite mice experiments showing life extension with caloric restriction.. those experiments are done with lab mice. When DR studies are done with wild mice, DR has no effect on longevity. WHAAAT!! I’ve never heard this before- in fact i was under the impression that CR/DR extends life in animals, period. Well, NO STUDY has ever found that DR extends life or improves health in nature (or even “nature-like” conditions). Mice in the wild actually do not have enough fat stores to reduce feeding except very briefly (wild mice has about 4% fat while a regular lab mice has 15%; also lab mice do not reproduce). In fact, mice in nature simply do not live long enough for the survival benefits of DR to be important. Another challenge to the original hypothesis that adaptation to dietary restriction enhances survival, is that DR increases mortality from some infections. Lastly, DR increases cold sensitivity (and cold is a major source of death in wild mice) and slows down wound healing.

Sounds like animals in the wild would not benefit from adaptation to dietary restriction… yet why is the positive DR effect observed in so many studies so common?

III. Well, even though wild mice do not live longer with restricted diets, DR still results in cancer protection for them. But even more importantly, DR has been found to protect against acute effects of many many toxins! Dr. Austad talks about this discovery in the following way:

 .. if animals can not afford to wait to reproduce..and they have to do it even when food conditions are poor, what they will do is broaden their diet. This means they might be ingesting a lot of toxins they are not normally exposed to (foods infected with fungi, new seed types that are well defended by the chemicals they wouldn’t normally encounter). So the hypothesis is that DR acutely induces broad defense mechanisms from a broad range of toxins

Toxicology studies have shown that mice that are calorically restricted survive a wide range of toxins. DR also acts as an acute (vs. chronic) protectant against other problems (see slide below). Renal ischaemia reperfusion injury (IRI) is a common cause of acute kidney injury and we can see that while ad libitum mice are dying steeply by day 7, those on DR of various proportions survive (30% DR is only 70% of normal food intake; ad libitum stands for eating as much as one wants). This is quite impressive!!!

Screen Shot 2016-02-20 at 2.36.43 PM.png

These acute benefits of DR have very important implications. We can think about these effects actually protecting the body against the toxins it itself produces (like free radicals).. it also has clinically relevant advantages- e.g. patients on very strong drug cocktails fasting to avoid harsh side-effects. This suggests that the protective effects of DR could have clinical relevance unrelated to chronic benefits like life extension.

The new hypothesis explaining the evolutionary advantage of this paradoxical effect is that dietary restriction arose as a defense against novel exposure to toxins during food shortage.

So in conclusion.. we saw evidence suggesting that dietary restriction would NOT enhance survival in nature. Yet research has shown that DR increases health and life in a diversity of species. The new hypothesis explaining the evolutionary advantage of this paradoxical effect is that dietary restriction arose as a defense against novel exposure to toxins during food shortage.

Screen Shot 2016-02-20 at 2.54.15 PM.png

My conclusion? I’m still excited about this topic- more than ever before!!! There is a lot of work done now on the timing of food intake as well (not just restricting the amount, but restricting the timing of eating and human health) and I can’t wait to post more about this (after I collect some necessary data though :).  Watch out for early May as I’ll be sharing some more info!

The Sci Files #1: Importance of Carbs in Human Evolution

Note: This Fall I decided to attempt even more science communication! The Sci Files (imagine the x files theme playing) will be a collection of health & food-related research articles that I summarize in plain(er) language. I became quite passionate about breaking down hard-to-understand research for the public audience and I’ll try to do my best, considering I’m no expert! Yet 5 years of graduate courses- statistics, research methods, nutrition psychology, evolution & medicine- at least give me skills to understand a lot of the material that might be overwhelming to a lay reader. I will try to keep the summary to one page (~500 words), possibly followed by extra material that could be interesting 😉

For the first Sci File, i’m looking at a paper discussed yesterday during a lecture on the paleolithic diet. It’s published in 2015 in The Quarterly Review of Biology and the title intrigued me “The Importance of Dietary Carbohydrate in Human Evolution”. I’ve heard multiple talks on how the various “paleolithic” diets could have included starchy foods, but I didn’t think they were substantial parts of such diets.
Original paper: Hardy, K., Brand-Miller, J., Brown, K. D., Thomas, M. G., & Copeland, L. (2015). The importance of dietary carbohydrate in human evolution. The Quarterly Review of Biology, 90(3), 251-268.

Short summary:  

calm
Apparently, you can delete the “NO” and still keep calm 😉

The authors propose that carbohydrates- particularly cooked high starch plant foods like tubers & roots- were essential in the evolution of our species- especially for the quick expansion of the human brain. They support this by showing that (1) critical development of this large glucozse-hungry organ required digestible carbohydrates, and eating cooked starch would really increase this energy availability to the brain (+ other glucose-hungry tissues such as red blood cells and the developing fetus).

They also show that the mutation in the enzyme for digesting carbs (salivary emylase, AMY1) co-evolved with both cooking and eating starchy carbs, giving an advantage to early humans. To put it in simpler terms: carbs were quite important, as shown in our increased ability to digest cooked starch (otherwise, why retain this mutation if we did not rely on cooked starches for a substantial amount of time?). A meat-heavy diet wouldn’t have provided sufficient glucose or energy to the growing brain + 1) large amounts of protein are in fact toxic and 2) providing sufficient amount of animal-based food would require too much effort:

“the energy expenditure required to obtain it may have been far greater than that used for collecting tubers from a reliable source”

Some Context: 

There is no clear agreement on what constituted a “Paleolithic diet”, but it makes sense to assume that our current physiology should be optimized to the kind of diet we had during our evolutionary past. Some important features in our evolution are considered linked particularly to key changes in diet: smaller teeth, smaller digestive tract (1.8 mln years ago), larger brain size (began ~2 mln yrs ago; accelerated around 800,000 yrs ago), and better aerobic capacity (ability of the heart and lungs to get oxygen to the muscles) about 2 mln years ago.

Early hominins include modern humans, extinct human species, and all our immediate ancestors

Some have argued that these changes happened because  humans transitioned from a diet based on fibrous plants to mostly meat-based diets.. But this paper offers evidence that both plant carbohydrates (carbs) and meat were crucial in human evolution. In their words:

We contend that in terms of energy supplied to an increasing large brain, as well as to other glucose-dependent tissues, consumption of increased amounts of starch may have provided a substantial evolutionary advantage to Mid-to-Late Pleistocene omnivorous hominins“.

carbs2
This photo is missing some starches!

Actual physical remains of early hominins are quite rare, so there is a lot of uncertainty about their lives. As already mentioned, there were several important changes in hominin morphology (size, shape, and structure of an organism) related to the appearance of Homo erectus (teeth, digestive length, brain). Anthropologists propose that they occurred with a change from a “high-volume, low-energy diet” (lots of fibrous plant material that’s not very calorie rich), to a low-volume, high-energy diet (so foods that are more packed with energy like meats and starchy roots & tubers). 

It looks like climate fluctuated between moist and dry periods, which required flexibility in diet (omnivory).. Increased meat consumption has been suggested as an important buffer against such environmental change (and helped expend into new unfamiliar environments), but high starch plant foods might have also been a very common and important part of the diet- especially when cooked. The timing of widespread cooking is not known, but it is argued that it was long enough ago to allow for biological adaptations to take place.

Note: Secure evidence of the use of fire to cook dates to about 400,000 years ago, though some suggestive evidence for a relationship between humans and fire dates to at least 1.6 mln years ago.

The fact that early hominins ate starchy foods is supported by various evidence (the paper goes through rather wordy technical anthropological examples that I fail to summarize in a simpler way). But while meat-eating evidence usually survives (e.g. animal remains with cut marks suggesting being butchered), evidence for plant foods doesn’t, which makes it hard to reconstruct ancestral diets based on physical remains alone (and biases them towards exaggerating meat eating).

Co-evolution of cooking & carb-digesting genes

Humans have the ability to digest starches with the help of enzymes in saliva- salivary amylase! AND humans are quite unusual as we have high levels of these enzymes, suggesting an adaptation to diets rich in cooked carbohydrates. Also, people from populations with high-starch diets have generally more AMY1 copies than those that have traditionally low-starch diets (hey! adaptation!).

Amylase (salivary amylase or AMY1)- enzyme that begins digesting starches in the mouth as it’s present in the saliva. Authors hypothesize that cooking and variation in the salivary amylase gene copy number are correlated.

The variation in copy numbers of salivary amylase genes is an important point of the paper – these enzymes are pretty much ineffective on raw starch, but cooking substantially increases their potential to provide energy/calories. So multiplication of the salivary amylase (AMY1) would become selectively advantageous only when cooking became widespread. (It’s been estimated that the three human AMY1 genes have been evolving separately for less than 1 million years). The authors theorize a gene-culture co-adaptation scenario here: cooking starch-rich plant foods (cultural evolution) coevolved with increased salivary amylase activity in the human lineage (gene evolution). Without cooking, eating starch-rich plant foods probably couldn’t meet the high demands for preformed glucose noted in modern humans.

Note: A mutation that is selectively advantageous means a change in DNA that gives a survival advantage to a particular genotype under certain environmental conditions. SO in an environment where starches are available (e.g. you can find a lot of roots and  tubers) and humans have learned to cook, having more copies of the AMY1 gene that aids in digesting cooked starch would allow those folks to survive more (e.g. in times of food crisis when they can’t hunt or gather other sources of food, etc.) vs. folks who don’t have that mutation.

To further test the paper’s hypothesis, we need “a convergence of information from archeology, genetics, and human physiology”. So let’s stay tuned 🙂

evol

Well, i’m at around 900 words, which is more than the summaries i hope to do in the future! In my defense, this paper was FULL of fantastic information, often rather technical and challenging to explain in less words. I do have some extra content below i found fascinating if you found this summary interesting!

Continue reading

Healthy Eating- Real or Imaginable??

111111111
The lobby of one of the conference hotels during non-busy time. The hallways were overflowing with anthropologists all over the world just hours later 😀

This December I presented on my research at the American Anthropological Association in D.C. (woohoo!) What a blast! The conference was bursting with anthropologists all over the globe; the 5-day event was so packed with presentations that the program which included just names of talks & authors ran about 500 pages.

Anyway, one of the interesting moments from the trip was a scholar (I believe she did some work in Latin America but I don’t know what kind of anthropologist she was), who was seemingly bothered by our session on food and nutrition. Our talks focused on “healthy eating” as a social construct [a social phenomenon created and developed by society; a perception or idea that is ‘constructed’ through cultural or social practice]. My talk was on how perceptions of what healthy eating means differs among and within cultures (Ukrainians & Americans in my study), while other presenters talked about how food is discussed in the Canadian Arctic and among those following a traditional “paleo” diet plan. 

12121212
Presenting on my Ukrainian study!

The question this lady asked was why we spoke of healthy eating as something created and perceived by humans as if there is no objective healthy diet supported by science.

It’s a bit funny to hear someone being surprised that concepts are discussed as a social creation vs. an objective reality at an anthropology meeting.. but that shows how food and healthy eating can be quite emotional when one is health conscious! I would bet this scholar was someone who personally cares about eating well for her own health. Understandable. Food is a very emotional topic- it is not only good/bad for health and looks, it also represents our identity, our culture, our experiences, etc.

Part of my answer to her was that science might not be able to give her what she is looking for- the objective healthy diet. Not because science sucks, but because nutrition studies are lengthy, complicated, and costly (see my post on why nutrition science doesn’t suck HERE). My favorite example of why nutrition science is hard to rely on is SUGAR. Look at this World Health Organization 2003 report (see full report).

The common sense might tell you that added sugar can’t be good- it adds calories, maybe it makes you hungrier or disrupts bodily processes, maybe it’s just unnatural. People I interview often mention that sugar is one of the main causes of weight gain. Common sense, right? Well, look at the WHO report and check out Free Sugars (= all monosaccharides and disaccharides added to foods by the manufacturer, cook or consumer, plus sugars naturally present in honey, syrups and fruit juices). The only convincing evidence from scientific studies is that free sugars increase the risk of dental caries. Not weight gain, not diabetes, not heart disease. Does this mean sugar is only bad for teeth? No, it means there isn’t evidence that it causes other disease with the studies that we have. So if you want to state with complete confidence that added sugars lead to chronic disease and obesity, you might have a hard time backing it up.

 who1 who3

Thinking that there is no such thing as a healthy diet is unsettling. We want clarity. :S Saying that “healthy eating” is an idea constructed socially, however, doesn’t mean that there is no such thing as healthy eating. It does mean that there are multiple ways one can eat well to avoid disease- it can be vegetarian, vegan, paleo, regular calorie restricted diet, Mediterranean diet, etc. etc. etc.

Historical perspective on what good/healthy eating is.

The official stance on a healthy diet is not purely unbiased either- the political and historical context shapes what is officially recognized.  I heard a very interesting talk on the differences in nutrition perceptions between Denmark and Germany during 1940-1945 by Dr. Jensen (University of Copenhagen). She talked how in the early 20th century macronutrients, salts, water and ash were believed to be the sole constituents of food.  Then vitamins were discovered resulting in growing scientific interest in identifying new “micronutrients”, a development that altered (diminished) the perceived importance of the macronutrients (protein, fat, carbs).  So as in Denmark micronutrients became the focus, good nutrition became about vegetables- the source of many micronutrients. In Germany, however, a country experiencing hunger during WWII, macronutrients remained as most important considerations in nutrition textbooks (with protein considered the primary element of food- for the satiety and strength it provides, especially for a country at war!). The point is- the scientific (and thus public) perceptions of what good eating means is shaped by societal circumstances.

It all just depends…

34343434
Baklava- a middle-eastern dessert I am absolutely insane about. My friends sometimes wonder how I can study health yet eat something so “unhealthy” as a high sugared dessert. IS it unhealthy? Turkish people love their sweets, yet traveling around Turkey will show you that the population is not plagued by obesity and chronic disease.

Back to whether an objective healthy diet exists or not. If we ignore for a second that people disagree on the details of what one should eat to stay healthy (is carb or fat evil? is animal protein toxic? should you go vegan? avoid gluten like the plague?), most folks at the minimum agree that eating “real” or whole foods is important (or in other words- avoiding or limiting modern processed foods and focusing on the less modified foods). I suppose we could say that this definition of a healthy diet is generally accepted. If we move on from processed vs. whole, though, here are a couple of examples of when something generally healthy might not be good for you or vice versa:

Cabbage! A wonderful plant full of micronutrients (vitamin K! Vitamin C!) that protect one from various diseases; the plant is often stated to have anti-inflammatory and anti-cancerous properties. Awesome. Unless you have hypothyroidism since cabbage is one of the foods that interfere with the thyroid function.

Dairy! Gets a lot of bad rep from the paleo community and others. While recently thought as very important for bone health and what not, there is a lot of talk that we have not evolved to tolerate it quite well and it is thus an unhealthy substance to consume. Our genes are still adapted to the pre-agricultural diet (before ~10,000 yrs ago), as many paleo proponents will argue. Yet there is evidence to challenge the assumption that humans are essentially unchanged since the Paleolithic era. E.g. “recent” evolution of lactase persistence and variation in the number of genes that code for amylase production tied to starch consumption. In other words, mutations have occured that allow many folks to digest and thrive on dairy and grains just fine.

– Phytates. Plants have a lot of great ingredients that generally affect us positively (e.g. vitamins protecting from disease), but it depends.. For example, phytates in grains and nuts are usually viewed as bad for us because they can bind to certain dietary minerals leading to deficiencies (iron, zinc, etc.).. In West Africa, many Hausa plants contain substantial amounts of these phytates (especially in cereals and legumes) but these botanical chelators have a potential malaria-suppressive effect (awesome!!). However, this anti-malarial effect may be antagonized by antioxidants in other foods (e.g. such free radical traps as Vitamins C, E, beta carotene, selenium). Antioxidants is something many of us try to increase in the diet..yet if you are living in malaria-prone regions of the African continent, you might want to concentrate on the opposite dietary strategy- phytate-rich and antioxidant-poor foods.

989898So is there an objectively healthy diet? Generally- all eating is healthy since it is required for survival.. undereating and overeating is not good.. lacking a variety nutrients is not good.. and that’s mostly it. Of course, different things work for people- someone might not tolerate dairy, others might feel miserable on a vegan diet; some thrive on salads others can’t digest raw plants well. If only we could all grasp the wonderful concept of moderation and apply it in our lives without struggle. In fact, it is because self-control is so hard to maintain that we want simplified solutions- a diet plan, a list of “bad” foods to simply avoid, etc.

Happy Holidays– don’t overeat on most days, yet don’t let yourself stress so much about what you’re eating that you are unable to enjoy life! 😉 *grabs a big fat piece of dark chocolate and kicks back*.