The practice of calorie restriction involves reducing calorie intake by up to 40% while maintaining an optimal intake of micronutrients. It can meaningfully extend life span in short-lived species such as mice, but does not add more than a few years in humans. The effect on lifespan of this and other interventions known to slow aging via upregulation of stress response mechanisms scale down as species life span increases – though, interestingly, the short-term benefits to health look quite similar across mammalian species.

The most important mechanism of action in the calorie restriction response, as well as responses to heat and other stresses, appears to be an increased operation of autophagy. Autophagy is the name given to a collection of cellular maintenance processes that break down unwanted or damaged proteins and cell structures by conveying them to a lysosome, a membrane packed full of enzymes capable of breaking down most of the molecules a cell will encounter. It is noteworthy that disabling autophagy, or important related processes such as the formation of stress granules to protect vital proteins from increased autophagy, blocks the benefits the calorie restriction response. It is similarly noteworthy that the efficiency of autophagy becomes impaired with age, and this is thought to contribute to many manifestations of aging.

The present consensus on why calorie restriction extends life notably in mice but not in humans is that the calorie restriction response evolved to enhance reproductive fitness in the face of seasonal famine, extending life to allow individuals to survive and reproduce once food was again plentiful. A season is a large fraction of a mouse life span, but not a large fraction of a human life span, and therefore only the mouse evolves to experience sizable increases in life span when calorie intake is low. This is far from the only evolutionary explanation for the calorie restriction response, however. Today’s open access paper is a review of the topic, providing an overview of present viewpoints.

Lifespan Extension Via Dietary Restriction: Time to Reconsider the Evolutionary Mechanisms?

Dietary restriction (DR), a moderate reduction in food intake whilst avoiding malnutrition, is the most consistent environmental manipulation to extend lifespan and delay ageing. First described in rats, DR has since been shown to extend lifespan in wide range of taxa: from model lab species such as Drosophila melanogaster and mice, to non-model species such as sticklebacks, crickets, and non-human primates. Owing to this taxonomic diversity, it is presumed that the underlying physiological mechanisms of DR are evolutionarily conserved and thus DR has been widely used to study the causes and consequences of variation in lifespan and ageing. Despite this attention, both the evolutionary and physiological mechanisms underpinning DR responses remain poorly understood.

Since its inception DR has become an all-encompassing description for multiple forms of dietary interventions. The most widely studied form of DR is calorie restriction (CR), a reduction in overall calorie intake whilst avoiding malnutrition. Common forms of CR include providing a restricted food portion, dilution of the diet, or restricting food availability temporally. Positive effects of CR on lifespan are well supported. Initial explorations of the role of specific dietary components, such as protein content, found that the effects were largely driven by caloric intake. Consequently, until recently DR and CR were largely interpreted as synonymous terms. Owing to this focus on CR, the predominant evolutionary explanations of the DR effect were developed to explain responses to CR and not macronutrient availability.

The Resource Reallocation Hypothesis

The most widely accepted evolutionary explanation of DR is a trade-off model based around the disposability theory of ageing. This theory suggests that a trade-off exists between reproduction and somatic maintenance (lifespan). The Resource Reallocation Hypothesis (RRH) proposes that during periods of famine (e.g., CR), natural selection should favor a switch in allocation, in which context organisms reallocate energy almost exclusively to somatic maintenance and not to reproduction. By investing heavily in somatic maintenance, organisms will improve their chances of surviving the period of famine, when it is likely that the cost of reproduction is high and offspring survival low, resulting in lower fitness returns. Once conditions improve, investment in reproduction can resume, and that should result in higher fitness. Critically, the reinvestment strategy described in the RRH will only lead to higher fitness if conditions improve. Owing to the trade-off, the RRH predicts that under DR conditions in the lab, there should be an increase in lifespan accompanied by a corresponding decrease in reproduction.

The Nutrient Recycling Hypothesis

Recently, the RRH has been critiqued, the argument against it being that adopting a pro-longevity investment strategy is unlikely to increase survival in the wild, where the main sources of mortality are extrinsic (i.e., predation, wounding, or infection). An alternative evolutionary explanation was proposed that we will term here the nutrient recycling hypothesis (NRH). As with the RRH, the NRH was proposed to explain an effect of CR, not the more recent suggestion of specific macronutrient effects. The NRH proposes that rather than sacrificing reproduction to increase longevity, organisms under DR attempt to maintain reproduction as much as possible in the face of reduced energy resources.

To achieve this, organisms upregulate the activity of cell recycling mechanisms such as autophagy and apoptosis. This allows better use, and even recycling, of the available energy, which can then be used to maintain reproductive function. The argument here is not that the level of reproduction achieved under DR is greater or even matched to that of a fully fed individual, rather that the loss of reproduction is minimized. An interesting suggestion of the NRH is that the pro-longevity effect of DR is an artefact of benign lab environments. The main sources of mortality in the laboratory are old age pathologies such as cancer, which are ameliorated by upregulation of autophagy and apoptosis. However, in the wild, cancer and other old-age pathologies are a relatively minor source of mortality, so the protective effect of the DR response may not be observed.

The Toxic Protein Hypothesis

A more recent hypothesis to be put forward is the toxic protein hypothesis (TPH), which is a constraint-based model rather than an evolutionary theory. Unlike the theories already discussed, the TPH was put forward in light of renewed focus on the role of macronutrients in DR responses. The TPH argues that protein is essential for reproductive function, where increasing protein intake leads to higher reproductive rates. However, it is proposed that high consumption of protein has direct negative effects on late-life health and lifespan, through increased production of both toxic nitrogenous compounds from protein metabolism and mitochondrial radical oxygen species.

Therefore, organisms face a constraint in the amount of protein they can consume, balancing high protein intake to maximize early life reproductive output whilst avoiding overconsumption, which may reduce lifespan and ultimately result in lower fitness. As with the other hypotheses, under the TPH there would be an optimal protein intake that maximizes lifetime reproductive success or fitness. However, the TPH argues that the DR response of increased lifespan is the result of protein restriction reducing the direct physiological costs of protein ingestion.