Over the last century, food fortification has been one of the great public health successes in nutrition, dramatically reducing the risk for diseases like pellagra, rickets, and goiter. But as Dr. Christine Taylor discussed, there has been a gradual paradigm shift in how we think about fortification. These changes were discussed in the session: “Fortification and Health: Opportunities and Challenges”, sponsored by ILSI North America on Saturday, April 26 at Experimental Biology in San Diego. The session was chaired by Dr. Johanna Dwyer and Kathryn Wiemer.
EDIT: See videos of this session here.
The session began with Dr. David Allison giving a touching tribute to the late Dr. John Milner, who was originally a chair of the session. It was noted that Milner was a promoter of public-private relationships, and “in the spirit of John Milner” was a frequent phrase. His impact on nutrition science and many of the speakers was clearly boundless. Read more about his prolific career here.
The first speaker was Dr. Janet King, who discussed where fortification has been successful and where there are limitations. She first emphasized the distinction between mandatory fortification/enrichment, where specific nutrients are added for a specific health program (by the US definition), versus discretionary fortification which is done at the choice of a food manufacturer. Iodine to salt, vitamin D to dairy products, thiamin, niacin, riboflavin, and iron to flour, fluoride to water, and folic acid to grains are a few that have reduced incidence of disease and the proportion of the population that consumes less than the estimated average requirements, as Fulgoni and colleagues (2011) have shown. Dr. King highlighted some limitations in fortification: programs used to be based on insufficient diet intakes rather than health/clinical problems; in the US, staple foods vary; many people are increasingly avoiding milk and dairy, and physiological bioavailability is often unknown in tissues. For example, increasing zinc by supplementation increases plasma zinc concentration but increasing it by food does not, according to one study.
In the 1970’s and 80’s, there was a shift in thinking as fortification as more of a “balancing act,” according to Dr. Christine Taylor. The National Academy of Sciences recommended iron fortification, but there were strong objections because of toxicity concerns, and thereafter the FDA conclude that fortification must be based on clinical measures rather than dietary intake. Determining whether to fortify grains with folate in the 1990s exemplifies the complex issues that must be considered. For instance, animal data suggested that folate can exacerbate vitamin B12 deficiency, and fortification would increase folate intake to a greater extent in those who already had a high-intake in a simulation study. A key theme throughout the program was that the reliance on dietary intake can be problematic. This message was really driven home by Dr. Taylor, who emphasized that intake data over-estimates inadequacy, and that we need more valid biochemical measures. Fortification should be a 2 prong approach- modest fortification, followed by targeted fortification where needed. Taylor speculated on the next US fortification experience – perhaps vitamin D. She highlighted a paper just published by their group that suggests we are underestimating vitamin D intake by about 15-30% because we currently don’t include 25(OH)D from animal products in food composition tables, a perfect example of why biomarkers are needed.
Dr. Omar Dary was next with a more global perspective on fortification. He stressed that fortification reach and coverage depends on the consumption pattern of the fortification vehicle. For example, iron is added to wheat flour in Jordan which improves status in children, but not women. Vitamin A is also added but that pattern is the opposite. In Chile, folic acid is added to wheat flour which reduces neural tube defects, but also increases the risk of excessive intake for much of the population. Multiple food vehicles would therefore be necessary in these countries to target different segments of the population. However, sugar is fortified with vitamin A in Guatemala, and because all segments of the population consume sugar, it doesn’t need to be added to anything else. Where detailed national diet intake data are not available, Dary and others propose a model based on “fortifiable food energy” – an optimal combination of food vehicles to achieve an ideal balance of increasing nutrients without unnecessary excess risk.
Dr. Regan Bailey discussed some of her NHANES research on children, looking at how fortification alters the proportion below micronutrient EARs and above ULs. She noted that not all nutrients have a large gap between the RDA and the UL, so fortification can push some above the UL. For example, 0% of children were above the ULs for folate and niacin without enrichment/fortification, but including these raised them both to 4%. Dietary supplements further increased these to 15% and 16% of the children, respectively. Bailey emphasized that we need need better analytically derived data for food and supplements and ideally we need to incorporate diet, supplements, and biomarkers to gain an accurate picture of the contributions of fortification and supplementation on nutrient status in the population.
Dr. Valerie Tarasuk covered discretionary fortified foods, for which the definition varies somewhat in different countries. In Canada, this means fortification that is not mandatory, and in the US it is fortification outside a planned health program. She showed how discretionary fortification is being used in marketing (certain energy drinks for instance) – in some products up to “many hundreds of times” the DRI. On some cereals, the % daily value is being used to market the product on the front of the box. Tarasuk’s lab has found that discretionary fortified foods do increase some nutrient intakes above the UL in some subpopulations. She emphasized that there is no evidence that going above usual intakes confers health benefits, and cautioned that research on utilizing nutrient supplements to prevent chronic disease keeps coming back negative. We need to monitor discretionary fortification by including it in food composition tables, and design surveys to enable subgroup analyses, she said. Along with these, tracking systems for adverse effects are needed and we need to study potential health consequences of chronic high intakes for fortificants and supplements.
Dr. Martin Philbert gave a talk that departed from the other topics: exploring fortification by nano-delivery. Nano-packages can encapsulate nutrients and altering their properties can target specific tissues if desired. However, absorption, excretion, metabolism, and toxicity can be altered when changing particle size, so much research needs to be done to explore these areas. For a technical overview of nanotechnology applications in nutrition, see this review.
Finally, Dr. Carl Keen took the audience through a hypothetical scenario in considering a new avenue of fortification: flavanols. There is a large body of evidence that suggests flavonoid intake is associated with a reduced coronary heart disease mortality. Intervention trials are relatively consistent, but epidemiological studies are not as clean, said Dr. Keen. This may be because interventions are traditionally done at the upper 10% of intake, so maybe fortification is necessary. Reliable physiological markers must be established. Biomarkers are not possible because metabolites can be gone by the evening if they are consumed in the morning. There is antioxidant activity in vitro but possibly not in vivo. Their effect on blood pressure is inconsistent. Platelet markers and vascular function as measured by flow mediated dilation may work, as there are consistent short-term changes in these. There are potential negative effects that would need to be considered: anti-nutrition effects, thyroid toxicity, genotoxicity, etc. Research would need to be done to define the amount of flavanols needed to achieve “optimal” status, to examine whether acute effects persist, explore mechanisms of action, and determine safety thresholds.
How we think about fortification has changed from nearly a century ago when the US first fortified salt with iodine. There still remain many challenges, especially outside of the US, in targeting specific populations with low nutrient intakes. Accurate monitoring of nutrient intakes and health outcomes is needed now more than ever as discretionary fortification is being used for food marketing. The broad perspectives that were covered in this session left a lot to think about, and no doubt fostered conversation as John Milner would have intended.