1. Are neural tube defects and anaemia widespread in India to warrant fortification?
India has one of the highest burdens of neural tube defects (NTDs) and anaemia in the world.
According to the 2016-2018 Comprehensive National Nutrition Survey on India (CNNS), 23% of pre-school age children, 28% of school age children, and 37% of adolescents are folate deficient. A deficiency in folate among women of childbearing age leads to an increased risk of their children being born with debilitating and often fatal neural tube defects (NTDs). The NTD prevalence in India is estimated to be 30/10,000 live births (Blencowe et al 2018). Anything over 9/10,000 is considered to warrant an intervention. To put this into context, post-fortification birth prevalence of spina bifida and anencephaly (two of the most common forms of NTDs) has been empirically set at a rate of 0.5 per 1,000 live births, based on published reports from several countries with successful fortification programs (Castillo‐Lancellotti, Tur, & Uauy, 2013; Cortes, Mellado, Pardo, Villarroel, & Hertrampf, 2012; De Wals et al., 2008; Sayed, Bourne, Pattinson, Nixon, & Henderson, 2008; Williams et al., 2015). In other words, a country’s NTD prevalence should be 0.5-0.6 per 1,000 live births with adequate folate intake. Additionally, modeled data from a population‐based community trial, Crider et al. (2014) reported that an optimal achievable prevention of spina bifida and anencephaly through folic acid interventions should yield a lowest prevalence of approximately 0.5–0.6 per 1,000 births. This points to the high rate found in India today and the need for a population-wide intervention that delivers folic acid.
According to India’s 2015-2016 National Family Health Survey (NFHS-4), more than half of women and children in India are anaemic. The survey reports that 58.6% of children 6-59 months, 53.2% of non-pregnant women, 50.4% of pregnant women, and 22.7% of men 15-49 years of age are anaemic. The 2016-2018 Comprehensive India National Nutrition Survey states that 41% of pre-school age children, 24% school-age children (5-9 years) and 28% of adolescents (10-19 years) are anaemic. The incidence of anaemia in NFHS-4 and CNNS use different measures of iron and hence are not directly comparable.
In terms of iron deficiency anaemia specifically, (which is determined by measuring serum ferritin and is the form of anaemia due to a lack of iron in the diet as opposed to other causes of anaemia such as infection, hookworm, malaria, or hemoglobinopathies) modeling results have shown that the median risk for dietary iron deficiency in India is 65% (Ghosh et al. 2019). It should be noted that there may be inter-state variation in these figures. For example, in Punjab, iron deficiency among adolescents is 45.3% vs. 9.4% in Mizoram. India is unique in that iron deficiency is measured separately from anaemia. This is a benefit as it provides evidence that a lack of iron in the diet, specifically, is a widespread concern in the country and a population-wide intervention that delivers dietary iron is warranted to address the problem. The Comprehensive India National Nutrition Survey also reports on rates of iron deficiency, indicating that 32% of pre-school age children, 17% of school-age children, and 22% of adolescents are iron deficient (adolescent females had a higher rate (31%) than adolescent males (12%)).
2. Is there any evidence that fortification of wheat flour leads to health improvements?
There is ample evidence globally and within India that points to health improvements as a result of fortified wheat flour. In India, a randomised control trial in Karnataka and Maharashtra (Muthyya et al, 2012) finds that fortifying wheat flour with iron from NaFeEDTA sources for seven months reduces iron deficiency anaemia (IDA) from 62% to 21% in school age children.
The most recent systematic review (Imhoff-Kunsch et al 2019) found that iron fortification can, on average, reduce anaemia by 34% if the anaemia is due to a lack of iron in the diet.
Fortification with folic acid has repeatedly demonstrated that it leads to significant reductions in NTDs. This is illustrated in this graph, which shows that at least 12 countries have documented a lower number of neural tube defects after initiation of mandatory flour fortification with folic acid using pre-post cross-sectional surveys (Food Fortification Initiative 2018). Pre-fortification values are illustrated in orange and post-fortification values in green.
Across the horizontal axis are the 36 studies completed in 13 countries: Argentina, Australia, Brazil, Canada, Chile, Costa Rica, Iran, Jordan, Oman, Peru, Saudi Arabia, South Africa, and USA. With the exception of two studies (Ricks in Peru and Alasfoor in Oman), there was a reduction in neural tube defects observed after fortification with folic acid.
Further summarised evidence on the impact of fortification for both anaemia and NTDs can be found here:
There are also economic implications to fortification:
Fortification has been found to yield US$84 for every US$1 spent on reducing iron deficiency anaemia prevalence (Hunt 2002).
Specifically in India, the Food Fortification Resource Center (FFRC) has found that it costs 10 paisa to fortify 1 kg of atta flour.
The following countries have compared the cost of fortifying with folic acid and the healthcare savings from preventing brain and spine birth defects (NTDs):
3. If people are already consuming other fortified staple foods (salt, rice and edible oil), why do we need to fortify wheat flour?
There are two reasons why adding fortified wheat flour to the list of foods fortified in India is important: 1) array of micronutrients included and 2) reach.
In India, fortified salt has iodine added to it; edible oil has vitamin A and D added to it; fortified milk has vitamin A and D added to it; and fortified rice (similar to fortified wheat flour in India) has vitamin B12, iron, and folic acid added to it.
One might be concerned about multiple vehicles being fortified with the same nutrient(s). However, by targeting one to two main staple foods with similar micronutrients, the program is increasing the amount of individuals reached with those micronutrients due to different purchasing and consumption patterns (e.g. those consuming more rice than wheat flour and vice versa will be ensured to still be receiving the key nutrients). When national standards are created, this potential overlap is taken into consideration so that the correct percentage of the Estimated Average Requirement (EAR) is being provided to individuals that may be consuming multiple foods fortified with the same nutrient(s). For example, the level of vitamin A added to edible oil in a national standard will be set at a different figure depending on if that is the only food vehicle fortified with vitamin A, or if it is one of several. The advantage of targeting multiple food vehicles with the same nutrients is that greater numbers of people can be reached with these essential nutrients and those consuming multiple vehicles that are fortified come closer to reaching necessary Recommended Daily Allowance (RDA) levels, which cannot be provided through one fortified food alone.
Finally, it is important to understand that all micronutrients are connected, meaning that a deficiency in one nutrient may lead to a deficiency in another nutrient. For example, vitamin A is needed to allow stored iron to be released from the liver. It is for this reason that a variety of fortified foods with different nutrients added to them is important for an effective national fortification program.
4. Why not focus on iron supplementation and dietary diversification campaigns instead of fortification?
A myriad of interventions may be required to tackle high rates of iron deficiency anaemia in a country including supplementation, fortification, and dietary diversification. There is generally not one solution that will effectively address this public health issue alone, therefore, a combined and complementary approach is often required. Each approach comes with its own set of strengths and weaknesses targeting different segments of the population in some cases. By employing a multi-pronged approach, greater reach amongst the most vulnerable can be achieved. With that said, a country that employs multiple strategies to address micronutrient malnutrition should carefully consider who is being provided with what kinds and amounts of nutrients to ensure that, indeed, all interventions are really needed. In some cases, it may be determined that only one or two are required to meet the nutritional needs of a population. In this case, national strategies should be adjusted accordingly.
Iron supplementation programs, which are considered short-term, targeted approaches to address iron deficiency, ensure that the most vulnerable and hard to reach are provided with necessary nutritional supplementation. These programs can also specifically target who receives the supplements. Two of the biggest challenges, however, with iron supplementation programs are irregular supply of good quality tablets and compliance by recipients (Malhotra et al 2015; Mora 2002). In other words, the iron tablets are not always taken accordingly by those receiving them due to side effects that are sometimes experienced (e.g. gastrointestinal problems). Also, these programs may be costly to implement and require sustained outside funding.
Dietary diversification, which is considered a longer-term approach, may be the ideal way to improve a population’s diet, however, the amount of time it requires to make such a change and the inherent contextual factors that affect it (sociopolitical, economic, cultural, behavioral, to name a few) likely make dietary diversification an approach that needs to take place alongside other approaches that can more quickly ensure improved nutrient intake. To date, there is limited data on the long-term impact of such interventions. Short- or medium-term approaches should be started alongside longer-term dietary diversification efforts.
India’s 2016-2018 CNNS reports findings on minimum dietary diversity, meal frequency, and acceptable diet among children 6-23 months, 2-4 years, 5-9 years, and 10-19 years. Findings indicate that 42% of children 6-23 months of age were fed the minimum number of times per day for their age, 21% were fed an adequate dietary diversity, 6.4% received a minimum acceptable diet, and 8.6% consumed iron rich foods. Among school-age children and adolescents, more than 85% consumed dark green leafy vegetables and pulses or beans at least once per week, one-third consumed eggs, fish, or meat at least once per week, and 60% consumed milk or curd at least once per week.
Food fortification, considered a medium-term approach, has been declared as the most cost-effective, safest (Hurrell 2010), and most practical approach to increase iron intake on a widespread and sustained basis (Gera 2012) that exists today. There are several benefits to food fortification including cost and the cost:benefit ratio, how it is implemented through the private sector, limited behavior change required, and the fact that fortified foods, consumed on a regular and frequent basis, maintain body stores of nutrients more efficiently and effectively than occasional, high-dose supplementation (Rowe et al 2014). It is this particular gap in the delivery of key micronutrients in India, that Fortify Health is aiming to address.
However, depending on the type of food being fortified, reach may not be universal (e.g. those that do not consume centrally processed flour may not be reached), therefore, implementation in conjunction with other intervention may be warranted.
5. Does fortification cause any adverse health effects, or effects on malaria, malaria treatments, and thalassemia?
There have been no adverse health effects reported from a fortification program globally. This is largely due to the level of nutrients that are provided through fortification programs -- the levels are intended to be close to natural levels of nutrients found in foods and to allow the body to maintain stores of the nutrients more efficiently and effectively than what may be the case with a high-dose supplementation. Levels of vitamins and minerals added to fortified foods don’t come close to designated Upper Limits (UL) established for specific vitamins and minerals. Even when multiple food vehicles are fortified with the same nutrient(s), the amount of the nutrient(s) provided through the multiple vehicles is taken into consideration in the drafting of the national fortification standards (see question #9 on how fortification standards are created).
Iron and malaria
In terms of malaria, the connection between iron deficiency and susceptibility to infection has been a controversial topic. As a result, the subject has been studied extensively since the original 2003 Pembe study observations. In summary, the concluding evidence suggests a strong net health benefit from food fortification and the WHO recommends that food fortified with iron in malaria-endemic regions be offered in concert with malaria control efforts (Kuehn 2013; Zlotkin et al 2013; WHO 2014) , which includes bed net distribution, prophylaxis, and education.
The connection between iron deficiency and susceptibility to infection is due to the fact that pathogens feed off of iron in the body. The body’s response during an infection is to decrease serum iron levels (iron in the blood) while increasing levels of ferritin (anthe iron storage protein) in order to sequester iron from the pathogens. Numerous studies have been conducted around iron supplementation (taking iron in the form of drops, syrups, or tablets) in malaria-endemic areas with varying outcomes; many studies indicate that taking regular iron supplements enhances the survival of infectious agents because it gives them more iron to feed off of. In general, if a person is to take a daily supplement, lower doses are recommended in conjunction with malaria treatment. The thought is that large doses of iron in the form of drops, syrups, or tablets, such as those delivered through supplementation, cause free roaming iron that can be used by parasites to increase infection.
There are significant differences between “supplementation” and “food fortification” in this context. The differences exist largely around the quantity of nutrient provided to an individual and how it is absorbed in the body. Supplementation dosage is significantly higher than that of fortification (Butta 2017). A supplement can provide the full recommended amount (or above) of a vitamin or mineral to a specific target population. For example, women of reproductive age (WRA) are provided with iron supplements, which contain between 60-120 mg/day of iron. The Recommended Nutrient Intake (RNI) for WRA is 15mg/day. Fortification provides a fraction of the RNI. Levels could range between 5% to 40% of the RNI.
Since iron consumed through fortified products contains significantly lower doses of iron (and folic acid) when compared to supplementation, is consumed in smaller amounts throughout the day, and is absorbed more slowly, similar concerns around malaria-endemic areas and iron fortification have not been identified according to World Health Organization (WHO) and those who have studied the issue.
Folic acid and malaria treatment
Concerns have also been raised over the interaction of folic acid and anti-malaria drugs. Evidence from randomised controlled trials has shown that high doses of folic acid (≥5 mg per day) interact with the antifolate antimalarial drug sulfadoxine pyrimethamine (SP), resulting in increased treatment failure rates (Carter et al 2005; Mulenga et al 2006; Ouma et al 2006; Sazawal et al 2006; van Hensbroek et al 1995). In light of this information, 2012 recommendations from the WHO’s Global Malaria Programme advise against simultaneous use of SP for Intermittent Preventive Treatment (IPT) and folic acid supplements at levels > 5 mg per day in pregnant women and children (World Health Organization’s Intermittent Preventive Treatment of malaria in pregnancy using Sulfadoxine-Pyrimethamine (IPTp-SP). 10-1-2012). As a result, since 2012, WHO has recommended that only low dose folic acid (< 5mg/ day) be administered in conjunction with IPTp-SP. Considering that fortification programs typically only provide 0.1-0.4 mg of folic acid per day, and despite the fact that there has been no study that we are aware of that looks specifically at folic acid fortification programs and antimalarial drug interactions, these programs are still encouraged regardless of IPT use due to their low dose provisions.
Iron and thalassemia
Thalassemias are inherited blood disorders characterised by defective production of haemoglobin. According to Harvard University’s Joint Center for Sickle Cell and Thalassemia Disorders, thalassemia minor features mild anaemia with slight lowering of haemoglobin level in the blood. This situation can closely resemble mild iron-deficiency anaemia. No treatment is necessary for thalassemia minor, thalassemia minor does not raise the risk for iron overload, and patients are not at any greater risk of complications from iron in the diet.
Thalassemia major is a serious and life threatening anaemia. Treatment includes blood transfusion every 4-6 weeks. Those suffering from thalassemia major are not thought to be at greater risk of iron overload from fortified flour, assuming treatment is sought, for a number of reasons:
India’s fortification logo is accompanied by the statement: “People with thalassemia may take under medical supervision” since those with thalassemia major should be under treatment for the condition and should inform their healthcare provider about any additional iron being consumed.
More on iron safety can be found in FFI’s Safety of Flour Fortification with Iron brief.
6. Since flour from centralised mills is typically purchased by people from higher socioeconomic backgrounds (with presumably adequate nutrition), how does fortified flour from these mills have an impact?
Centralised mills are larger and more mechanised mills, as opposed to those that are smaller and unmechanised. Those that purchase and consume flour from centralised mills are not precluded, per se, from deficiencies in vitamins and minerals, as one might think. Although these individuals may in some cases be higher income earners and / or come from urban areas (note that urban areas can include all economic strata from the poorest to the wealthiest, particularly when considering the existence of urban slums), they often face similar nutritional challenges.
According to the 2016-2018 Comprehensive National Nutrition Survey on India (CNNS), children and adolescents residing in urban areas actually had a higher (7%-12%) prevalence of iron deficiency compared to their rural counterparts. Findings also indicated a higher prevalence of iron deficiency among wealthier households across all three age groups. Iron deficiency prevalence in the lowest vs. highest wealth quintiles was as follows: (20% vs. 43% among pre-school children, 12% vs. 27% among school-age children, and 15% vs. 27% among adolescents). These findings point to the need of those who, generally speaking, purchase from centralised mills.
Also to take into consideration is the number of people reached by the centralised mills. Although, in the state of Maharashtra, only 27% (unpublished FFI supply chain analysis, 2018) of the population consumes flour from centralised mills, that equates to a total population of over 30 million people potentially at risk for iron deficiency anaemia that are being reached with fortified flour.
7. Why does Fortify Health use iron from NaFeEDTA and not other sources?
The flour being fortified in India through the efforts of Fortify Health is atta flour, which is considered to be high extraction flour. This means that the flour is minimally processed leaving a high level of bran and germ in the end product, which contain important minerals. But this minimally processed flour also contains a high level of phytates (or phytic acid), which inhibits the absorption of nutrients when consumed. Since iron being added through fortification would also be inhibited by these phytates, an enhanced form of iron called NaFeEDTA is used to address the problem of phytic acid inhibiting the bioavailability of added iron. The iron used in NaFeEDTA is chelated preventing the iron from being bound to phytic acid and thereby making it more easily absorbed by the body. Once in the human gut, the iron is released from the EDTA allowing it to be absorbed by the body (The Micronutrient Initiative’s Food Fortification Handbook: Vitamin and Mineral Fortification of Wheat Flour and Maize Meal, 2014).
8. How are food fortification standards created?
Generally speaking, there are four things taken into consideration when fortification standards are created:
Fortification standards should only be drafted and adopted in a country if there is a demonstrated need for the nutrients being provided at a population level and there is a food vehicle(s) that can effectively be fortified (this includes foods that are consumed by a large segment of the population most of the year and foods that can be fortified without substantial cost increase or organoleptic changes).
Once this is determined, very careful consideration is given to how much of the food vehicle being fortified (or food vehicles being fortified, if more than one) is consumed by the population on a daily basis; this includes men, women, and children. The goal is to provide enough of the nutrient(s) to ensure a nutritional impact amongst those consuming the least while preventing consumption of more than is required by those who consume the most. This is often done through modeling exercises that use the Food Fortification Formulator or by referencing the WHO Wheat and Maize Flour Fortification Guidelines. This process should also take into account other vehicles that may be being fortified throughout the country and the quantity of nutrients provided in them in addition to other micronutrient interventions such as iron, folic acid, or vitamin A supplementation or biofortification programs that may be ongoing. If flour is being fortified, the extraction level of the flour will determine the type of iron fortificant that should be provided in order to ensure optimal bioavailability of the added iron. Finally, cost should be taken into consideration ensuring that any cost increase in the end product is no more than usual market price fluctuations (generally no more than 2% of the current market price).