|
A landmark Lancet study (Friesen et al, 2026) has crystallised what the nutrition community has long argued: large-scale food fortification works, it is extraordinarily cost-effective, and the case for scaling it globally has never been stronger. The question that must now be asked is how we design fortification interventions that can actually scale? The evidence is in. Current fortification efforts are already preventing the equivalent of 7 billion inadequate person-years in nutrient intake, yet 38.6 billion inadequate person-years persist, at a mean cost of just $0.18 per person per year across five major food staples. The opportunity to prevent a further 24.7 billion inadequacies through better standards, compliance, and coverage is within reach. Micronutrient inadequacies remain stubbornly high and are increasing across a wide range of nutrients. Salt iodization stands as proof that coordinated global action can work. But it also stands somewhat alone. The gap between what fortification can achieve and what it currently achieves is not primarily an evidence problem - it is an implementation problem. Closing it requires designing interventions that are capable of scale, namely: good enough, big enough, simple enough, and cheap enough for the specific doers at scale and payers at scale in each context (leveraging the Mulago Foundation Framework for scale). The sections that follow unpack what this means in practice. Fortify Health is an Indian non-profit currently reaching more than 20 million beneficiaries per month through a combination of direct engagement with private sector wheat flour millers and, increasingly, government welfare programs. We offer our experience here as a case study, not as a template, but we believe the lessons generalise. Doer at scale and Payer at ScaleEffective implementation requires working backwards from two foundational questions: who is actually going to do this at scale, and who is going to pay for it at scale? Critically, the incentive mechanisms and stakeholder dynamics for implementation and financing are fundamentally different depending on the answer. Large-scale food fortification cannot be treated as a single type of intervention. It must be designed and made bespoke for the context in which it will operate, whether that is governments or businesses. The answer to these questions should drive every design decision that follows. Government as doer at scale and payer at scale In countries where a significant share of the micronutrient-deficient population receives food through existing welfare programs and ration schemes, the government is the natural doer at scale and payer at scale. India is the primary example. More than Business as doer at scale and consumer as payer at scale In many countries, the government does not play this role for a significant share of the population. In these contexts, businesses are the doers at scale, and consumers are the payers at scale, and the incentive structure is entirely different. This requires designing programs that are genuinely implementable by disparate private-sector, ground-level actors: wheat flour and rice millers, milk cooperatives, salt producers, and oil producers. Many of these businesses are relatively unsophisticated, with frontline mill staff who may have limited formal education. The cost of fortification must ultimately be transferable to consumers through the price of the commodity. Organisations such as Millers for Nutrition, through partners including TechnoServe, have been thinking innovatively about how to engage businesses in this track and design implementable interventions that could truly scale. Yet more work remains to be done in translating that thinking into programs that can operate without ongoing external support. This matters because the Lancet study's $0.18 per person per year figure reflects a modelled average. The landed cost of fortification, once distributed through large numbers of private sector actors, is often materially higher. Private sector commodity businesses operate with very low risk appetite. Meaningful behaviour change typically requires sufficient consumer awareness and demand to justify even modest price adjustments. And even where those price increases appear small as a percentage of gross margin, they can carry significant stickiness in price-sensitive markets. Fortify Health: Operating Across Both Tracks Fortify Health operates across both tracks in India. More than 400 million individuals receive a wheat-containing food product daily through government programs, making the Mid-Day Meal Scheme, the Public Distribution System, and Anganwadi systems critical channels. However, as an organisation, we made a deliberate strategic choice to prioritise private sector engagement first. We currently work with close to 300 large industrial wheat flour millers, reaching more than 20 million beneficiaries per month. The rationale was straightforward: private sector engagement builds the infrastructure, operational knowledge, and refined intervention design needed to support government programs at true scale. Effective government support requires ground-level implementation experience. High-level technical assistance is often insufficient because the most consequential gaps only become visible when operating directly at mill level. India's recent temporary discontinuation of its rice fortification program illustrates precisely this point. The pause reflected significant challenges in thinking through logistics, supply chain management, and the development of real-time quality testing to ensure that micronutrients were reaching beneficiaries at appropriate quantities to actually drive nutritional impact. These are not abstract design questions. They are operational realities that only become fully visible through direct implementation experience. Simple EnoughLarge-scale food fortification is, on its surface, a relatively simple intervention. Adding micronutrients to a food staple is often as straightforward as mixing liquid into liquid, blending fortified rice kernels into existing stock, or homogenising a powder into flour. But as with many things, what appears simple on the surface is considerably more complex to master and sustain at scale. Getting fortification right requires attending carefully to four core elements. The right premix For fortification to be effective, the premix must contain appropriate levels of micronutrients within a chemical compound that does not alter the organoleptic characteristics of the final cooked product. The choice of compound matters enormously. In wheat flour fortification, iron sodium EDTA is the preferred iron source precisely because it is well-absorbed and does not cause the discolouration or rancidity that other iron compounds can produce in the finished product. Folic acid, vitamin B12, and zinc each carry their own compound-specific considerations around stability, interaction effects, and detectability. Getting these choices right at the premix design stage is a precondition for everything that follows. Yet in many countries, regulatory standards for premix do not exist, and the testing mechanisms to verify premix quality before it enters the production line remain underdeveloped. The challenge is compounded by the fact that the underlying micronutrient inputs are often proprietary, supplied by a small number of global manufacturers, and subject to supply chain disruptions that can introduce substitutions or quality variation without adequate visibility. A mill receiving a new batch of premix has limited means to verify that it meets specifications before it is blended into thousands of tonnes of flour or other food staples. Significant opportunities remain around the formal regulation of premix, the standardisation of premix specifications across borders, and the development of real-time, low-cost quality assurance mechanisms at the point of receipt. The right equipment Appropriate addition and homogenisation of micronutrients depends on having the right equipment operating correctly and consistently. A dosifier that is miscalibrated by even a small margin can result in systematic under- or over-fortification across an entire production run. Equipment that is poorly maintained, or operated by staff who do not fully understand its function, produces results that diverge significantly from what the program design assumes. Organisations like Sanku have done important work in innovating and rolling out low-cost, high-efficacy dosing equipment specifically designed for flour fortification in lower-resource settings, bringing down the capital cost barrier that has historically prevented smaller mills from participating. The next frontier is integration. Opportunities now exist to connect dosing equipment to digital monitoring systems, enabling real-time, remote calibration checks and flagging of anomalies by program administrators or government regulators without requiring a physical inspection. Embedding this kind of remote oversight into existing government quality assurance schemes would significantly strengthen the monitoring and accountability layer of large-scale fortification programs at relatively low incremental cost. The right testing Micronutrients exist at parts per million or parts per billion concentrations in finished food products, which means verifying whether a fortified product actually meets its specification in real-time is a genuinely hard technical problem. Traditional laboratory methods, such as atomic absorption spectroscopy or inductively coupled plasma analysis, are accurate but expensive, slow, and require equipment and trained personnel that most production facilities and regulatory bodies in lower-income settings cannot access routinely. The result is that quality verification is either done infrequently, done at centralised laboratories far removed from the production point, or not done at all. This gap is one of the most consequential in the fortification system. At Fortify Health, we have worked with Hornbill to develop an AI-based tool called Iron Lens, which uses machine vision to determine the level of added iron in a wheat flour sample in real-time and at essentially no marginal cost per test. A mill operator or quality assurance inspector can take a sample, photograph it, and receive a result within seconds. Similar technology is now being developed for other food staples and micronutrient combinations, and represents one of the most promising near-term opportunities to close the quality assurance gap that undermines confidence in fortification programs at scale. The right training Even a well-designed fortification system, with high-quality premix, correctly installed equipment, and accessible testing tools, will underperform if the people responsible for day-to-day implementation do not understand what they are doing or why it matters. Frontline mill operators are often unskilled or semi-skilled workers with high turnover. The knowledge required to manage premix correctly, maintain dosing equipment, and interpret a quality test result is not complex, but it does need to be reliably transmitted and refreshed. The key is institutionalisation. Training schemes that run as standalone, program-specific efforts tend to decay rapidly as staff change and program funding cycles end. The more durable approach is to embed fortification training into existing government vocational training infrastructure and industry-level certification schemes, so that the knowledge becomes a standard part of what it means to operate a flour mill or rice processing facility. Fortify Health is currently working with government institutions in India to incorporate fortification-relevant modules into pre-existing online technical training programs, building on infrastructure that already exists and is already funded. The four elements above did not emerge from a design exercise. They emerged from years of working directly at mill level and watching technically sound interventions fail in practice. Premix quality issues that were invisible at the program design stage became apparent when mills reported unexpected colour changes in finished flour. Equipment calibration drift went undetected for months before systematic under-fortification was identified through field testing. Iron Lens was developed because we needed an affordable testing tool that did not yet exist. And the training work began because staff turnover at mills was eroding implementation quality faster than program-level support could compensate for. The lesson is not that fortification is uniquely difficult. It is that the gap between a well-designed intervention and an effectively implemented one is almost always found in these operational details. Big EnoughThe word "large-scale" is built into the name of this intervention. But scale is not simply a matter of reach. It is a matter of persuasion. Before governments or businesses will commit the financial investment and institutional effort that fortification requires, they must be convinced that the problem it addresses is serious enough to justify that commitment. This is one of the most underappreciated challenges in the field. Anaemia and micronutrient inadequacy are, in many ways, uniquely difficult conditions to build a case around. The burden is non-visual, non-visceral, and non-tangible. Unlike a disease that produces visible symptoms or acute mortality, micronutrient deficiency tends to operate as a corrosive underlying condition, quietly undermining the foundations of health, amplifying other diseases, and reducing cognitive and physical capacity in ways that are real but rarely attributed to their root cause. Very few people die of anaemia directly. The harm accumulates gradually, diffusely, and largely invisibly. This makes it genuinely hard to communicate as a priority, both to policymakers and to the consumers who are ultimately the ones being harmed. The challenge is further complicated by the gendered distribution of the burden. Anaemia disproportionately affects women and adolescent girls, in part through menstruation-related iron loss, and the associated stigma and cultural sensitivity around this make straightforward public communication difficult in many contexts. A disease that is common, chronic, and predominantly female in its visible presentation faces a particularly steep path to becoming a public priority. Some argue, persuasively, that this is precisely what regulators are for: to translate high-quality evidence into nutritional standards and mandates for the good of society, without requiring each individual consumer to understand or demand the intervention themselves. That argument holds where governments are the doers at scale and payers at scale. Mandatory fortification through government welfare systems can reach hundreds of millions of people without depending on consumer awareness at all. But for the hundreds of millions, if not billions, of individuals with micronutrient inadequacies who would be served through private sector channels, the equation is different. In those markets, the intervention must either become so pervasive that it requires no active choice, or a sufficient level of consumer awareness must be built to generate demand for fortified products over unfortified alternatives. This is where the evidence base is genuinely weak. Highly effective, cost-effective, and scalable behaviour change communication for micronutrient health does not yet exist in any well-demonstrated form. This is both a critical barrier for the field and a significant opportunity for academic and implementation research. Until that gap is closed, the private sector track for fortification will remain dependent on either regulatory mandates or on building a business case through routes other than consumer demand. At Fortify Health, we have experimented with a range of behaviour change mechanisms and have not yet identified an approach that is appropriate for scale. Our current hypothesis is that the most viable near-term path in private sector markets is to build the business case through large-scale industry partnerships, including initiatives like Millers for Nutrition, and through direct engagement with large-scale retailers. The logic is that product differentiation on the basis of fortification could generate first-mover advantage and justify improved margins on what is otherwise a low-profit commodity good. This remains a hypothesis. Significantly more work is needed, both in academic research and through implementation, to identify solutions that can genuinely move the needle on consumer awareness at scale. Good EnoughWhen we ask whether a fortification intervention is "good enough," the answer cannot rest on the global clinical evidence base alone, even though that evidence base is dense and compelling. What governments, program implementers, and funders increasingly require is evidence that is localised, contextualised, and grounded in implementation realities rather than controlled trial conditions. These are meaningfully different things, and the gap between them is one of the more persistent friction points in scaling fortification programs. The global evidence on the effectiveness of micronutrient fortification is substantial. But large-scale food fortification is inherently difficult to evaluate rigorously. Its rollout is imperfect by design, reaching populations through existing food supply chains rather than controlled distribution. Treatment and control groups are hard to isolate. The key health outcomes, reduced anaemia prevalence, improved micronutrient status, and downstream cognitive and physical effects, typically require biomarker data from blood collection to measure, and the sample sizes required to detect meaningful differences at a population level can be cost-prohibitive for most program budgets. The result is that the implementation-level evidence base, particularly for large-scale open-market interventions, remains thinner than the clinical evidence base would suggest is needed. Where governments act as the doer at scale and payer at scale, this challenge is compounded by a legitimate and understandable preference for local evidence. Program decision-makers in a given country, or even a given state, are often resistant to extrapolating from evidence generated elsewhere. To some, this may appear to be an inefficient duplication of a well-established evidence base. But given the cost-effectiveness of fortification relative to the burden it addresses, the cost of generating local evidence is relatively small, and the political and institutional returns to having it can be significant. The question is not whether local evidence is worth generating. It is how to generate it more efficiently, more innovatively, and in ways that are embedded within implementation rather than running alongside it. Several opportunities are emerging here. Non-invasive mechanisms for measuring haemoglobin and anaemia status are reducing the logistical and ethical complexity of large-scale data collection. AI-assisted tools for estimating dietary intake are beginning to make it feasible to understand consumption patterns at scale without expensive survey infrastructure. As the cost of data collection and analysis continues to fall, the opportunity to run rigorous evaluations embedded within real-world fortification rollouts, rather than dependent on separately funded research programs, increases substantially. Fortification programs should be designed from the outset to generate evidence, not just deliver outcomes. At Fortify Health, we hear consistently from government actors in India that they want local evidence, often at the state level, before they are willing to make significant program commitments. This is not resistance to the science of fortification. It is a reasonable institutional ask in a context where implementation conditions vary significantly across states and where political accountability for program outcomes is local. We are actively working on a number of evaluation opportunities in response, including a potential large-scale evaluation of an open-market wheat flour fortification intervention within India. The lesson we draw is that building the evidence base is not a precursor to implementation. It must be integrated into the implementation from the start. CHeap EnoughLarge-scale food fortification is among the most cost-effective public health interventions in the world, and by some analyses, the most cost-effective. The Lancet estimate of $0.18 per person per year across five food staples builds on a substantial and consistent body of evidence. In the abstract, the cost-effectiveness case is essentially closed. But "cheap enough" cannot be evaluated in the abstract. It must be evaluated from the perspective of the specific payers at scale, and the picture looks different from that vantage point. For governments, the per-capita cost of fortification may be small, but the absolute expenditure can be very large. India's investment in rice fortification has run to approximately 17,000 crore rupees, a number that, whatever its per-capita character, competes directly for budget allocation against other public health priorities. For finance ministries and program budget holders, that is a real and significant number. The question is not whether fortification is cost-effective in a technical sense. The question is how it is positioned relative to other demands on public expenditure, and whether the political and institutional case for that investment has been made convincingly enough to survive budget cycles and competing priorities. That requires more than a cost-effectiveness ratio. It requires active demand creation, the elevation of anaemia and micronutrient inadequacy as a visible public health priority, and a clear analysis of where government expenditure is already flowing that could be leveraged more effectively through fortification. For private sector businesses, the dynamic is different, but the friction is real. Gross profit margins on commodity food staples are thin. Any percentage change in production cost, even a small one, can represent a meaningful risk for a business with low tolerance for margin compression, particularly when the benefit to the business in terms of product differentiation or consumer demand is uncertain. The perceived risk may exceed the actual risk, but perceived risk drives business decisions. Designing the private sector case for fortification therefore, requires more than demonstrating that the cost increment is small. It requires building a credible commercial argument: that fortification creates product differentiation, that first-mover advantage is real in markets where consumer awareness is growing, and that the marginal cost can be passed on to consumers without meaningful volume loss. At Fortify Health, we have worked closely with more than 300 mill partners to support and persuade them to fortify their existing products. This has required building a wide range of arguments tailored to different business contexts, from the commercial case for product differentiation to the risk mitigation case around regulatory direction of travel. That work has now achieved significant success. On the government side, we conduct detailed analyses of existing government expenditure on food programs to help program stakeholders understand that fortification is not a new cost but a more effective use of resources already being deployed. Both tracks require the same core discipline: understanding what "cheap enough" means to the specific payer at scale in front of you, and designing the case accordingly. ConclusionThe Lancet study has done something important. It has taken a large and fragmented body of evidence and consolidated it into a clear, global picture: fortification works, it is cost-effective, and the scale of what remains unachieved is enormous. 7 billion inadequate person-years prevented. 24 billion more within reach. $0.18 USD per person per year. The evidence case is as strong as it has ever been.
But the evidence case is not the constraint. It has not been for some time. The constraint is implementation, and specifically the challenge of designing interventions that can be fully institutionalised, owned, and sustained at scale by the governments, businesses, and consumers who are the actual doers at scale and payers at scale in this system. Philanthropic funding has played a critical role in building the evidence base and proving what is possible. It cannot be the long-term solution to micronutrient inadequacies globally. The goal must be to design fortification programs so well adapted to their context, so good enough, simple enough, big enough, and cheap enough for the specific doers and payers involved, that they no longer require external subsidy to persist. The global community has demonstrated that it can do this. Iodized salt is proof. The challenge of iodine deficiency is, for most of the world, a problem of yesterday. There is no principled reason why the same cannot be true for iron, folate, zinc, and vitamin B12 across the staple foods that billions of people consume every day. Getting there requires the kind of clear-eyed, implementation-focused thinking that the Lancet study now makes newly urgent. The potential is not in question. The work is in the doing. Tony Senanayake, CEO – Fortify Health
0 Comments
Leave a Reply. |
Archives
December 2025
Categories |
RSS Feed