A new study has highlighted the significant risk of iron deficiency among pregnant women, even in high-resource settings. Conducted in Ireland, the research followed 629 primiparous women with low-risk, singleton pregnancies, tracking their iron status throughout pregnancy. Iron biomarkers, including ferritin and soluble transferrin receptors (sTfR), were measured at 15, 20, and 33 weeks of gestation.
The results show a sharp increase in iron deficiency as pregnancy progresses. At 15 weeks, 4.5% of women had ferritin levels below 15μg/L, a marker of deficiency. By 33 weeks, this figure had risen to 51.2%. When using a higher ferritin threshold of <30μg/L, 83.8% of women were considered iron deficient by the third trimester. The sTfR marker provided similar results.
The study identified ferritin levels of <60μg/L at 15 weeks as predictive of iron deficiency in the third trimester. Prepregnancy and early pregnancy iron supplementation were associated with a reduced risk of developing deficiency later, lowering the odds by 43%.
These findings underscore the need for early iron screening and supplementation during pregnancy. The authors recommend a ferritin target of >60μg/L in early pregnancy to prevent deficiency. Iron deficiency during pregnancy is linked to adverse outcomes for both mother and baby, making this issue a priority even in countries with good healthcare access.
Helena Bradbury, EMJ
Reference
McCarthy EK et al. Longitudinal evaluation of iron status during pregnancy: a prospective cohort study in a high-resource setting. The American Journal of Clinical Nutrition. 2024.