Using Deep Learning to Identify Fetal Head Position in Labor
November 1, 2024
By Ahizechukwu C. Eke, MD, PhD, MPH
SYNOPSIS: An artificial intelligence (deep learning) model accurately detected fetal head position during the second stage of labor using transperineal ultrasound, but validation of the model on larger datasets and real-time patients before clinical use is important.
SOURCE: Ramirez Zegarra R, Conversano F, Dall’Asta A, et al. A deep learning approach to identify the fetal head position using transperineal ultrasound during labor. Eur J Obstet Gynecol Reprod Biol 2024;301:147-153.
Accurate determination of fetal head position during labor is critical to preventing complications such as prolonged/obstructed labor, instrumental vaginal delivery, and cesarean delivery.1,2 Malposition of the fetal head, such as asynclitism or occiput posterior (OP) or transverse (OT) positions, is associated with a significant increase in labor dystocia, prolonged second-stage labor, and adverse maternal and neonatal outcomes.3
These complications contribute to higher rates of operative deliveries, with cesarean delivery accounting for approximately one-third of all births in the United States every year.4 Although digital vaginal examination remains the primary method for assessing fetal head position during labor, it is highly operator-dependent, with error rates as high as 20% to 40%.5
Transperineal ultrasound has emerged as a noninvasive and more reliable alternative to digital evaluation, offering improved accuracy in determining fetal head position.6 However, its utility depends on the skill of the examiner, and there remains variability in interpretation.6 Artificial intelligence (AI)-based technologies, particularly deep learning models, have shown promise in interpreting ultrasound images with high precision.7,8
AI models have the potential to address the operator-dependent limitations by providing automated, consistent, and real-time assessment of fetal head position during labor, improving outcomes for both mothers and infants.8 Several studies have demonstrated that AI algorithms can achieve accuracy rates above 90% in detecting fetal head position when using transperineal ultrasound images.9,10
Despite these promising findings, current research has been limited to small-scale datasets. Validation of AI models using larger, more diverse populations, including real-time clinical settings (e.g., during labor), remains a critical research gap. Therefore, Ramirez Zegarra and colleagues addressed these gaps by applying AI-based transperineal ultrasound to differentiate between occiput anterior and non-occiput posterior positions during labor using a large dataset.11
This prospective, multicenter diagnostic study, part of the “AI OCCIPUT” initiative, was conducted between February 2018 and May 2023 at 16 maternity hospitals affiliated with the International Study Group on Labor and Delivery Sonography (ISLANDS).11 The study included patients with uncomplicated singleton term pregnancies (≥ 37 weeks of gestation) in the second stage of labor, with non-anomalous fetuses in cephalic presentation. Exclusion criteria included patients not in labor, fetuses in non-cephalic presentation, gestation < 37 weeks, or pregnancies with medical complications.
At each participating center, an obstetrician obtained fetal head position using transabdominal ultrasound, followed by transperineal ultrasound imaging. The ultrasound images were randomly allocated into three datasets with proportional representation of fetal head positions (occiput anterior [OA], OP, and OT): 70% for training, 15% for validation, and 15% for testing.
The pre-trained ResNet18 model (a deep learning architecture designed for image classification that uses residual learning to improve accuracy and efficiency by allowing the network to learn from deeper layers without performance degradation) was used for feature extraction (identification of key attributes and patterns from data) and classification.11,12
Three convolutional neural networks (CNNs) then were developed: CNN1 classified OA vs. non-OA positions, CNN2 differentiated OP from OT positions, and CNN3 classified right vs. left OT positions. The AI model for the determination of fetal head position was built using these three CNNs working simultaneously. The model’s performance was evaluated using the testing dataset, with accuracy, sensitivity, specificity, F1-score (balance between precision and recall), and Cohen’s kappa (agreement between AI predictions and ultrasound measurements). Statistical significance was set at P < 0.05.
A total of 2,154 transperineal images were included from eligible participants between February 2018 and May 2023. Using an axial plane transperineal ultrasound, the model performed exceptionally well overall for classifying the fetal head position. Its accuracy was 94.5% (92.0% to 97.0%), sensitivity was 95.6% (96.7% to 100.0%), specificity was 91.2% (97.3% to 95.1%), F1-score was 0.92, and Cohen’s kappa was 0.90.
With an accuracy of 98.3% (95% confidence interval [CI], 96.9-99.7), CNN1’s OA position vs. fetal head malpositions performed the best. CNN2’s OP vs. OT positions performed next best, with an accuracy of 93.9% (95% CI, 89.6-98.2), and CNN3’s right vs. left OT position was weakest, with an accuracy of 91.3% (95% CI, 83.5-99.1).
Commentary
The results of this study demonstrate the significant potential of deep learning models combined with transperineal ultrasound for detecting fetal head position during labor. With an overall accuracy of 94.5% and a high sensitivity of 95.6%, this model proves to be a reliable tool in identifying fetal head orientations in real-time, which is critical for managing the second stage of labor. The model’s strong performance, reflected in its high specificity of 91.2%, suggests that it can accurately distinguish between different fetal head positions, minimizing the risk of misclassification.
The F1-score of 0.92 further highlights the model’s balanced performance, effectively managing both precision and recall. This is essential for obstetricians and other delivery providers, since accurate and timely identification of fetal malpositions could lead to better decision-making regarding interventions such as instrumental delivery or cesarean delivery, thus reducing the risks of prolonged labor and related complications.
Among the CNNs developed in this study, CNN1, which differentiated OA from malpositions, achieved the highest accuracy at 98.3%, indicating that the model excels in identifying the optimal fetal position for vaginal delivery. This is particularly important for obstetricians and other healthcare delivery providers, since OA is associated with the most favorable delivery outcomes, while malpositions, such as OP or OT, are linked to prolonged labor, higher rates of operative delivery, and neonatal complications.13
The ability of the AI model to precisely detect OA vs. non-OA positions could enhance clinical decision-making, especially in settings where immediate manual assessment may be challenging or subjective. Moreover, CNN2’s performance, with an accuracy of 93.9%, demonstrated reliable classification between OP and OT positions, which are known to complicate delivery if not promptly recognized and managed.
CNN3, which classified right vs. left OT positions, also performed well, with an accuracy of 91.3%, although slightly lower than CNN1 and CNN2. The ability to accurately distinguish between right and left OT fetal positions is clinically significant, since this precise identification is critical for guiding appropriate management strategies during labor.
In cases where manual rotation of the fetal head from OT to OA is desirable, recognizing the specific lateral position becomes essential. This distinction is particularly important because the success of manual rotation techniques can be influenced by the fetal head’s orientation; for instance, the angle of the fetal head relative to the maternal pelvis can affect the feasibility of such maneuvers.14
Additionally, failure to accurately identify the lateral position may lead to inappropriate management decisions, such as unnecessary operative deliveries or prolonged labor, both of which can increase the risk of maternal and neonatal complications. Moreover, distinguishing between right and left OT positions can inform the use of specific maternal positions or maneuvers, such as the McRoberts maneuver or the use of gravity-assisted techniques, to facilitate effective fetal rotation.
Although manual palpation of fetal position often can be inaccurate and operator-dependent, the integration of deep learning AI models with ultrasound could standardize assessment of fetal head position, making them more objective and consistent across different clinical environments.
These findings suggest that AI-driven transperineal ultrasound has the potential to become an essential tool in obstetric practice, improving fetal position detection and potentially enhancing maternal and neonatal outcomes.
REFERENCES
- Nouri-Khasheh-Heiran E, Montazeri A, Conversano F, et al. The success of vaginal birth by use of trans-labial ultrasound plus vaginal examination and vaginal examination only in pregnant women with labor induction: A comparative study. BMC Pregnancy Childbirth 2023;23:3.
- Souka AP, Haritos T, Basayiannis K, et al. Intrapartum ultrasound for the examination of the fetal head position in normal and obstructed labor. J Matern Fetal Neonatal Med 2003;13:59-63.
- Malvasi A, Vinciguerra M, Lamanna B, et al. Asynclitism and its ultrasonographic rediscovery in labor room to date: A systematic review. Diagnostics (Basel) 2022;12:2998.
- Schulz KW, Gaither K, Zigler C, et al. Optimal mode of delivery in pregnancy: Individualized predictions using national vital statistics data. PLOS Digit Health 2022;1:e0000166.
- Popowski T, Porcher R, Fort J, et al. Influence of ultrasound determination of fetal head position on mode of delivery: A pragmatic randomized trial. Ultrasound Obstet Gynecol 2015;46:520-525.
- Ahn KH, Oh MJ. Intrapartum ultrasound: A useful method for evaluating labor progress and predicting operative vaginal delivery. Obstet Gynecol Sci 2014;57:427-435.
- Stringer JSA, Pokaprakarn T, Prieto JC, et al. Diagnostic accuracy of an integrated AI tool to estimate gestational age from blind ultrasound sweeps. JAMA 2024;332:649-657.
- Gimovsky AC, Eke AC, Tuuli MG. Enhancing obstetric ultrasonography with artificial intelligence in resource-limited settings. JAMA 2024;332:626-628.
- Ferreira I, Simões J, Pereira B, et al. Ensemble learning for fetal ultrasound and maternal-fetal data to predict mode of delivery after labor induction. Sci Rep 2024;14:15275.
- Ghi T, Conversano F, Ramirez Zegarra R, et al. Novel artificial intelligence approach for automatic differentiation of fetal occiput anterior and non-occiput anterior positions during labor. Ultrasound Obstet Gynecol 2022;59:93-99.
- Ramirez Zegarra R, Conversano F, Dall’Asta A, et al. A deep learning approach to identify the fetal head position using transperineal ultrasound during labor. Eur J Obstet Gynecol Reprod Biol 2024;301:147-153.
- Qayyum W, Ehtisham R, Bahrami A, et al. Assessment of convolutional neural network pre-trained models for detection and orientation of cracks. Materials (Basel) 2023;16:826.
- Yagel O, Cohen SM, Lipschuetz M, et al. Higher rates of operative delivery and maternal and neonatal complications in persistent occiput posterior position with a large head circumference: A retrospective cohort study. Fetal Diagn Ther 2018;44:51-58.
- Verhaeghe C, Parot-Schinkel E, Bouet PE, et al. The impact of manual rotation of the occiput posterior position on spontaneous vaginal delivery rate: Study protocol for a randomized clinical trial (RMOS). Trials 2018;19:109.
Ahizechukwu C. Eke, MD, PhD, MPH, is Associate Professor in Maternal Fetal Medicine, Division of Maternal Fetal Medicine, Department of Gynecology & Obstetrics, Johns Hopkins University School of Medicine, Baltimore.
An artificial intelligence (deep learning) model accurately detected fetal head position during the second stage of labor using transperineal ultrasound, but validation of the model on larger datasets and real-time patients before clinical use is important.
Subscribe Now for Access
You have reached your article limit for the month. We hope you found our articles both enjoyable and insightful. For information on new subscriptions, product trials, alternative billing arrangements or group and site discounts please call 800-688-2421. We look forward to having you as a long-term member of the Relias Media community.