FetalMovNet: A Novel Deep Learning Model Based on Attention Mechanism for Fetal Movement Classification in US


TURKAN M., DANDIL E., Erturk Urfali F., Korkmaz M.

IEEE Access, cilt.13, ss.52508-52527, 2025 (SCI-Expanded) identifier

  • Yayın Türü: Makale / Tam Makale
  • Cilt numarası: 13
  • Basım Tarihi: 2025
  • Doi Numarası: 10.1109/access.2025.3553548
  • Dergi Adı: IEEE Access
  • Derginin Tarandığı İndeksler: Science Citation Index Expanded (SCI-EXPANDED), Scopus, Compendex, INSPEC, Directory of Open Access Journals
  • Sayfa Sayıları: ss.52508-52527
  • Anahtar Kelimeler: attention mechanism, CNN, deep learning, fetal movement detection, Fetus, US video
  • Bilecik Şeyh Edebali Üniversitesi Adresli: Evet

Özet

Automated classification of fetal movements in ultrasound (US) videos is critical for assessing fetal well-being and detecting potential complications during pregnancy. This study introduces FetalMovNet, a novel deep learning model that incorporates an attention mechanism to improve the classification of fetal movement in US video sequences. The model integrates convolutional neural networks (CNN) for feature extraction and an attention mechanism to capture spatio-temporal patterns, significantly improving classification performance of fetal movements. To evaluate FetalMovNet, we construct a new dataset containing fetal movements in US across seven different anatomical structures-head, body, arm, hand, heart, leg, and foot. Experimental results show that FetalMovNet achieves an accuracy of 0.9887, precision of 0.9871, recall of 0.9910, and an F1-score of 0.9891, outperforming state-of-the-art CNN and CNN-LSTM architectures. Ablation studies confirm the effectiveness of the attention mechanism, with FetalMovNet achieving an area under curve (AUC) score of 0.9957, compared to 0.9471 for CNN and 0.9543 for CNN-LSTM. The proposed FetalMovNet model provides a robust and clinically applicable tool for real-time fetal movement monitoring, reducing the need for manual assessment and improving prenatal care.