6533b861fe1ef96bd12c4f83

RESEARCH PRODUCT

Wavelet analysis and neural network classifiers to detect mid-sagittal sections for nuchal translucency measurement

Domenico TegoloGiuseppa SciortinoCesare ValentiEmanuela Orlandi

subject

Radiology Nuclear Medicine and ImagingAcoustics and UltrasonicsComputer scienceGeneral MathematicsMaterials Science (miscellaneous)Acoustics and UltrasonicWavelet analysi030218 nuclear medicine & medical imaging03 medical and health sciences0302 clinical medicineWaveletNuchal translucencyNuchal Translucency MeasurementmedicineMathematics (all)Instrumentation1707lcsh:R5-920Mid-sagittal section030219 obstetrics & reproductive medicineArtificial neural networkSettore INF/01 - Informaticabusiness.industrylcsh:MathematicsUltrasoundPattern recognitionSymmetry transformlcsh:QA1-939Sagittal planeNeural networkIdentification (information)True negativemedicine.anatomical_structureNuchal translucencySignal ProcessingComputer Vision and Pattern RecognitionArtificial intelligencebusinesslcsh:Medicine (General)Biotechnology

description

We propose a methodology to support the physician in the automatic identification of mid-sagittal sections of the fetus in ultrasound videos acquired during the first trimester of pregnancy. A good mid-sagittal section is a key requirement to make the correct measurement of nuchal translucency which is one of the main marker for screening of chromosomal defects such as trisomy 13, 18 and 21. NT measurement is beyond the scope of this article. The proposed methodology is mainly based on wavelet analysis and neural network classifiers to detect the jawbone and on radial symmetry analysis to detect the choroid plexus. Those steps allow to identify the frames which represent correct mid-sagittal sections to be processed. The performance of the proposed methodology was analyzed on 3000 random frames uniformly extracted from 10 real clinical ultrasound videos. With respect to a ground-truth provided by an expert physician, we obtained a true positive, a true negative and a balanced accuracy equal to 87.26%, 94.98% and 91.12% respectively.

10.5566/ias.1352http://hdl.handle.net/10447/201821