par Rasmussen, Christina Kjaergaard;Van Den Bosch, Thierry;Exacoustos, Caterina;Manegold-Brauer, Gwendolin;Benacerraf, Beryl;Froyman, Wouter;Landolfo, Chiara;Condorelli, Margherita ;Egekvist, Anne Gisselmann;Josefsson, Hampus;Leone, Francesco Paolo Giuseppe;Jokubkiene, Ligita;Zannoni, Letizia;Epstein, Elisabeth;Installé, Arnaud J F;Dueholm, Margit
Référence 27th World Congress on Ultrasound in Obstetrics and Gynecology(Vienna, Austria)
Publication Publié, 2017-09-16
Référence 27th World Congress on Ultrasound in Obstetrics and Gynecology(Vienna, Austria)
Publication Publié, 2017-09-16
Publication dans des actes
Résumé : | ObjectivesTo evaluate intra‐ and interobserver agreement in the reporting of ultrasonographic features associated with ill‐defined lesions using the Morphological Uterus Sonographic Assessment (MUSA) terms.MethodsMulticentre clinical study, using three‐dimensional (3D) transvaginal ultrasound clips of 30 premenopausal women suffering from abnormal uterine bleeding and/or menstrual pain. All women had transcervical deep resection of the endometrium and inner myometrium (n=25) or hysterectomy (n=5) and histopathological examination for adenomyosis (AM). Twelve women had a confident diagnosis of AM. Thirteen gynecologists with high (n=7) or medium (n=6) experience in TVS evaluated each 3D ultrasound clip in two rounds, with a two months' interval, blinded to histopathology. The evaluation was managed online with Clinical Data Miner software and the presence of ill‐defined lesions and associated features, as defined in MUSA, were recorded. Results are presented as interobserver agreement during the first evaluation and intraobserver agreement between first and second evaluation.ResultsIntraobserver agreement (average kappa) for ill‐defined lesions was moderate (0.45) and ranged from fair to moderate (0.29–0.45) for associated features. Interobserver agreement (kappa) for ill‐defined lesions was poor (0.18) between all observers and fair (0.24) between highly experienced observers. Interobserver agreement for associated features ranged from poor to fair (0.08–0.32). Excluding medium experienced observers and patients without confident diagnosis of AM, interobserver agreement for associated features ranged from fair to moderate (0.20‐0.40).ConclusionsThere was large observer variation between multiple observers for ill‐defined lesions and associated features. Presence of well‐defined lesions, image orientation and the use of 3D video clips instead of 3D volumes may have influenced the findings. Future studies need to specify ill‐defined lesions and the composition of associated features based on histopathology. |