The OSCAR-IB consensus criteria for retinal OCT quality assessment.

<h4>Background</h4>Retinal optical coherence tomography (OCT) is an imaging biomarker for neurodegeneration in multiple sclerosis (MS). In order to become validated as an outcome measure in multicenter studies, reliable quality control (QC) criteria with high inter-rater agreement are re...

Descripción completa

Guardado en:
Detalles Bibliográficos
Autores principales: Prejaas Tewarie, Lisanne Balk, Fiona Costello, Ari Green, Roland Martin, Sven Schippling, Axel Petzold
Formato: article
Lenguaje:EN
Publicado: Public Library of Science (PLoS) 2012
Materias:
R
Q
Acceso en línea:https://doaj.org/article/56167ab7b592477496c8ad3f57c4a9bf
Etiquetas: Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
id oai:doaj.org-article:56167ab7b592477496c8ad3f57c4a9bf
record_format dspace
spelling oai:doaj.org-article:56167ab7b592477496c8ad3f57c4a9bf2021-11-18T07:21:38ZThe OSCAR-IB consensus criteria for retinal OCT quality assessment.1932-620310.1371/journal.pone.0034823https://doaj.org/article/56167ab7b592477496c8ad3f57c4a9bf2012-01-01T00:00:00Zhttps://www.ncbi.nlm.nih.gov/pmc/articles/pmid/22536333/?tool=EBIhttps://doaj.org/toc/1932-6203<h4>Background</h4>Retinal optical coherence tomography (OCT) is an imaging biomarker for neurodegeneration in multiple sclerosis (MS). In order to become validated as an outcome measure in multicenter studies, reliable quality control (QC) criteria with high inter-rater agreement are required.<h4>Methods/principal findings</h4>A prospective multicentre study on developing consensus QC criteria for retinal OCT in MS: (1) a literature review on OCT QC criteria; (2) application of these QC criteria to a training set of 101 retinal OCT scans from patients with MS; (3) kappa statistics for inter-rater agreement; (4) identification reasons for inter-rater disagreement; (5) development of new consensus QC criteria; (6) testing of the new QC criteria on the training set and (7) prospective validation on a new set of 159 OCT scans from patients with MS. The inter-rater agreement for acceptable scans among OCT readers (n = 3) was moderate (kappa 0·45) based on the non-validated QC criteria which were entirely based on the ophthalmological literature. A new set of QC criteria was developed based on recognition of: (O) obvious problems, (S) poor signal strength, (C) centration of scan, (A) algorithm failure, (R) retinal pathology other than MS related, (I) illumination and (B) beam placement. Adhering to these OSCAR-IB QC criteria increased the inter-rater agreement to kappa from moderate to substantial (0.61 training set and 0.61 prospective validation).<h4>Conclusions</h4>This study presents the first validated consensus QC criteria for retinal OCT reading in MS. The high inter-rater agreement suggests the OSCAR-IB QC criteria to be considered in the context of multicentre studies and trials in MS.Prejaas TewarieLisanne BalkFiona CostelloAri GreenRoland MartinSven SchipplingAxel PetzoldPublic Library of Science (PLoS)articleMedicineRScienceQENPLoS ONE, Vol 7, Iss 4, p e34823 (2012)
institution DOAJ
collection DOAJ
language EN
topic Medicine
R
Science
Q
spellingShingle Medicine
R
Science
Q
Prejaas Tewarie
Lisanne Balk
Fiona Costello
Ari Green
Roland Martin
Sven Schippling
Axel Petzold
The OSCAR-IB consensus criteria for retinal OCT quality assessment.
description <h4>Background</h4>Retinal optical coherence tomography (OCT) is an imaging biomarker for neurodegeneration in multiple sclerosis (MS). In order to become validated as an outcome measure in multicenter studies, reliable quality control (QC) criteria with high inter-rater agreement are required.<h4>Methods/principal findings</h4>A prospective multicentre study on developing consensus QC criteria for retinal OCT in MS: (1) a literature review on OCT QC criteria; (2) application of these QC criteria to a training set of 101 retinal OCT scans from patients with MS; (3) kappa statistics for inter-rater agreement; (4) identification reasons for inter-rater disagreement; (5) development of new consensus QC criteria; (6) testing of the new QC criteria on the training set and (7) prospective validation on a new set of 159 OCT scans from patients with MS. The inter-rater agreement for acceptable scans among OCT readers (n = 3) was moderate (kappa 0·45) based on the non-validated QC criteria which were entirely based on the ophthalmological literature. A new set of QC criteria was developed based on recognition of: (O) obvious problems, (S) poor signal strength, (C) centration of scan, (A) algorithm failure, (R) retinal pathology other than MS related, (I) illumination and (B) beam placement. Adhering to these OSCAR-IB QC criteria increased the inter-rater agreement to kappa from moderate to substantial (0.61 training set and 0.61 prospective validation).<h4>Conclusions</h4>This study presents the first validated consensus QC criteria for retinal OCT reading in MS. The high inter-rater agreement suggests the OSCAR-IB QC criteria to be considered in the context of multicentre studies and trials in MS.
format article
author Prejaas Tewarie
Lisanne Balk
Fiona Costello
Ari Green
Roland Martin
Sven Schippling
Axel Petzold
author_facet Prejaas Tewarie
Lisanne Balk
Fiona Costello
Ari Green
Roland Martin
Sven Schippling
Axel Petzold
author_sort Prejaas Tewarie
title The OSCAR-IB consensus criteria for retinal OCT quality assessment.
title_short The OSCAR-IB consensus criteria for retinal OCT quality assessment.
title_full The OSCAR-IB consensus criteria for retinal OCT quality assessment.
title_fullStr The OSCAR-IB consensus criteria for retinal OCT quality assessment.
title_full_unstemmed The OSCAR-IB consensus criteria for retinal OCT quality assessment.
title_sort oscar-ib consensus criteria for retinal oct quality assessment.
publisher Public Library of Science (PLoS)
publishDate 2012
url https://doaj.org/article/56167ab7b592477496c8ad3f57c4a9bf
work_keys_str_mv AT prejaastewarie theoscaribconsensuscriteriaforretinaloctqualityassessment
AT lisannebalk theoscaribconsensuscriteriaforretinaloctqualityassessment
AT fionacostello theoscaribconsensuscriteriaforretinaloctqualityassessment
AT arigreen theoscaribconsensuscriteriaforretinaloctqualityassessment
AT rolandmartin theoscaribconsensuscriteriaforretinaloctqualityassessment
AT svenschippling theoscaribconsensuscriteriaforretinaloctqualityassessment
AT axelpetzold theoscaribconsensuscriteriaforretinaloctqualityassessment
AT prejaastewarie oscaribconsensuscriteriaforretinaloctqualityassessment
AT lisannebalk oscaribconsensuscriteriaforretinaloctqualityassessment
AT fionacostello oscaribconsensuscriteriaforretinaloctqualityassessment
AT arigreen oscaribconsensuscriteriaforretinaloctqualityassessment
AT rolandmartin oscaribconsensuscriteriaforretinaloctqualityassessment
AT svenschippling oscaribconsensuscriteriaforretinaloctqualityassessment
AT axelpetzold oscaribconsensuscriteriaforretinaloctqualityassessment
_version_ 1718423596027609088