Identifying Sources of Difference in Reliability in Content Analysis
This paper reports on a case study which identifies and illustrates sources of difference in agreement in relation to reliability in a context of quantitative content analysis of a transcript of an online asynchronous discussion (OAD). Transcripts of 10 students in a month-long online asynchronous d...
Guardado en:
Autores principales: | , |
---|---|
Formato: | article |
Lenguaje: | EN |
Publicado: |
Athabasca University Press
2005
|
Materias: | |
Acceso en línea: | https://doaj.org/article/a97620a42dc840259ec8a511b758675a |
Etiquetas: |
Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
|
id |
oai:doaj.org-article:a97620a42dc840259ec8a511b758675a |
---|---|
record_format |
dspace |
spelling |
oai:doaj.org-article:a97620a42dc840259ec8a511b758675a2021-12-02T18:03:26ZIdentifying Sources of Difference in Reliability in Content Analysis10.19173/irrodl.v6i2.2331492-3831https://doaj.org/article/a97620a42dc840259ec8a511b758675a2005-07-01T00:00:00Zhttp://www.irrodl.org/index.php/irrodl/article/view/233https://doaj.org/toc/1492-3831This paper reports on a case study which identifies and illustrates sources of difference in agreement in relation to reliability in a context of quantitative content analysis of a transcript of an online asynchronous discussion (OAD). Transcripts of 10 students in a month-long online asynchronous discussion were coded by two coders using an instrument with two categories, five processes, and 19 indicators of Problem Formulation and Resolution (PFR). Sources of difference were identified in relation to: coders; tasks; and students. Reliability values were calculated at the levels of categories, processes, and indicators. At the most detailed level of coding on the basis of the indicator, findings revealed that the overall level of reliability between coders was .591 when measured with Cohen’s kappa. The difference between tasks at the same level ranged from .349 to .664, and the difference between participants ranged from .390 to .907. Implications for training and research are discussed. Keywords: content analysis; online discussions; reliability; Cohen's kappa; sources of difference; codingElizabeth MurphyJustyna Ciszewska-CarrAthabasca University PressarticleSpecial aspects of educationLC8-6691ENInternational Review of Research in Open and Distributed Learning, Vol 6, Iss 2 (2005) |
institution |
DOAJ |
collection |
DOAJ |
language |
EN |
topic |
Special aspects of education LC8-6691 |
spellingShingle |
Special aspects of education LC8-6691 Elizabeth Murphy Justyna Ciszewska-Carr Identifying Sources of Difference in Reliability in Content Analysis |
description |
This paper reports on a case study which identifies and illustrates sources of difference in agreement in relation to reliability in a context of quantitative content analysis of a transcript of an online asynchronous discussion (OAD). Transcripts of 10 students in a month-long online asynchronous discussion were coded by two coders using an instrument with two categories, five processes, and 19 indicators of Problem Formulation and Resolution (PFR). Sources of difference were identified in relation to: coders; tasks; and students. Reliability values were calculated at the levels of categories, processes, and indicators. At the most detailed level of coding on the basis of the indicator, findings revealed that the overall level of reliability between coders was .591 when measured with Cohen’s kappa. The difference between tasks at the same level ranged from .349 to .664, and the difference between participants ranged from .390 to .907. Implications for training and research are discussed.
Keywords: content analysis; online discussions; reliability; Cohen's kappa; sources of difference; coding |
format |
article |
author |
Elizabeth Murphy Justyna Ciszewska-Carr |
author_facet |
Elizabeth Murphy Justyna Ciszewska-Carr |
author_sort |
Elizabeth Murphy |
title |
Identifying Sources of Difference in Reliability in Content Analysis |
title_short |
Identifying Sources of Difference in Reliability in Content Analysis |
title_full |
Identifying Sources of Difference in Reliability in Content Analysis |
title_fullStr |
Identifying Sources of Difference in Reliability in Content Analysis |
title_full_unstemmed |
Identifying Sources of Difference in Reliability in Content Analysis |
title_sort |
identifying sources of difference in reliability in content analysis |
publisher |
Athabasca University Press |
publishDate |
2005 |
url |
https://doaj.org/article/a97620a42dc840259ec8a511b758675a |
work_keys_str_mv |
AT elizabethmurphy identifyingsourcesofdifferenceinreliabilityincontentanalysis AT justynaciszewskacarr identifyingsourcesofdifferenceinreliabilityincontentanalysis |
_version_ |
1718378703898017792 |