The design and implementation of the Carolina Automated Reading Evaluation for reading deficit screening
This paper examines the design process and implementation of the Carolina Automated Reading Evaluation (CARE). Designed to automate the process of screening for reading deficits, CARE is an interactive computer-based tool that helps eliminate the need for one-on-one evaluations of pupils to detect d...
Guardado en:
Autores principales: | , , , , |
---|---|
Formato: | article |
Lenguaje: | EN |
Publicado: |
Elsevier
2021
|
Materias: | |
Acceso en línea: | https://doaj.org/article/719b547d0f8f4f38a0545740619d1d4e |
Etiquetas: |
Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
|
id |
oai:doaj.org-article:719b547d0f8f4f38a0545740619d1d4e |
---|---|
record_format |
dspace |
spelling |
oai:doaj.org-article:719b547d0f8f4f38a0545740619d1d4e2021-12-01T05:04:30ZThe design and implementation of the Carolina Automated Reading Evaluation for reading deficit screening2451-958810.1016/j.chbr.2021.100123https://doaj.org/article/719b547d0f8f4f38a0545740619d1d4e2021-08-01T00:00:00Zhttp://www.sciencedirect.com/science/article/pii/S2451958821000713https://doaj.org/toc/2451-9588This paper examines the design process and implementation of the Carolina Automated Reading Evaluation (CARE). Designed to automate the process of screening for reading deficits, CARE is an interactive computer-based tool that helps eliminate the need for one-on-one evaluations of pupils to detect dyslexia and other reading deficits. While other tests collect specific data points in order to determine whether a pupil has dyslexia, they typically focus on only a few metrics for diagnosis, such as handwriting analysis or eye tracking. The CARE collects data across up to 16 different subtests, each built to test proficiency in various reading skills. These skills include reading fluency, phoneme manipulation, sound blending, and many other essential skills for reading. This wide variety of measurements allows for a more focused intervention to be created for the pupil. For this study, elementary school pupils were tested both with the CARE and with the Dynamic Indicators of Basic Early Literacy Skills (DIBELS) test, a well-established screener for elementary school-level reading deficits. The data collected was used in a comparison of average scores, a gradient boosting tree classifier, and a convergence test to determine if there was any correlation between the CARE and DIBELS scores. Based on these comparisons, a correspondence was found between the two tests, showing that the CARE can detect reading deficits comparable to manually administered testing instruments. Based on these findings, the CARE could be used as a replacement for current tests, giving the users more detailed data at a faster rate. This paper reviews the technical development and preliminary analysis of the CARE. It also provides insights into key considerations when translating standard psychological screeners onto computerized platforms.William H. HoskinsWilliam I. HobbsMichael J. EasonScott DeckerJijun TangElsevierarticleApplied computingSocial and behavioral sciencesPsychologyEducation/learningReading assessmentDyslexiaElectronic computers. Computer scienceQA75.5-76.95PsychologyBF1-990ENComputers in Human Behavior Reports, Vol 4, Iss , Pp 100123- (2021) |
institution |
DOAJ |
collection |
DOAJ |
language |
EN |
topic |
Applied computing Social and behavioral sciences Psychology Education/learning Reading assessment Dyslexia Electronic computers. Computer science QA75.5-76.95 Psychology BF1-990 |
spellingShingle |
Applied computing Social and behavioral sciences Psychology Education/learning Reading assessment Dyslexia Electronic computers. Computer science QA75.5-76.95 Psychology BF1-990 William H. Hoskins William I. Hobbs Michael J. Eason Scott Decker Jijun Tang The design and implementation of the Carolina Automated Reading Evaluation for reading deficit screening |
description |
This paper examines the design process and implementation of the Carolina Automated Reading Evaluation (CARE). Designed to automate the process of screening for reading deficits, CARE is an interactive computer-based tool that helps eliminate the need for one-on-one evaluations of pupils to detect dyslexia and other reading deficits. While other tests collect specific data points in order to determine whether a pupil has dyslexia, they typically focus on only a few metrics for diagnosis, such as handwriting analysis or eye tracking. The CARE collects data across up to 16 different subtests, each built to test proficiency in various reading skills. These skills include reading fluency, phoneme manipulation, sound blending, and many other essential skills for reading. This wide variety of measurements allows for a more focused intervention to be created for the pupil. For this study, elementary school pupils were tested both with the CARE and with the Dynamic Indicators of Basic Early Literacy Skills (DIBELS) test, a well-established screener for elementary school-level reading deficits. The data collected was used in a comparison of average scores, a gradient boosting tree classifier, and a convergence test to determine if there was any correlation between the CARE and DIBELS scores. Based on these comparisons, a correspondence was found between the two tests, showing that the CARE can detect reading deficits comparable to manually administered testing instruments. Based on these findings, the CARE could be used as a replacement for current tests, giving the users more detailed data at a faster rate. This paper reviews the technical development and preliminary analysis of the CARE. It also provides insights into key considerations when translating standard psychological screeners onto computerized platforms. |
format |
article |
author |
William H. Hoskins William I. Hobbs Michael J. Eason Scott Decker Jijun Tang |
author_facet |
William H. Hoskins William I. Hobbs Michael J. Eason Scott Decker Jijun Tang |
author_sort |
William H. Hoskins |
title |
The design and implementation of the Carolina Automated Reading Evaluation for reading deficit screening |
title_short |
The design and implementation of the Carolina Automated Reading Evaluation for reading deficit screening |
title_full |
The design and implementation of the Carolina Automated Reading Evaluation for reading deficit screening |
title_fullStr |
The design and implementation of the Carolina Automated Reading Evaluation for reading deficit screening |
title_full_unstemmed |
The design and implementation of the Carolina Automated Reading Evaluation for reading deficit screening |
title_sort |
design and implementation of the carolina automated reading evaluation for reading deficit screening |
publisher |
Elsevier |
publishDate |
2021 |
url |
https://doaj.org/article/719b547d0f8f4f38a0545740619d1d4e |
work_keys_str_mv |
AT williamhhoskins thedesignandimplementationofthecarolinaautomatedreadingevaluationforreadingdeficitscreening AT williamihobbs thedesignandimplementationofthecarolinaautomatedreadingevaluationforreadingdeficitscreening AT michaeljeason thedesignandimplementationofthecarolinaautomatedreadingevaluationforreadingdeficitscreening AT scottdecker thedesignandimplementationofthecarolinaautomatedreadingevaluationforreadingdeficitscreening AT jijuntang thedesignandimplementationofthecarolinaautomatedreadingevaluationforreadingdeficitscreening AT williamhhoskins designandimplementationofthecarolinaautomatedreadingevaluationforreadingdeficitscreening AT williamihobbs designandimplementationofthecarolinaautomatedreadingevaluationforreadingdeficitscreening AT michaeljeason designandimplementationofthecarolinaautomatedreadingevaluationforreadingdeficitscreening AT scottdecker designandimplementationofthecarolinaautomatedreadingevaluationforreadingdeficitscreening AT jijuntang designandimplementationofthecarolinaautomatedreadingevaluationforreadingdeficitscreening |
_version_ |
1718405548088492032 |