Real-time breath recognition by movies from a small drone landing on victim’s bodies

Abstract In local and global disaster scenes, rapid recognition of victims’ breathing is vital. It is unclear whether the footage transmitted from small drones can enable medical providers to detect breathing. This study investigated the ability of small drones to evaluate breathing correctly after...

Descripción completa

Guardado en:
Detalles Bibliográficos
Autores principales: Takeji Saitoh, Yoshiaki Takahashi, Hisae Minami, Yukako Nakashima, Shuhei Aramaki, Yuki Mihara, Takamasa Iwakura, Keiichiro Odagiri, Yuichiro Maekawa, Atsuto Yoshino
Formato: article
Lenguaje:EN
Publicado: Nature Portfolio 2021
Materias:
R
Q
Acceso en línea:https://doaj.org/article/13120739c22b4ade86aab12c716df354
Etiquetas: Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
id oai:doaj.org-article:13120739c22b4ade86aab12c716df354
record_format dspace
spelling oai:doaj.org-article:13120739c22b4ade86aab12c716df3542021-12-02T13:20:22ZReal-time breath recognition by movies from a small drone landing on victim’s bodies10.1038/s41598-021-84575-12045-2322https://doaj.org/article/13120739c22b4ade86aab12c716df3542021-03-01T00:00:00Zhttps://doi.org/10.1038/s41598-021-84575-1https://doaj.org/toc/2045-2322Abstract In local and global disaster scenes, rapid recognition of victims’ breathing is vital. It is unclear whether the footage transmitted from small drones can enable medical providers to detect breathing. This study investigated the ability of small drones to evaluate breathing correctly after landing on victims’ bodies and hovering over them. We enrolled 46 medical workers in this prospective, randomized, crossover study. The participants were provided with envelopes, from which they were asked to pull four notes sequentially and follow the written instructions (“breathing” and “no breathing”). After they lied on the ground in the supine position, a drone was landed on their abdomen, subsequently hovering over them. Two evaluators were asked to determine whether the participant had followed the “breathing” or “no breathing” instruction based on the real-time footage transmitted from the drone camera. The same experiment was performed while the participant was in the prone position. If both evaluators were able to determine the participant’s breathing status correctly, the results were tagged as “correct.” All experiments were successfully performed. Breathing was correctly determined in all 46 participants (100%) when the drone was landed on the abdomen and in 19 participants when the drone hovered over them while they were in the supine position (p < 0.01). In the prone position, breathing was correctly determined in 44 participants when the drone was landed on the abdomen and in 10 participants when it was kept hovering over them (p < 0.01). Notably, breathing status was misinterpreted as “no breathing” in 8 out of 27 (29.6%) participants lying in the supine position and 13 out of 36 (36.1%) participants lying in the prone position when the drone was kept hovering over them. The landing points seemed wider laterally when the participants were in the supine position than when they were in the prone position. Breathing status was more reliably determined when a small drone was landed on an individual’s body than when it hovered over them.Takeji SaitohYoshiaki TakahashiHisae MinamiYukako NakashimaShuhei AramakiYuki MiharaTakamasa IwakuraKeiichiro OdagiriYuichiro MaekawaAtsuto YoshinoNature PortfolioarticleMedicineRScienceQENScientific Reports, Vol 11, Iss 1, Pp 1-7 (2021)
institution DOAJ
collection DOAJ
language EN
topic Medicine
R
Science
Q
spellingShingle Medicine
R
Science
Q
Takeji Saitoh
Yoshiaki Takahashi
Hisae Minami
Yukako Nakashima
Shuhei Aramaki
Yuki Mihara
Takamasa Iwakura
Keiichiro Odagiri
Yuichiro Maekawa
Atsuto Yoshino
Real-time breath recognition by movies from a small drone landing on victim’s bodies
description Abstract In local and global disaster scenes, rapid recognition of victims’ breathing is vital. It is unclear whether the footage transmitted from small drones can enable medical providers to detect breathing. This study investigated the ability of small drones to evaluate breathing correctly after landing on victims’ bodies and hovering over them. We enrolled 46 medical workers in this prospective, randomized, crossover study. The participants were provided with envelopes, from which they were asked to pull four notes sequentially and follow the written instructions (“breathing” and “no breathing”). After they lied on the ground in the supine position, a drone was landed on their abdomen, subsequently hovering over them. Two evaluators were asked to determine whether the participant had followed the “breathing” or “no breathing” instruction based on the real-time footage transmitted from the drone camera. The same experiment was performed while the participant was in the prone position. If both evaluators were able to determine the participant’s breathing status correctly, the results were tagged as “correct.” All experiments were successfully performed. Breathing was correctly determined in all 46 participants (100%) when the drone was landed on the abdomen and in 19 participants when the drone hovered over them while they were in the supine position (p < 0.01). In the prone position, breathing was correctly determined in 44 participants when the drone was landed on the abdomen and in 10 participants when it was kept hovering over them (p < 0.01). Notably, breathing status was misinterpreted as “no breathing” in 8 out of 27 (29.6%) participants lying in the supine position and 13 out of 36 (36.1%) participants lying in the prone position when the drone was kept hovering over them. The landing points seemed wider laterally when the participants were in the supine position than when they were in the prone position. Breathing status was more reliably determined when a small drone was landed on an individual’s body than when it hovered over them.
format article
author Takeji Saitoh
Yoshiaki Takahashi
Hisae Minami
Yukako Nakashima
Shuhei Aramaki
Yuki Mihara
Takamasa Iwakura
Keiichiro Odagiri
Yuichiro Maekawa
Atsuto Yoshino
author_facet Takeji Saitoh
Yoshiaki Takahashi
Hisae Minami
Yukako Nakashima
Shuhei Aramaki
Yuki Mihara
Takamasa Iwakura
Keiichiro Odagiri
Yuichiro Maekawa
Atsuto Yoshino
author_sort Takeji Saitoh
title Real-time breath recognition by movies from a small drone landing on victim’s bodies
title_short Real-time breath recognition by movies from a small drone landing on victim’s bodies
title_full Real-time breath recognition by movies from a small drone landing on victim’s bodies
title_fullStr Real-time breath recognition by movies from a small drone landing on victim’s bodies
title_full_unstemmed Real-time breath recognition by movies from a small drone landing on victim’s bodies
title_sort real-time breath recognition by movies from a small drone landing on victim’s bodies
publisher Nature Portfolio
publishDate 2021
url https://doaj.org/article/13120739c22b4ade86aab12c716df354
work_keys_str_mv AT takejisaitoh realtimebreathrecognitionbymoviesfromasmalldronelandingonvictimsbodies
AT yoshiakitakahashi realtimebreathrecognitionbymoviesfromasmalldronelandingonvictimsbodies
AT hisaeminami realtimebreathrecognitionbymoviesfromasmalldronelandingonvictimsbodies
AT yukakonakashima realtimebreathrecognitionbymoviesfromasmalldronelandingonvictimsbodies
AT shuheiaramaki realtimebreathrecognitionbymoviesfromasmalldronelandingonvictimsbodies
AT yukimihara realtimebreathrecognitionbymoviesfromasmalldronelandingonvictimsbodies
AT takamasaiwakura realtimebreathrecognitionbymoviesfromasmalldronelandingonvictimsbodies
AT keiichiroodagiri realtimebreathrecognitionbymoviesfromasmalldronelandingonvictimsbodies
AT yuichiromaekawa realtimebreathrecognitionbymoviesfromasmalldronelandingonvictimsbodies
AT atsutoyoshino realtimebreathrecognitionbymoviesfromasmalldronelandingonvictimsbodies
_version_ 1718393194419322880