The Internet of Things (IoT) widely supports the smart healthcare field; combined with computer vision, machine, and deep learning techniques; it provides fast and accurate services for automated patient discomfort monitoring/detection systems. Traditional patient monitoring systems are commonly comprised of wearable sensors and vision-based methods. In this paper, an IoT-based non-invasive automated patient’s discomfort monitoring/detection system is presented and implemented, using a deep learning-based algorithm. The system is based on an IP camera device; the patient’s body’s movement and posture are detected without using any wearable devices. The Mask-RCNN method is employed for the extraction of different key points on the patient body. These detected key points are then transformed into six major body organs using association rules of data mining. Furthermore, for analyzing the patient’s discomfort, detected key point coordinates information is measured. Finally, the distance and the temporal threshold are applied to classify movements as either associated with normal or discomfort conditions. These key points information is also used to determine the postures of the patient lying on the bed. The patient’s body position and posture are continuously monitored, based on which comfort and discomfort level are discriminated. For experimental evaluation, different video sequences are recorded covering two patient’s beds. The experimental results show the proposed system’s worth by achieving a true-positive rate of 94% and a false-positive rate of 7%.

A Deep Learning-based Smart Healthcare System for Patient’s Discomfort Detection at the Edge of Internet of Things / Ahmed, I.; Jeon, G.; Piccialli, F.. - In: IEEE INTERNET OF THINGS JOURNAL. - ISSN 2327-4662. - (2021). [10.1109/JIOT.2021.3052067]

A Deep Learning-based Smart Healthcare System for Patient’s Discomfort Detection at the Edge of Internet of Things

Piccialli F.
2021

Abstract

The Internet of Things (IoT) widely supports the smart healthcare field; combined with computer vision, machine, and deep learning techniques; it provides fast and accurate services for automated patient discomfort monitoring/detection systems. Traditional patient monitoring systems are commonly comprised of wearable sensors and vision-based methods. In this paper, an IoT-based non-invasive automated patient’s discomfort monitoring/detection system is presented and implemented, using a deep learning-based algorithm. The system is based on an IP camera device; the patient’s body’s movement and posture are detected without using any wearable devices. The Mask-RCNN method is employed for the extraction of different key points on the patient body. These detected key points are then transformed into six major body organs using association rules of data mining. Furthermore, for analyzing the patient’s discomfort, detected key point coordinates information is measured. Finally, the distance and the temporal threshold are applied to classify movements as either associated with normal or discomfort conditions. These key points information is also used to determine the postures of the patient lying on the bed. The patient’s body position and posture are continuously monitored, based on which comfort and discomfort level are discriminated. For experimental evaluation, different video sequences are recorded covering two patient’s beds. The experimental results show the proposed system’s worth by achieving a true-positive rate of 94% and a false-positive rate of 7%.
2021
A Deep Learning-based Smart Healthcare System for Patient’s Discomfort Detection at the Edge of Internet of Things / Ahmed, I.; Jeon, G.; Piccialli, F.. - In: IEEE INTERNET OF THINGS JOURNAL. - ISSN 2327-4662. - (2021). [10.1109/JIOT.2021.3052067]
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11588/834853
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 64
  • ???jsp.display-item.citation.isi??? 36
social impact