Name: | Description: | Size: | Format: | |
---|---|---|---|---|
749.74 KB | Adobe PDF |
Advisor(s)
Abstract(s)
The identification of Activities of Daily Living (ADL) is intrinsic with the user’s
environment recognition. This detection can be executed through standard sensors present in
every-day mobile devices. On the one hand, the main proposal is to recognize users’ environment
and standing activities. On the other hand, these features are included in a framework for the ADL
and environment identification. Therefore, this paper is divided into two parts—firstly, acoustic
sensors are used for the collection of data towards the recognition of the environment and, secondly,
the information of the environment recognized is fused with the information gathered by motion
and magnetic sensors. The environment and ADL recognition are performed by pattern recognition
techniques that aim for the development of a system, including data collection, processing, fusion
and classification procedures. These classification techniques include distinctive types of Artificial
Neural Networks (ANN), analyzing various implementations of ANN and choosing the most
suitable for further inclusion in the following different stages of the developed system. The results
present 85.89% accuracy using Deep Neural Networks (DNN) with normalized data for the ADL
recognition and 86.50% accuracy using Feedforward Neural Networks (FNN) with non-normalized
data for environment recognition. Furthermore, the tests conducted present 100% accuracy for
standing activities recognition using DNN with normalized data, which is the most suited for the
intended purpose.
Description
This work is funded by FCT/MEC through national funds and when applicable co-funded
by FEDER-PT2020 partnership agreement under the project UID/EEA/50008/2019 This work is funded by
FCT/MEC through national funds and co-funded by FEDER-PT2020 partnership agreement under the project
(Este trabalho é financiado pela FCT/MEC através de fundos nacionais e cofinanciado pelo FEDER, no âmbito do Acordo
de Parceria PT2020 no âmbito do projeto UID/EEA/50008/2019). This article is based upon work from COST Action IC1303-AAPELE—Architectures, Algorithms and Protocols for Enhanced Living Environments and COST Action CA16226–SHELD-ON—Indoor living space improvement: Smart Habitat for the Elderly, supported by COST (European Cooperation in Science and Technology). More information in www.cost.eu.
Keywords
Activities of daily living (ADL) Data fusion Environments Feature extraction Pattern recognition Sensors
Citation
PIRES, Ivan M. [et al.] (2018) - Recognition of activities of daily living and environments using acoustic sensors embedded on mobile devices. Electronics. 12:8, 1499. DOI: https://doi.org/10.3390/electronics8121499
Publisher
MDPI