UJA Human activity recognition multi-occupancy dataset
Activity Recognition Systems - ARS are proposed to improve the quality of human life. An ARS uses predictive models to identify the activities that individuals are performing in different environments. Under data-driven approaches, these models are trained and tested in experimental environments fro...
- Autores:
-
De-La-Hoz-Franco, Emiro
Bernal Monroy, E.R
Ariza Colpas, P
Mendoza Palechor, F
Estevez, M.E.
- Tipo de recurso:
- Article of journal
- Fecha de publicación:
- 2021
- Institución:
- Corporación Universidad de la Costa
- Repositorio:
- REDICUC - Repositorio CUC
- Idioma:
- eng
- OAI Identifier:
- oai:repositorio.cuc.edu.co:11323/8495
- Acceso en línea:
- https://hdl.handle.net/11323/8495
https://repositorio.cuc.edu.co/
- Palabra clave:
- Acceleration of the inhabitant
intelligent floor for location
proximity and binary-sensors
- Rights
- openAccess
- License
- CC0 1.0 Universal
id |
RCUC2_f1a9e0ec79329c0871cfd99ec9147bde |
---|---|
oai_identifier_str |
oai:repositorio.cuc.edu.co:11323/8495 |
network_acronym_str |
RCUC2 |
network_name_str |
REDICUC - Repositorio CUC |
repository_id_str |
|
dc.title.spa.fl_str_mv |
UJA Human activity recognition multi-occupancy dataset |
title |
UJA Human activity recognition multi-occupancy dataset |
spellingShingle |
UJA Human activity recognition multi-occupancy dataset Acceleration of the inhabitant intelligent floor for location proximity and binary-sensors |
title_short |
UJA Human activity recognition multi-occupancy dataset |
title_full |
UJA Human activity recognition multi-occupancy dataset |
title_fullStr |
UJA Human activity recognition multi-occupancy dataset |
title_full_unstemmed |
UJA Human activity recognition multi-occupancy dataset |
title_sort |
UJA Human activity recognition multi-occupancy dataset |
dc.creator.fl_str_mv |
De-La-Hoz-Franco, Emiro Bernal Monroy, E.R Ariza Colpas, P Mendoza Palechor, F Estevez, M.E. |
dc.contributor.author.spa.fl_str_mv |
De-La-Hoz-Franco, Emiro Bernal Monroy, E.R Ariza Colpas, P Mendoza Palechor, F Estevez, M.E. |
dc.subject.spa.fl_str_mv |
Acceleration of the inhabitant intelligent floor for location proximity and binary-sensors |
topic |
Acceleration of the inhabitant intelligent floor for location proximity and binary-sensors |
description |
Activity Recognition Systems - ARS are proposed to improve the quality of human life. An ARS uses predictive models to identify the activities that individuals are performing in different environments. Under data-driven approaches, these models are trained and tested in experimental environments from datasets that contain data collected from heterogeneous information sources. When several people interact (multi-occupation) in the environment from which data are collected, identifying the activities performed by each individual in a time window is not a trivial task. In addition, there is a lack of datasets generated from different data sources, which allow systems to be evaluated both from an individual and collective perspective. This paper presents the SaMO – UJA dataset, which contains Single and Multi-Occupancy activities collected in the UJAmI (University of Jaén Ambient Intelligence, Spain) Smart Lab. The main contribution of this work is the presentation of a dataset that includes a new generation of sensors as a source of information (acceleration of the inhabitant, intelligent floor for location, proximity and binary-sensors) to provide an excellent tool for addressing multioccupancy in smart environments. |
publishDate |
2021 |
dc.date.accessioned.none.fl_str_mv |
2021-07-27T13:44:42Z |
dc.date.available.none.fl_str_mv |
2021-07-27T13:44:42Z |
dc.date.issued.none.fl_str_mv |
2021-06 |
dc.type.spa.fl_str_mv |
Artículo de revista |
dc.type.coar.fl_str_mv |
http://purl.org/coar/resource_type/c_2df8fbb1 |
dc.type.coar.spa.fl_str_mv |
http://purl.org/coar/resource_type/c_6501 |
dc.type.content.spa.fl_str_mv |
Text |
dc.type.driver.spa.fl_str_mv |
info:eu-repo/semantics/article |
dc.type.redcol.spa.fl_str_mv |
http://purl.org/redcol/resource_type/ART |
dc.type.version.spa.fl_str_mv |
info:eu-repo/semantics/acceptedVersion |
format |
http://purl.org/coar/resource_type/c_6501 |
status_str |
acceptedVersion |
dc.identifier.issn.spa.fl_str_mv |
15301605 |
dc.identifier.uri.spa.fl_str_mv |
https://hdl.handle.net/11323/8495 |
dc.identifier.instname.spa.fl_str_mv |
Corporación Universidad de la Costa |
dc.identifier.reponame.spa.fl_str_mv |
REDICUC - Repositorio CUC |
dc.identifier.repourl.spa.fl_str_mv |
https://repositorio.cuc.edu.co/ |
identifier_str_mv |
15301605 Corporación Universidad de la Costa REDICUC - Repositorio CUC |
url |
https://hdl.handle.net/11323/8495 https://repositorio.cuc.edu.co/ |
dc.language.iso.none.fl_str_mv |
eng |
language |
eng |
dc.relation.references.spa.fl_str_mv |
[1] R. Mohamed, T. Perumal, M. Sulaiman and N. Mustapha, "Multi-resident activity recognition using label combination approach in smart home environment," IEEE International Symposium on Consumer Electronics (ISCE), Kuala Lumpur, 2017, pp. 69-71, doi: 10.1109/ISCE.2017.8355551. [2] R. Mohamed, T. Perumal, M. Sulaiman, N. Mustapha and M. Razali, "Conflict resolution using enhanced label combination method for complex activity recognition in smart home environment," IEEE 6th Global Conference on Consumer Electronics (GCCE), Nagoya, 2017, pp. 1-3, doi: 10.1109/GCCE.2017.8229477. [3] J. Medina, S. Zhang, C. Nugent, M. Espinilla. “Ensemble classifier of Long Short-Term Memory with Fuzzy Temporal Windows on binary sensors for Activity Recognition” in Expert Systems with Applications, vol. 114,2018 pp. 441-453, doi: 10.1016/j.eswa.2018.07.068. [4] A. Salguero, M. Espinilla, P. de la Torre, J. Medina “Using Ontologies for the Online Recognition of Activities of Daily Living” in Sensors, vol. 18, n.º 4., 2018. doi: 10.3390/s18041202. [5] E. De-La-Hoz-Franco, P. Ariza-Colpas, J. M. Quero and M. Espinilla, "Sensor-Based Datasets for Human Activity Recognition – A Systematic Review of Literature," in IEEE Access, vol. 6, 2018 pp. 59192- 59210, doi: 10.1109/ACCESS.2018.2873502. [6] T. L. M. van Kasteren, G. Englebienne, and B. J. A. Kröse, ‘‘Activity recognition using semi-Markov models on real world smart home datasets,’’ J. Ambient Intell. Smart Environ., vol. 2, no. 3, Aug. 2010, pp. 311–325 doi: 10.3233/AIS-2010-0070. [7] D. J. Cook, A. S. Crandall, B. L. Thomas and N. C. Krishnan, "CASAS: A Smart Home in a Box," in Computer, vol. 46, no. 7, July 2013, pp. 62-69, doi: 10.1109/MC.2012.328. [8] D. Cook, "Learning Setting-Generalized Activity Models for Smart Spaces," in IEEE Intelligent Systems, vol. 27, no. 1, Jan.-Feb. 2012, pp. 32-38, doi: 10.1109/MIS.2010.112. [9] G. Singla, D.J. Cook, and M. Schmitter-Edgecombe, ‘‘Recognizing independent and joint activities among multiple residents in smart environments,’’ J. Ambient Intell. Humanized Comput., vol. 1, no. 1, 2010, pp. 57– 63, doi: 10.1007/s12652-009-0007-1. [10] D. Anguita, A. Ghio, L. Oneto, X. Parra, and J. L. Reyes-Ortiz, ‘‘A public domain dataset for human activity recognition using smartphones,’’ in Proc. 21th Eur. Symp. Artif. Neural Netw., Comput. Intell. Mach. Learn. (ESANN), Bruges, Belgium, Apr. 2013, pp. 437–442. [11] R. Chavarriaga et al., ‘‘The opportunity challenge: A benchmark database for on-body sensor-based activity recognition,’’ Pattern Recognit. Lett., vol. 34, no. 15, Nov. 2013, pp. 2033–2042, doi: 10.1016/j.patrec.2012.12.014. [12] O. Banos et al., ‘‘mHealthDroid: A novel framework for agile development of mobile health applications,’’ in Proc. 6th Int. Workshop Conf. Ambient Assist. Living (IWAAL), Belfast, U.K., Dec. 2014, pp. 91–98, doi: 10.1007/978-3-319-13105-4_14. [13] M. Espinilla, J. Medina, and C. Nugent, “UCAmI Cup. Analyzing the UJA Human Activity Recognition Dataset of Activities of Daily Living,” Proceedings, vol. 2, no. 19, Oct. 2018, p. 1267, doi: 10.3390/proceedings2191267. [14] J. D. Cerón, D. M. López, and B. M. Eskofier, “Human Activity Recognition Using Binary Sensors, BLE Beacons, an Intelligent Floor and Acceleration Data: A Machine Learning Approach,” Proceedings, vol. 2, no. 19, Oct. 2018, p. 1265, doi: 10.3390/proceedings2191265. [15] N. Karvonen and D. Kleyko, “A Domain KnowledgeBased Solution for Human Activity Recognition: The UJA Dataset Analysis,” Proceedings, vol. 2, no. 19, Oct. 2018, p. 1261, doi: 10.3390/proceedings2191261. [16] M. Razzaq, I. Cleland, C. Nugent, and S. Lee, “Multimodal Sensor Data Fusion for Activity Recognition Using Filtered Classifier,” Proceedings, vol. 2, no. 19, Oct. 2018, p. 1262, doi: 10.3390/proceedings2191262. [17] S. Salomón and C. Tîrnăucă, “Human Activity Recognition through Weighted Finite Automata,” Proceedings, vol. 2, no. 19, Oct. 2018, p. 1263, doi: 10.3390/proceedings2191263. [18] A. Jiménez and F. Seco, “Multi-Event Naive Bayes Classifier for Activity Recognition in the UCAmI Cup,” Proceedings, vol. 2, no. 19, Oct. 2018, p. 1264, doi: 10.3390/proceedings2191264. [19] P. Lago and S. Inoue, “A Hybrid Model Using Hidden Markov Chain and Logic Model for Daily Living Activity Recognition,” Proceedings, vol. 2, no. 19, Oct. 2018, p. 1266, doi: 10.3390/proceedings2191266. [20] WSU CASAS Datasets. Accessed: Jul. 1, 2020. [Online]. Available: http://casas.wsu.edu/datasets/ Page 1946 [21] H. Alemdar, H. Ertan, O. D. Incel and C. Ersoy, "ARAS human activity datasets in multiple homes with multiple residents," 2013 7th International Conference on Pervasive Computing Technologies for Healthcare and Workshops, Venice, 2013, pp. 232-235. [22] T. Tan, M. Gochoo, S. Huang, Y. Liu, S. Liu and Y. Huang, "Multi-Resident Activity Recognition in a Smart Home Using RGB Activity Image and DCNN," in IEEE Sensors Journal, vol. 18, no. 23, 1 Dec.1, 2018 pp. 9718-9727, doi: 10.1109/JSEN.2018.2866806. [23] F. A. Machot, A. H. Mosa, M. Ali and K. Kyamakya, "Activity Recognition in Sensor Data Streams for Active and Assisted Living Environments," in IEEE Transactions on Circuits and Systems for Video Technology, vol. 28, no. 10, Oct. 2018, pp. 2933-2945, doi: 10.1109/TCSVT.2017.2764868. [24] Y. Ting , K. Hsu, C. Lu, Li-Chen Fu and J. Yung-Jen, "Interaction models for multiple-resident activity recognition in a smart home," 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems, Taipei, 2010, pp. 3753-3758, doi: 10.1109/IROS.2010.5650340. [25] R. Kumar, I. Qamar, J. S. Virdi and N. C. Krishnan, "Multi-label Learning for Activity Recognition," 2015 International Conference on Intelligent Environments, Prague, 2015, pp. 152-155, doi: 10.1109/IE.2015.32. [26] N. Sarma, S. Chakraborty and D. S. Banerjee, "Learning and Annotating Activities for Home Automation using LSTM," 2019 11th International Conference on Communication Systems & Networks (COMSNETS), Bengaluru, India, 2019, pp. 631-636, doi: 10.1109/COMSNETS.2019.8711433. [27] N. Sarma, S. Chakraborty and D. S. Banerjee, "Activity Recognition through Feature Learning and Annotations using LSTM," 2019 11th International Conference on Communication Systems & Networks (COMSNETS), Bengaluru, India, 2019, pp. 444-447, doi: 10.1109/COMSNETS.2019.8711147. [28] W. Wang and C. Miao, "Multi-Resident Activity Recognition with Unseen Classes in Smart Homes," 2018 IEEE SmartWorld, Ubiquitous Intelligence & Computing, Advanced & Trusted Computing, Scalable Computing & Communications, Cloud & Big Data Computing, Internet of People and Smart City Innovation (SmartWorld /SCALCOM /UIC /ATC /CBDCom /IOP /SCI), Guangzhou, 2018, pp. 780-787. doi: 10.1109/SmartWorld.2018.00147. [29] Everspring Website. Accessed: Jul. 1, 2020. [Online]. Available: http://www.everspring.com/product/ [30] Estimote Website. Accessed: Jul. 1, 2020. [Online]. Available: https://estimote.com/ [31] Android SDK. Accessed: Jul. 1, 2020. [Online]. Available: https://developer.android.com/docs [32] SensFloor®. Accessed: Jul. 1, 2020. [Online]. Available: https://future-shape.com/en/system [33] F. A. Machot and H. C. Mayr, ‘‘Improving human activity recognition by smart windowing and spatiotemporal feature analysis,’’ in Proc. 9th ACM Int. Conf. Pervasive Technol. Rel. Assistive Environ., 2016, p. 56, doi: 10.1145/2910674.2910697. [34] M. Espinilla, L. Martínez, J. Medina and C. Nugent, "The Experience of Developing the UJAmI Smart Lab," in IEEE Access, vol. 6, 2018, pp. 34631-34642, doi: 10.1109/ACCESS.2018.2849226. [35] A.G. Salguero, J. Medina, P. Delatorre, et al. “Methodology for improving classification accuracy using ontologies: application in the recognition of activities of daily living” in J Ambient Intell Human Comput 10, (2019), p 2125–2142 https://doi.org/10.1007/s12652-018-0769-4 [36] M.A. López, M. Espinilla, I. Cleland, C. Nugent, and J. Medina, “Fuzzy cloud-fog computing approach application for human activity recognition in smart homes” in Journal of Intelligent & Fuzzy Systems, vol. 38, no. 1, pp. 709-721 (2020). doi: 10.3233/JIFS179443. [37] M.Á. López ; M Espinilla; C. Paggeti, J.Medina Activity Recognition for IoT Devices Using Fuzzy Spatio-Temporal Features as Environmental Sensor Fusion. Sensors 2019, 19, 3512. [38] M. Á. López, M. Espinilla, C. Nugent, & J. Quero, “Evaluation of convolutional neural networks for the classification of falls from heterogeneous thermal vision sensors” in International Journal of Distributed Sensor Networks (2020). https://doi.org/10.1177/1550147720920485 [39] R. A. Hamad, A. S. Hidalgo, M. Bouguelia, M. E. Estevez and J. M. Quero, "Efficient Activity Recognition in Smart Homes Using Delayed Fuzzy Temporal Windows on Binary Sensors," in IEEE Journal of Biomedical and Health Informatics, vol. 24, no. 2, Feb. 2020, pp. 387-395, doi: 10.1109/JBHI.2019.2918412. [40] M. Espinilla, J. Medina, J. Hallberg,. et al. A new approach based on temporal sub-windows for online sensor-based activity recognition. J Ambient Intell Human Comput (2018). https://doi.org/10.1007/s12652-018-0746-y |
dc.rights.spa.fl_str_mv |
CC0 1.0 Universal |
dc.rights.uri.spa.fl_str_mv |
http://creativecommons.org/publicdomain/zero/1.0/ |
dc.rights.accessrights.spa.fl_str_mv |
info:eu-repo/semantics/openAccess |
dc.rights.coar.spa.fl_str_mv |
http://purl.org/coar/access_right/c_abf2 |
rights_invalid_str_mv |
CC0 1.0 Universal http://creativecommons.org/publicdomain/zero/1.0/ http://purl.org/coar/access_right/c_abf2 |
eu_rights_str_mv |
openAccess |
dc.format.mimetype.spa.fl_str_mv |
application/pdf |
dc.publisher.spa.fl_str_mv |
Proceedings of the 54th Hawaii International Conference on System Sciences |
institution |
Corporación Universidad de la Costa |
dc.source.url.spa.fl_str_mv |
https://ezproxy.cuc.edu.co:2104/record/display.uri?eid=2-s2.0-85108309878&origin=resultslist&sort=plf-f&src=s&sid=f2ed54a71a75ceef7e16719ade10d5e2&sot=b&sdt=b&sl=69&s=TITLE-ABS-KEY%28UJA+Human+Activity+Recognition+multi-occupancy+dataset%29&relpos=0&citeCnt=0&searchTerm= |
bitstream.url.fl_str_mv |
https://repositorio.cuc.edu.co/bitstreams/3dabee0c-c6ff-45df-a2fd-9c8ed97198a4/download https://repositorio.cuc.edu.co/bitstreams/891bdf05-a3d4-419c-bb73-bd12194c4dff/download https://repositorio.cuc.edu.co/bitstreams/a565ce62-fbe1-4948-a40e-b12debe89a58/download https://repositorio.cuc.edu.co/bitstreams/41529b64-9fcd-4c2a-ad4e-cc8c2e8f638c/download https://repositorio.cuc.edu.co/bitstreams/3c3a9f8d-2451-48be-bb5a-cca93727d4f6/download |
bitstream.checksum.fl_str_mv |
f9859dd41ec6b9262e045d9ac1631859 42fd4ad1e89814f5e4a476b409eb708c e30e9215131d99561d40d6b0abbe9bad e644cf818dde135a78e90a37c3dfeaa4 97299da0c8f80a1c1a6fb7371a299d4e |
bitstream.checksumAlgorithm.fl_str_mv |
MD5 MD5 MD5 MD5 MD5 |
repository.name.fl_str_mv |
Repositorio de la Universidad de la Costa CUC |
repository.mail.fl_str_mv |
repdigital@cuc.edu.co |
_version_ |
1828166537326886912 |
spelling |
De-La-Hoz-Franco, EmiroBernal Monroy, E.RAriza Colpas, PMendoza Palechor, FEstevez, M.E.2021-07-27T13:44:42Z2021-07-27T13:44:42Z2021-0615301605https://hdl.handle.net/11323/8495Corporación Universidad de la CostaREDICUC - Repositorio CUChttps://repositorio.cuc.edu.co/Activity Recognition Systems - ARS are proposed to improve the quality of human life. An ARS uses predictive models to identify the activities that individuals are performing in different environments. Under data-driven approaches, these models are trained and tested in experimental environments from datasets that contain data collected from heterogeneous information sources. When several people interact (multi-occupation) in the environment from which data are collected, identifying the activities performed by each individual in a time window is not a trivial task. In addition, there is a lack of datasets generated from different data sources, which allow systems to be evaluated both from an individual and collective perspective. This paper presents the SaMO – UJA dataset, which contains Single and Multi-Occupancy activities collected in the UJAmI (University of Jaén Ambient Intelligence, Spain) Smart Lab. The main contribution of this work is the presentation of a dataset that includes a new generation of sensors as a source of information (acceleration of the inhabitant, intelligent floor for location, proximity and binary-sensors) to provide an excellent tool for addressing multioccupancy in smart environments.De-La-Hoz-Franco, Emiro-will be generated-orcid-0000-0002-4926-7414-600Bernal Monroy, E.RAriza Colpas, PMendoza Palechor, FEstevez, M.E.application/pdfengProceedings of the 54th Hawaii International Conference on System SciencesCC0 1.0 Universalhttp://creativecommons.org/publicdomain/zero/1.0/info:eu-repo/semantics/openAccesshttp://purl.org/coar/access_right/c_abf2Acceleration of the inhabitantintelligent floor for locationproximity and binary-sensorsUJA Human activity recognition multi-occupancy datasetArtículo de revistahttp://purl.org/coar/resource_type/c_6501http://purl.org/coar/resource_type/c_2df8fbb1Textinfo:eu-repo/semantics/articlehttp://purl.org/redcol/resource_type/ARTinfo:eu-repo/semantics/acceptedVersionhttps://ezproxy.cuc.edu.co:2104/record/display.uri?eid=2-s2.0-85108309878&origin=resultslist&sort=plf-f&src=s&sid=f2ed54a71a75ceef7e16719ade10d5e2&sot=b&sdt=b&sl=69&s=TITLE-ABS-KEY%28UJA+Human+Activity+Recognition+multi-occupancy+dataset%29&relpos=0&citeCnt=0&searchTerm=[1] R. Mohamed, T. Perumal, M. Sulaiman and N. Mustapha, "Multi-resident activity recognition using label combination approach in smart home environment," IEEE International Symposium on Consumer Electronics (ISCE), Kuala Lumpur, 2017, pp. 69-71, doi: 10.1109/ISCE.2017.8355551.[2] R. Mohamed, T. Perumal, M. Sulaiman, N. Mustapha and M. Razali, "Conflict resolution using enhanced label combination method for complex activity recognition in smart home environment," IEEE 6th Global Conference on Consumer Electronics (GCCE), Nagoya, 2017, pp. 1-3, doi: 10.1109/GCCE.2017.8229477.[3] J. Medina, S. Zhang, C. Nugent, M. Espinilla. “Ensemble classifier of Long Short-Term Memory with Fuzzy Temporal Windows on binary sensors for Activity Recognition” in Expert Systems with Applications, vol. 114,2018 pp. 441-453, doi: 10.1016/j.eswa.2018.07.068.[4] A. Salguero, M. Espinilla, P. de la Torre, J. Medina “Using Ontologies for the Online Recognition of Activities of Daily Living” in Sensors, vol. 18, n.º 4., 2018. doi: 10.3390/s18041202.[5] E. De-La-Hoz-Franco, P. Ariza-Colpas, J. M. Quero and M. Espinilla, "Sensor-Based Datasets for Human Activity Recognition – A Systematic Review of Literature," in IEEE Access, vol. 6, 2018 pp. 59192- 59210, doi: 10.1109/ACCESS.2018.2873502.[6] T. L. M. van Kasteren, G. Englebienne, and B. J. A. Kröse, ‘‘Activity recognition using semi-Markov models on real world smart home datasets,’’ J. Ambient Intell. Smart Environ., vol. 2, no. 3, Aug. 2010, pp. 311–325 doi: 10.3233/AIS-2010-0070.[7] D. J. Cook, A. S. Crandall, B. L. Thomas and N. C. Krishnan, "CASAS: A Smart Home in a Box," in Computer, vol. 46, no. 7, July 2013, pp. 62-69, doi: 10.1109/MC.2012.328.[8] D. Cook, "Learning Setting-Generalized Activity Models for Smart Spaces," in IEEE Intelligent Systems, vol. 27, no. 1, Jan.-Feb. 2012, pp. 32-38, doi: 10.1109/MIS.2010.112.[9] G. Singla, D.J. Cook, and M. Schmitter-Edgecombe, ‘‘Recognizing independent and joint activities among multiple residents in smart environments,’’ J. Ambient Intell. Humanized Comput., vol. 1, no. 1, 2010, pp. 57– 63, doi: 10.1007/s12652-009-0007-1.[10] D. Anguita, A. Ghio, L. Oneto, X. Parra, and J. L. Reyes-Ortiz, ‘‘A public domain dataset for human activity recognition using smartphones,’’ in Proc. 21th Eur. Symp. Artif. Neural Netw., Comput. Intell. Mach. Learn. (ESANN), Bruges, Belgium, Apr. 2013, pp. 437–442.[11] R. Chavarriaga et al., ‘‘The opportunity challenge: A benchmark database for on-body sensor-based activity recognition,’’ Pattern Recognit. Lett., vol. 34, no. 15, Nov. 2013, pp. 2033–2042, doi: 10.1016/j.patrec.2012.12.014.[12] O. Banos et al., ‘‘mHealthDroid: A novel framework for agile development of mobile health applications,’’ in Proc. 6th Int. Workshop Conf. Ambient Assist. Living (IWAAL), Belfast, U.K., Dec. 2014, pp. 91–98, doi: 10.1007/978-3-319-13105-4_14.[13] M. Espinilla, J. Medina, and C. Nugent, “UCAmI Cup. Analyzing the UJA Human Activity Recognition Dataset of Activities of Daily Living,” Proceedings, vol. 2, no. 19, Oct. 2018, p. 1267, doi: 10.3390/proceedings2191267.[14] J. D. Cerón, D. M. López, and B. M. Eskofier, “Human Activity Recognition Using Binary Sensors, BLE Beacons, an Intelligent Floor and Acceleration Data: A Machine Learning Approach,” Proceedings, vol. 2, no. 19, Oct. 2018, p. 1265, doi: 10.3390/proceedings2191265.[15] N. Karvonen and D. Kleyko, “A Domain KnowledgeBased Solution for Human Activity Recognition: The UJA Dataset Analysis,” Proceedings, vol. 2, no. 19, Oct. 2018, p. 1261, doi: 10.3390/proceedings2191261.[16] M. Razzaq, I. Cleland, C. Nugent, and S. Lee, “Multimodal Sensor Data Fusion for Activity Recognition Using Filtered Classifier,” Proceedings, vol. 2, no. 19, Oct. 2018, p. 1262, doi: 10.3390/proceedings2191262.[17] S. Salomón and C. Tîrnăucă, “Human Activity Recognition through Weighted Finite Automata,” Proceedings, vol. 2, no. 19, Oct. 2018, p. 1263, doi: 10.3390/proceedings2191263.[18] A. Jiménez and F. Seco, “Multi-Event Naive Bayes Classifier for Activity Recognition in the UCAmI Cup,” Proceedings, vol. 2, no. 19, Oct. 2018, p. 1264, doi: 10.3390/proceedings2191264.[19] P. Lago and S. Inoue, “A Hybrid Model Using Hidden Markov Chain and Logic Model for Daily Living Activity Recognition,” Proceedings, vol. 2, no. 19, Oct. 2018, p. 1266, doi: 10.3390/proceedings2191266.[20] WSU CASAS Datasets. Accessed: Jul. 1, 2020. [Online]. Available: http://casas.wsu.edu/datasets/ Page 1946[21] H. Alemdar, H. Ertan, O. D. Incel and C. Ersoy, "ARAS human activity datasets in multiple homes with multiple residents," 2013 7th International Conference on Pervasive Computing Technologies for Healthcare and Workshops, Venice, 2013, pp. 232-235.[22] T. Tan, M. Gochoo, S. Huang, Y. Liu, S. Liu and Y. Huang, "Multi-Resident Activity Recognition in a Smart Home Using RGB Activity Image and DCNN," in IEEE Sensors Journal, vol. 18, no. 23, 1 Dec.1, 2018 pp. 9718-9727, doi: 10.1109/JSEN.2018.2866806.[23] F. A. Machot, A. H. Mosa, M. Ali and K. Kyamakya, "Activity Recognition in Sensor Data Streams for Active and Assisted Living Environments," in IEEE Transactions on Circuits and Systems for Video Technology, vol. 28, no. 10, Oct. 2018, pp. 2933-2945, doi: 10.1109/TCSVT.2017.2764868.[24] Y. Ting , K. Hsu, C. Lu, Li-Chen Fu and J. Yung-Jen, "Interaction models for multiple-resident activity recognition in a smart home," 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems, Taipei, 2010, pp. 3753-3758, doi: 10.1109/IROS.2010.5650340.[25] R. Kumar, I. Qamar, J. S. Virdi and N. C. Krishnan, "Multi-label Learning for Activity Recognition," 2015 International Conference on Intelligent Environments, Prague, 2015, pp. 152-155, doi: 10.1109/IE.2015.32.[26] N. Sarma, S. Chakraborty and D. S. Banerjee, "Learning and Annotating Activities for Home Automation using LSTM," 2019 11th International Conference on Communication Systems & Networks (COMSNETS), Bengaluru, India, 2019, pp. 631-636, doi: 10.1109/COMSNETS.2019.8711433.[27] N. Sarma, S. Chakraborty and D. S. Banerjee, "Activity Recognition through Feature Learning and Annotations using LSTM," 2019 11th International Conference on Communication Systems & Networks (COMSNETS), Bengaluru, India, 2019, pp. 444-447, doi: 10.1109/COMSNETS.2019.8711147.[28] W. Wang and C. Miao, "Multi-Resident Activity Recognition with Unseen Classes in Smart Homes," 2018 IEEE SmartWorld, Ubiquitous Intelligence & Computing, Advanced & Trusted Computing, Scalable Computing & Communications, Cloud & Big Data Computing, Internet of People and Smart City Innovation (SmartWorld /SCALCOM /UIC /ATC /CBDCom /IOP /SCI), Guangzhou, 2018, pp. 780-787. doi: 10.1109/SmartWorld.2018.00147.[29] Everspring Website. Accessed: Jul. 1, 2020. [Online]. Available: http://www.everspring.com/product/[30] Estimote Website. Accessed: Jul. 1, 2020. [Online]. Available: https://estimote.com/[31] Android SDK. Accessed: Jul. 1, 2020. [Online]. Available: https://developer.android.com/docs[32] SensFloor®. Accessed: Jul. 1, 2020. [Online]. Available: https://future-shape.com/en/system[33] F. A. Machot and H. C. Mayr, ‘‘Improving human activity recognition by smart windowing and spatiotemporal feature analysis,’’ in Proc. 9th ACM Int. Conf. Pervasive Technol. Rel. Assistive Environ., 2016, p. 56, doi: 10.1145/2910674.2910697.[34] M. Espinilla, L. Martínez, J. Medina and C. Nugent, "The Experience of Developing the UJAmI Smart Lab," in IEEE Access, vol. 6, 2018, pp. 34631-34642, doi: 10.1109/ACCESS.2018.2849226.[35] A.G. Salguero, J. Medina, P. Delatorre, et al. “Methodology for improving classification accuracy using ontologies: application in the recognition of activities of daily living” in J Ambient Intell Human Comput 10, (2019), p 2125–2142 https://doi.org/10.1007/s12652-018-0769-4[36] M.A. López, M. Espinilla, I. Cleland, C. Nugent, and J. Medina, “Fuzzy cloud-fog computing approach application for human activity recognition in smart homes” in Journal of Intelligent & Fuzzy Systems, vol. 38, no. 1, pp. 709-721 (2020). doi: 10.3233/JIFS179443.[37] M.Á. López ; M Espinilla; C. Paggeti, J.Medina Activity Recognition for IoT Devices Using Fuzzy Spatio-Temporal Features as Environmental Sensor Fusion. Sensors 2019, 19, 3512.[38] M. Á. López, M. Espinilla, C. Nugent, & J. Quero, “Evaluation of convolutional neural networks for the classification of falls from heterogeneous thermal vision sensors” in International Journal of Distributed Sensor Networks (2020). https://doi.org/10.1177/1550147720920485[39] R. A. Hamad, A. S. Hidalgo, M. Bouguelia, M. E. Estevez and J. M. Quero, "Efficient Activity Recognition in Smart Homes Using Delayed Fuzzy Temporal Windows on Binary Sensors," in IEEE Journal of Biomedical and Health Informatics, vol. 24, no. 2, Feb. 2020, pp. 387-395, doi: 10.1109/JBHI.2019.2918412.[40] M. Espinilla, J. Medina, J. Hallberg,. et al. A new approach based on temporal sub-windows for online sensor-based activity recognition. J Ambient Intell Human Comput (2018). https://doi.org/10.1007/s12652-018-0746-yPublicationORIGINALUJA Human Activity Recognition multi-occupancy dataset.pdfUJA Human Activity Recognition multi-occupancy dataset.pdfapplication/pdf698431https://repositorio.cuc.edu.co/bitstreams/3dabee0c-c6ff-45df-a2fd-9c8ed97198a4/downloadf9859dd41ec6b9262e045d9ac1631859MD51CC-LICENSElicense_rdflicense_rdfapplication/rdf+xml; charset=utf-8701https://repositorio.cuc.edu.co/bitstreams/891bdf05-a3d4-419c-bb73-bd12194c4dff/download42fd4ad1e89814f5e4a476b409eb708cMD52LICENSElicense.txtlicense.txttext/plain; charset=utf-83196https://repositorio.cuc.edu.co/bitstreams/a565ce62-fbe1-4948-a40e-b12debe89a58/downloade30e9215131d99561d40d6b0abbe9badMD53THUMBNAILUJA Human Activity Recognition multi-occupancy dataset.pdf.jpgUJA Human Activity Recognition multi-occupancy dataset.pdf.jpgimage/jpeg67860https://repositorio.cuc.edu.co/bitstreams/41529b64-9fcd-4c2a-ad4e-cc8c2e8f638c/downloade644cf818dde135a78e90a37c3dfeaa4MD54TEXTUJA Human Activity Recognition multi-occupancy dataset.pdf.txtUJA Human Activity Recognition multi-occupancy dataset.pdf.txttext/plain41901https://repositorio.cuc.edu.co/bitstreams/3c3a9f8d-2451-48be-bb5a-cca93727d4f6/download97299da0c8f80a1c1a6fb7371a299d4eMD5511323/8495oai:repositorio.cuc.edu.co:11323/84952024-09-16 16:37:38.532http://creativecommons.org/publicdomain/zero/1.0/CC0 1.0 Universalopen.accesshttps://repositorio.cuc.edu.coRepositorio de la Universidad de la Costa CUCrepdigital@cuc.edu.coQXV0b3Jpem8gKGF1dG9yaXphbW9zKSBhIGxhIEJpYmxpb3RlY2EgZGUgbGEgSW5zdGl0dWNpw7NuIHBhcmEgcXVlIGluY2x1eWEgdW5hIGNvcGlhLCBpbmRleGUgeSBkaXZ1bGd1ZSBlbiBlbCBSZXBvc2l0b3JpbyBJbnN0aXR1Y2lvbmFsLCBsYSBvYnJhIG1lbmNpb25hZGEgY29uIGVsIGZpbiBkZSBmYWNpbGl0YXIgbG9zIHByb2Nlc29zIGRlIHZpc2liaWxpZGFkIGUgaW1wYWN0byBkZSBsYSBtaXNtYSwgY29uZm9ybWUgYSBsb3MgZGVyZWNob3MgcGF0cmltb25pYWxlcyBxdWUgbWUobm9zKSBjb3JyZXNwb25kZShuKSB5IHF1ZSBpbmNsdXllbjogbGEgcmVwcm9kdWNjacOzbiwgY29tdW5pY2FjacOzbiBww7pibGljYSwgZGlzdHJpYnVjacOzbiBhbCBww7pibGljbywgdHJhbnNmb3JtYWNpw7NuLCBkZSBjb25mb3JtaWRhZCBjb24gbGEgbm9ybWF0aXZpZGFkIHZpZ2VudGUgc29icmUgZGVyZWNob3MgZGUgYXV0b3IgeSBkZXJlY2hvcyBjb25leG9zIHJlZmVyaWRvcyBlbiBhcnQuIDIsIDEyLCAzMCAobW9kaWZpY2FkbyBwb3IgZWwgYXJ0IDUgZGUgbGEgbGV5IDE1MjAvMjAxMiksIHkgNzIgZGUgbGEgbGV5IDIzIGRlIGRlIDE5ODIsIExleSA0NCBkZSAxOTkzLCBhcnQuIDQgeSAxMSBEZWNpc2nDs24gQW5kaW5hIDM1MSBkZSAxOTkzIGFydC4gMTEsIERlY3JldG8gNDYwIGRlIDE5OTUsIENpcmN1bGFyIE5vIDA2LzIwMDIgZGUgbGEgRGlyZWNjacOzbiBOYWNpb25hbCBkZSBEZXJlY2hvcyBkZSBhdXRvciwgYXJ0LiAxNSBMZXkgMTUyMCBkZSAyMDEyLCBsYSBMZXkgMTkxNSBkZSAyMDE4IHkgZGVtw6FzIG5vcm1hcyBzb2JyZSBsYSBtYXRlcmlhLg0KDQpBbCByZXNwZWN0byBjb21vIEF1dG9yKGVzKSBtYW5pZmVzdGFtb3MgY29ub2NlciBxdWU6DQoNCi0gTGEgYXV0b3JpemFjacOzbiBlcyBkZSBjYXLDoWN0ZXIgbm8gZXhjbHVzaXZhIHkgbGltaXRhZGEsIGVzdG8gaW1wbGljYSBxdWUgbGEgbGljZW5jaWEgdGllbmUgdW5hIHZpZ2VuY2lhLCBxdWUgbm8gZXMgcGVycGV0dWEgeSBxdWUgZWwgYXV0b3IgcHVlZGUgcHVibGljYXIgbyBkaWZ1bmRpciBzdSBvYnJhIGVuIGN1YWxxdWllciBvdHJvIG1lZGlvLCBhc8OtIGNvbW8gbGxldmFyIGEgY2FibyBjdWFscXVpZXIgdGlwbyBkZSBhY2Npw7NuIHNvYnJlIGVsIGRvY3VtZW50by4NCg0KLSBMYSBhdXRvcml6YWNpw7NuIHRlbmRyw6EgdW5hIHZpZ2VuY2lhIGRlIGNpbmNvIGHDsW9zIGEgcGFydGlyIGRlbCBtb21lbnRvIGRlIGxhIGluY2x1c2nDs24gZGUgbGEgb2JyYSBlbiBlbCByZXBvc2l0b3JpbywgcHJvcnJvZ2FibGUgaW5kZWZpbmlkYW1lbnRlIHBvciBlbCB0aWVtcG8gZGUgZHVyYWNpw7NuIGRlIGxvcyBkZXJlY2hvcyBwYXRyaW1vbmlhbGVzIGRlbCBhdXRvciB5IHBvZHLDoSBkYXJzZSBwb3IgdGVybWluYWRhIHVuYSB2ZXogZWwgYXV0b3IgbG8gbWFuaWZpZXN0ZSBwb3IgZXNjcml0byBhIGxhIGluc3RpdHVjacOzbiwgY29uIGxhIHNhbHZlZGFkIGRlIHF1ZSBsYSBvYnJhIGVzIGRpZnVuZGlkYSBnbG9iYWxtZW50ZSB5IGNvc2VjaGFkYSBwb3IgZGlmZXJlbnRlcyBidXNjYWRvcmVzIHkvbyByZXBvc2l0b3Jpb3MgZW4gSW50ZXJuZXQgbG8gcXVlIG5vIGdhcmFudGl6YSBxdWUgbGEgb2JyYSBwdWVkYSBzZXIgcmV0aXJhZGEgZGUgbWFuZXJhIGlubWVkaWF0YSBkZSBvdHJvcyBzaXN0ZW1hcyBkZSBpbmZvcm1hY2nDs24gZW4gbG9zIHF1ZSBzZSBoYXlhIGluZGV4YWRvLCBkaWZlcmVudGVzIGFsIHJlcG9zaXRvcmlvIGluc3RpdHVjaW9uYWwgZGUgbGEgSW5zdGl0dWNpw7NuLCBkZSBtYW5lcmEgcXVlIGVsIGF1dG9yKHJlcykgdGVuZHLDoW4gcXVlIHNvbGljaXRhciBsYSByZXRpcmFkYSBkZSBzdSBvYnJhIGRpcmVjdGFtZW50ZSBhIG90cm9zIHNpc3RlbWFzIGRlIGluZm9ybWFjacOzbiBkaXN0aW50b3MgYWwgZGUgbGEgSW5zdGl0dWNpw7NuIHNpIGRlc2VhIHF1ZSBzdSBvYnJhIHNlYSByZXRpcmFkYSBkZSBpbm1lZGlhdG8uDQoNCi0gTGEgYXV0b3JpemFjacOzbiBkZSBwdWJsaWNhY2nDs24gY29tcHJlbmRlIGVsIGZvcm1hdG8gb3JpZ2luYWwgZGUgbGEgb2JyYSB5IHRvZG9zIGxvcyBkZW3DoXMgcXVlIHNlIHJlcXVpZXJhIHBhcmEgc3UgcHVibGljYWNpw7NuIGVuIGVsIHJlcG9zaXRvcmlvLiBJZ3VhbG1lbnRlLCBsYSBhdXRvcml6YWNpw7NuIHBlcm1pdGUgYSBsYSBpbnN0aXR1Y2nDs24gZWwgY2FtYmlvIGRlIHNvcG9ydGUgZGUgbGEgb2JyYSBjb24gZmluZXMgZGUgcHJlc2VydmFjacOzbiAoaW1wcmVzbywgZWxlY3Ryw7NuaWNvLCBkaWdpdGFsLCBJbnRlcm5ldCwgaW50cmFuZXQsIG8gY3VhbHF1aWVyIG90cm8gZm9ybWF0byBjb25vY2lkbyBvIHBvciBjb25vY2VyKS4NCg0KLSBMYSBhdXRvcml6YWNpw7NuIGVzIGdyYXR1aXRhIHkgc2UgcmVudW5jaWEgYSByZWNpYmlyIGN1YWxxdWllciByZW11bmVyYWNpw7NuIHBvciBsb3MgdXNvcyBkZSBsYSBvYnJhLCBkZSBhY3VlcmRvIGNvbiBsYSBsaWNlbmNpYSBlc3RhYmxlY2lkYSBlbiBlc3RhIGF1dG9yaXphY2nDs24uDQoNCi0gQWwgZmlybWFyIGVzdGEgYXV0b3JpemFjacOzbiwgc2UgbWFuaWZpZXN0YSBxdWUgbGEgb2JyYSBlcyBvcmlnaW5hbCB5IG5vIGV4aXN0ZSBlbiBlbGxhIG5pbmd1bmEgdmlvbGFjacOzbiBhIGxvcyBkZXJlY2hvcyBkZSBhdXRvciBkZSB0ZXJjZXJvcy4gRW4gY2FzbyBkZSBxdWUgZWwgdHJhYmFqbyBoYXlhIHNpZG8gZmluYW5jaWFkbyBwb3IgdGVyY2Vyb3MgZWwgbyBsb3MgYXV0b3JlcyBhc3VtZW4gbGEgcmVzcG9uc2FiaWxpZGFkIGRlbCBjdW1wbGltaWVudG8gZGUgbG9zIGFjdWVyZG9zIGVzdGFibGVjaWRvcyBzb2JyZSBsb3MgZGVyZWNob3MgcGF0cmltb25pYWxlcyBkZSBsYSBvYnJhIGNvbiBkaWNobyB0ZXJjZXJvLg0KDQotIEZyZW50ZSBhIGN1YWxxdWllciByZWNsYW1hY2nDs24gcG9yIHRlcmNlcm9zLCBlbCBvIGxvcyBhdXRvcmVzIHNlcsOhbiByZXNwb25zYWJsZXMsIGVuIG5pbmfDum4gY2FzbyBsYSByZXNwb25zYWJpbGlkYWQgc2Vyw6EgYXN1bWlkYSBwb3IgbGEgaW5zdGl0dWNpw7NuLg0KDQotIENvbiBsYSBhdXRvcml6YWNpw7NuLCBsYSBpbnN0aXR1Y2nDs24gcHVlZGUgZGlmdW5kaXIgbGEgb2JyYSBlbiDDrW5kaWNlcywgYnVzY2Fkb3JlcyB5IG90cm9zIHNpc3RlbWFzIGRlIGluZm9ybWFjacOzbiBxdWUgZmF2b3JlemNhbiBzdSB2aXNpYmlsaWRhZA== |