Affective recognition from EEG signals: an integrated data-mining approach
Emotions play an important role in human communication, interaction, and decision making processes. Therefore, considerable efforts have been made towards the automatic identification of human emotions, in particular electroencephalogram (EEG) signals and Data Mining (DM) techniques have been then u...
- Autores:
-
Mendoza Palechor, Fabio Enrique
Recena Menezes, Maria Luiza
Sant’anna, Anita
Ortiz Barrios, Miguel Angel
Samara, Anas
Galway, Leo
- Tipo de recurso:
- Article of journal
- Fecha de publicación:
- 2018
- Institución:
- Corporación Universidad de la Costa
- Repositorio:
- REDICUC - Repositorio CUC
- Idioma:
- eng
- OAI Identifier:
- oai:repositorio.cuc.edu.co:11323/1747
- Acceso en línea:
- https://hdl.handle.net/11323/1747
https://doi.org/10.1007/s12652-018-1065-z
https://repositorio.cuc.edu.co/
- Palabra clave:
- Affective computing
Affective recognition
Data Mining (DM)
Electroencephalogram (EEG)
Statistical features
- Rights
- openAccess
- License
- Atribución – No comercial – Compartir igual
id |
RCUC2_b758a40c58264ae6d738c1d919f481c0 |
---|---|
oai_identifier_str |
oai:repositorio.cuc.edu.co:11323/1747 |
network_acronym_str |
RCUC2 |
network_name_str |
REDICUC - Repositorio CUC |
repository_id_str |
|
dc.title.eng.fl_str_mv |
Affective recognition from EEG signals: an integrated data-mining approach |
title |
Affective recognition from EEG signals: an integrated data-mining approach |
spellingShingle |
Affective recognition from EEG signals: an integrated data-mining approach Affective computing Affective recognition Data Mining (DM) Electroencephalogram (EEG) Statistical features |
title_short |
Affective recognition from EEG signals: an integrated data-mining approach |
title_full |
Affective recognition from EEG signals: an integrated data-mining approach |
title_fullStr |
Affective recognition from EEG signals: an integrated data-mining approach |
title_full_unstemmed |
Affective recognition from EEG signals: an integrated data-mining approach |
title_sort |
Affective recognition from EEG signals: an integrated data-mining approach |
dc.creator.fl_str_mv |
Mendoza Palechor, Fabio Enrique Recena Menezes, Maria Luiza Sant’anna, Anita Ortiz Barrios, Miguel Angel Samara, Anas Galway, Leo |
dc.contributor.author.spa.fl_str_mv |
Mendoza Palechor, Fabio Enrique Recena Menezes, Maria Luiza Sant’anna, Anita Ortiz Barrios, Miguel Angel Samara, Anas Galway, Leo |
dc.subject.eng.fl_str_mv |
Affective computing Affective recognition Data Mining (DM) Electroencephalogram (EEG) Statistical features |
topic |
Affective computing Affective recognition Data Mining (DM) Electroencephalogram (EEG) Statistical features |
description |
Emotions play an important role in human communication, interaction, and decision making processes. Therefore, considerable efforts have been made towards the automatic identification of human emotions, in particular electroencephalogram (EEG) signals and Data Mining (DM) techniques have been then used to create models recognizing the affective states of users. However, most previous works have used clinical grade EEG systems with at least 32 electrodes. These systems are expensive and cumbersome, and therefore unsuitable for usage during normal daily activities. Smaller EEG headsets such as the Emotiv are now available and can be used during daily activities. This paper investigates the accuracy and applicability of previous affective recognition methods on data collected with an Emotiv headset while participants used a personal computer to fulfill several tasks. Several features were extracted from four channels only (AF3, AF4, F3 and F4 in accordance with the 10–20 system). Both Support Vector Machine and Naïve Bayes were used for emotion classification. Results demonstrate that such methods can be used to accurately detect emotions using a small EEG headset during a normal daily activity. |
publishDate |
2018 |
dc.date.accessioned.none.fl_str_mv |
2018-11-23T16:10:24Z |
dc.date.available.none.fl_str_mv |
2018-11-23T16:10:24Z |
dc.date.issued.none.fl_str_mv |
2018 |
dc.type.spa.fl_str_mv |
Artículo de revista |
dc.type.coar.fl_str_mv |
http://purl.org/coar/resource_type/c_2df8fbb1 |
dc.type.coar.spa.fl_str_mv |
http://purl.org/coar/resource_type/c_6501 |
dc.type.content.spa.fl_str_mv |
Text |
dc.type.driver.spa.fl_str_mv |
info:eu-repo/semantics/article |
dc.type.redcol.spa.fl_str_mv |
http://purl.org/redcol/resource_type/ART |
dc.type.version.spa.fl_str_mv |
info:eu-repo/semantics/acceptedVersion |
format |
http://purl.org/coar/resource_type/c_6501 |
status_str |
acceptedVersion |
dc.identifier.issn.spa.fl_str_mv |
18685137 |
dc.identifier.uri.spa.fl_str_mv |
https://hdl.handle.net/11323/1747 |
dc.identifier.doi.spa.fl_str_mv |
https://doi.org/10.1007/s12652-018-1065-z |
dc.identifier.instname.spa.fl_str_mv |
Corporación Universidad de la Costa |
dc.identifier.reponame.spa.fl_str_mv |
REDICUC - Repositorio CUC |
dc.identifier.repourl.spa.fl_str_mv |
https://repositorio.cuc.edu.co/ |
identifier_str_mv |
18685137 Corporación Universidad de la Costa REDICUC - Repositorio CUC |
url |
https://hdl.handle.net/11323/1747 https://doi.org/10.1007/s12652-018-1065-z https://repositorio.cuc.edu.co/ |
dc.language.iso.none.fl_str_mv |
eng |
language |
eng |
dc.rights.spa.fl_str_mv |
Atribución – No comercial – Compartir igual |
dc.rights.accessrights.spa.fl_str_mv |
info:eu-repo/semantics/openAccess |
dc.rights.coar.spa.fl_str_mv |
http://purl.org/coar/access_right/c_abf2 |
rights_invalid_str_mv |
Atribución – No comercial – Compartir igual http://purl.org/coar/access_right/c_abf2 |
eu_rights_str_mv |
openAccess |
dc.publisher.spa.fl_str_mv |
Journal of Ambient Intelligence and Humanized Computing |
institution |
Corporación Universidad de la Costa |
bitstream.url.fl_str_mv |
https://repositorio.cuc.edu.co/bitstreams/b92da808-37d9-4a94-be34-88dd0365d728/download https://repositorio.cuc.edu.co/bitstreams/dd9c088c-77d4-43ee-8d73-4b2f7d21eb2c/download https://repositorio.cuc.edu.co/bitstreams/19af6a64-6b37-4b5c-a388-1a78e02461be/download https://repositorio.cuc.edu.co/bitstreams/48c43be5-fe8d-4975-ae77-3fd4f767345f/download |
bitstream.checksum.fl_str_mv |
fd9b4a51899764fbeab44551c706f3db 8a4605be74aa9ea9d79846c1fba20a33 cab28bac1837d2c3d114db1966de5461 236996fd93b5f6553f167c905cac2906 |
bitstream.checksumAlgorithm.fl_str_mv |
MD5 MD5 MD5 MD5 |
repository.name.fl_str_mv |
Repositorio de la Universidad de la Costa CUC |
repository.mail.fl_str_mv |
repdigital@cuc.edu.co |
_version_ |
1811760759195041792 |
spelling |
Mendoza Palechor, Fabio EnriqueRecena Menezes, Maria LuizaSant’anna, AnitaOrtiz Barrios, Miguel AngelSamara, AnasGalway, Leo2018-11-23T16:10:24Z2018-11-23T16:10:24Z201818685137https://hdl.handle.net/11323/1747https://doi.org/10.1007/s12652-018-1065-zCorporación Universidad de la CostaREDICUC - Repositorio CUChttps://repositorio.cuc.edu.co/Emotions play an important role in human communication, interaction, and decision making processes. Therefore, considerable efforts have been made towards the automatic identification of human emotions, in particular electroencephalogram (EEG) signals and Data Mining (DM) techniques have been then used to create models recognizing the affective states of users. However, most previous works have used clinical grade EEG systems with at least 32 electrodes. These systems are expensive and cumbersome, and therefore unsuitable for usage during normal daily activities. Smaller EEG headsets such as the Emotiv are now available and can be used during daily activities. This paper investigates the accuracy and applicability of previous affective recognition methods on data collected with an Emotiv headset while participants used a personal computer to fulfill several tasks. Several features were extracted from four channels only (AF3, AF4, F3 and F4 in accordance with the 10–20 system). Both Support Vector Machine and Naïve Bayes were used for emotion classification. Results demonstrate that such methods can be used to accurately detect emotions using a small EEG headset during a normal daily activity.Mendoza Palechor, Fabio Enrique-0000-0002-2755-0841-600Recena Menezes, Maria Luiza-c8eaba4a-0b49-447d-90b0-5cf0437500c0-0Sant’anna, Anita-f2c553ab-b317-42ef-bccc-7093c0134552-0Ortiz Barrios, Miguel Angel-0000-0001-6890-7547-600Samara, Anas-edeac690-26ba-4f55-82b3-b60ba7eba410-0Galway, Leo-893471ae-82ef-4b5c-8835-f7cee402d3b1-0engJournal of Ambient Intelligence and Humanized ComputingAtribución – No comercial – Compartir igualinfo:eu-repo/semantics/openAccesshttp://purl.org/coar/access_right/c_abf2Affective computingAffective recognitionData Mining (DM)Electroencephalogram (EEG)Statistical featuresAffective recognition from EEG signals: an integrated data-mining approachArtículo de revistahttp://purl.org/coar/resource_type/c_6501http://purl.org/coar/resource_type/c_2df8fbb1Textinfo:eu-repo/semantics/articlehttp://purl.org/redcol/resource_type/ARTinfo:eu-repo/semantics/acceptedVersionPublicationORIGINALAffective recognition from EEG signals.pdfAffective recognition from EEG signals.pdfapplication/pdf105257https://repositorio.cuc.edu.co/bitstreams/b92da808-37d9-4a94-be34-88dd0365d728/downloadfd9b4a51899764fbeab44551c706f3dbMD51LICENSElicense.txtlicense.txttext/plain; charset=utf-81748https://repositorio.cuc.edu.co/bitstreams/dd9c088c-77d4-43ee-8d73-4b2f7d21eb2c/download8a4605be74aa9ea9d79846c1fba20a33MD52THUMBNAILAffective recognition from EEG signals.pdf.jpgAffective recognition from EEG signals.pdf.jpgimage/jpeg47850https://repositorio.cuc.edu.co/bitstreams/19af6a64-6b37-4b5c-a388-1a78e02461be/downloadcab28bac1837d2c3d114db1966de5461MD54TEXTAffective recognition from EEG signals.pdf.txtAffective recognition from EEG signals.pdf.txttext/plain1596https://repositorio.cuc.edu.co/bitstreams/48c43be5-fe8d-4975-ae77-3fd4f767345f/download236996fd93b5f6553f167c905cac2906MD5511323/1747oai:repositorio.cuc.edu.co:11323/17472024-09-17 11:00:57.774open.accesshttps://repositorio.cuc.edu.coRepositorio de la Universidad de la Costa CUCrepdigital@cuc.edu.coTk9URTogUExBQ0UgWU9VUiBPV04gTElDRU5TRSBIRVJFClRoaXMgc2FtcGxlIGxpY2Vuc2UgaXMgcHJvdmlkZWQgZm9yIGluZm9ybWF0aW9uYWwgcHVycG9zZXMgb25seS4KCk5PTi1FWENMVVNJVkUgRElTVFJJQlVUSU9OIExJQ0VOU0UKCkJ5IHNpZ25pbmcgYW5kIHN1Ym1pdHRpbmcgdGhpcyBsaWNlbnNlLCB5b3UgKHRoZSBhdXRob3Iocykgb3IgY29weXJpZ2h0Cm93bmVyKSBncmFudHMgdG8gRFNwYWNlIFVuaXZlcnNpdHkgKERTVSkgdGhlIG5vbi1leGNsdXNpdmUgcmlnaHQgdG8gcmVwcm9kdWNlLAp0cmFuc2xhdGUgKGFzIGRlZmluZWQgYmVsb3cpLCBhbmQvb3IgZGlzdHJpYnV0ZSB5b3VyIHN1Ym1pc3Npb24gKGluY2x1ZGluZwp0aGUgYWJzdHJhY3QpIHdvcmxkd2lkZSBpbiBwcmludCBhbmQgZWxlY3Ryb25pYyBmb3JtYXQgYW5kIGluIGFueSBtZWRpdW0sCmluY2x1ZGluZyBidXQgbm90IGxpbWl0ZWQgdG8gYXVkaW8gb3IgdmlkZW8uCgpZb3UgYWdyZWUgdGhhdCBEU1UgbWF5LCB3aXRob3V0IGNoYW5naW5nIHRoZSBjb250ZW50LCB0cmFuc2xhdGUgdGhlCnN1Ym1pc3Npb24gdG8gYW55IG1lZGl1bSBvciBmb3JtYXQgZm9yIHRoZSBwdXJwb3NlIG9mIHByZXNlcnZhdGlvbi4KCllvdSBhbHNvIGFncmVlIHRoYXQgRFNVIG1heSBrZWVwIG1vcmUgdGhhbiBvbmUgY29weSBvZiB0aGlzIHN1Ym1pc3Npb24gZm9yCnB1cnBvc2VzIG9mIHNlY3VyaXR5LCBiYWNrLXVwIGFuZCBwcmVzZXJ2YXRpb24uCgpZb3UgcmVwcmVzZW50IHRoYXQgdGhlIHN1Ym1pc3Npb24gaXMgeW91ciBvcmlnaW5hbCB3b3JrLCBhbmQgdGhhdCB5b3UgaGF2ZQp0aGUgcmlnaHQgdG8gZ3JhbnQgdGhlIHJpZ2h0cyBjb250YWluZWQgaW4gdGhpcyBsaWNlbnNlLiBZb3UgYWxzbyByZXByZXNlbnQKdGhhdCB5b3VyIHN1Ym1pc3Npb24gZG9lcyBub3QsIHRvIHRoZSBiZXN0IG9mIHlvdXIga25vd2xlZGdlLCBpbmZyaW5nZSB1cG9uCmFueW9uZSdzIGNvcHlyaWdodC4KCklmIHRoZSBzdWJtaXNzaW9uIGNvbnRhaW5zIG1hdGVyaWFsIGZvciB3aGljaCB5b3UgZG8gbm90IGhvbGQgY29weXJpZ2h0LAp5b3UgcmVwcmVzZW50IHRoYXQgeW91IGhhdmUgb2J0YWluZWQgdGhlIHVucmVzdHJpY3RlZCBwZXJtaXNzaW9uIG9mIHRoZQpjb3B5cmlnaHQgb3duZXIgdG8gZ3JhbnQgRFNVIHRoZSByaWdodHMgcmVxdWlyZWQgYnkgdGhpcyBsaWNlbnNlLCBhbmQgdGhhdApzdWNoIHRoaXJkLXBhcnR5IG93bmVkIG1hdGVyaWFsIGlzIGNsZWFybHkgaWRlbnRpZmllZCBhbmQgYWNrbm93bGVkZ2VkCndpdGhpbiB0aGUgdGV4dCBvciBjb250ZW50IG9mIHRoZSBzdWJtaXNzaW9uLgoKSUYgVEhFIFNVQk1JU1NJT04gSVMgQkFTRUQgVVBPTiBXT1JLIFRIQVQgSEFTIEJFRU4gU1BPTlNPUkVEIE9SIFNVUFBPUlRFRApCWSBBTiBBR0VOQ1kgT1IgT1JHQU5JWkFUSU9OIE9USEVSIFRIQU4gRFNVLCBZT1UgUkVQUkVTRU5UIFRIQVQgWU9VIEhBVkUKRlVMRklMTEVEIEFOWSBSSUdIVCBPRiBSRVZJRVcgT1IgT1RIRVIgT0JMSUdBVElPTlMgUkVRVUlSRUQgQlkgU1VDSApDT05UUkFDVCBPUiBBR1JFRU1FTlQuCgpEU1Ugd2lsbCBjbGVhcmx5IGlkZW50aWZ5IHlvdXIgbmFtZShzKSBhcyB0aGUgYXV0aG9yKHMpIG9yIG93bmVyKHMpIG9mIHRoZQpzdWJtaXNzaW9uLCBhbmQgd2lsbCBub3QgbWFrZSBhbnkgYWx0ZXJhdGlvbiwgb3RoZXIgdGhhbiBhcyBhbGxvd2VkIGJ5IHRoaXMKbGljZW5zZSwgdG8geW91ciBzdWJtaXNzaW9uLgo= |