Modified Spatio-Temporal Matched Filtering for Brain Responses Classification

In this article, we apply the method of spatio-temporal filtering (STF) to electroencephalographic (EEG) data processing for brain responses classification. The method operates similarly to linear discriminant analysis (LDA) but contrary to most applied classifiers, it uses the whole recorded EEG si...

Full description

Autores:
P. Kotas, Marian
Piela, Michal
Contreras-Ortiz, Sonia H.
Tipo de recurso:
Fecha de publicación:
2022
Institución:
Universidad Tecnológica de Bolívar
Repositorio:
Repositorio Institucional UTB
Idioma:
eng
OAI Identifier:
oai:repositorio.utb.edu.co:20.500.12585/11134
Acceso en línea:
https://hdl.handle.net/20.500.12585/11134
Palabra clave:
Index Terms—Brain–computer interfaces (BCI)
Discrete cosine transform (DCT)
Generalized matched filtering (GMF)
Spatio– temporal filtering (STF)
Visual evoked potentials (EPs)
LEMB
Rights
openAccess
License
http://creativecommons.org/licenses/by-nc-nd/4.0/
id UTB2_099e1dc2c90329cfe4f579d1a2678718
oai_identifier_str oai:repositorio.utb.edu.co:20.500.12585/11134
network_acronym_str UTB2
network_name_str Repositorio Institucional UTB
repository_id_str
dc.title.es_CO.fl_str_mv Modified Spatio-Temporal Matched Filtering for Brain Responses Classification
title Modified Spatio-Temporal Matched Filtering for Brain Responses Classification
spellingShingle Modified Spatio-Temporal Matched Filtering for Brain Responses Classification
Index Terms—Brain–computer interfaces (BCI)
Discrete cosine transform (DCT)
Generalized matched filtering (GMF)
Spatio– temporal filtering (STF)
Visual evoked potentials (EPs)
LEMB
title_short Modified Spatio-Temporal Matched Filtering for Brain Responses Classification
title_full Modified Spatio-Temporal Matched Filtering for Brain Responses Classification
title_fullStr Modified Spatio-Temporal Matched Filtering for Brain Responses Classification
title_full_unstemmed Modified Spatio-Temporal Matched Filtering for Brain Responses Classification
title_sort Modified Spatio-Temporal Matched Filtering for Brain Responses Classification
dc.creator.fl_str_mv P. Kotas, Marian
Piela, Michal
Contreras-Ortiz, Sonia H.
dc.contributor.author.none.fl_str_mv P. Kotas, Marian
Piela, Michal
Contreras-Ortiz, Sonia H.
dc.subject.keywords.es_CO.fl_str_mv Index Terms—Brain–computer interfaces (BCI)
Discrete cosine transform (DCT)
Generalized matched filtering (GMF)
Spatio– temporal filtering (STF)
Visual evoked potentials (EPs)
topic Index Terms—Brain–computer interfaces (BCI)
Discrete cosine transform (DCT)
Generalized matched filtering (GMF)
Spatio– temporal filtering (STF)
Visual evoked potentials (EPs)
LEMB
dc.subject.armarc.none.fl_str_mv LEMB
description In this article, we apply the method of spatio-temporal filtering (STF) to electroencephalographic (EEG) data processing for brain responses classification. The method operates similarly to linear discriminant analysis (LDA) but contrary to most applied classifiers, it uses the whole recorded EEG signal as a source of information instead of the precisely selected brain responses, only. This way it avoids the limitations of LDA and improves the classification accuracy. We emphasize the significance of the STF learning phase. To preclude the negative influence of super–Gaussian artifacts on accomplishment of this phase, we apply the discrete cosine transform (DCT) based method for their rejection. Later, we estimate the noise covariance matrix using all data available, and we improve the STF template construction. The further modifications are related with the constructed filters operation and consist in the changes of the STF interpretation rules. Consequently, a new tool for evoked potentials (EPs) classification has been developed. Applied to the analysis of signals stored in a publicly available database, prepared for the assessment of modern algorithms aimed in EPs detection (in the frames of the 2019 IFMBE Scientific Challenge), it allowed to achieve the second best result, very close to the best one, and significantly better than the ones achieved by other contestants of the challenge
publishDate 2022
dc.date.accessioned.none.fl_str_mv 2022-10-27T21:40:54Z
dc.date.available.none.fl_str_mv 2022-10-27T21:40:54Z
dc.date.issued.none.fl_str_mv 2022-08
dc.date.submitted.none.fl_str_mv 2022-10-26
dc.type.driver.es_CO.fl_str_mv info:eu-repo/semantics/article
dc.type.hasversion.es_CO.fl_str_mv info:eu-repo/semantics/restrictedAccess
dc.type.spa.es_CO.fl_str_mv http://purl.org/coar/resource_type/c_2df8fbb1
dc.identifier.citation.es_CO.fl_str_mv M. P. Kotas, M. Piela and S. H. Contreras-Ortiz, "Modified Spatio-Temporal Matched Filtering for Brain Responses Classification," in IEEE Transactions on Human-Machine Systems, vol. 52, no. 4, pp. 677-686, Aug. 2022, doi: 10.1109/THMS.2022.3168421
dc.identifier.uri.none.fl_str_mv https://hdl.handle.net/20.500.12585/11134
dc.identifier.doi.none.fl_str_mv 10.1109/THMS.2022.3168421
dc.identifier.instname.es_CO.fl_str_mv Universidad Tecnológica de Bolívar
dc.identifier.reponame.es_CO.fl_str_mv Repositorio Universidad Tecnológica de Bolívar
identifier_str_mv M. P. Kotas, M. Piela and S. H. Contreras-Ortiz, "Modified Spatio-Temporal Matched Filtering for Brain Responses Classification," in IEEE Transactions on Human-Machine Systems, vol. 52, no. 4, pp. 677-686, Aug. 2022, doi: 10.1109/THMS.2022.3168421
10.1109/THMS.2022.3168421
Universidad Tecnológica de Bolívar
Repositorio Universidad Tecnológica de Bolívar
url https://hdl.handle.net/20.500.12585/11134
dc.language.iso.es_CO.fl_str_mv eng
language eng
dc.rights.coar.fl_str_mv http://purl.org/coar/access_right/c_abf2
dc.rights.uri.*.fl_str_mv http://creativecommons.org/licenses/by-nc-nd/4.0/
dc.rights.accessrights.es_CO.fl_str_mv info:eu-repo/semantics/openAccess
dc.rights.cc.*.fl_str_mv Attribution-NonCommercial-NoDerivatives 4.0 Internacional
rights_invalid_str_mv http://creativecommons.org/licenses/by-nc-nd/4.0/
Attribution-NonCommercial-NoDerivatives 4.0 Internacional
http://purl.org/coar/access_right/c_abf2
eu_rights_str_mv openAccess
dc.format.extent.none.fl_str_mv 10 Páginas
dc.format.mimetype.es_CO.fl_str_mv application/pdf
dc.publisher.place.es_CO.fl_str_mv Cartagena de Indias
dc.source.es_CO.fl_str_mv IEEE Transactions on Human-Machine Systems - Vol. 52 N° 4 (2022)
institution Universidad Tecnológica de Bolívar
bitstream.url.fl_str_mv https://repositorio.utb.edu.co/bitstream/20.500.12585/11134/1/Modified_Spatio-Temporal_Matched_Filtering_for_Brain_Responses_Classification.pdf
https://repositorio.utb.edu.co/bitstream/20.500.12585/11134/2/license_rdf
https://repositorio.utb.edu.co/bitstream/20.500.12585/11134/3/license.txt
https://repositorio.utb.edu.co/bitstream/20.500.12585/11134/4/Modified_Spatio-Temporal_Matched_Filtering_for_Brain_Responses_Classification.pdf.txt
https://repositorio.utb.edu.co/bitstream/20.500.12585/11134/5/Modified_Spatio-Temporal_Matched_Filtering_for_Brain_Responses_Classification.pdf.jpg
bitstream.checksum.fl_str_mv c34ded1244bdbaa2780b8182c5221c16
4460e5956bc1d1639be9ae6146a50347
e20ad307a1c5f3f25af9304a7a7c86b6
a954abfabd71062db614c7d2276ddbdf
480466fa4e256b1263f43675674a3987
bitstream.checksumAlgorithm.fl_str_mv MD5
MD5
MD5
MD5
MD5
repository.name.fl_str_mv Repositorio Institucional UTB
repository.mail.fl_str_mv repositorioutb@utb.edu.co
_version_ 1814021776934436864
spelling P. Kotas, Marian02d87d0c-c965-468b-a4a3-21211d1288a7Piela, Michal30612509-13cc-43b9-81ab-31b0b653cab8Contreras-Ortiz, Sonia H.1d56d7f5-97c9-4429-b47d-48ebe97de2a82022-10-27T21:40:54Z2022-10-27T21:40:54Z2022-082022-10-26M. P. Kotas, M. Piela and S. H. Contreras-Ortiz, "Modified Spatio-Temporal Matched Filtering for Brain Responses Classification," in IEEE Transactions on Human-Machine Systems, vol. 52, no. 4, pp. 677-686, Aug. 2022, doi: 10.1109/THMS.2022.3168421https://hdl.handle.net/20.500.12585/1113410.1109/THMS.2022.3168421Universidad Tecnológica de BolívarRepositorio Universidad Tecnológica de BolívarIn this article, we apply the method of spatio-temporal filtering (STF) to electroencephalographic (EEG) data processing for brain responses classification. The method operates similarly to linear discriminant analysis (LDA) but contrary to most applied classifiers, it uses the whole recorded EEG signal as a source of information instead of the precisely selected brain responses, only. This way it avoids the limitations of LDA and improves the classification accuracy. We emphasize the significance of the STF learning phase. To preclude the negative influence of super–Gaussian artifacts on accomplishment of this phase, we apply the discrete cosine transform (DCT) based method for their rejection. Later, we estimate the noise covariance matrix using all data available, and we improve the STF template construction. The further modifications are related with the constructed filters operation and consist in the changes of the STF interpretation rules. Consequently, a new tool for evoked potentials (EPs) classification has been developed. Applied to the analysis of signals stored in a publicly available database, prepared for the assessment of modern algorithms aimed in EPs detection (in the frames of the 2019 IFMBE Scientific Challenge), it allowed to achieve the second best result, very close to the best one, and significantly better than the ones achieved by other contestants of the challenge10 Páginasapplication/pdfenghttp://creativecommons.org/licenses/by-nc-nd/4.0/info:eu-repo/semantics/openAccessAttribution-NonCommercial-NoDerivatives 4.0 Internacionalhttp://purl.org/coar/access_right/c_abf2IEEE Transactions on Human-Machine Systems - Vol. 52 N° 4 (2022)Modified Spatio-Temporal Matched Filtering for Brain Responses Classificationinfo:eu-repo/semantics/articleinfo:eu-repo/semantics/restrictedAccesshttp://purl.org/coar/resource_type/c_2df8fbb1Index Terms—Brain–computer interfaces (BCI)Discrete cosine transform (DCT)Generalized matched filtering (GMF)Spatio– temporal filtering (STF)Visual evoked potentials (EPs)LEMBCartagena de IndiasL. A. Farwell and E. Donchin, “Talking off the top of your head: Toward a mental prosthesis utilizing event-related brain potentials,” Electroencephalogr. Clin. Neuriophysiol., vol. 70, no. 6, pp. 510–523, 1988.S. Sutton, M. Braren, J. Zubin, and E. R. John, “Evoked potential correlates of stimulus uncertainty,” Science, vol. 150, pp. 1187–1188, 1965.Q. T. Obeidat, T. A. Campbell, and J. Kong, “Introducing the edges paradigm: A P300 brain–computer interface for spelling written words,” IEEE Trans. Human-Mach. Syst., vol. 45, no. 6, pp. 727–738, Dec. 2015.S. Noorzadeh, B. Rivet, and C. Jutten, “3-D interface for the P300 speller BCI,” IEEE Trans. Human-Mach. Syst., vol. 50, no. 6, pp. 604–612, Dec. 2020J. Qu et al., “A novel three-dimensional P300 speller based on stereo visual stimuli,” IEEE Trans. Human-Mach. Syst., vol. 48, no. 4, pp. 392–399, Aug. 2018.P. Stegman, C. S. Crawford, M. Andujar, A. Nijholt, and J. E. Gilbert, “Brain–computer interface software: A review and discussion,” IEEE Trans. Human-Mach. Syst., vol. 50, no. 2, pp. 101–115, Apr. 2020.A. Cruz, G. Pires, A. Lopes, C. Carona, and U. J. Nunes, “A self-paced BCI with a collaborative controller for highly reliable wheelchair driving: Experimental tests with physically disabled individuals,” IEEE Trans. Human-Mach. Syst., vol. 51, no. 2, pp. 109–119, Apr. 2021A. Nourmohammadi, M. Jafari, and T. O. Zander, “A survey on unmanned aerial vehicle remote control using brain–computer interface,” IEEE Trans. Human-Mach. Syst., vol. 48, no. 4, pp. 337–348, Aug. 2018.A. Rakotomamonjy and V. Guigue, “BCI competition III: Dataset II – ensemble of SVMs for BCI P300 speller,” IEEE Trans. Biomed. Eng., vol. 55, no. 3, pp. 1147–1154, Mar. 2008.M. Thulasidas, C. Guan, and J. Wu, “Robust classification of EEG signal for brain-computer interface,” IEEE Trans. Neural Syst. Rehabil. Eng., vol. 14, no. 1, pp. 24–29, Mar. 2006.S. Kundu and S. Ari, “P300 detection with brain–computer interface application using PCA and ensemble of weighted SVMs,” IETE J. Res., vol. 64, no. 3, pp. 406–416, 2018.Y-R. Lee and H-N. Kim, “A data partitioning method for increasing ensemble diversity of an eSVM-based P300 speller,” Biomed. Signal Process. Control, vol. 39, pp. 53–63, 2018.C. Guger et al., “How many people are able to control a P300-based braincomputer interface (BCI)?,” Neurosci. Lett., vol. 462, pp. 94–98, 2009.D. J. Krusienski et al., “Toward enhanced P300 speller performance,” J. Neurosci. Methods, vol. 167, no. 1, pp. 15–21, Jan. 2008B. Blankertz et al., “Single-trial analysis and classification of ERP components’a tutorial,” NeuroImage, vol. 56, no. 2, pp. 814–825, May 2011U. Hoffmann, J. Vesin, T. Ebrahimi, and K. Diserens, “An efficient P300-based brain computer interface for disabled subjects,” J. Neurosci., vol. 167, no. 1, pp. 115–125, 2008Y. Miao et al., “An ERP-based BCI with peripheral stimuli: Validation with ALS patients,” Cogn. Neurodynamics, vol. 14, pp. 21–33, 2020.Y. Zhang, G. Zhou, J. Jin, Q. Zhao, X. Wang, and A. Cichocki, “Sparse Bayesian classification of EEG for brain–computer interface,” IEEE Trans. Neural Netw. Learn. Syst., vol. 27, no. 11, pp. 2256–2267, Nov. 2016.Q. Wu, Y. Zhang, J. Liu, J. Sun, A. Cichocki, and F. Gao, “Regularized group sparse discriminant analysis for P300-based brain–computer interface,” Int. J. Neural Syst., vol. 29, no. 6, 2019, Art. no. 1950002.Q. Wu, Y. Zhang, J. Liu, J. Sun, A. Cichocki, and F. Gao, “Regularized group sparse discriminant analysis for P300-based brain–computer interface,” Int. J. Neural Syst., vol. 29, no. 6, 2019, Art. no. 1950002.B. Rivet, A. Souloumiac, V. Attina, and G. Gibert, “xDAWN algorithm to enhance evoked potentials: Application to brain–computer interface,” IEEE Trans. Biomed. Eng., vol. 56, no. 8, pp. 2035–2043, Aug. 2009.F. S. Rizi, V. Abootalebi, and M. T. Sadeghi, “Spatial and spatio-temporal filtering based on common spatial patterns and Max-SNR for detection of P300 component,” Biocybernetics Biomed. Eng., vol. 37, no. 3, pp. 365–372, 2017.X. Xiao, M. Xu, J. Jin, Y. Wang, T. -P. Jung, and D. Ming, “Discriminative canonical pattern matching for single-trial classification of ERP components,” IEEE Trans. Biomed. Eng., vol. 67, no. 8, pp. 2266–2275, Aug. 2020.X. Xiao et al., “Enhancement for P300-speller classification using multiwindow discriminative canonical pattern matching,” J. Neural Eng., vol. 18, no. 4, 2021, Art. no. 046079.D. J. Krusienski et al., “A comparison of classification techniques for the P300 speller,” J. Neural Eng., vol. 3, no. 4, pp. 299–305, 2006.H. Cecotti and A. Graser, “Convolutional neural networks for P300 detection with application to brain-computer interfaces,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 33, no. 3, pp. 433–445, Mar. 2011L. Vaˇreka, “Evaluation of convolutional neural networks using a large multi-subject P300 dataset,” Biomed. Signal Process. Control, vol. 58, 2020, Art. no. 101837.O. Y. Kwon, M. H. Lee, C. Guan, and S. W. Lee, “Subject-independent brain–computer interfaces based on deep convolutional neural networks,” IEEE Trans. Neural Netw. Learn. Syst., vol. 31, no. 10, pp. 3839–3852, Oct. 2020D. Coyle, G. Prasad, and T. M. McGinnity, “Extracting features for a brain-computer interface by self-organising fuzzy neural network-based time series prediction,” in Proc. 26th Annu. Int. Conf. IEEE Eng. Med. Biol. Soc., San Francisco, USA, 2004, pp. 4371–4374, 2004.W. Kong et al., “Weighted extreme learning machine for P300 detection with application to brain computer interface,” J. Ambient Intell. Humanized Comput., pp. 1–11, 2018.Z. Jin et al., “EEG classification using sparse Bayesian extreme learning machine for brain–computer interface,” Neural Comput. Appl., vol. 32, pp. 6601–6609, 2020.V. J. Lawhern, A. J. Solon, N. R. Waytowich, S. M. Gordon, C. P. Hung, and B. J. Lance, “EEGNet: A compact convolutional neural network for EEG-based brain–computer interfaces,” J. Neural Eng., vol. 15, no. 5, 2018, Art. no. 056013.M. Simões et al., “BCIAUT-P300: A multi-session and multi-subject benchmark dataset on autism for P300-based brain-computer-interfaces,” Front. Neurosci., vol. 14, 2020, Art. no. 568104.M. Liu, W. Wu, Z. Gu, Z. Yu, F. Qi, and Y. Li, “Deep learning based on batch normalization for P300 signal detection,” Neurocomputing, vol. 275, pp. 288–297, 2018.F. Li, X. Li, F. Wang, D. Zhang, Y. Xia, and F. He, “A novel P300 classification algorithm based on a principal component analysis-convolutional neural network,” Appl. Sci., vol. 10, no. 4, 2020, Art. no. 1546.M. Kotas, J. Je˙zewski, K. Horoba, and A. Matonia, “Application of spatio-temporal filtering to fetal electrocardiogram enhancement,” Comput. Methods Progr. Biomed., vol. 104, pp. 1–9, 2010.J. Giraldo-Guzmán, M. Kotas, F. Castells, S. H. Contreras-Ortiz, and M. Urina-Triana, “Estimation of PQ distance dispersion for atrial fibrillation detection,” Comput. Methods Progr. Biomed., vol. 208, 2021, Art. no. 106167S. Lemm, B. Blankertz, G. Curio, and K-R. Müller, “Spatio-spectral filters for improving the classification of single trial EEG,” IEEE Trans. Biomed. Eng., vol. 52, no. 9, pp. 1541–1548, Sep. 2005.G. Dornhege, B. Blankertz, M. Krauledat, F. Losch, G. Curio, and K. Muller, “Combined optimization of spatial and temporal filters for improving brain-computer interfacing,” IEEE Trans. Biomed. Eng., vol. 53, no. 11, pp. 2274–2281, Nov. 2006.F. Qi, Y. Li, and W. Wu, “RSTFC: A novel algorithm for spatio-temporal filtering and classification of single-trial EEG,” IEEE Trans. Neural Netw. Learn. Syst., vol. 26, no. 12, pp. 3070–3082, Dec. 2015.A. Jiang, J. Shang, X. Liu, Y. Tang, H. K. Kwan, and Y. Zhu, “Efficient CSP algorithm with spatio-temporal filtering for motor imagery classification,” IEEE Trans. Neural Syst. Rehabil. Eng., vol. 28, no. 4, pp. 1006–1016, Apr. 2020F. Qi et al., “Spatiotemporal-filtering-based channel selection for singletrial EEG classification,” IEEE Trans. Cybern., vol. 51, no. 2, pp. 558–567, Feb. 2021J. Lu, K. Xie, and D. J. McFarland, “Adaptive spatio-temporal filtering for movement related potentials in EEG-based brain–computer interfaces,” IEEE Trans. Neural Syst. Rehabil. Eng., vol. 22, no. 4, pp. 847–857, Jul. 2014.Y. Zhang, G. Zhou, Q. Zhao, J. Jin, X. Wang, and A. Cichocki, “Spatialtemporal discriminant analysis for ERP-based brain-computer interface,” IEEE Trans. Neural Syst. Rehabil. Eng., vol. 21, no. 2, pp. 233–243, Mar. 2013.S. M. Kay, “Fundamentals of statistical signal processing. detection theory,” in Signal Process. Ser.. Hoboken, NJ, USA: Prentice-Hall, vol. 2, 1998.R. Bro, E. Acar, and Tamara G. Kolda, “Resolving the sign ambiguity in the singular value decomposition,” J. Chemometrics, vol. 22, no. 2, pp. 135–140, 2008.G. D. Dawson, “A summation technique for detecting small signals in a large irregular background,” J. Physiol., vol. 115, no. 1, 1951.D. Garcia, “Robust smoothing of gridded data in one and higher dimensions with missing values,” Comput. Statist. Data Anal., vol. 54, no. 4, pp. 1167–1178, 2010.M. J. Buckley, “Fast computation of a discretized thin-plate smoothing spline for image data,” Biometrika, vol. 81, no. 2, pp. 247–258, 1994G. M. Foody, “Thematic map comparison: Evaluating the statistical significance of differences in classification accuracy,” Photogrammetric Eng. Remote Sens., vol. 70, no. 5, pp. 627–633, 2004.A. Agresti, An Introduction to Categorical Data Analysis. Wiley, 2018M. N. Anastasiadou, M. Christodoulakis, E. S. Papathanasiou, S. S. Papacostas, and G. D. Mitsis, “Unsupervised detection and removal of muscle artifacts from scalp EEG recordings using canonical correlation analysis, wavelets and random forests,” Clin. Neuriophysiol., vol. 128, no. 9, pp. 1755–1769, 2017R. Vigario, J. Sarela, V. Jousmiki, M. Hamalainen, and E. Oja, “Independent component approach to the analysis of EEG and MEG recordings,” IEEE Trans. Biomed. Eng., vol. 47, no. 5, pp. 589–593, May 2000.E. J. Berbari, S. M. Collins, and R. Arzbaecher, “Evaluation of esophageal electrodes for recording his-purkinje activity based upon signal variance,” IEEE Trans. Biomed. Eng., vol. BME-33, no. 10, pp. 922–928, Oct. 1986http://purl.org/coar/resource_type/c_2df8fbb1ORIGINALModified_Spatio-Temporal_Matched_Filtering_for_Brain_Responses_Classification.pdfModified_Spatio-Temporal_Matched_Filtering_for_Brain_Responses_Classification.pdfapplication/pdf1202412https://repositorio.utb.edu.co/bitstream/20.500.12585/11134/1/Modified_Spatio-Temporal_Matched_Filtering_for_Brain_Responses_Classification.pdfc34ded1244bdbaa2780b8182c5221c16MD51CC-LICENSElicense_rdflicense_rdfapplication/rdf+xml; charset=utf-8805https://repositorio.utb.edu.co/bitstream/20.500.12585/11134/2/license_rdf4460e5956bc1d1639be9ae6146a50347MD52LICENSElicense.txtlicense.txttext/plain; charset=utf-83182https://repositorio.utb.edu.co/bitstream/20.500.12585/11134/3/license.txte20ad307a1c5f3f25af9304a7a7c86b6MD53TEXTModified_Spatio-Temporal_Matched_Filtering_for_Brain_Responses_Classification.pdf.txtModified_Spatio-Temporal_Matched_Filtering_for_Brain_Responses_Classification.pdf.txtExtracted texttext/plain57169https://repositorio.utb.edu.co/bitstream/20.500.12585/11134/4/Modified_Spatio-Temporal_Matched_Filtering_for_Brain_Responses_Classification.pdf.txta954abfabd71062db614c7d2276ddbdfMD54THUMBNAILModified_Spatio-Temporal_Matched_Filtering_for_Brain_Responses_Classification.pdf.jpgModified_Spatio-Temporal_Matched_Filtering_for_Brain_Responses_Classification.pdf.jpgGenerated Thumbnailimage/jpeg9886https://repositorio.utb.edu.co/bitstream/20.500.12585/11134/5/Modified_Spatio-Temporal_Matched_Filtering_for_Brain_Responses_Classification.pdf.jpg480466fa4e256b1263f43675674a3987MD5520.500.12585/11134oai:repositorio.utb.edu.co:20.500.12585/111342022-10-28 00:18:34.606Repositorio Institucional UTBrepositorioutb@utb.edu.coQXV0b3Jpem8gKGF1dG9yaXphbW9zKSBhIGxhIEJpYmxpb3RlY2EgZGUgbGEgSW5zdGl0dWNpw7NuIHBhcmEgcXVlIGluY2x1eWEgdW5hIGNvcGlhLCBpbmRleGUgeSBkaXZ1bGd1ZSBlbiBlbCBSZXBvc2l0b3JpbyBJbnN0aXR1Y2lvbmFsLCBsYSBvYnJhIG1lbmNpb25hZGEgY29uIGVsIGZpbiBkZSBmYWNpbGl0YXIgbG9zIHByb2Nlc29zIGRlIHZpc2liaWxpZGFkIGUgaW1wYWN0byBkZSBsYSBtaXNtYSwgY29uZm9ybWUgYSBsb3MgZGVyZWNob3MgcGF0cmltb25pYWxlcyBxdWUgbWUobm9zKSBjb3JyZXNwb25kZShuKSB5IHF1ZSBpbmNsdXllbjogbGEgcmVwcm9kdWNjacOzbiwgY29tdW5pY2FjacOzbiBww7pibGljYSwgZGlzdHJpYnVjacOzbiBhbCBww7pibGljbywgdHJhbnNmb3JtYWNpw7NuLCBkZSBjb25mb3JtaWRhZCBjb24gbGEgbm9ybWF0aXZpZGFkIHZpZ2VudGUgc29icmUgZGVyZWNob3MgZGUgYXV0b3IgeSBkZXJlY2hvcyBjb25leG9zIHJlZmVyaWRvcyBlbiBhcnQuIDIsIDEyLCAzMCAobW9kaWZpY2FkbyBwb3IgZWwgYXJ0IDUgZGUgbGEgbGV5IDE1MjAvMjAxMiksIHkgNzIgZGUgbGEgbGV5IDIzIGRlIGRlIDE5ODIsIExleSA0NCBkZSAxOTkzLCBhcnQuIDQgeSAxMSBEZWNpc2nDs24gQW5kaW5hIDM1MSBkZSAxOTkzIGFydC4gMTEsIERlY3JldG8gNDYwIGRlIDE5OTUsIENpcmN1bGFyIE5vIDA2LzIwMDIgZGUgbGEgRGlyZWNjacOzbiBOYWNpb25hbCBkZSBEZXJlY2hvcyBkZSBhdXRvciwgYXJ0LiAxNSBMZXkgMTUyMCBkZSAyMDEyLCBsYSBMZXkgMTkxNSBkZSAyMDE4IHkgZGVtw6FzIG5vcm1hcyBzb2JyZSBsYSBtYXRlcmlhLgoKQWwgcmVzcGVjdG8gY29tbyBBdXRvcihlcykgbWFuaWZlc3RhbW9zIGNvbm9jZXIgcXVlOgoKLSBMYSBhdXRvcml6YWNpw7NuIGVzIGRlIGNhcsOhY3RlciBubyBleGNsdXNpdmEgeSBsaW1pdGFkYSwgZXN0byBpbXBsaWNhIHF1ZSBsYSBsaWNlbmNpYSB0aWVuZSB1bmEgdmlnZW5jaWEsIHF1ZSBubyBlcyBwZXJwZXR1YSB5IHF1ZSBlbCBhdXRvciBwdWVkZSBwdWJsaWNhciBvIGRpZnVuZGlyIHN1IG9icmEgZW4gY3VhbHF1aWVyIG90cm8gbWVkaW8sIGFzw60gY29tbyBsbGV2YXIgYSBjYWJvIGN1YWxxdWllciB0aXBvIGRlIGFjY2nDs24gc29icmUgZWwgZG9jdW1lbnRvLgoKLSBMYSBhdXRvcml6YWNpw7NuIHRlbmRyw6EgdW5hIHZpZ2VuY2lhIGRlIGNpbmNvIGHDsW9zIGEgcGFydGlyIGRlbCBtb21lbnRvIGRlIGxhIGluY2x1c2nDs24gZGUgbGEgb2JyYSBlbiBlbCByZXBvc2l0b3JpbywgcHJvcnJvZ2FibGUgaW5kZWZpbmlkYW1lbnRlIHBvciBlbCB0aWVtcG8gZGUgZHVyYWNpw7NuIGRlIGxvcyBkZXJlY2hvcyBwYXRyaW1vbmlhbGVzIGRlbCBhdXRvciB5IHBvZHLDoSBkYXJzZSBwb3IgdGVybWluYWRhIHVuYSB2ZXogZWwgYXV0b3IgbG8gbWFuaWZpZXN0ZSBwb3IgZXNjcml0byBhIGxhIGluc3RpdHVjacOzbiwgY29uIGxhIHNhbHZlZGFkIGRlIHF1ZSBsYSBvYnJhIGVzIGRpZnVuZGlkYSBnbG9iYWxtZW50ZSB5IGNvc2VjaGFkYSBwb3IgZGlmZXJlbnRlcyBidXNjYWRvcmVzIHkvbyByZXBvc2l0b3Jpb3MgZW4gSW50ZXJuZXQgbG8gcXVlIG5vIGdhcmFudGl6YSBxdWUgbGEgb2JyYSBwdWVkYSBzZXIgcmV0aXJhZGEgZGUgbWFuZXJhIGlubWVkaWF0YSBkZSBvdHJvcyBzaXN0ZW1hcyBkZSBpbmZvcm1hY2nDs24gZW4gbG9zIHF1ZSBzZSBoYXlhIGluZGV4YWRvLCBkaWZlcmVudGVzIGFsIHJlcG9zaXRvcmlvIGluc3RpdHVjaW9uYWwgZGUgbGEgSW5zdGl0dWNpw7NuLCBkZSBtYW5lcmEgcXVlIGVsIGF1dG9yKHJlcykgdGVuZHLDoW4gcXVlIHNvbGljaXRhciBsYSByZXRpcmFkYSBkZSBzdSBvYnJhIGRpcmVjdGFtZW50ZSBhIG90cm9zIHNpc3RlbWFzIGRlIGluZm9ybWFjacOzbiBkaXN0aW50b3MgYWwgZGUgbGEgSW5zdGl0dWNpw7NuIHNpIGRlc2VhIHF1ZSBzdSBvYnJhIHNlYSByZXRpcmFkYSBkZSBpbm1lZGlhdG8uCgotIExhIGF1dG9yaXphY2nDs24gZGUgcHVibGljYWNpw7NuIGNvbXByZW5kZSBlbCBmb3JtYXRvIG9yaWdpbmFsIGRlIGxhIG9icmEgeSB0b2RvcyBsb3MgZGVtw6FzIHF1ZSBzZSByZXF1aWVyYSBwYXJhIHN1IHB1YmxpY2FjacOzbiBlbiBlbCByZXBvc2l0b3Jpby4gSWd1YWxtZW50ZSwgbGEgYXV0b3JpemFjacOzbiBwZXJtaXRlIGEgbGEgaW5zdGl0dWNpw7NuIGVsIGNhbWJpbyBkZSBzb3BvcnRlIGRlIGxhIG9icmEgY29uIGZpbmVzIGRlIHByZXNlcnZhY2nDs24gKGltcHJlc28sIGVsZWN0csOzbmljbywgZGlnaXRhbCwgSW50ZXJuZXQsIGludHJhbmV0LCBvIGN1YWxxdWllciBvdHJvIGZvcm1hdG8gY29ub2NpZG8gbyBwb3IgY29ub2NlcikuCgotIExhIGF1dG9yaXphY2nDs24gZXMgZ3JhdHVpdGEgeSBzZSByZW51bmNpYSBhIHJlY2liaXIgY3VhbHF1aWVyIHJlbXVuZXJhY2nDs24gcG9yIGxvcyB1c29zIGRlIGxhIG9icmEsIGRlIGFjdWVyZG8gY29uIGxhIGxpY2VuY2lhIGVzdGFibGVjaWRhIGVuIGVzdGEgYXV0b3JpemFjacOzbi4KCi0gQWwgZmlybWFyIGVzdGEgYXV0b3JpemFjacOzbiwgc2UgbWFuaWZpZXN0YSBxdWUgbGEgb2JyYSBlcyBvcmlnaW5hbCB5IG5vIGV4aXN0ZSBlbiBlbGxhIG5pbmd1bmEgdmlvbGFjacOzbiBhIGxvcyBkZXJlY2hvcyBkZSBhdXRvciBkZSB0ZXJjZXJvcy4gRW4gY2FzbyBkZSBxdWUgZWwgdHJhYmFqbyBoYXlhIHNpZG8gZmluYW5jaWFkbyBwb3IgdGVyY2Vyb3MgZWwgbyBsb3MgYXV0b3JlcyBhc3VtZW4gbGEgcmVzcG9uc2FiaWxpZGFkIGRlbCBjdW1wbGltaWVudG8gZGUgbG9zIGFjdWVyZG9zIGVzdGFibGVjaWRvcyBzb2JyZSBsb3MgZGVyZWNob3MgcGF0cmltb25pYWxlcyBkZSBsYSBvYnJhIGNvbiBkaWNobyB0ZXJjZXJvLgoKLSBGcmVudGUgYSBjdWFscXVpZXIgcmVjbGFtYWNpw7NuIHBvciB0ZXJjZXJvcywgZWwgbyBsb3MgYXV0b3JlcyBzZXLDoW4gcmVzcG9uc2FibGVzLCBlbiBuaW5nw7puIGNhc28gbGEgcmVzcG9uc2FiaWxpZGFkIHNlcsOhIGFzdW1pZGEgcG9yIGxhIGluc3RpdHVjacOzbi4KCi0gQ29uIGxhIGF1dG9yaXphY2nDs24sIGxhIGluc3RpdHVjacOzbiBwdWVkZSBkaWZ1bmRpciBsYSBvYnJhIGVuIMOtbmRpY2VzLCBidXNjYWRvcmVzIHkgb3Ryb3Mgc2lzdGVtYXMgZGUgaW5mb3JtYWNpw7NuIHF1ZSBmYXZvcmV6Y2FuIHN1IHZpc2liaWxpZGFkCgo=