Efficient Non-Parametric Neural Density Estimation and Its Application to Outlier and Anomaly Detection

ilustraciones, diagramas

Autores:
Gallego-Mejia, Joseph A.
Tipo de recurso:
Doctoral thesis
Fecha de publicación:
2023
Institución:
Universidad Nacional de Colombia
Repositorio:
Universidad Nacional de Colombia
Idioma:
eng
OAI Identifier:
oai:repositorio.unal.edu.co:unal/84781
Acceso en línea:
https://repositorio.unal.edu.co/handle/unal/84781
https://repositorio.unal.edu.co/
Palabra clave:
000 - Ciencias de la computación, información y obras generales
Aprendizaje profundo
Redes Neurales de la Computación
Deep Learning
Neural Networks, Computer
Kernel density estimation
Kernel methods
Deep Learning
Random Fourier Features
Machine Learning
Deep Kernel
Large-scale learning
Kernel Density Esitmation Approximation
Neural Density Estimation
Density Matrix
Estimación de la densidad del núcleo
Métodos del núcleo
Aprendizaje profundo
Aprendizaje a gran escala
Aproximaciones de la estimación de la densidad del nucleo
Matriz de densidad
Estimación de la densidad neuronal
Rights
openAccess
License
Atribución-NoComercial-SinDerivadas 4.0 Internacional
id UNACIONAL2_fa40955fee40a551d9a658682b6ad07f
oai_identifier_str oai:repositorio.unal.edu.co:unal/84781
network_acronym_str UNACIONAL2
network_name_str Universidad Nacional de Colombia
repository_id_str
dc.title.eng.fl_str_mv Efficient Non-Parametric Neural Density Estimation and Its Application to Outlier and Anomaly Detection
dc.title.translated.spa.fl_str_mv Estimación neuronal no paramétrica eficiente de la densidad y su aplicación a la detección de valores atípicos y anomalías
title Efficient Non-Parametric Neural Density Estimation and Its Application to Outlier and Anomaly Detection
spellingShingle Efficient Non-Parametric Neural Density Estimation and Its Application to Outlier and Anomaly Detection
000 - Ciencias de la computación, información y obras generales
Aprendizaje profundo
Redes Neurales de la Computación
Deep Learning
Neural Networks, Computer
Kernel density estimation
Kernel methods
Deep Learning
Random Fourier Features
Machine Learning
Deep Kernel
Large-scale learning
Kernel Density Esitmation Approximation
Neural Density Estimation
Density Matrix
Estimación de la densidad del núcleo
Métodos del núcleo
Aprendizaje profundo
Aprendizaje a gran escala
Aproximaciones de la estimación de la densidad del nucleo
Matriz de densidad
Estimación de la densidad neuronal
title_short Efficient Non-Parametric Neural Density Estimation and Its Application to Outlier and Anomaly Detection
title_full Efficient Non-Parametric Neural Density Estimation and Its Application to Outlier and Anomaly Detection
title_fullStr Efficient Non-Parametric Neural Density Estimation and Its Application to Outlier and Anomaly Detection
title_full_unstemmed Efficient Non-Parametric Neural Density Estimation and Its Application to Outlier and Anomaly Detection
title_sort Efficient Non-Parametric Neural Density Estimation and Its Application to Outlier and Anomaly Detection
dc.creator.fl_str_mv Gallego-Mejia, Joseph A.
dc.contributor.advisor.none.fl_str_mv González, Fabio A.
dc.contributor.author.none.fl_str_mv Gallego-Mejia, Joseph A.
dc.contributor.researchgroup.spa.fl_str_mv Mindlab
dc.contributor.orcid.spa.fl_str_mv Gallego-Mejia, Joseph A. [0000-0001-8971-4998]
dc.contributor.researchgate.spa.fl_str_mv https://www.researchgate.net/profile/Joseph-Gallego-Mejia
dc.contributor.googlescholar.spa.fl_str_mv https://scholar.google.cl/citations?user=DS0IfX4AAAAJ&hl=es&oi=ao
dc.subject.ddc.spa.fl_str_mv 000 - Ciencias de la computación, información y obras generales
topic 000 - Ciencias de la computación, información y obras generales
Aprendizaje profundo
Redes Neurales de la Computación
Deep Learning
Neural Networks, Computer
Kernel density estimation
Kernel methods
Deep Learning
Random Fourier Features
Machine Learning
Deep Kernel
Large-scale learning
Kernel Density Esitmation Approximation
Neural Density Estimation
Density Matrix
Estimación de la densidad del núcleo
Métodos del núcleo
Aprendizaje profundo
Aprendizaje a gran escala
Aproximaciones de la estimación de la densidad del nucleo
Matriz de densidad
Estimación de la densidad neuronal
dc.subject.decs.spa.fl_str_mv Aprendizaje profundo
Redes Neurales de la Computación
dc.subject.decs.eng.fl_str_mv Deep Learning
Neural Networks, Computer
dc.subject.proposal.eng.fl_str_mv Kernel density estimation
Kernel methods
Deep Learning
Random Fourier Features
Machine Learning
Deep Kernel
Large-scale learning
Kernel Density Esitmation Approximation
Neural Density Estimation
dc.subject.proposal.none.fl_str_mv Density Matrix
dc.subject.proposal.spa.fl_str_mv Estimación de la densidad del núcleo
Métodos del núcleo
Aprendizaje profundo
Aprendizaje a gran escala
Aproximaciones de la estimación de la densidad del nucleo
Matriz de densidad
Estimación de la densidad neuronal
description ilustraciones, diagramas
publishDate 2023
dc.date.accessioned.none.fl_str_mv 2023-10-06T20:45:17Z
dc.date.available.none.fl_str_mv 2023-10-06T20:45:17Z
dc.date.issued.none.fl_str_mv 2023-10-05
dc.type.spa.fl_str_mv Trabajo de grado - Doctorado
dc.type.driver.spa.fl_str_mv info:eu-repo/semantics/doctoralThesis
dc.type.version.spa.fl_str_mv info:eu-repo/semantics/acceptedVersion
dc.type.coar.spa.fl_str_mv http://purl.org/coar/resource_type/c_db06
dc.type.content.spa.fl_str_mv Text
dc.type.redcol.spa.fl_str_mv http://purl.org/redcol/resource_type/TD
format http://purl.org/coar/resource_type/c_db06
status_str acceptedVersion
dc.identifier.uri.none.fl_str_mv https://repositorio.unal.edu.co/handle/unal/84781
dc.identifier.instname.spa.fl_str_mv Universidad Nacional de Colombia
dc.identifier.reponame.spa.fl_str_mv Repositorio Institucional Universidad Nacional de Colombia
dc.identifier.repourl.spa.fl_str_mv https://repositorio.unal.edu.co/
url https://repositorio.unal.edu.co/handle/unal/84781
https://repositorio.unal.edu.co/
identifier_str_mv Universidad Nacional de Colombia
Repositorio Institucional Universidad Nacional de Colombia
dc.language.iso.spa.fl_str_mv eng
language eng
dc.relation.references.spa.fl_str_mv Kdd cup dataset, 1999. http://kdd.ics.uci.edu/databases/kddcup99/kddcup99.html.
Andrews Jerone T. A., Edward J Morton, and Lewis D Griffin. Detecting anomalous data using auto-encoders. International Journal of Machine Learning and Computing, 2016.
Charu C Aggarwal. Outlier analysis second edition, 2016.
Mohiuddin Ahmed, Abdun Naser Mahmood, and Md Rafiqul Islam. A survey of anomaly detection techniques in financial domain. Future Generation Computer Systems, 55:278–288, 2016
Jinwon An and Sungzoon Cho. Variational autoencoder based anomaly detection using reconstruction probability. Special Lecture on IE, 2(1):1–18, 2015
Tessa K. Anderson. Kernel density estimation and K-means clustering to profile road accident hotspots. Accident Analysis and Prevention, 41(3):359–364, may 2009. ISSN 00014575. doi: 10.1016/j.aap.2008.12.014.
Jerone T A Andrews, Thomas Tanay, Edward J Morton, and Lewis D Griffin. Transfer representation-learning for anomaly detection, 2016. URL http://www.vlfeat.org/matconvnet.
Fabrizio Angiulli and Fabio Fassetti. Detecting distance-based outliers in streams of data. pages 811–820, 2007. ISBN 9781595938039. doi: 10.1145/1321440.1321552.
Haim Avron, Vikas Sindhwani, Jiyan Yang, and Michael W. Mahoney. Quasi-monte carlo feature maps for shift-invariant kernels. Journal of Machine Learning Research, 17(120):1–38, 2016. URL http://jmlr.org/papers/v17/14-538.html.
Francis R Bach and Michael I Jordan. Predictive low-rank decomposition for kernel methods. In ICML 2005 - Proceedings of the 22nd International Conference on Machine Learning, pages 33–40, 2005. ISBN 1595931805. doi: 10.1145/1102351.1102356
Arturs Backurs, Piotr Indyk, and Tal Wagner. Space and time efficient kernel density estimation in high dimensions. volume 32, 2019
Sivaraman Balakrishnan, Srivatsan Narayanan, Alessandro Rinaldo, Aarti Singh, and Larry Wasserman. Cluster trees on manifolds. arXiv preprint arXiv:1307.6515, 2013.
David M Bashtannyk and Rob J Hyndman. Bandwidth selection for kernel conditional density estimation, 2001. URL www.elsevier.com/locate/csda.
Yoshua Bengio and Samy Bengio. Modeling high-dimensional discrete data with multilayer neural networks. Advances in Neural Information Processing Systems, 12, 1999.
Siddharth Bhatia, Arjit Jain, Pan Li, Ritesh Kumar, and Bryan Hooi. Mstream: Fast anomaly detection in multi-aspect streams. pages 3371–3382. Association for Computing Machinery, Inc, 4 2021. ISBN 9781450383127. doi: 10.1145/3442381. 3450023
Siddharth Bhatia, Arjit Jain, Shivin Srivastava, Kenji Kawaguchi, and Bryan Hooi. Memstream: Memory-based streaming anomaly detection. WWW 2022 - Proceedings of the ACM Web Conference 2022, pages 610–621, 2022. doi: 10.1145/3485447. 3512221.
Peter J Bickel and Kjell A Doksum. Mathematical statistics: basic ideas and selected topics, volumes I-II package. CRC Press, 2015
Christopher M Bishop and Nasser M Nasrabadi. Pattern recognition and machine learning, volume 4. Springer, 2006
Giuseppe Borruso. Network Density Estimation: A GIS Approach for Analysing Point Patterns in a Network Space. Transactions in GIS, 12(3):377–402, 2008. ISSN 1361- 1682
Markus M Breunig, Hans-Peter Kriegel, Raymond T Ng, and J¨org Sander. Lof: identifying density-based local outliers. In Proceedings of the 2000 ACM SIGMOD international conference on Management of data, pages 93–104, 2000.
Brian Bullins, Cyril Zhang, and Yi Zhang. Not-so-random features. 10 2017. URL http://arxiv.org/abs/1710.10230
Oscar Bustos-Brinez, Joseph Gallego-Mejia, and Fabio A Gonz´alez. Ad-dmkde: Anomaly detection through density matrices and fourier features. arXiv preprint arXiv:2210.14796, 2022.
Raghavendra Chalapathy and Sanjay Chawla. Deep learning for anomaly detection: A survey. 1 2019. URL http://arxiv.org/abs/1901.03407.
Raghavendra Chalapathy, Aditya Krishna Menon, and Sanjay Chawla. Anomaly detection using one-class neural networks. arXiv preprint arXiv:1802.06360, 2018.
Varun Chandola. Anomaly detection : A survey, 2009
Moses Charikar and Paris Siminelakis. Hashing-based-estimators for kernel density in high dimensions. In 2017 IEEE 58th Annual Symposium on Foundations of Computer Science (FOCS), pages 1032–1043. IEEE, 2017.
Sotirios P. Chatzis, Dimitrios Korkinof, and Yiannis Demiris. A quantum-statistical approach toward robot learning by demonstration. IEEE Transactions on Robotics, 28:1371–1381, 2012. ISSN 15523098. doi: 10.1109/TRO.2012.2203055.
Fr´ed´eric Chazal, Brittany Fasy, Fabrizio Lecci, Bertrand Michel, Alessandro Rinaldo, Alessandro Rinaldo, and Larry Wasserman. Robust topological inference: Distance to a measure and kernel distance. The Journal of Machine Learning Research, 18(1): 5845–5884, 2017
Gal Chechik, Varun Sharma, Uri Shalit, Samy Bengio, and Samy Bengio CHECHIK. Large Scale Online Learning of Image Similarity Through Ranking. Technical report, 2010.
Yen Chi Chen. A tutorial on kernel density estimation and recent advances. Biostatistics and Epidemiology, 1:161–187, 1 2017. ISSN 24709379. doi: 10.1080/24709360. 2017.1396742
Yen-Chi Chen, Christopher R Genovese, Larry Wasserman, et al. A comprehensive approach to mode clustering. Electronic Journal of Statistics, 10(1):210–241, 2016
Peitao Cheng, Yuanying Qiu, Xiumei Wang, and Ke Zhao. A New Single Image Super-Resolution Method Based on the Infinite Mixture Model. IEEE Access, 5:2228–2240, 2017. ISSN 21693536. doi: 10.1109/ACCESS.2017.2664103. URL http://arxiv.org/abs/2006.05218.
Victor Chernozhukov, Denis Chetverikov, Kengo Kato, et al. Gaussian approximation of suprema of empirical processes. Annals of Statistics, 42(4):1564–1597, 2014
Krzysztof Choromanski and Vikas Sindhwani. Recycling randomness with structure for sublinear time kernel expansions. 5 2016. URL http://arxiv.org/abs/1605.09049
N Cristianini. Kernel Methods for Pattern Analysis, volume 1. 2004. doi: 10.1198/ tech.2005.s264.
Tri Dao, Christopher De Sa, and Christopher R´e. Gaussian quadrature for kernel features. Advances in neural information processing systems, 30:6109, 2017.
Philip I Davies and Nicholas J Higham. Numerically stable generation of correlation matrices and their factors. BIT Numerical Mathematics, 40(4):640–651, 2000.
Arthur P Dempster, Nan M Laird, and Donald B Rubin. Maximum likelihood from incomplete data via the em algorithm. Journal of the royal statistical society: series B (methodological), 39(1):1–22, 1977.
B Denkena, M-A Dittrich, H Noske, and M Witt. Statistical approaches for semisupervised anomaly detection in machining. Production Engineering, 14(3):385–393, 2020
Luc Devroye. Nonparametric density estimation. The L 1 View, 1985.
Asimenia Dimokranitou. Adversarial autoencoders for anomalous event detection in images. PhD thesis, Purdue University, 2017
Zhiguo Ding and Minrui Fei. An anomaly detection approach based on isolation forest algorithm for streaming data using sliding window. volume 3, pages 12–17. IFAC Secretariat, 2013. ISBN 9783902823458. doi: 10.3182/20130902-3-CN-3020.00044.
Laurent Dinh, David Krueger, and Yoshua Bengio. Nice: Non-linear independent components estimation. arXiv preprint arXiv:1410.8516, 2014.
Laurent Dinh, Jascha Sohl-Dickstein, and Samy Bengio. Density estimation using real nvp. 5 2016. URL http://arxiv.org/abs/1605.08803.
Joni A Downs. Time-geographic density estimation for moving point objects, 2010.
Conor Durkan, Artur Bekasov, Iain Murray, and George Papamakarios. Neural spline flows. In Advances in Neural Information Processing Systems, volume 32, 2019
Vincent Dutordoir, Hugh Salimbeni, Marc Peter Deisenroth, and James Hensman. Gaussian process conditional density estimation. volume 2018-Decem, pages 2385– 2395, 2018.
Bradley Efron. Bootstrap methods: another look at the jackknife. In Breakthroughs in statistics, pages 569–593. Springer, 1992.
Gilberto Fernandes, Joel JPC Rodrigues, Luiz Fernando Carvalho, Jalal F Al-Muhtadi, and Mario Lemes Proen¸ca. A comprehensive survey on network anomaly detection. Telecommunication Systems, 70(3):447–489, 2019.
Chris Fraley and Adrian E. Raftery. Model-based clustering, discriminant analysis, and density estimation. Journal of the American Statistical Association, 97:611–631, 2002. ISSN 01621459. doi: 10.1198/016214502760047131.
Brendan J Frey, J Frey Brendan, and Brendan J Frey. Graphical models for machine learning and digital communication. MIT press, 1998
Joseph A. Gallego and Fabio A. Gonz´alez. Quantum adaptive fourier features for neural density estimation, 2022. URL https://arxiv.org/abs/2208.00564.
Joseph A Gallego, Fabio A Gonz´alez, and Olfa Nasraoui. Robust kernels for robust location estimation. Neurocomputing, 429:174–186, 2021
Joseph A Gallego, Juan F Osorio, and Fabio A Gonzalez. Fast kernel density estimation with density matrices and random fourier features. In Advances in Artificial Intelligence–IBERAMIA 2022: 17th Ibero-American Conference on AI, Cartagena de Indias, Colombia, November 23–25, 2022, Proceedings, pages 160–172. Springer, 2023
Joseph Gallego-Mejia, Oscar Bustos-Brinez, and Fabio A Gonz´alez. Leandmkde: Quantum latent density estimation for anomaly detection. arXiv preprint arXiv:2211.08525, 2022
Joseph A. Gallego-Mejia and Fabio A Gonzalez. Demande dataset, April 2023. URL https://doi.org/10.5281/zenodo.7822851
Joseph A Gallego-Mejia, Oscar A Bustos-Brinez, and Fabio A Gonz´alez. Inqmad: Incremental quantum measurement anomaly detection. In 2022 IEEE International Conference on Data Mining Workshops (ICDMW), pages 787–796. IEEE, 2022
Weihao Gao, Sreeram Kannan, Sewoong Oh, and Pramod Viswanath. Estimating mutual information for discrete-continuous mixtures. Advances in neural information processing systems, 30, 2017
Ioannis Gatopoulos, Maarten Stol, and Jakub M. Tomczak. Super-resolution Variational Auto-Encoders. jun 2020. URL http://arxiv.org/abs/2006.05218.
Christopher R Genovese, Marco Perone-Pacifico, Isabella Verdinelli, Larry Wasserman, et al. Nonparametric ridge estimation. Annals of Statistics, 42(4):1511–1545, 2014
Mathieu Germain, Karol Gregor, Iain Murray, and Hugo Larochelle. Made: Masked autoencoder for distribution estimation, 2015.
Kaan Gokcesu and Suleyman S. Kozat. Online density estimation of nonstationary sources using exponential family of distributions. IEEE Transactions on Neural Networks and Learning Systems, 29:4473–4478, 9 2018. ISSN 21622388. doi: 10.1109/TNNLS.2017.2740003.
Fabio A. Gonz´alez, Vladimir Vargas-Calder´on, and Herbert Vinck-Posada. Supervised Learning with Quantum Measurements. 2020. URL http://arxiv.org/abs/2004.01227
Fabio A Gonz´alez, Vladimir Vargas-Calder´on, and Herbert Vinck-Posada. Classification with quantum measurements. Journal of the Physical Society of Japan, 90(4): 044002, 2021.
Fabio A. Gonz´alez, Alejandro Gallego, Santiago Toledo-Cort´es, and Vladimir VargasCalder´on. Learning with density matrices and random features, 2021.
Artur Gramacki. Nonparametric kernel density estimation and its computational aspects, 2018.
Artur Gramacki. Nonparametric kernel density estimation and its computational aspects. Springer, 2018.
Alexander G Gray and Andrew W Moore. Nonparametric density estimation: Toward computational tractability. In Proceedings of the 2003 SIAM International Conference on Data Mining, pages 203–211. SIAM, 2003
Claudio Guardnaccia, Michele Grimaldi, Gabriella Graziuso, and Simona Mancini. CROWDSOURCING NOISE MAPS ANALYSIS BY MEANS OF KERNEL DENSITY ESTIMATION. pages 1691–1697, 2021. doi: 10.48465/fa.2020.0505. URL https://hal.archives-ouvertes.fr/hal-03233732.
Sudipto Guha, Nina Mishra, Gourav Roy, and Okke Schrijvers. Robust random cut forest based anomaly detection on streams, 2016.
Sahand Hariri, Matias Carrasco Kind, and Robert J. Brunner. Extended isolation forest. 11 2018. doi: 10.1109/TKDE.2019.2947676. URL http://arxiv.org/abs/1811.02141http://dx.doi.org/10.1109/TKDE.2019.2947676.
Trevor Hastie, Robert Tibshirani, Jerome H Friedman, and Jerome H Friedman. The elements of statistical learning: data mining, inference, and prediction, volume 2. Springer, 2009
Marti A. Hearst, Susan T Dumais, Edgar Osuna, John Platt, and Bernhard Scholkopf. Support vector machines. IEEE Intelligent Systems and their applications, 13(4):18– 28, 1998.
Geoffrey E Hinton, Simon Osindero, and Yee-Whye Teh. A fast learning algorithm for deep belief nets. Neural computation, 18(7):1527–1554, 2006.
A. H. Lashkari I. Sharafaldin and A. A Ghorbani. Toward generating a new intrusion detection dataset and intrusion traffic characterization, 2018.
F´elix Iglesias and Tanja Zseby. Analysis of network traffic features for anomaly detection. Machine Learning, 101:59–84, 10 2015. ISSN 15730565. doi: 10.1007/ s10994-014-5473-9.
Piotr Indyk and Rajeev Motwd. Approximate nearest neighbors: Towards removing the curse of dimensionality, 1998.
Marko V Jankovic. Probabilistic Approach to Neural Networks Computation Based on Quantum Probability Model Probabilistic Principal Subspace Analysis Example. 2010. URL http://arxiv.org/abs/1001.4301.
Marko V. Jankovic. Quantum Low Entropy based Associative Reasoning – QLEAR Learning, 2017. ISSN 23318422
Marko V. Jankovic and Masashi Sugiyama. Probabilistic principal component analysis based on joystick probability selector. In Proceedings of the International Joint Conference on Neural Networks, pages 1414–1421. IEEE, 2009. ISBN 9781424435531. doi: 10.1109/IJCNN.2009.5178696.
Marko V. Jankovi´c, Tomislav Gaji´c, and Branimir D. Reljin. Applications of probabilistic model based on main quantum mechanics concepts. In 12th Symposium on Neural Network Applications in Electrical Engineering, NEUREL 2014 - Proceedings, pages 33–36, 2014. ISBN 9781479958887. doi: 10.1109/NEUREL.2014.7011453.
J.H.M. Janssens, F. Huszar, E.O. Postma, and H.J. van den Herik. Stochastic outlier selection, 2012.
Ping Ji, Na Zhao, Shijie Hao, and Jianguo Jiang. Automatic image annotation by semisupervised manifold kernel density estimation. Information Sciences, 281:648–660, 10 2014. ISSN 00200255. doi: 10.1016/j.ins.2013.09.016
Joagg. Joaggi/demande: v1.0, March 2023. URL https://doi.org/10.5281/zenodo.7709634.
Michael I Jordan and Tom M Mitchell. Machine learning: Trends, perspectives, and prospects. Science, 349(6245):255–260, 2015.
Vilen Jumutc and Johan A.K. Suykens. Multi-class supervised novelty detection. IEEE Transactions on Pattern Analysis and Machine Intelligence, 36:2510–2523, 12 2014. ISSN 01628828. doi: 10.1109/TPAMI.2014.2327984.
Firuz Kamalov. Kernel density estimation based sampling for imbalanced class distribution. Information Sciences, 512:1192–1201, feb 2020. ISSN 00200255. doi: 10.1016/j.ins.2019.10.017.
Sangwook Kim, Yonghwa Choi, and Minho Lee. Deep learning with support vector data description. Neurocomputing, 165:111–117, 10 2015. ISSN 18728286. doi: 10. 1016/j.neucom.2014.09.086
Diederik P Kingma and Max Welling. Auto-encoding variational bayes, 2014.
Diederik P Kingma, Tim Salimans, Rafal Jozefowicz, Xi Chen, Ilya Sutskever, and Max Welling. Improved variational inference with inverse autoregressive flow. In Advances in Neural Information Processing Systems, pages 4743–4751, 2016.
B. Ravi Kiran, Dilip Mathew Thomas, and Ranjith Parakkal. An overview of deep learning based methods for unsupervised and semi-supervised anomaly detection in videos, 2018. ISSN 2313433X.
Matej Kristan, Aleˇs Leonardis, and Danijel Skoˇcaj. Multivariate online kernel density estimation with gaussian kernels. Pattern Recognition, 44:2630– 2642, 10 2011. ISSN 00313203. doi: 10.1016/j.patcog.2011.03.019. URL https://linkinghub.elsevier.com/retrieve/pii/S0031320311001233.
Donghwoon Kwon, Hyunjoo Kim, Jinoh Kim, Sang C Suh, Ikkyun Kim, and Kuinam J Kim. A survey of deep learning-based network anomaly detection. Cluster Computing, 22(1):949–961, 2019.
Hugo Larochelle and Iain Murray. The neural autoregressive distribution estimator. In Proceedings of the fourteenth international conference on artificial intelligence and statistics, pages 29–37. JMLR Workshop and Conference Proceedings, 2011.
Longin Jan Latecki, Aleksandar Lazarevic, and Dragoljub Pokrajac. Outlier detection with kernel density functions. In International Workshop on Machine Learning and Data Mining in Pattern Recognition, pages 61–75. Springer, 2007.
Quoc Le, Tamas Sarlos, and Alex Smola. Fastfood - approximating kernel expansions in loglinear time. In 30th International Conference on Machine Learning (ICML), 2013. URL http://jmlr.org/proceedings/papers/v28/le13.html.
Yann LeCun, Bernhard Boser, John S Denker, Donnie Henderson, Richard E Howard, Wayne Hubbard, and Lawrence D Jackel. Backpropagation applied to handwritten zip code recognition. Neural computation, 1(4):541–551, 1989
Yann LeCun, Yoshua Bengio, and Geoffrey Hinton. Deep learning. nature, 521(7553): 436–444, 2015.
Jeisung Lee and Mignon Park. An adaptive background subtraction method based on kernel density estimation. Sensors (Switzerland), 12:12279–12300, 9 2012. ISSN 14248220. doi: 10.3390/s120912279
Jonathan Li and Andrew Barron. Mixture density estimation. Advances in neural information processing systems, 12, 1999
Kun-Lun Li, Hou-Kuan Huang, Sheng-Feng Tian, and Wei Xu. Improving one-class svm for anomaly detection. In Proceedings of the 2003 International Conference on Machine Learning and Cybernetics (IEEE Cat. No. 03EX693), volume 5, pages 3077– 3081. IEEE, 2003.
Yanjun Li, Kai Zhang, Jun Wang, and Sanjiv Kumar. Learning adaptive random features. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 33, pages 4229–4236, 2019
Zheng Li, Yue Zhao, Nicola Botta, Cezar Ionescu, and Xiyang Hu. Copod: Copulabased outlier detection. pages 1118–1123, 11 2020.
Zhu Li, Jean Fran¸cois Ton, Dino Oglic, and Dino Sejdinovic. Towards a unified analysis of random fourier features. In 36th International Conference on Machine Learning, ICML 2019, volume 2019-June, pages 6916–6936, 2019. ISBN 9781510886988
Zhu Li, Jean-Francois Ton, Dino Oglic, and Dino Sejdinovic. Towards a unified analysis of random fourier features. In International Conference on Machine Learning, pages 3905–3914. PMLR, 2019.
Zilong Lin, Yong Shi, and Zhi Xue. Idsgan: Generative adversarial networks for attack generation against intrusion detection. 9 2018. URL http://arxiv.org/abs/1809.02077
Fang Liu, Yanwei Yu, Peng Song, Yangyang Fan, and Xiangrong Tong. Scalable KDEbased top-n local outlier detection over large-scale data streams. Knowledge-Based Systems, 204:106186, 2020. ISSN 0950-7051. doi: https://doi.org/10.1016/j.knosys.2020. 106186. URL https://www.sciencedirect.com/science/article/pii/S0950705120304159
Fanghui Liu, Xiaolin Huang, Yudong Chen, and Johan A. K. Suykens. Random Features for Kernel Approximation: A Survey on Algorithms, Theory, and Beyond. apr 2020.
Fanghui Liu, Xiaolin Huang, Yudong Chen, Jie Yang, and Johan Suykens. Random fourier features via fast surrogate leverage weighted sampling. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 34, pages 4844–4851, 2020.
Fei Tony Liu, Kai Ming Ting, and Zhi-Hua Zhou. Isolation forest. In 2008 eighth ieee international conference on data mining, pages 413–422. IEEE, 2008
Qiao Liu, Jiaze Xu, Rui Jiang, and Wing Hung Wong. Density estimation using deep generative neural networks. Proceedings of the National Academy of Sciences, 118(15), 2021.
Peng Lv, Yanwei Yu, Yangyang Fan, Xianfeng Tang, and Xiangrong Tong. Layerconstrained variational autoencoding kernel density estimation model for anomaly detection. Knowledge-Based Systems, 196, 5 2020. ISSN 09507051. doi: 10.1016/j.knosys. 2020.105753.
Yueming Lyu. Spherical structured feature maps for kernel approximation, 2017
Larry M Manevitz and Malik Yousef. One-class svms for document classification. Journal of machine Learning research, 2(Dec):139–154, 2001.
Emaad Manzoor, Hemank Lamba, and Leman Akoglu. Xstream: Outlier dete’x’ion in feature-evolving data streams. pages 1963–1972. Association for Computing Machinery, 7 2018. ISBN 9781450355520. doi: 10.1145/3219819.3220107.
William B. March, Bo Xiao, and George Biros. Askit: Approximate skeletonization kernel-independent treecode in high dimensions. 10 2014. URL http://arxiv.org/abs/1410.0260.
William B March, Bo Xiao, and George Biros. Askit: Approximate skeletonization kernel-independent treecode in high dimensions. SIAM Journal on Scientific Computing, 37(2):A1089–A1110, 2015.
Luis Mart´ı, Nayat Sanchez-Pi, Jos´e Manuel Molina, and Ana Cristina Bicharra Garcia. Anomaly detection based on sensor data in petroleum industry applications. Sensors (Switzerland), 15:2774–2797, 1 2015. ISSN 14248220. doi: 10.3390/s150202774.
Luis Mart´ı, Nayat Sanchez-Pi, Jos´e Manuel Molina, and Ana Cristina Bicharra Garcia. Anomaly detection based on sensor data in petroleum industry applications. Sensors (Switzerland), 15:2774–2797, 1 2015. ISSN 14248220. doi: 10.3390/s150202774.
Barbara J McNeil and James A Hanley. Statistical approaches to the analysis of receiver operating characteristic (roc) curves. Medical decision making, 4(2):137–150, 1984.
Seonwoo Min, Byunghan Lee, and Sungroh Yoon. Deep learning in bioinformatics. 18 (5):851–869, 2017. ISSN 14774054. doi: 10.1093/bib/bbw068
Tom Minka et al. Divergence measures and message passing. Technical report, Citeseer, 2005
Yisroel Mirsky, Tomer Doitshman, Yuval Elovici, and Asaf Shabtai. Kitsune: An ensemble of autoencoders for online network intrusion detection, 2 2018. ISSN 23318422.
Nour Moustafa and Jill Slay. Unsw-nb15: a comprehensive data set for network intrusion detection systems (unsw-nb15 network data set), 2015.
Marina Munkhoeva, Yermek Kapushev, Evgeny Burnaev, and Ivan Oseledets. Quadrature-based features for kernel approximation. 2 2018. URL http://arxiv.org/abs/1802.03832.
Kevin P Murphy. Machine learning: a probabilistic perspective. MIT press, 2012.
Gyoung S. Na, Donghyun Kim, and Hwanjo Yu. Dilof: Effective and memory efficient local outlier detection in data streams. pages 1993–2002. Association for Computing Machinery, 7 2018. ISBN 9781450355520. doi: 10.1145/3219819.3220022
Benjamin Nachman and David Shih. Anomaly detection with density estimation. Physical Review D, 101(7):075042, 2020.
Elizbar A Nadaraya. Some new estimates for distribution functions. Theory of Probability & Its Applications, 9(3):497–500, 1964.
Tomoki Nakaya and Keiji Yano. Visualising crime clusters in a space-time cube: An exploratory data-analysis approach using space-time kernel density estimation and scan statistics. Transactions in GIS, 14:223–239, 6 2010. ISSN 13611682. doi: 10. 1111/j.1467-9671.2010.01194.x
HD Nguyen, Kim Phuc Tran, S´ebastien Thomassey, and Moez Hamad. Forecasting and anomaly detection approaches using lstm and lstm autoencoder techniques with the applications in supply chain management. International Journal of Information Management, 57:102282, 2021
Hien D. Nguyen, Dianhui Wang, and Geoffrey J. McLachlan. Randomized mixture models for probability density approximation and estimation. Information Sciences, 467:135–148, 10 2018. ISSN 00200255. doi: 10.1016/j.ins.2018.07.056.
Guansong Pang, Charu Aggarwal, Chunhua Shen, and Nicu Sebe. Editorial deep learning for anomaly detection. IEEE Transactions on Neural Networks and Learning Systems, 33:2282–2286, 6 2022. ISSN 2162-237X. doi: 10.1109/TNNLS.2022.3162123. URL https://ieeexplore.ieee.org/document/9786561/
George Papamakarios, Theo Pavlakou, and Iain Murray. Masked autoregressive flow for density estimation. 5 2017. URL http://arxiv.org/abs/1705.07057.
Emanuel Parzen. On estimation of a probability density function and mode. The annals of mathematical statistics, 33(3):1065–1076, 1962.
F. Pedregosa, G. Varoquaux, A. Gramfort, V. Michel, B. Thirion, O. Grisel, M. Blondel, P. Prettenhofer, R. Weiss, V. Dubourg, J. Vanderplas, A. Passos, D. Cournapeau, M. Brucher, M. Perrot, and E. Duchesnay. Scikit-learn: Machine learning in Python. Journal of Machine Learning Research, 12:2825–2830, 2011.
Kainan Peng, Wei Ping, Zhao Song, and Kexin Zhao. Non-autoregressive neural textto-speech. 5 2019. URL http://arxiv.org/abs/1905.08459
Tom´aˇs Pevn´y. Loda: Lightweight on-line detector of anomalies. Machine Learning, 102:275–304, 2016. ISSN 1573-0565
Tom´aˇs Pevn´y. Loda: Lightweight on-line detector of anomalies. Machine Learning, 102:275–304, 2 2016. ISSN 15730565. doi: 10.1007/s10994-015-5521-0
Marco A.F. Pimentel, David A. Clifton, Lei Clifton, and Lionel Tarassenko. A review of novelty detection. Signal Processing, 99:215–249, 6 2014. ISSN 01651684. doi: 10.1016/j.sigpro.2013.12.026
Adrian Alan Pol, Victor Berger, Cecile Germain, Gianluca Cerminara, and Maurizio Pierini. Anomaly detection with conditional variational autoencoders. In 2019 18th IEEE International Conference On Machine Learning And Applications (ICMLA), pages 1651–1657, 2019. doi: 10.1109/ICMLA.2019.00270.
Rimpal Popat and Jayesh Chaudhary. A survey on credit card fraud detection using machine learning. 2018
Ali Rahimi and Benjamin Recht. Random features for large-scale kernel machines. In Proceedings of the 20th International Conference on Neural Information Processing Systems, NIPS’07, page 1177–1184, Red Hook, NY, USA, 2007. Curran Associates Inc. ISBN 9781605603520.
Ali Rahimi, Benjamin Recht, et al. Random features for large-scale kernel machines. In NIPS, volume 3, page 5. Citeseer, 2007
Sridhar Ramaswamy, Rajeev Rastogi, and Kyuseok Shim. Efficient algorithms for mining outliers from large data sets. page 427438. Association for Computing Machinery, 2000. ISBN 1581132174.
Daniel Ramotsoela, Adnan Abu-Mahfouz, and Gerhard Hancke. A survey of anomaly detection in industrial wireless sensor networks with critical water system infrastructure as a case study. Sensors (Switzerland), 18, 8 2018. ISSN 14248220. doi: 10.3390/s18082491
Carl Edward Rasmussen. Gaussian processes in machine learning. In Summer school on machine learning, pages 63–71. Springer, 2003
Carl Edward. Rasmussen. Gaussian Processes in Machine Learning. Springer-Verlag Berlin Heidelberg 2004, 19(1):63–71, 2004. ISSN 00219509.
Shebuti Rayana. ODDS library, 2016. URL http://odds.cs.stonybrook.edu
S. Reed, Y. Chen, T. Paine, A. van den Oord, S. M.A. Eslami, D. Rezende, O. Vinyals, and N. de Freitas. Few-shot autoregressive density estimation: Towards learning to learn distributions, 2017.
Danilo Jimenez Rezende and Shakir Mohamed. Variational inference with normalizing flows. volume 2, pages 1530–1538, 2015. ISBN 9781510810587.
Baqar Rizvi, Ammar Belatreche, Ahmed Bouridane, and Ian Watson. Detection of stock price manipulation using kernel based principal component analysis and multivariate density estimation. IEEE Access, 8:135989–136003, 2020.
Vijay K Rohatgi and AK Md Ehsanes Saleh. An introduction to probability and statistics. John Wiley & Sons, 2015.
Murray Rosenblatt. Remarks on some nonparametric estimates of a density function. Ann. Math. Statist., 27(3):832–837, 09 1956. doi: 10.1214/aoms/1177728190. URL https://doi.org/10.1214/aoms/1177728190
Peter J Rousseeuw and Katrien Van Driessen. A fast algorithm for the minimum covariance determinant estimator. Technometrics, 41(3):212–223, 1999.
Lukas Ruff, Robert Vandermeulen, Nico Goernitz, Lucas Deecke, Shoaib Ahmed Siddiqui, Alexander Binder, Emmanuel M¨uller, and Marius Kloft. Deep one-class classification. volume 80. PMLR, 2018
Lukas Ruff, Jacob R. Kauffmann, Robert A. Vandermeulen, Gregoire Montavon, Wojciech Samek, Marius Kloft, Thomas G. Dietterich, and Klaus Robert Muller. A unifying review of deep and shallow anomaly detection. Proceedings of the IEEE, 109: 756–795, 5 2021. ISSN 15582256. doi: 10.1109/JPROC.2021.3052449.
G Rupert Jr et al. Simultaneous statistical inference. Springer Series in Statistics, 2012.
Saket Sathe and Charu C Aggarwal. Subspace outlier detection in linear time with randomized hashing. In 2016 IEEE 16th International Conference on Data Mining (ICDM), pages 459–468. IEEE, 2016.
Issei Sato, Kenichi Kurihara, Shu Tanaka, Hiroshi Nakagawa, and Seiji Miyashita. Quantum annealing for variational Bayes inference. In Proceedings of the 25th Conference on Uncertainty in Artificial Intelligence, UAI 2009, pages 479–486, 2009
B. Sch¨olkopf. Learning with kernels. Journal of the Electrochemical Society, 129 (November):2865, 2002. ISSN 0162-1459. doi: 10.1198/jasa.2003.s269. URL http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.167.5140&rep=rep1&type=pdf
Bernhard Sch¨olkopf, Alexander Smola, and Klaus-Robert M¨uller. Kernel principal component analysis. In International conference on artificial neural networks, pages 583–588. Springer, 1997.
Bernhard Scholkopf, Robert C Williamson, and Peter L Bartlett. New Support Vector Algorithms ¤. Neural Computation, 1245:1207–1245, 2000.
Bernhard Sch¨olkopf, John C Platt, John Shawe-Taylor, Alex J Smola, and Robert C Williamson. Estimating the support of a high-dimensional distribution. Neural computation, 2001
David W Scott. Multivariate density estimation and visualization. In Handbook of computational statistics, pages 549–569. Springer, 2012
Younghyun Jo Sejong, Yang Seon, and Joo Kim. SRFlow-DA: Super-Resolution Using Normalizing Flow with Deep Convolutional Block. Technical report, 2021. URL https://github.com/yhjo09/SRFlow-DA.
Razieh Sheikhpour, Mehdi Agha Sarram, and Robab Sheikhpour. Particle swarm optimization for bandwidth determination and feature selection of kernel density estimation based classifiers in diagnosis of breast cancer. Applied Soft Computing Journal, 40:113–131, 3 2016. ISSN 15684946. doi: 10.1016/j.asoc.2015.10.005
Boxi Shen, Xiang Xu, Jun Li, Antonio Plaza, and Qunying Huang. Unfolding spatialtemporal patterns of taxi trip based on an improved network kernel density estimation. ISPRS International Journal of Geo-Information, 9:683, 11 2020. ISSN 2220-9964. doi: 10.3390/ijgi9110683.
Weiwei Shen, Zhihui Yang, and Jun Wang. Random features for shift-invariant kernels with moment matching. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 31, 2017.
Xun Shi. Selection of bandwidth type and adjustment side in kernel density estimation over inhomogeneous backgrounds. International Journal of Geographical Information Science, 24:643–660, 5 2010. ISSN 13658816. doi: 10.1080/13658810902950625.Xun Shi. Selection of bandwidth type and adjustment side in kernel density estimation over inhomogeneous backgrounds. International Journal of Geographical Information Science, 24:643–660, 5 2010. ISSN 13658816. doi: 10.1080/13658810902950625.
Alistair Shilton, Sutharshan Rajasegarar, and Marimuthu Palaniswami. Combined multiclass classification and anomaly detection for large-scale wireless sensor networks. In 2013 IEEE eighth international conference on intelligent sensors, sensor networks and information processing, pages 491–496. IEEE, 2013
Paris Siminelakis, Kexin Rong, Peter Bailis, Moses Charikar, and Philip Levis. Rehashing kernel evaluation in high dimensions. volume 2019-June, pages 10153–10173, 2019. ISBN 9781510886988
Aman Sinha and John Duchi. Learning kernels with random features. In Advances in Neural Information Processing Systems, pages 1306–1314, 2016.
Daniel B. Smith. Kde-for-scipy (python script). https://github.com/Daniel-B-Smith/KDE-for-SciPy/blob/master/kde.py, 2021. Accessed on March 6, 2023.
Paul Smolensky. Information processing in dynamical systems: Foundations of harmony theory. Technical report, Colorado Univ at Boulder Dept of Computer Science, 1986.
Angela A Sodemann, Matthew P Ross, and Brett J Borghetti. A review of anomaly detection in automated surveillance. IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews), 42(6):1257–1272, 2012
R. F. Streater. Classical and quantum probability. Journal of Mathematical Physics, 41:3556–3603, 2000. ISSN 00222488. doi: 10.1063/1.533322.
Swee Chuan Tan, Kai Ming Ting, and Tony Fei Liu. Fast anomaly detection for streaming data. In Twenty-second international joint conference on artificial intelligence, 2011.
Xiaofeng Tang and Aiqiang Xu. Multi-class classification using kernel density estimation on K-nearest neighbours. Electronics Letters, 52(8):600–602, apr 2016. ISSN 00135194. doi: 10.1049/el.2015.4437.
Mahbod Tavallaee, Ebrahim Bagheri, Wei Lu, and Ali A Ghorbani. A detailed analysis of the kdd cup 99 data set, 2009. http://kdd.ics.uci.edu/databases/kddcup99/kddcup99.html
P. Tiwari and M. Melucci. Towards a quantum-inspired binary classifier. IEEE Access, 7:42354–42372, 2019. doi: 10.1109/ACCESS.2019.2904624.
Maximilian E. Tschuchnig and Michael Gadermayr. Anomaly detection in medical imaging - a mini review. In Peter Haber, Thomas J. Lampoltshammer, Helmut Leopold, and Manfred Mayr, editors, Data Science – Analytics and Applications, pages 33–38, Wiesbaden, 2022. Springer Fachmedien Wiesbaden. ISBN 978-3-658-36295-9.
Berwin A Turlach. Bandwidth selection in kernel density estimation: A review. In CORE and Institut de Statistique. Citeseer, 1993
Andrea Vedaldi and Andrew Zisserman. Efficient additive kernels via explicit feature maps. IEEE Transactions on Pattern Analysis and Machine Intelligence, 34(3):480– 492, 2012. ISSN 01628828. doi: 10.1109/TPAMI.2011.153.
Pascal Vincent, Hugo Larochelle, Yoshua Bengio, and Pierre-Antoine Manzagol. Extracting and composing robust features with denoising autoencoders, 2009.
John Von Neumann. Wahrscheinlichkeitstheoretischer aufbau der quantenmechanik. Nachrichten von der Gesellschaft der Wissenschaften zu G¨ottingen, MathematischPhysikalische Klasse, 1927:245–272, 1927.
Lin Wang, Fuqiang Zhou, Zuoxin Li, Wangxia Zuo, and Haishu Tan. Abnormal Event Detection in Videos Using Hybrid Spatio-Temporal Autoencoder. In Proceedings - International Conference on Image Processing, ICIP, pages 2276–2280, 2018. ISBN 9781479970612. doi: 10.1109/ICIP.2018.8451070.
Xuanzhao Wang, Zhengping Che, Bo Jiang, Ning Xiao, Ke Yang, Jian Tang, Jieping Ye, Jingyu Wang, and Qi Qi. Robust unsupervised video anomaly detection by multipath frame prediction. IEEE Transactions on Neural Networks and Learning Systems, 33(6):2301–2312, 2022. doi: 10.1109/TNNLS.2021.3083152.
Yong Wang, Xinbin Luo, Lu Ding, Shan Fu, and Xian Wei. Detection based visual tracking with convolutional neural network. Knowledge-Based Systems, 175:62–71, 2019.
Yong Wang, Xinbin Luo, Lu Ding, Shan Fu, and Xian Wei. Detection based visual tracking with convolutional neural network. Knowledge-Based Systems, 175:62–71, 2019.
Christopher K. I. Williams. Using the nystrom method to speed up kernel machines. 2001
Christopher K. I. Williams. Using the nystrom method to speed up kernel machines. 2001
Miao Xie, Song Han, Biming Tian, and Sazia Parvin. Anomaly detection in wireless sensor networks: A survey. Journal of Network and computer Applications, 34(4): 1302–1325, 2011
Tianbao Yang, Yu Feng Li, Mehrdad Mahdavi, Rong Jin, and Zhi Hua Zhou. Nystr¨om method vs random Fourier features: A theoretical and empirical comparison. In Advances in Neural Information Processing Systems, volume 1, pages 476–484, 2012. ISBN 9781627480031
Felix Xinnan Yu, Ananda Theertha Suresh, Krzysztof Choromanski, Daniel HoltmannRice, and Sanjiv Kumar. Orthogonal random features. pages 1983–1991, 10 2016. URL http://arxiv.org/abs/1610.09072.
Felix Xinnan X Yu, Ananda Theertha Suresh, Krzysztof M Choromanski, Daniel N Holtmann-Rice, and Sanjiv Kumar. Orthogonal random features. In D. Lee, M. Sugiyama, U. Luxburg, I. Guyon, and R. Garnett, editors, Advances in Neural Information Processing Systems, volume 29, pages 1975–1983. Curran Associates, Inc., 2016. URL https://proceedings.neurips.cc/paper/2016/file/53adaf494dc89ef7196d73636eb2451b-Paper.pdf.
Houssam Zenati, Manon Romain, Chuan-Sheng Foo, Bruno Lecouat, and Vijay Chandrasekhar. Adversarially learned anomaly detection. In 2018 IEEE International conference on data mining (ICDM), pages 727–736. IEEE, 2018.
Liangwei Zhang, Jing Lin, and Ramin Karim. Adaptive kernel density-based anomaly detection for nonlinear systems. Knowledge-Based Systems, 139:50–63, 2018. ISSN 09507051. doi: 10.1016/j.knosys.2017.10.009.
Yue Zhao, Zain Nasrullah, and Zheng Li. Pyod: A python toolbox for scalable outlier detection. Journal of Machine Learning Research, 20, 2019. ISSN 15337928.
Xun Zhou, Sicong Cheng, Meng Zhu, Chengkun Guo, Sida Zhou, Peng Xu, Zhenghua Xue, and Weishi Zhang. A state of the art survey of data mining-based fraud detection and credit scoring. volume 189. EDP Sciences, 8 2018. doi: 10.1051/matecconf/ 201818903002.
Gallego-Mejia, J. A. (2023, June). Efficient non-parametric neural density estimation and its application to outlier and anomaly detection. In Proceedings of the AAAI Conference on Artificial Intelligence (Vol. 37, No. 13, pp. 16117-16118).
dc.rights.coar.fl_str_mv http://purl.org/coar/access_right/c_abf2
dc.rights.license.spa.fl_str_mv Atribución-NoComercial-SinDerivadas 4.0 Internacional
dc.rights.uri.spa.fl_str_mv http://creativecommons.org/licenses/by-nc-nd/4.0/
dc.rights.accessrights.spa.fl_str_mv info:eu-repo/semantics/openAccess
rights_invalid_str_mv Atribución-NoComercial-SinDerivadas 4.0 Internacional
http://creativecommons.org/licenses/by-nc-nd/4.0/
http://purl.org/coar/access_right/c_abf2
eu_rights_str_mv openAccess
dc.format.extent.spa.fl_str_mv xv, 113 páginas
dc.format.mimetype.spa.fl_str_mv application/pdf
dc.publisher.spa.fl_str_mv Universidad Nacional de Colombia
dc.publisher.program.spa.fl_str_mv Bogotá - Ingeniería - Doctorado en Ingeniería - Sistemas y Computación
dc.publisher.faculty.spa.fl_str_mv Facultad de Ingeniería
dc.publisher.place.spa.fl_str_mv Bogotá, Colombia
dc.publisher.branch.spa.fl_str_mv Universidad Nacional de Colombia - Sede Bogotá
institution Universidad Nacional de Colombia
bitstream.url.fl_str_mv https://repositorio.unal.edu.co/bitstream/unal/84781/3/license.txt
https://repositorio.unal.edu.co/bitstream/unal/84781/4/1022369610-2023.pdf
https://repositorio.unal.edu.co/bitstream/unal/84781/5/1022369610-2023.pdf.jpg
bitstream.checksum.fl_str_mv eb34b1cf90b7e1103fc9dfd26be24b4a
235373928c11e7514ce52eba236eb00d
c699c9de2d8b6dc864a1dfa5d1302760
bitstream.checksumAlgorithm.fl_str_mv MD5
MD5
MD5
repository.name.fl_str_mv Repositorio Institucional Universidad Nacional de Colombia
repository.mail.fl_str_mv repositorio_nal@unal.edu.co
_version_ 1814089362211602432
spelling Atribución-NoComercial-SinDerivadas 4.0 Internacionalhttp://creativecommons.org/licenses/by-nc-nd/4.0/info:eu-repo/semantics/openAccesshttp://purl.org/coar/access_right/c_abf2González, Fabio A.443fd5ab620c3973efa2055ce58761daGallego-Mejia, Joseph A.150b4a7318e6285f3342c8b8602b8095MindlabGallego-Mejia, Joseph A. [0000-0001-8971-4998]https://www.researchgate.net/profile/Joseph-Gallego-Mejiahttps://scholar.google.cl/citations?user=DS0IfX4AAAAJ&hl=es&oi=ao2023-10-06T20:45:17Z2023-10-06T20:45:17Z2023-10-05https://repositorio.unal.edu.co/handle/unal/84781Universidad Nacional de ColombiaRepositorio Institucional Universidad Nacional de Colombiahttps://repositorio.unal.edu.co/ilustraciones, diagramasThe main goal of this thesis is to propose efficient non-parametric density estimation methods that can be integrated with deep learning architectures, for instance, convolutional neural networks and transformers. A recent approach to non-parametric density estimation is neural density estimation. One advantage of these methods is that they can be integrated with deep learning architectures and trained using gradient descent. Most of these methods are based on neural network implementations of normalizing flows which transform an original simpler distribution to a more complex one. The approach of this thesis is based on a different idea that combines random Fourier features with density matrices to estimate the underlying distribution function. The method can be seen as an approximation of the popular kernel density estimation method but without the inherent computational cost. Density estimation methods can be applied to different problems in statistics and machine learning. They may be used to solve tasks such as anomaly detection, generative models, semi-supervised learning, compression, text-to-speech, among others. This thesis explores the application of the method in anomaly and outlier detection tasks such as medical anomaly detection, fraud detection, video surveillance, time series anomaly detection, industrial damage detection, among others. (Texto tomado de la fuente)El objetivo principal de esta tesis es proponer m´etodos eficientes de estimaci´on de densidad no param´etrica que puedan integrarse con arquitecturas de aprendizaje profundo, por ejemplo, redes neuronales convolucionales y transformadores. Una aproximaci´on reciente a la estimaci´on no param´etrica de la densidad es la estimaci´on de la densidad usando redes neuronales. Una de las ventajas de estos m´etodos es que pueden integrarse con arquitecturas de aprendizaje profundo y entrenarse mediante gradiente descendente. La mayor´ıa de estos m´etodos se basan en implementaciones de redes neuronales de flujos de normalizaci´on que transforman una distribuci´on original m´as simple en una m´as compleja. El enfoque de esta tesis se basa en una nueva idea diferente que combina caracter´ısticas aleatorias de Fourier con matrices de densidad para estimar la funci´on de distribuci´on subyacente. El m´etodo puede considerarse una aproximaci´on al popular m´etodo kernel density estimation, pero sin el coste computacional inherente. Los m´etodos de estimaci´on de la densidad pueden aplicarse a diferentes problemas en estad´ıstica y aprendizaje autom´atico. Pueden ser utilizados para resolver tareas como la detecci´on de anomal´ıas, modelos generativos, aprendizaje semi-supervisado, compresi´on, texto a habla, entre otros. El presente trabajo se centra principalmente en la aplicaci´on del m´etodo en tareas de detecci´on de anomal´ıas y valores at´ıpicos como la detecci´on de anomal´ıas m´edicas, la detecci´on de fraudes, la videovigilancia, la detecci´on de anomal´ıas en series temporales, la detecci´on de da˜nos industriales, entre otras.DoctoradoDoctor en IngenieríaMachine Learningxv, 113 páginasapplication/pdfengUniversidad Nacional de ColombiaBogotá - Ingeniería - Doctorado en Ingeniería - Sistemas y ComputaciónFacultad de IngenieríaBogotá, ColombiaUniversidad Nacional de Colombia - Sede Bogotá000 - Ciencias de la computación, información y obras generalesAprendizaje profundoRedes Neurales de la ComputaciónDeep LearningNeural Networks, ComputerKernel density estimationKernel methodsDeep LearningRandom Fourier FeaturesMachine LearningDeep KernelLarge-scale learningKernel Density Esitmation ApproximationNeural Density EstimationDensity MatrixEstimación de la densidad del núcleoMétodos del núcleoAprendizaje profundoAprendizaje a gran escalaAproximaciones de la estimación de la densidad del nucleoMatriz de densidadEstimación de la densidad neuronalEfficient Non-Parametric Neural Density Estimation and Its Application to Outlier and Anomaly DetectionEstimación neuronal no paramétrica eficiente de la densidad y su aplicación a la detección de valores atípicos y anomalíasTrabajo de grado - Doctoradoinfo:eu-repo/semantics/doctoralThesisinfo:eu-repo/semantics/acceptedVersionhttp://purl.org/coar/resource_type/c_db06Texthttp://purl.org/redcol/resource_type/TDKdd cup dataset, 1999. http://kdd.ics.uci.edu/databases/kddcup99/kddcup99.html.Andrews Jerone T. A., Edward J Morton, and Lewis D Griffin. Detecting anomalous data using auto-encoders. International Journal of Machine Learning and Computing, 2016.Charu C Aggarwal. Outlier analysis second edition, 2016.Mohiuddin Ahmed, Abdun Naser Mahmood, and Md Rafiqul Islam. A survey of anomaly detection techniques in financial domain. Future Generation Computer Systems, 55:278–288, 2016Jinwon An and Sungzoon Cho. Variational autoencoder based anomaly detection using reconstruction probability. Special Lecture on IE, 2(1):1–18, 2015Tessa K. Anderson. Kernel density estimation and K-means clustering to profile road accident hotspots. Accident Analysis and Prevention, 41(3):359–364, may 2009. ISSN 00014575. doi: 10.1016/j.aap.2008.12.014.Jerone T A Andrews, Thomas Tanay, Edward J Morton, and Lewis D Griffin. Transfer representation-learning for anomaly detection, 2016. URL http://www.vlfeat.org/matconvnet.Fabrizio Angiulli and Fabio Fassetti. Detecting distance-based outliers in streams of data. pages 811–820, 2007. ISBN 9781595938039. doi: 10.1145/1321440.1321552.Haim Avron, Vikas Sindhwani, Jiyan Yang, and Michael W. Mahoney. Quasi-monte carlo feature maps for shift-invariant kernels. Journal of Machine Learning Research, 17(120):1–38, 2016. URL http://jmlr.org/papers/v17/14-538.html.Francis R Bach and Michael I Jordan. Predictive low-rank decomposition for kernel methods. In ICML 2005 - Proceedings of the 22nd International Conference on Machine Learning, pages 33–40, 2005. ISBN 1595931805. doi: 10.1145/1102351.1102356Arturs Backurs, Piotr Indyk, and Tal Wagner. Space and time efficient kernel density estimation in high dimensions. volume 32, 2019Sivaraman Balakrishnan, Srivatsan Narayanan, Alessandro Rinaldo, Aarti Singh, and Larry Wasserman. Cluster trees on manifolds. arXiv preprint arXiv:1307.6515, 2013.David M Bashtannyk and Rob J Hyndman. Bandwidth selection for kernel conditional density estimation, 2001. URL www.elsevier.com/locate/csda.Yoshua Bengio and Samy Bengio. Modeling high-dimensional discrete data with multilayer neural networks. Advances in Neural Information Processing Systems, 12, 1999.Siddharth Bhatia, Arjit Jain, Pan Li, Ritesh Kumar, and Bryan Hooi. Mstream: Fast anomaly detection in multi-aspect streams. pages 3371–3382. Association for Computing Machinery, Inc, 4 2021. ISBN 9781450383127. doi: 10.1145/3442381. 3450023Siddharth Bhatia, Arjit Jain, Shivin Srivastava, Kenji Kawaguchi, and Bryan Hooi. Memstream: Memory-based streaming anomaly detection. WWW 2022 - Proceedings of the ACM Web Conference 2022, pages 610–621, 2022. doi: 10.1145/3485447. 3512221.Peter J Bickel and Kjell A Doksum. Mathematical statistics: basic ideas and selected topics, volumes I-II package. CRC Press, 2015Christopher M Bishop and Nasser M Nasrabadi. Pattern recognition and machine learning, volume 4. Springer, 2006Giuseppe Borruso. Network Density Estimation: A GIS Approach for Analysing Point Patterns in a Network Space. Transactions in GIS, 12(3):377–402, 2008. ISSN 1361- 1682Markus M Breunig, Hans-Peter Kriegel, Raymond T Ng, and J¨org Sander. Lof: identifying density-based local outliers. In Proceedings of the 2000 ACM SIGMOD international conference on Management of data, pages 93–104, 2000.Brian Bullins, Cyril Zhang, and Yi Zhang. Not-so-random features. 10 2017. URL http://arxiv.org/abs/1710.10230Oscar Bustos-Brinez, Joseph Gallego-Mejia, and Fabio A Gonz´alez. Ad-dmkde: Anomaly detection through density matrices and fourier features. arXiv preprint arXiv:2210.14796, 2022.Raghavendra Chalapathy and Sanjay Chawla. Deep learning for anomaly detection: A survey. 1 2019. URL http://arxiv.org/abs/1901.03407.Raghavendra Chalapathy, Aditya Krishna Menon, and Sanjay Chawla. Anomaly detection using one-class neural networks. arXiv preprint arXiv:1802.06360, 2018.Varun Chandola. Anomaly detection : A survey, 2009Moses Charikar and Paris Siminelakis. Hashing-based-estimators for kernel density in high dimensions. In 2017 IEEE 58th Annual Symposium on Foundations of Computer Science (FOCS), pages 1032–1043. IEEE, 2017.Sotirios P. Chatzis, Dimitrios Korkinof, and Yiannis Demiris. A quantum-statistical approach toward robot learning by demonstration. IEEE Transactions on Robotics, 28:1371–1381, 2012. ISSN 15523098. doi: 10.1109/TRO.2012.2203055.Fr´ed´eric Chazal, Brittany Fasy, Fabrizio Lecci, Bertrand Michel, Alessandro Rinaldo, Alessandro Rinaldo, and Larry Wasserman. Robust topological inference: Distance to a measure and kernel distance. The Journal of Machine Learning Research, 18(1): 5845–5884, 2017Gal Chechik, Varun Sharma, Uri Shalit, Samy Bengio, and Samy Bengio CHECHIK. Large Scale Online Learning of Image Similarity Through Ranking. Technical report, 2010.Yen Chi Chen. A tutorial on kernel density estimation and recent advances. Biostatistics and Epidemiology, 1:161–187, 1 2017. ISSN 24709379. doi: 10.1080/24709360. 2017.1396742Yen-Chi Chen, Christopher R Genovese, Larry Wasserman, et al. A comprehensive approach to mode clustering. Electronic Journal of Statistics, 10(1):210–241, 2016Peitao Cheng, Yuanying Qiu, Xiumei Wang, and Ke Zhao. A New Single Image Super-Resolution Method Based on the Infinite Mixture Model. IEEE Access, 5:2228–2240, 2017. ISSN 21693536. doi: 10.1109/ACCESS.2017.2664103. URL http://arxiv.org/abs/2006.05218.Victor Chernozhukov, Denis Chetverikov, Kengo Kato, et al. Gaussian approximation of suprema of empirical processes. Annals of Statistics, 42(4):1564–1597, 2014Krzysztof Choromanski and Vikas Sindhwani. Recycling randomness with structure for sublinear time kernel expansions. 5 2016. URL http://arxiv.org/abs/1605.09049N Cristianini. Kernel Methods for Pattern Analysis, volume 1. 2004. doi: 10.1198/ tech.2005.s264.Tri Dao, Christopher De Sa, and Christopher R´e. Gaussian quadrature for kernel features. Advances in neural information processing systems, 30:6109, 2017.Philip I Davies and Nicholas J Higham. Numerically stable generation of correlation matrices and their factors. BIT Numerical Mathematics, 40(4):640–651, 2000.Arthur P Dempster, Nan M Laird, and Donald B Rubin. Maximum likelihood from incomplete data via the em algorithm. Journal of the royal statistical society: series B (methodological), 39(1):1–22, 1977.B Denkena, M-A Dittrich, H Noske, and M Witt. Statistical approaches for semisupervised anomaly detection in machining. Production Engineering, 14(3):385–393, 2020Luc Devroye. Nonparametric density estimation. The L 1 View, 1985.Asimenia Dimokranitou. Adversarial autoencoders for anomalous event detection in images. PhD thesis, Purdue University, 2017Zhiguo Ding and Minrui Fei. An anomaly detection approach based on isolation forest algorithm for streaming data using sliding window. volume 3, pages 12–17. IFAC Secretariat, 2013. ISBN 9783902823458. doi: 10.3182/20130902-3-CN-3020.00044.Laurent Dinh, David Krueger, and Yoshua Bengio. Nice: Non-linear independent components estimation. arXiv preprint arXiv:1410.8516, 2014.Laurent Dinh, Jascha Sohl-Dickstein, and Samy Bengio. Density estimation using real nvp. 5 2016. URL http://arxiv.org/abs/1605.08803.Joni A Downs. Time-geographic density estimation for moving point objects, 2010.Conor Durkan, Artur Bekasov, Iain Murray, and George Papamakarios. Neural spline flows. In Advances in Neural Information Processing Systems, volume 32, 2019Vincent Dutordoir, Hugh Salimbeni, Marc Peter Deisenroth, and James Hensman. Gaussian process conditional density estimation. volume 2018-Decem, pages 2385– 2395, 2018.Bradley Efron. Bootstrap methods: another look at the jackknife. In Breakthroughs in statistics, pages 569–593. Springer, 1992.Gilberto Fernandes, Joel JPC Rodrigues, Luiz Fernando Carvalho, Jalal F Al-Muhtadi, and Mario Lemes Proen¸ca. A comprehensive survey on network anomaly detection. Telecommunication Systems, 70(3):447–489, 2019.Chris Fraley and Adrian E. Raftery. Model-based clustering, discriminant analysis, and density estimation. Journal of the American Statistical Association, 97:611–631, 2002. ISSN 01621459. doi: 10.1198/016214502760047131.Brendan J Frey, J Frey Brendan, and Brendan J Frey. Graphical models for machine learning and digital communication. MIT press, 1998Joseph A. Gallego and Fabio A. Gonz´alez. Quantum adaptive fourier features for neural density estimation, 2022. URL https://arxiv.org/abs/2208.00564.Joseph A Gallego, Fabio A Gonz´alez, and Olfa Nasraoui. Robust kernels for robust location estimation. Neurocomputing, 429:174–186, 2021Joseph A Gallego, Juan F Osorio, and Fabio A Gonzalez. Fast kernel density estimation with density matrices and random fourier features. In Advances in Artificial Intelligence–IBERAMIA 2022: 17th Ibero-American Conference on AI, Cartagena de Indias, Colombia, November 23–25, 2022, Proceedings, pages 160–172. Springer, 2023Joseph Gallego-Mejia, Oscar Bustos-Brinez, and Fabio A Gonz´alez. Leandmkde: Quantum latent density estimation for anomaly detection. arXiv preprint arXiv:2211.08525, 2022Joseph A. Gallego-Mejia and Fabio A Gonzalez. Demande dataset, April 2023. URL https://doi.org/10.5281/zenodo.7822851Joseph A Gallego-Mejia, Oscar A Bustos-Brinez, and Fabio A Gonz´alez. Inqmad: Incremental quantum measurement anomaly detection. In 2022 IEEE International Conference on Data Mining Workshops (ICDMW), pages 787–796. IEEE, 2022Weihao Gao, Sreeram Kannan, Sewoong Oh, and Pramod Viswanath. Estimating mutual information for discrete-continuous mixtures. Advances in neural information processing systems, 30, 2017Ioannis Gatopoulos, Maarten Stol, and Jakub M. Tomczak. Super-resolution Variational Auto-Encoders. jun 2020. URL http://arxiv.org/abs/2006.05218.Christopher R Genovese, Marco Perone-Pacifico, Isabella Verdinelli, Larry Wasserman, et al. Nonparametric ridge estimation. Annals of Statistics, 42(4):1511–1545, 2014Mathieu Germain, Karol Gregor, Iain Murray, and Hugo Larochelle. Made: Masked autoencoder for distribution estimation, 2015.Kaan Gokcesu and Suleyman S. Kozat. Online density estimation of nonstationary sources using exponential family of distributions. IEEE Transactions on Neural Networks and Learning Systems, 29:4473–4478, 9 2018. ISSN 21622388. doi: 10.1109/TNNLS.2017.2740003.Fabio A. Gonz´alez, Vladimir Vargas-Calder´on, and Herbert Vinck-Posada. Supervised Learning with Quantum Measurements. 2020. URL http://arxiv.org/abs/2004.01227Fabio A Gonz´alez, Vladimir Vargas-Calder´on, and Herbert Vinck-Posada. Classification with quantum measurements. Journal of the Physical Society of Japan, 90(4): 044002, 2021.Fabio A. Gonz´alez, Alejandro Gallego, Santiago Toledo-Cort´es, and Vladimir VargasCalder´on. Learning with density matrices and random features, 2021.Artur Gramacki. Nonparametric kernel density estimation and its computational aspects, 2018.Artur Gramacki. Nonparametric kernel density estimation and its computational aspects. Springer, 2018.Alexander G Gray and Andrew W Moore. Nonparametric density estimation: Toward computational tractability. In Proceedings of the 2003 SIAM International Conference on Data Mining, pages 203–211. SIAM, 2003Claudio Guardnaccia, Michele Grimaldi, Gabriella Graziuso, and Simona Mancini. CROWDSOURCING NOISE MAPS ANALYSIS BY MEANS OF KERNEL DENSITY ESTIMATION. pages 1691–1697, 2021. doi: 10.48465/fa.2020.0505. URL https://hal.archives-ouvertes.fr/hal-03233732.Sudipto Guha, Nina Mishra, Gourav Roy, and Okke Schrijvers. Robust random cut forest based anomaly detection on streams, 2016.Sahand Hariri, Matias Carrasco Kind, and Robert J. Brunner. Extended isolation forest. 11 2018. doi: 10.1109/TKDE.2019.2947676. URL http://arxiv.org/abs/1811.02141http://dx.doi.org/10.1109/TKDE.2019.2947676.Trevor Hastie, Robert Tibshirani, Jerome H Friedman, and Jerome H Friedman. The elements of statistical learning: data mining, inference, and prediction, volume 2. Springer, 2009Marti A. Hearst, Susan T Dumais, Edgar Osuna, John Platt, and Bernhard Scholkopf. Support vector machines. IEEE Intelligent Systems and their applications, 13(4):18– 28, 1998.Geoffrey E Hinton, Simon Osindero, and Yee-Whye Teh. A fast learning algorithm for deep belief nets. Neural computation, 18(7):1527–1554, 2006.A. H. Lashkari I. Sharafaldin and A. A Ghorbani. Toward generating a new intrusion detection dataset and intrusion traffic characterization, 2018.F´elix Iglesias and Tanja Zseby. Analysis of network traffic features for anomaly detection. Machine Learning, 101:59–84, 10 2015. ISSN 15730565. doi: 10.1007/ s10994-014-5473-9.Piotr Indyk and Rajeev Motwd. Approximate nearest neighbors: Towards removing the curse of dimensionality, 1998.Marko V Jankovic. Probabilistic Approach to Neural Networks Computation Based on Quantum Probability Model Probabilistic Principal Subspace Analysis Example. 2010. URL http://arxiv.org/abs/1001.4301.Marko V. Jankovic. Quantum Low Entropy based Associative Reasoning – QLEAR Learning, 2017. ISSN 23318422Marko V. Jankovic and Masashi Sugiyama. Probabilistic principal component analysis based on joystick probability selector. In Proceedings of the International Joint Conference on Neural Networks, pages 1414–1421. IEEE, 2009. ISBN 9781424435531. doi: 10.1109/IJCNN.2009.5178696.Marko V. Jankovi´c, Tomislav Gaji´c, and Branimir D. Reljin. Applications of probabilistic model based on main quantum mechanics concepts. In 12th Symposium on Neural Network Applications in Electrical Engineering, NEUREL 2014 - Proceedings, pages 33–36, 2014. ISBN 9781479958887. doi: 10.1109/NEUREL.2014.7011453.J.H.M. Janssens, F. Huszar, E.O. Postma, and H.J. van den Herik. Stochastic outlier selection, 2012.Ping Ji, Na Zhao, Shijie Hao, and Jianguo Jiang. Automatic image annotation by semisupervised manifold kernel density estimation. Information Sciences, 281:648–660, 10 2014. ISSN 00200255. doi: 10.1016/j.ins.2013.09.016Joagg. Joaggi/demande: v1.0, March 2023. URL https://doi.org/10.5281/zenodo.7709634.Michael I Jordan and Tom M Mitchell. Machine learning: Trends, perspectives, and prospects. Science, 349(6245):255–260, 2015.Vilen Jumutc and Johan A.K. Suykens. Multi-class supervised novelty detection. IEEE Transactions on Pattern Analysis and Machine Intelligence, 36:2510–2523, 12 2014. ISSN 01628828. doi: 10.1109/TPAMI.2014.2327984.Firuz Kamalov. Kernel density estimation based sampling for imbalanced class distribution. Information Sciences, 512:1192–1201, feb 2020. ISSN 00200255. doi: 10.1016/j.ins.2019.10.017.Sangwook Kim, Yonghwa Choi, and Minho Lee. Deep learning with support vector data description. Neurocomputing, 165:111–117, 10 2015. ISSN 18728286. doi: 10. 1016/j.neucom.2014.09.086Diederik P Kingma and Max Welling. Auto-encoding variational bayes, 2014.Diederik P Kingma, Tim Salimans, Rafal Jozefowicz, Xi Chen, Ilya Sutskever, and Max Welling. Improved variational inference with inverse autoregressive flow. In Advances in Neural Information Processing Systems, pages 4743–4751, 2016.B. Ravi Kiran, Dilip Mathew Thomas, and Ranjith Parakkal. An overview of deep learning based methods for unsupervised and semi-supervised anomaly detection in videos, 2018. ISSN 2313433X.Matej Kristan, Aleˇs Leonardis, and Danijel Skoˇcaj. Multivariate online kernel density estimation with gaussian kernels. Pattern Recognition, 44:2630– 2642, 10 2011. ISSN 00313203. doi: 10.1016/j.patcog.2011.03.019. URL https://linkinghub.elsevier.com/retrieve/pii/S0031320311001233.Donghwoon Kwon, Hyunjoo Kim, Jinoh Kim, Sang C Suh, Ikkyun Kim, and Kuinam J Kim. A survey of deep learning-based network anomaly detection. Cluster Computing, 22(1):949–961, 2019.Hugo Larochelle and Iain Murray. The neural autoregressive distribution estimator. In Proceedings of the fourteenth international conference on artificial intelligence and statistics, pages 29–37. JMLR Workshop and Conference Proceedings, 2011.Longin Jan Latecki, Aleksandar Lazarevic, and Dragoljub Pokrajac. Outlier detection with kernel density functions. In International Workshop on Machine Learning and Data Mining in Pattern Recognition, pages 61–75. Springer, 2007.Quoc Le, Tamas Sarlos, and Alex Smola. Fastfood - approximating kernel expansions in loglinear time. In 30th International Conference on Machine Learning (ICML), 2013. URL http://jmlr.org/proceedings/papers/v28/le13.html.Yann LeCun, Bernhard Boser, John S Denker, Donnie Henderson, Richard E Howard, Wayne Hubbard, and Lawrence D Jackel. Backpropagation applied to handwritten zip code recognition. Neural computation, 1(4):541–551, 1989Yann LeCun, Yoshua Bengio, and Geoffrey Hinton. Deep learning. nature, 521(7553): 436–444, 2015.Jeisung Lee and Mignon Park. An adaptive background subtraction method based on kernel density estimation. Sensors (Switzerland), 12:12279–12300, 9 2012. ISSN 14248220. doi: 10.3390/s120912279Jonathan Li and Andrew Barron. Mixture density estimation. Advances in neural information processing systems, 12, 1999Kun-Lun Li, Hou-Kuan Huang, Sheng-Feng Tian, and Wei Xu. Improving one-class svm for anomaly detection. In Proceedings of the 2003 International Conference on Machine Learning and Cybernetics (IEEE Cat. No. 03EX693), volume 5, pages 3077– 3081. IEEE, 2003.Yanjun Li, Kai Zhang, Jun Wang, and Sanjiv Kumar. Learning adaptive random features. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 33, pages 4229–4236, 2019Zheng Li, Yue Zhao, Nicola Botta, Cezar Ionescu, and Xiyang Hu. Copod: Copulabased outlier detection. pages 1118–1123, 11 2020.Zhu Li, Jean Fran¸cois Ton, Dino Oglic, and Dino Sejdinovic. Towards a unified analysis of random fourier features. In 36th International Conference on Machine Learning, ICML 2019, volume 2019-June, pages 6916–6936, 2019. ISBN 9781510886988Zhu Li, Jean-Francois Ton, Dino Oglic, and Dino Sejdinovic. Towards a unified analysis of random fourier features. In International Conference on Machine Learning, pages 3905–3914. PMLR, 2019.Zilong Lin, Yong Shi, and Zhi Xue. Idsgan: Generative adversarial networks for attack generation against intrusion detection. 9 2018. URL http://arxiv.org/abs/1809.02077Fang Liu, Yanwei Yu, Peng Song, Yangyang Fan, and Xiangrong Tong. Scalable KDEbased top-n local outlier detection over large-scale data streams. Knowledge-Based Systems, 204:106186, 2020. ISSN 0950-7051. doi: https://doi.org/10.1016/j.knosys.2020. 106186. URL https://www.sciencedirect.com/science/article/pii/S0950705120304159Fanghui Liu, Xiaolin Huang, Yudong Chen, and Johan A. K. Suykens. Random Features for Kernel Approximation: A Survey on Algorithms, Theory, and Beyond. apr 2020.Fanghui Liu, Xiaolin Huang, Yudong Chen, Jie Yang, and Johan Suykens. Random fourier features via fast surrogate leverage weighted sampling. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 34, pages 4844–4851, 2020.Fei Tony Liu, Kai Ming Ting, and Zhi-Hua Zhou. Isolation forest. In 2008 eighth ieee international conference on data mining, pages 413–422. IEEE, 2008Qiao Liu, Jiaze Xu, Rui Jiang, and Wing Hung Wong. Density estimation using deep generative neural networks. Proceedings of the National Academy of Sciences, 118(15), 2021.Peng Lv, Yanwei Yu, Yangyang Fan, Xianfeng Tang, and Xiangrong Tong. Layerconstrained variational autoencoding kernel density estimation model for anomaly detection. Knowledge-Based Systems, 196, 5 2020. ISSN 09507051. doi: 10.1016/j.knosys. 2020.105753.Yueming Lyu. Spherical structured feature maps for kernel approximation, 2017Larry M Manevitz and Malik Yousef. One-class svms for document classification. Journal of machine Learning research, 2(Dec):139–154, 2001.Emaad Manzoor, Hemank Lamba, and Leman Akoglu. Xstream: Outlier dete’x’ion in feature-evolving data streams. pages 1963–1972. Association for Computing Machinery, 7 2018. ISBN 9781450355520. doi: 10.1145/3219819.3220107.William B. March, Bo Xiao, and George Biros. Askit: Approximate skeletonization kernel-independent treecode in high dimensions. 10 2014. URL http://arxiv.org/abs/1410.0260.William B March, Bo Xiao, and George Biros. Askit: Approximate skeletonization kernel-independent treecode in high dimensions. SIAM Journal on Scientific Computing, 37(2):A1089–A1110, 2015.Luis Mart´ı, Nayat Sanchez-Pi, Jos´e Manuel Molina, and Ana Cristina Bicharra Garcia. Anomaly detection based on sensor data in petroleum industry applications. Sensors (Switzerland), 15:2774–2797, 1 2015. ISSN 14248220. doi: 10.3390/s150202774.Luis Mart´ı, Nayat Sanchez-Pi, Jos´e Manuel Molina, and Ana Cristina Bicharra Garcia. Anomaly detection based on sensor data in petroleum industry applications. Sensors (Switzerland), 15:2774–2797, 1 2015. ISSN 14248220. doi: 10.3390/s150202774.Barbara J McNeil and James A Hanley. Statistical approaches to the analysis of receiver operating characteristic (roc) curves. Medical decision making, 4(2):137–150, 1984.Seonwoo Min, Byunghan Lee, and Sungroh Yoon. Deep learning in bioinformatics. 18 (5):851–869, 2017. ISSN 14774054. doi: 10.1093/bib/bbw068Tom Minka et al. Divergence measures and message passing. Technical report, Citeseer, 2005Yisroel Mirsky, Tomer Doitshman, Yuval Elovici, and Asaf Shabtai. Kitsune: An ensemble of autoencoders for online network intrusion detection, 2 2018. ISSN 23318422.Nour Moustafa and Jill Slay. Unsw-nb15: a comprehensive data set for network intrusion detection systems (unsw-nb15 network data set), 2015.Marina Munkhoeva, Yermek Kapushev, Evgeny Burnaev, and Ivan Oseledets. Quadrature-based features for kernel approximation. 2 2018. URL http://arxiv.org/abs/1802.03832.Kevin P Murphy. Machine learning: a probabilistic perspective. MIT press, 2012.Gyoung S. Na, Donghyun Kim, and Hwanjo Yu. Dilof: Effective and memory efficient local outlier detection in data streams. pages 1993–2002. Association for Computing Machinery, 7 2018. ISBN 9781450355520. doi: 10.1145/3219819.3220022Benjamin Nachman and David Shih. Anomaly detection with density estimation. Physical Review D, 101(7):075042, 2020.Elizbar A Nadaraya. Some new estimates for distribution functions. Theory of Probability & Its Applications, 9(3):497–500, 1964.Tomoki Nakaya and Keiji Yano. Visualising crime clusters in a space-time cube: An exploratory data-analysis approach using space-time kernel density estimation and scan statistics. Transactions in GIS, 14:223–239, 6 2010. ISSN 13611682. doi: 10. 1111/j.1467-9671.2010.01194.xHD Nguyen, Kim Phuc Tran, S´ebastien Thomassey, and Moez Hamad. Forecasting and anomaly detection approaches using lstm and lstm autoencoder techniques with the applications in supply chain management. International Journal of Information Management, 57:102282, 2021Hien D. Nguyen, Dianhui Wang, and Geoffrey J. McLachlan. Randomized mixture models for probability density approximation and estimation. Information Sciences, 467:135–148, 10 2018. ISSN 00200255. doi: 10.1016/j.ins.2018.07.056.Guansong Pang, Charu Aggarwal, Chunhua Shen, and Nicu Sebe. Editorial deep learning for anomaly detection. IEEE Transactions on Neural Networks and Learning Systems, 33:2282–2286, 6 2022. ISSN 2162-237X. doi: 10.1109/TNNLS.2022.3162123. URL https://ieeexplore.ieee.org/document/9786561/George Papamakarios, Theo Pavlakou, and Iain Murray. Masked autoregressive flow for density estimation. 5 2017. URL http://arxiv.org/abs/1705.07057.Emanuel Parzen. On estimation of a probability density function and mode. The annals of mathematical statistics, 33(3):1065–1076, 1962.F. Pedregosa, G. Varoquaux, A. Gramfort, V. Michel, B. Thirion, O. Grisel, M. Blondel, P. Prettenhofer, R. Weiss, V. Dubourg, J. Vanderplas, A. Passos, D. Cournapeau, M. Brucher, M. Perrot, and E. Duchesnay. Scikit-learn: Machine learning in Python. Journal of Machine Learning Research, 12:2825–2830, 2011.Kainan Peng, Wei Ping, Zhao Song, and Kexin Zhao. Non-autoregressive neural textto-speech. 5 2019. URL http://arxiv.org/abs/1905.08459Tom´aˇs Pevn´y. Loda: Lightweight on-line detector of anomalies. Machine Learning, 102:275–304, 2016. ISSN 1573-0565Tom´aˇs Pevn´y. Loda: Lightweight on-line detector of anomalies. Machine Learning, 102:275–304, 2 2016. ISSN 15730565. doi: 10.1007/s10994-015-5521-0Marco A.F. Pimentel, David A. Clifton, Lei Clifton, and Lionel Tarassenko. A review of novelty detection. Signal Processing, 99:215–249, 6 2014. ISSN 01651684. doi: 10.1016/j.sigpro.2013.12.026Adrian Alan Pol, Victor Berger, Cecile Germain, Gianluca Cerminara, and Maurizio Pierini. Anomaly detection with conditional variational autoencoders. In 2019 18th IEEE International Conference On Machine Learning And Applications (ICMLA), pages 1651–1657, 2019. doi: 10.1109/ICMLA.2019.00270.Rimpal Popat and Jayesh Chaudhary. A survey on credit card fraud detection using machine learning. 2018Ali Rahimi and Benjamin Recht. Random features for large-scale kernel machines. In Proceedings of the 20th International Conference on Neural Information Processing Systems, NIPS’07, page 1177–1184, Red Hook, NY, USA, 2007. Curran Associates Inc. ISBN 9781605603520.Ali Rahimi, Benjamin Recht, et al. Random features for large-scale kernel machines. In NIPS, volume 3, page 5. Citeseer, 2007Sridhar Ramaswamy, Rajeev Rastogi, and Kyuseok Shim. Efficient algorithms for mining outliers from large data sets. page 427438. Association for Computing Machinery, 2000. ISBN 1581132174.Daniel Ramotsoela, Adnan Abu-Mahfouz, and Gerhard Hancke. A survey of anomaly detection in industrial wireless sensor networks with critical water system infrastructure as a case study. Sensors (Switzerland), 18, 8 2018. ISSN 14248220. doi: 10.3390/s18082491Carl Edward Rasmussen. Gaussian processes in machine learning. In Summer school on machine learning, pages 63–71. Springer, 2003Carl Edward. Rasmussen. Gaussian Processes in Machine Learning. Springer-Verlag Berlin Heidelberg 2004, 19(1):63–71, 2004. ISSN 00219509.Shebuti Rayana. ODDS library, 2016. URL http://odds.cs.stonybrook.eduS. Reed, Y. Chen, T. Paine, A. van den Oord, S. M.A. Eslami, D. Rezende, O. Vinyals, and N. de Freitas. Few-shot autoregressive density estimation: Towards learning to learn distributions, 2017.Danilo Jimenez Rezende and Shakir Mohamed. Variational inference with normalizing flows. volume 2, pages 1530–1538, 2015. ISBN 9781510810587.Baqar Rizvi, Ammar Belatreche, Ahmed Bouridane, and Ian Watson. Detection of stock price manipulation using kernel based principal component analysis and multivariate density estimation. IEEE Access, 8:135989–136003, 2020.Vijay K Rohatgi and AK Md Ehsanes Saleh. An introduction to probability and statistics. John Wiley & Sons, 2015.Murray Rosenblatt. Remarks on some nonparametric estimates of a density function. Ann. Math. Statist., 27(3):832–837, 09 1956. doi: 10.1214/aoms/1177728190. URL https://doi.org/10.1214/aoms/1177728190Peter J Rousseeuw and Katrien Van Driessen. A fast algorithm for the minimum covariance determinant estimator. Technometrics, 41(3):212–223, 1999.Lukas Ruff, Robert Vandermeulen, Nico Goernitz, Lucas Deecke, Shoaib Ahmed Siddiqui, Alexander Binder, Emmanuel M¨uller, and Marius Kloft. Deep one-class classification. volume 80. PMLR, 2018Lukas Ruff, Jacob R. Kauffmann, Robert A. Vandermeulen, Gregoire Montavon, Wojciech Samek, Marius Kloft, Thomas G. Dietterich, and Klaus Robert Muller. A unifying review of deep and shallow anomaly detection. Proceedings of the IEEE, 109: 756–795, 5 2021. ISSN 15582256. doi: 10.1109/JPROC.2021.3052449.G Rupert Jr et al. Simultaneous statistical inference. Springer Series in Statistics, 2012.Saket Sathe and Charu C Aggarwal. Subspace outlier detection in linear time with randomized hashing. In 2016 IEEE 16th International Conference on Data Mining (ICDM), pages 459–468. IEEE, 2016.Issei Sato, Kenichi Kurihara, Shu Tanaka, Hiroshi Nakagawa, and Seiji Miyashita. Quantum annealing for variational Bayes inference. In Proceedings of the 25th Conference on Uncertainty in Artificial Intelligence, UAI 2009, pages 479–486, 2009B. Sch¨olkopf. Learning with kernels. Journal of the Electrochemical Society, 129 (November):2865, 2002. ISSN 0162-1459. doi: 10.1198/jasa.2003.s269. URL http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.167.5140&rep=rep1&type=pdfBernhard Sch¨olkopf, Alexander Smola, and Klaus-Robert M¨uller. Kernel principal component analysis. In International conference on artificial neural networks, pages 583–588. Springer, 1997.Bernhard Scholkopf, Robert C Williamson, and Peter L Bartlett. New Support Vector Algorithms ¤. Neural Computation, 1245:1207–1245, 2000.Bernhard Sch¨olkopf, John C Platt, John Shawe-Taylor, Alex J Smola, and Robert C Williamson. Estimating the support of a high-dimensional distribution. Neural computation, 2001David W Scott. Multivariate density estimation and visualization. In Handbook of computational statistics, pages 549–569. Springer, 2012Younghyun Jo Sejong, Yang Seon, and Joo Kim. SRFlow-DA: Super-Resolution Using Normalizing Flow with Deep Convolutional Block. Technical report, 2021. URL https://github.com/yhjo09/SRFlow-DA.Razieh Sheikhpour, Mehdi Agha Sarram, and Robab Sheikhpour. Particle swarm optimization for bandwidth determination and feature selection of kernel density estimation based classifiers in diagnosis of breast cancer. Applied Soft Computing Journal, 40:113–131, 3 2016. ISSN 15684946. doi: 10.1016/j.asoc.2015.10.005Boxi Shen, Xiang Xu, Jun Li, Antonio Plaza, and Qunying Huang. Unfolding spatialtemporal patterns of taxi trip based on an improved network kernel density estimation. ISPRS International Journal of Geo-Information, 9:683, 11 2020. ISSN 2220-9964. doi: 10.3390/ijgi9110683.Weiwei Shen, Zhihui Yang, and Jun Wang. Random features for shift-invariant kernels with moment matching. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 31, 2017.Xun Shi. Selection of bandwidth type and adjustment side in kernel density estimation over inhomogeneous backgrounds. International Journal of Geographical Information Science, 24:643–660, 5 2010. ISSN 13658816. doi: 10.1080/13658810902950625.Xun Shi. Selection of bandwidth type and adjustment side in kernel density estimation over inhomogeneous backgrounds. International Journal of Geographical Information Science, 24:643–660, 5 2010. ISSN 13658816. doi: 10.1080/13658810902950625.Alistair Shilton, Sutharshan Rajasegarar, and Marimuthu Palaniswami. Combined multiclass classification and anomaly detection for large-scale wireless sensor networks. In 2013 IEEE eighth international conference on intelligent sensors, sensor networks and information processing, pages 491–496. IEEE, 2013Paris Siminelakis, Kexin Rong, Peter Bailis, Moses Charikar, and Philip Levis. Rehashing kernel evaluation in high dimensions. volume 2019-June, pages 10153–10173, 2019. ISBN 9781510886988Aman Sinha and John Duchi. Learning kernels with random features. In Advances in Neural Information Processing Systems, pages 1306–1314, 2016.Daniel B. Smith. Kde-for-scipy (python script). https://github.com/Daniel-B-Smith/KDE-for-SciPy/blob/master/kde.py, 2021. Accessed on March 6, 2023.Paul Smolensky. Information processing in dynamical systems: Foundations of harmony theory. Technical report, Colorado Univ at Boulder Dept of Computer Science, 1986.Angela A Sodemann, Matthew P Ross, and Brett J Borghetti. A review of anomaly detection in automated surveillance. IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews), 42(6):1257–1272, 2012R. F. Streater. Classical and quantum probability. Journal of Mathematical Physics, 41:3556–3603, 2000. ISSN 00222488. doi: 10.1063/1.533322.Swee Chuan Tan, Kai Ming Ting, and Tony Fei Liu. Fast anomaly detection for streaming data. In Twenty-second international joint conference on artificial intelligence, 2011.Xiaofeng Tang and Aiqiang Xu. Multi-class classification using kernel density estimation on K-nearest neighbours. Electronics Letters, 52(8):600–602, apr 2016. ISSN 00135194. doi: 10.1049/el.2015.4437.Mahbod Tavallaee, Ebrahim Bagheri, Wei Lu, and Ali A Ghorbani. A detailed analysis of the kdd cup 99 data set, 2009. http://kdd.ics.uci.edu/databases/kddcup99/kddcup99.htmlP. Tiwari and M. Melucci. Towards a quantum-inspired binary classifier. IEEE Access, 7:42354–42372, 2019. doi: 10.1109/ACCESS.2019.2904624.Maximilian E. Tschuchnig and Michael Gadermayr. Anomaly detection in medical imaging - a mini review. In Peter Haber, Thomas J. Lampoltshammer, Helmut Leopold, and Manfred Mayr, editors, Data Science – Analytics and Applications, pages 33–38, Wiesbaden, 2022. Springer Fachmedien Wiesbaden. ISBN 978-3-658-36295-9.Berwin A Turlach. Bandwidth selection in kernel density estimation: A review. In CORE and Institut de Statistique. Citeseer, 1993Andrea Vedaldi and Andrew Zisserman. Efficient additive kernels via explicit feature maps. IEEE Transactions on Pattern Analysis and Machine Intelligence, 34(3):480– 492, 2012. ISSN 01628828. doi: 10.1109/TPAMI.2011.153.Pascal Vincent, Hugo Larochelle, Yoshua Bengio, and Pierre-Antoine Manzagol. Extracting and composing robust features with denoising autoencoders, 2009.John Von Neumann. Wahrscheinlichkeitstheoretischer aufbau der quantenmechanik. Nachrichten von der Gesellschaft der Wissenschaften zu G¨ottingen, MathematischPhysikalische Klasse, 1927:245–272, 1927.Lin Wang, Fuqiang Zhou, Zuoxin Li, Wangxia Zuo, and Haishu Tan. Abnormal Event Detection in Videos Using Hybrid Spatio-Temporal Autoencoder. In Proceedings - International Conference on Image Processing, ICIP, pages 2276–2280, 2018. ISBN 9781479970612. doi: 10.1109/ICIP.2018.8451070.Xuanzhao Wang, Zhengping Che, Bo Jiang, Ning Xiao, Ke Yang, Jian Tang, Jieping Ye, Jingyu Wang, and Qi Qi. Robust unsupervised video anomaly detection by multipath frame prediction. IEEE Transactions on Neural Networks and Learning Systems, 33(6):2301–2312, 2022. doi: 10.1109/TNNLS.2021.3083152.Yong Wang, Xinbin Luo, Lu Ding, Shan Fu, and Xian Wei. Detection based visual tracking with convolutional neural network. Knowledge-Based Systems, 175:62–71, 2019.Yong Wang, Xinbin Luo, Lu Ding, Shan Fu, and Xian Wei. Detection based visual tracking with convolutional neural network. Knowledge-Based Systems, 175:62–71, 2019.Christopher K. I. Williams. Using the nystrom method to speed up kernel machines. 2001Christopher K. I. Williams. Using the nystrom method to speed up kernel machines. 2001Miao Xie, Song Han, Biming Tian, and Sazia Parvin. Anomaly detection in wireless sensor networks: A survey. Journal of Network and computer Applications, 34(4): 1302–1325, 2011Tianbao Yang, Yu Feng Li, Mehrdad Mahdavi, Rong Jin, and Zhi Hua Zhou. Nystr¨om method vs random Fourier features: A theoretical and empirical comparison. In Advances in Neural Information Processing Systems, volume 1, pages 476–484, 2012. ISBN 9781627480031Felix Xinnan Yu, Ananda Theertha Suresh, Krzysztof Choromanski, Daniel HoltmannRice, and Sanjiv Kumar. Orthogonal random features. pages 1983–1991, 10 2016. URL http://arxiv.org/abs/1610.09072.Felix Xinnan X Yu, Ananda Theertha Suresh, Krzysztof M Choromanski, Daniel N Holtmann-Rice, and Sanjiv Kumar. Orthogonal random features. In D. Lee, M. Sugiyama, U. Luxburg, I. Guyon, and R. Garnett, editors, Advances in Neural Information Processing Systems, volume 29, pages 1975–1983. Curran Associates, Inc., 2016. URL https://proceedings.neurips.cc/paper/2016/file/53adaf494dc89ef7196d73636eb2451b-Paper.pdf.Houssam Zenati, Manon Romain, Chuan-Sheng Foo, Bruno Lecouat, and Vijay Chandrasekhar. Adversarially learned anomaly detection. In 2018 IEEE International conference on data mining (ICDM), pages 727–736. IEEE, 2018.Liangwei Zhang, Jing Lin, and Ramin Karim. Adaptive kernel density-based anomaly detection for nonlinear systems. Knowledge-Based Systems, 139:50–63, 2018. ISSN 09507051. doi: 10.1016/j.knosys.2017.10.009.Yue Zhao, Zain Nasrullah, and Zheng Li. Pyod: A python toolbox for scalable outlier detection. Journal of Machine Learning Research, 20, 2019. ISSN 15337928.Xun Zhou, Sicong Cheng, Meng Zhu, Chengkun Guo, Sida Zhou, Peng Xu, Zhenghua Xue, and Weishi Zhang. A state of the art survey of data mining-based fraud detection and credit scoring. volume 189. EDP Sciences, 8 2018. doi: 10.1051/matecconf/ 201818903002.Gallego-Mejia, J. A. (2023, June). Efficient non-parametric neural density estimation and its application to outlier and anomaly detection. In Proceedings of the AAAI Conference on Artificial Intelligence (Vol. 37, No. 13, pp. 16117-16118).InvestigadoresLICENSElicense.txtlicense.txttext/plain; charset=utf-85879https://repositorio.unal.edu.co/bitstream/unal/84781/3/license.txteb34b1cf90b7e1103fc9dfd26be24b4aMD53ORIGINAL1022369610-2023.pdf1022369610-2023.pdfTesis de Doctorado en Ingeniería - Sistemas y Computaciónapplication/pdf11263997https://repositorio.unal.edu.co/bitstream/unal/84781/4/1022369610-2023.pdf235373928c11e7514ce52eba236eb00dMD54THUMBNAIL1022369610-2023.pdf.jpg1022369610-2023.pdf.jpgGenerated Thumbnailimage/jpeg4606https://repositorio.unal.edu.co/bitstream/unal/84781/5/1022369610-2023.pdf.jpgc699c9de2d8b6dc864a1dfa5d1302760MD55unal/84781oai:repositorio.unal.edu.co:unal/847812023-10-06 23:03:55.963Repositorio Institucional Universidad Nacional de Colombiarepositorio_nal@unal.edu.coUEFSVEUgMS4gVMOJUk1JTk9TIERFIExBIExJQ0VOQ0lBIFBBUkEgUFVCTElDQUNJw5NOIERFIE9CUkFTIEVOIEVMIFJFUE9TSVRPUklPIElOU1RJVFVDSU9OQUwgVU5BTC4KCkxvcyBhdXRvcmVzIHkvbyB0aXR1bGFyZXMgZGUgbG9zIGRlcmVjaG9zIHBhdHJpbW9uaWFsZXMgZGUgYXV0b3IsIGNvbmZpZXJlbiBhIGxhIFVuaXZlcnNpZGFkIE5hY2lvbmFsIGRlIENvbG9tYmlhIHVuYSBsaWNlbmNpYSBubyBleGNsdXNpdmEsIGxpbWl0YWRhIHkgZ3JhdHVpdGEgc29icmUgbGEgb2JyYSBxdWUgc2UgaW50ZWdyYSBlbiBlbCBSZXBvc2l0b3JpbyBJbnN0aXR1Y2lvbmFsLCBiYWpvIGxvcyBzaWd1aWVudGVzIHTDqXJtaW5vczoKCgphKQlMb3MgYXV0b3JlcyB5L28gbG9zIHRpdHVsYXJlcyBkZSBsb3MgZGVyZWNob3MgcGF0cmltb25pYWxlcyBkZSBhdXRvciBzb2JyZSBsYSBvYnJhIGNvbmZpZXJlbiBhIGxhIFVuaXZlcnNpZGFkIE5hY2lvbmFsIGRlIENvbG9tYmlhIHVuYSBsaWNlbmNpYSBubyBleGNsdXNpdmEgcGFyYSByZWFsaXphciBsb3Mgc2lndWllbnRlcyBhY3RvcyBzb2JyZSBsYSBvYnJhOiBpKSByZXByb2R1Y2lyIGxhIG9icmEgZGUgbWFuZXJhIGRpZ2l0YWwsIHBlcm1hbmVudGUgbyB0ZW1wb3JhbCwgaW5jbHV5ZW5kbyBlbCBhbG1hY2VuYW1pZW50byBlbGVjdHLDs25pY28sIGFzw60gY29tbyBjb252ZXJ0aXIgZWwgZG9jdW1lbnRvIGVuIGVsIGN1YWwgc2UgZW5jdWVudHJhIGNvbnRlbmlkYSBsYSBvYnJhIGEgY3VhbHF1aWVyIG1lZGlvIG8gZm9ybWF0byBleGlzdGVudGUgYSBsYSBmZWNoYSBkZSBsYSBzdXNjcmlwY2nDs24gZGUgbGEgcHJlc2VudGUgbGljZW5jaWEsIHkgaWkpIGNvbXVuaWNhciBhbCBww7pibGljbyBsYSBvYnJhIHBvciBjdWFscXVpZXIgbWVkaW8gbyBwcm9jZWRpbWllbnRvLCBlbiBtZWRpb3MgYWzDoW1icmljb3MgbyBpbmFsw6FtYnJpY29zLCBpbmNsdXllbmRvIGxhIHB1ZXN0YSBhIGRpc3Bvc2ljacOzbiBlbiBhY2Nlc28gYWJpZXJ0by4gQWRpY2lvbmFsIGEgbG8gYW50ZXJpb3IsIGVsIGF1dG9yIHkvbyB0aXR1bGFyIGF1dG9yaXphIGEgbGEgVW5pdmVyc2lkYWQgTmFjaW9uYWwgZGUgQ29sb21iaWEgcGFyYSBxdWUsIGVuIGxhIHJlcHJvZHVjY2nDs24geSBjb211bmljYWNpw7NuIGFsIHDDumJsaWNvIHF1ZSBsYSBVbml2ZXJzaWRhZCByZWFsaWNlIHNvYnJlIGxhIG9icmEsIGhhZ2EgbWVuY2nDs24gZGUgbWFuZXJhIGV4cHJlc2EgYWwgdGlwbyBkZSBsaWNlbmNpYSBDcmVhdGl2ZSBDb21tb25zIGJham8gbGEgY3VhbCBlbCBhdXRvciB5L28gdGl0dWxhciBkZXNlYSBvZnJlY2VyIHN1IG9icmEgYSBsb3MgdGVyY2Vyb3MgcXVlIGFjY2VkYW4gYSBkaWNoYSBvYnJhIGEgdHJhdsOpcyBkZWwgUmVwb3NpdG9yaW8gSW5zdGl0dWNpb25hbCwgY3VhbmRvIHNlYSBlbCBjYXNvLiBFbCBhdXRvciB5L28gdGl0dWxhciBkZSBsb3MgZGVyZWNob3MgcGF0cmltb25pYWxlcyBkZSBhdXRvciBwb2Ryw6EgZGFyIHBvciB0ZXJtaW5hZGEgbGEgcHJlc2VudGUgbGljZW5jaWEgbWVkaWFudGUgc29saWNpdHVkIGVsZXZhZGEgYSBsYSBEaXJlY2Npw7NuIE5hY2lvbmFsIGRlIEJpYmxpb3RlY2FzIGRlIGxhIFVuaXZlcnNpZGFkIE5hY2lvbmFsIGRlIENvbG9tYmlhLiAKCmIpIAlMb3MgYXV0b3JlcyB5L28gdGl0dWxhcmVzIGRlIGxvcyBkZXJlY2hvcyBwYXRyaW1vbmlhbGVzIGRlIGF1dG9yIHNvYnJlIGxhIG9icmEgY29uZmllcmVuIGxhIGxpY2VuY2lhIHNlw7FhbGFkYSBlbiBlbCBsaXRlcmFsIGEpIGRlbCBwcmVzZW50ZSBkb2N1bWVudG8gcG9yIGVsIHRpZW1wbyBkZSBwcm90ZWNjacOzbiBkZSBsYSBvYnJhIGVuIHRvZG9zIGxvcyBwYcOtc2VzIGRlbCBtdW5kbywgZXN0byBlcywgc2luIGxpbWl0YWNpw7NuIHRlcnJpdG9yaWFsIGFsZ3VuYS4KCmMpCUxvcyBhdXRvcmVzIHkvbyB0aXR1bGFyZXMgZGUgZGVyZWNob3MgcGF0cmltb25pYWxlcyBkZSBhdXRvciBtYW5pZmllc3RhbiBlc3RhciBkZSBhY3VlcmRvIGNvbiBxdWUgbGEgcHJlc2VudGUgbGljZW5jaWEgc2Ugb3RvcmdhIGEgdMOtdHVsbyBncmF0dWl0bywgcG9yIGxvIHRhbnRvLCByZW51bmNpYW4gYSByZWNpYmlyIGN1YWxxdWllciByZXRyaWJ1Y2nDs24gZWNvbsOzbWljYSBvIGVtb2x1bWVudG8gYWxndW5vIHBvciBsYSBwdWJsaWNhY2nDs24sIGRpc3RyaWJ1Y2nDs24sIGNvbXVuaWNhY2nDs24gcMO6YmxpY2EgeSBjdWFscXVpZXIgb3RybyB1c28gcXVlIHNlIGhhZ2EgZW4gbG9zIHTDqXJtaW5vcyBkZSBsYSBwcmVzZW50ZSBsaWNlbmNpYSB5IGRlIGxhIGxpY2VuY2lhIENyZWF0aXZlIENvbW1vbnMgY29uIHF1ZSBzZSBwdWJsaWNhLgoKZCkJUXVpZW5lcyBmaXJtYW4gZWwgcHJlc2VudGUgZG9jdW1lbnRvIGRlY2xhcmFuIHF1ZSBwYXJhIGxhIGNyZWFjacOzbiBkZSBsYSBvYnJhLCBubyBzZSBoYW4gdnVsbmVyYWRvIGxvcyBkZXJlY2hvcyBkZSBwcm9waWVkYWQgaW50ZWxlY3R1YWwsIGluZHVzdHJpYWwsIG1vcmFsZXMgeSBwYXRyaW1vbmlhbGVzIGRlIHRlcmNlcm9zLiBEZSBvdHJhIHBhcnRlLCAgcmVjb25vY2VuIHF1ZSBsYSBVbml2ZXJzaWRhZCBOYWNpb25hbCBkZSBDb2xvbWJpYSBhY3TDumEgY29tbyB1biB0ZXJjZXJvIGRlIGJ1ZW5hIGZlIHkgc2UgZW5jdWVudHJhIGV4ZW50YSBkZSBjdWxwYSBlbiBjYXNvIGRlIHByZXNlbnRhcnNlIGFsZ8O6biB0aXBvIGRlIHJlY2xhbWFjacOzbiBlbiBtYXRlcmlhIGRlIGRlcmVjaG9zIGRlIGF1dG9yIG8gcHJvcGllZGFkIGludGVsZWN0dWFsIGVuIGdlbmVyYWwuIFBvciBsbyB0YW50bywgbG9zIGZpcm1hbnRlcyAgYWNlcHRhbiBxdWUgY29tbyB0aXR1bGFyZXMgw7puaWNvcyBkZSBsb3MgZGVyZWNob3MgcGF0cmltb25pYWxlcyBkZSBhdXRvciwgYXN1bWlyw6FuIHRvZGEgbGEgcmVzcG9uc2FiaWxpZGFkIGNpdmlsLCBhZG1pbmlzdHJhdGl2YSB5L28gcGVuYWwgcXVlIHB1ZWRhIGRlcml2YXJzZSBkZSBsYSBwdWJsaWNhY2nDs24gZGUgbGEgb2JyYS4gIAoKZikJQXV0b3JpemFuIGEgbGEgVW5pdmVyc2lkYWQgTmFjaW9uYWwgZGUgQ29sb21iaWEgaW5jbHVpciBsYSBvYnJhIGVuIGxvcyBhZ3JlZ2Fkb3JlcyBkZSBjb250ZW5pZG9zLCBidXNjYWRvcmVzIGFjYWTDqW1pY29zLCBtZXRhYnVzY2Fkb3Jlcywgw61uZGljZXMgeSBkZW3DoXMgbWVkaW9zIHF1ZSBzZSBlc3RpbWVuIG5lY2VzYXJpb3MgcGFyYSBwcm9tb3ZlciBlbCBhY2Nlc28geSBjb25zdWx0YSBkZSBsYSBtaXNtYS4gCgpnKQlFbiBlbCBjYXNvIGRlIGxhcyB0ZXNpcyBjcmVhZGFzIHBhcmEgb3B0YXIgZG9ibGUgdGl0dWxhY2nDs24sIGxvcyBmaXJtYW50ZXMgc2Vyw6FuIGxvcyByZXNwb25zYWJsZXMgZGUgY29tdW5pY2FyIGEgbGFzIGluc3RpdHVjaW9uZXMgbmFjaW9uYWxlcyBvIGV4dHJhbmplcmFzIGVuIGNvbnZlbmlvLCBsYXMgbGljZW5jaWFzIGRlIGFjY2VzbyBhYmllcnRvIENyZWF0aXZlIENvbW1vbnMgeSBhdXRvcml6YWNpb25lcyBhc2lnbmFkYXMgYSBzdSBvYnJhIHBhcmEgbGEgcHVibGljYWNpw7NuIGVuIGVsIFJlcG9zaXRvcmlvIEluc3RpdHVjaW9uYWwgVU5BTCBkZSBhY3VlcmRvIGNvbiBsYXMgZGlyZWN0cmljZXMgZGUgbGEgUG9sw610aWNhIEdlbmVyYWwgZGUgbGEgQmlibGlvdGVjYSBEaWdpdGFsLgoKCmgpCVNlIGF1dG9yaXphIGEgbGEgVW5pdmVyc2lkYWQgTmFjaW9uYWwgZGUgQ29sb21iaWEgY29tbyByZXNwb25zYWJsZSBkZWwgdHJhdGFtaWVudG8gZGUgZGF0b3MgcGVyc29uYWxlcywgZGUgYWN1ZXJkbyBjb24gbGEgbGV5IDE1ODEgZGUgMjAxMiBlbnRlbmRpZW5kbyBxdWUgc2UgZW5jdWVudHJhbiBiYWpvIG1lZGlkYXMgcXVlIGdhcmFudGl6YW4gbGEgc2VndXJpZGFkLCBjb25maWRlbmNpYWxpZGFkIGUgaW50ZWdyaWRhZCwgeSBzdSB0cmF0YW1pZW50byB0aWVuZSB1bmEgZmluYWxpZGFkIGhpc3TDs3JpY2EsIGVzdGFkw61zdGljYSBvIGNpZW50w61maWNhIHNlZ8O6biBsbyBkaXNwdWVzdG8gZW4gbGEgUG9sw610aWNhIGRlIFRyYXRhbWllbnRvIGRlIERhdG9zIFBlcnNvbmFsZXMuCgoKClBBUlRFIDIuIEFVVE9SSVpBQ0nDk04gUEFSQSBQVUJMSUNBUiBZIFBFUk1JVElSIExBIENPTlNVTFRBIFkgVVNPIERFIE9CUkFTIEVOIEVMIFJFUE9TSVRPUklPIElOU1RJVFVDSU9OQUwgVU5BTC4KClNlIGF1dG9yaXphIGxhIHB1YmxpY2FjacOzbiBlbGVjdHLDs25pY2EsIGNvbnN1bHRhIHkgdXNvIGRlIGxhIG9icmEgcG9yIHBhcnRlIGRlIGxhIFVuaXZlcnNpZGFkIE5hY2lvbmFsIGRlIENvbG9tYmlhIHkgZGUgc3VzIHVzdWFyaW9zIGRlIGxhIHNpZ3VpZW50ZSBtYW5lcmE6CgphLglDb25jZWRvIGxpY2VuY2lhIGVuIGxvcyB0w6lybWlub3Mgc2XDsWFsYWRvcyBlbiBsYSBwYXJ0ZSAxIGRlbCBwcmVzZW50ZSBkb2N1bWVudG8sIGNvbiBlbCBvYmpldGl2byBkZSBxdWUgbGEgb2JyYSBlbnRyZWdhZGEgc2VhIHB1YmxpY2FkYSBlbiBlbCBSZXBvc2l0b3JpbyBJbnN0aXR1Y2lvbmFsIGRlIGxhIFVuaXZlcnNpZGFkIE5hY2lvbmFsIGRlIENvbG9tYmlhIHkgcHVlc3RhIGEgZGlzcG9zaWNpw7NuIGVuIGFjY2VzbyBhYmllcnRvIHBhcmEgc3UgY29uc3VsdGEgcG9yIGxvcyB1c3VhcmlvcyBkZSBsYSBVbml2ZXJzaWRhZCBOYWNpb25hbCBkZSBDb2xvbWJpYSAgYSB0cmF2w6lzIGRlIGludGVybmV0LgoKCgpQQVJURSAzIEFVVE9SSVpBQ0nDk04gREUgVFJBVEFNSUVOVE8gREUgREFUT1MgUEVSU09OQUxFUy4KCkxhIFVuaXZlcnNpZGFkIE5hY2lvbmFsIGRlIENvbG9tYmlhLCBjb21vIHJlc3BvbnNhYmxlIGRlbCBUcmF0YW1pZW50byBkZSBEYXRvcyBQZXJzb25hbGVzLCBpbmZvcm1hIHF1ZSBsb3MgZGF0b3MgZGUgY2Fyw6FjdGVyIHBlcnNvbmFsIHJlY29sZWN0YWRvcyBtZWRpYW50ZSBlc3RlIGZvcm11bGFyaW8sIHNlIGVuY3VlbnRyYW4gYmFqbyBtZWRpZGFzIHF1ZSBnYXJhbnRpemFuIGxhIHNlZ3VyaWRhZCwgY29uZmlkZW5jaWFsaWRhZCBlIGludGVncmlkYWQgeSBzdSB0cmF0YW1pZW50byBzZSByZWFsaXphIGRlIGFjdWVyZG8gYWwgY3VtcGxpbWllbnRvIG5vcm1hdGl2byBkZSBsYSBMZXkgMTU4MSBkZSAyMDEyIHkgZGUgbGEgUG9sw610aWNhIGRlIFRyYXRhbWllbnRvIGRlIERhdG9zIFBlcnNvbmFsZXMgZGUgbGEgVW5pdmVyc2lkYWQgTmFjaW9uYWwgZGUgQ29sb21iaWEuIFB1ZWRlIGVqZXJjZXIgc3VzIGRlcmVjaG9zIGNvbW8gdGl0dWxhciBhIGNvbm9jZXIsIGFjdHVhbGl6YXIsIHJlY3RpZmljYXIgeSByZXZvY2FyIGxhcyBhdXRvcml6YWNpb25lcyBkYWRhcyBhIGxhcyBmaW5hbGlkYWRlcyBhcGxpY2FibGVzIGEgdHJhdsOpcyBkZSBsb3MgY2FuYWxlcyBkaXNwdWVzdG9zIHkgZGlzcG9uaWJsZXMgZW4gd3d3LnVuYWwuZWR1LmNvIG8gZS1tYWlsOiBwcm90ZWNkYXRvc19uYUB1bmFsLmVkdS5jbyIKClRlbmllbmRvIGVuIGN1ZW50YSBsbyBhbnRlcmlvciwgYXV0b3Jpem8gZGUgbWFuZXJhIHZvbHVudGFyaWEsIHByZXZpYSwgZXhwbMOtY2l0YSwgaW5mb3JtYWRhIGUgaW5lcXXDrXZvY2EgYSBsYSBVbml2ZXJzaWRhZCBOYWNpb25hbCBkZSBDb2xvbWJpYSBhIHRyYXRhciBsb3MgZGF0b3MgcGVyc29uYWxlcyBkZSBhY3VlcmRvIGNvbiBsYXMgZmluYWxpZGFkZXMgZXNwZWPDrWZpY2FzIHBhcmEgZWwgZGVzYXJyb2xsbyB5IGVqZXJjaWNpbyBkZSBsYXMgZnVuY2lvbmVzIG1pc2lvbmFsZXMgZGUgZG9jZW5jaWEsIGludmVzdGlnYWNpw7NuIHkgZXh0ZW5zacOzbiwgYXPDrSBjb21vIGxhcyByZWxhY2lvbmVzIGFjYWTDqW1pY2FzLCBsYWJvcmFsZXMsIGNvbnRyYWN0dWFsZXMgeSB0b2RhcyBsYXMgZGVtw6FzIHJlbGFjaW9uYWRhcyBjb24gZWwgb2JqZXRvIHNvY2lhbCBkZSBsYSBVbml2ZXJzaWRhZC4gCgo=