Using machine learning algorithms in the search for a Z' vector boson at LHC

Six hundred million events per second and one million bytes per each, constitute the rate of raw data production, that experiments at the Large Hadron Collider (LHC) have to handle on average. Analyzing large amounts of data is an important task in high energy physics (HEP), the area of physical sci...

Full description

Autores:
Torroledo Peña, Iván Darío
Tipo de recurso:
Trabajo de grado de pregrado
Fecha de publicación:
2017
Institución:
Universidad de los Andes
Repositorio:
Séneca: repositorio Uniandes
Idioma:
eng
OAI Identifier:
oai:repositorio.uniandes.edu.co:1992/45681
Acceso en línea:
http://hdl.handle.net/1992/45681
Palabra clave:
Bosones Z
Aprendizaje automático (Inteligencia artificial)
Partículas (Física nuclear)
Aceleradores de partículas
Física
Rights
openAccess
License
http://creativecommons.org/licenses/by-nc-sa/4.0/
Description
Summary:Six hundred million events per second and one million bytes per each, constitute the rate of raw data production, that experiments at the Large Hadron Collider (LHC) have to handle on average. Analyzing large amounts of data is an important task in high energy physics (HEP), the area of physical sciences that studies elementary particles and their interaction at the most fundamental level. Although, in the beginning this task was made through the study of astrophysical cosmic rays, posterior years led to the use of particle accelerators and detectors, progressively higher in scale. At present, the main HEP project is the LHC located at the European Organization for Nuclear Research (CERN). Several experiments at the LHC, such as ATLAS and CMS analyze data from proton-proton and/or heavy ion collisions. Thus, the large amount of data that HEP experiments have to process, represents a computational challenge. To overcome this challenge, there has been a progress in data analysis techniques used to study the amount of data produced by experiments, along with the development of particle accelerators. In the beginning of 1960 the main technique was multivariate analysis (MVA), but in later years this would be known as machine learning (ML). Over time, ML algorithms, such as Boosted Decision Trees or Neural Networks, started to become commonly used in trigger systems and particle reconstruction in HEP experiments. However, in recent years there is a lack of implementation of newer techniques in HEP, as opposed to the boost of novel techniques in other areas such technology, artificial intelligence or business. As a result, recent papers have proposed the use of others machine learning algorithms like support vector machine (SVM) or deep learning (DL), convolutional neural networks (CNN), region based CNNs, generative adversarial networks (GANs) and deep boltzmann machines, arguing improvements in model performance and data fitting.