A Snapshot of Parallelism in Distributed Deep Learning Training

The accelerated development of applications related to artificial intelligence has generated the creation of increasingly complex neural network models with enormous amounts of parameters, currently reaching up to trillions of parameters. Therefore, it makes your training almost impossible without t...

Full description

Autores:
Romero Sandí, Hairol
Núñez, Gabriel
Rojas, Elvis
Tipo de recurso:
Article of investigation
Fecha de publicación:
2024
Institución:
Universidad Autónoma de Bucaramanga - UNAB
Repositorio:
Repositorio UNAB
Idioma:
spa
OAI Identifier:
oai:repository.unab.edu.co:20.500.12749/26661
Acceso en línea:
http://hdl.handle.net/20.500.12749/26661
https://doi.org/10.29375/25392115.5054
Palabra clave:
Deep learning
Parallelism
Artificial neural networks
Rights
License
http://purl.org/coar/access_right/c_abf2