3D Human pose estimation from egocentric inputs
Egocentric pose estimation is essential for developing embodied AI systems capable of interacting naturally with humans and their environments. This thesis addresses the challenges of first-person pose estimation through a series of interconnected studies. The first study, BoDiffusion, presents a ge...
- Autores:
-
Escobar Palomeque, María Camila
- Tipo de recurso:
- Doctoral thesis
- Fecha de publicación:
- 2024
- Institución:
- Universidad de los Andes
- Repositorio:
- Séneca: repositorio Uniandes
- Idioma:
- eng
- OAI Identifier:
- oai:repositorio.uniandes.edu.co:1992/75400
- Acceso en línea:
- https://hdl.handle.net/1992/75400
- Palabra clave:
- Egocentric vision
Pose estimation
Pose forecasting
Ingeniería
- Rights
- openAccess
- License
- Attribution-NonCommercial-NoDerivatives 4.0 International
id |
UNIANDES2_85538b8c7ead5a9efd00fac742d249af |
---|---|
oai_identifier_str |
oai:repositorio.uniandes.edu.co:1992/75400 |
network_acronym_str |
UNIANDES2 |
network_name_str |
Séneca: repositorio Uniandes |
repository_id_str |
|
dc.title.eng.fl_str_mv |
3D Human pose estimation from egocentric inputs |
title |
3D Human pose estimation from egocentric inputs |
spellingShingle |
3D Human pose estimation from egocentric inputs Egocentric vision Pose estimation Pose forecasting Ingeniería |
title_short |
3D Human pose estimation from egocentric inputs |
title_full |
3D Human pose estimation from egocentric inputs |
title_fullStr |
3D Human pose estimation from egocentric inputs |
title_full_unstemmed |
3D Human pose estimation from egocentric inputs |
title_sort |
3D Human pose estimation from egocentric inputs |
dc.creator.fl_str_mv |
Escobar Palomeque, María Camila |
dc.contributor.advisor.none.fl_str_mv |
Arbeláez Escalante, Pablo Andrés |
dc.contributor.author.none.fl_str_mv |
Escobar Palomeque, María Camila |
dc.contributor.jury.none.fl_str_mv |
Giraldo Trujillo, Luis Felipe Kevis-Kokitsi Maninis Thabet, Ali |
dc.contributor.researchgroup.none.fl_str_mv |
Facultad de Ingeniería |
dc.subject.keyword.eng.fl_str_mv |
Egocentric vision Pose estimation Pose forecasting |
topic |
Egocentric vision Pose estimation Pose forecasting Ingeniería |
dc.subject.themes.spa.fl_str_mv |
Ingeniería |
description |
Egocentric pose estimation is essential for developing embodied AI systems capable of interacting naturally with humans and their environments. This thesis addresses the challenges of first-person pose estimation through a series of interconnected studies. The first study, BoDiffusion, presents a generative model that synthesizes full-body motion from sparse inputs. The second study, Ego-Exo4D, establishes a benchmark for pose estimation in real-life settings with diverse activities. The final study, EgoCast, focuses on current pose estimation and forecasting in the wild, integrating visual and proprioceptive inputs to handle dynamic and unscripted environments. Together, these contributions provide robust, temporally consistent methods for real-world 3D pose estimation. |
publishDate |
2024 |
dc.date.issued.none.fl_str_mv |
2024-12-12 |
dc.date.accessioned.none.fl_str_mv |
2025-01-14T15:49:01Z |
dc.date.available.none.fl_str_mv |
2025-01-14T15:49:01Z |
dc.type.none.fl_str_mv |
Trabajo de grado - Doctorado |
dc.type.driver.none.fl_str_mv |
info:eu-repo/semantics/doctoralThesis |
dc.type.version.none.fl_str_mv |
info:eu-repo/semantics/acceptedVersion |
dc.type.coar.none.fl_str_mv |
http://purl.org/coar/resource_type/c_db06 |
dc.type.content.none.fl_str_mv |
Text |
dc.type.redcol.none.fl_str_mv |
https://purl.org/redcol/resource_type/TD |
format |
http://purl.org/coar/resource_type/c_db06 |
status_str |
acceptedVersion |
dc.identifier.uri.none.fl_str_mv |
https://hdl.handle.net/1992/75400 |
dc.identifier.instname.none.fl_str_mv |
instname:Universidad de los Andes |
dc.identifier.reponame.none.fl_str_mv |
reponame:Repositorio Institucional Séneca |
dc.identifier.repourl.none.fl_str_mv |
repourl:https://repositorio.uniandes.edu.co/ |
url |
https://hdl.handle.net/1992/75400 |
identifier_str_mv |
instname:Universidad de los Andes reponame:Repositorio Institucional Séneca repourl:https://repositorio.uniandes.edu.co/ |
dc.language.iso.none.fl_str_mv |
eng |
language |
eng |
dc.rights.en.fl_str_mv |
Attribution-NonCommercial-NoDerivatives 4.0 International |
dc.rights.uri.none.fl_str_mv |
http://creativecommons.org/licenses/by-nc-nd/4.0/ |
dc.rights.accessrights.none.fl_str_mv |
info:eu-repo/semantics/openAccess |
dc.rights.coar.none.fl_str_mv |
http://purl.org/coar/access_right/c_abf2 |
rights_invalid_str_mv |
Attribution-NonCommercial-NoDerivatives 4.0 International http://creativecommons.org/licenses/by-nc-nd/4.0/ http://purl.org/coar/access_right/c_abf2 |
eu_rights_str_mv |
openAccess |
dc.format.extent.none.fl_str_mv |
85 páginas |
dc.format.mimetype.none.fl_str_mv |
application/pdf |
dc.publisher.none.fl_str_mv |
Universidad de los Andes |
dc.publisher.program.none.fl_str_mv |
Doctorado en Ingeniería |
dc.publisher.faculty.none.fl_str_mv |
Facultad de Ingeniería |
publisher.none.fl_str_mv |
Universidad de los Andes |
institution |
Universidad de los Andes |
bitstream.url.fl_str_mv |
https://repositorio.uniandes.edu.co/bitstreams/161754c2-4bf5-41a7-be3c-d3572861d5e0/download https://repositorio.uniandes.edu.co/bitstreams/20e0500d-12dc-4867-96d9-317a0573e9b1/download https://repositorio.uniandes.edu.co/bitstreams/9936e6d2-0126-4922-9654-a1adde904017/download https://repositorio.uniandes.edu.co/bitstreams/3d830678-4cbd-4101-a99f-5a74ad5d1123/download https://repositorio.uniandes.edu.co/bitstreams/1d0d8b5a-0b01-4fc9-8eb5-224d215071af/download https://repositorio.uniandes.edu.co/bitstreams/ede8a343-0c34-410d-83fe-fd3e3074c1f4/download https://repositorio.uniandes.edu.co/bitstreams/a08f3924-e3c9-4df5-9814-1de472808875/download https://repositorio.uniandes.edu.co/bitstreams/71cf6db1-af8c-4666-a6d4-5d78178ee652/download |
bitstream.checksum.fl_str_mv |
7d28a10d09cf20e2059dab4d74c4d9a2 f594fe8c01b59680f85e693a349232d5 4460e5956bc1d1639be9ae6146a50347 ae9e573a68e7f92501b6913cc846c39f eed94123b3664134e91c8d7afdb3e60e 3f7db7e9a1130c0929dec54179d91ecd f2111e44fdd5140f8e093425da20ba06 db507722fc7e945a9aa0fbc25f2f5f50 |
bitstream.checksumAlgorithm.fl_str_mv |
MD5 MD5 MD5 MD5 MD5 MD5 MD5 MD5 |
repository.name.fl_str_mv |
Repositorio institucional Séneca |
repository.mail.fl_str_mv |
adminrepositorio@uniandes.edu.co |
_version_ |
1831927721772449792 |
spelling |
Arbeláez Escalante, Pablo Andrésvirtual::22034-1Escobar Palomeque, María CamilaGiraldo Trujillo, Luis FelipeKevis-Kokitsi ManinisThabet, AliFacultad de Ingeniería2025-01-14T15:49:01Z2025-01-14T15:49:01Z2024-12-12https://hdl.handle.net/1992/75400instname:Universidad de los Andesreponame:Repositorio Institucional Sénecarepourl:https://repositorio.uniandes.edu.co/Egocentric pose estimation is essential for developing embodied AI systems capable of interacting naturally with humans and their environments. This thesis addresses the challenges of first-person pose estimation through a series of interconnected studies. The first study, BoDiffusion, presents a generative model that synthesizes full-body motion from sparse inputs. The second study, Ego-Exo4D, establishes a benchmark for pose estimation in real-life settings with diverse activities. The final study, EgoCast, focuses on current pose estimation and forecasting in the wild, integrating visual and proprioceptive inputs to handle dynamic and unscripted environments. Together, these contributions provide robust, temporally consistent methods for real-world 3D pose estimation.La estimación de pose en primera persona es fundamental para el desarrollo de sistemas de Inteligencia Artificial capaces de interactuar de manera natural con los humanos y sus entornos. Esta tesis aborda los desafíos de la estimación de pose egocéntrica a través de una serie de estudios interconectados. El primer estudio, BoDiffusion, presenta un modelo generativo que genera movimientos corporales completos a partir de poca información. El segundo estudio, Ego-Exo4D, establece un punto de referencia para estudiar la estimación de pose en escenarios de la vida real. El estudio final, EgoCast, se centra en la estimación de la pose actual y en la predicción de movimientos futuros en entornos dinámicos y no estructurados, integrando datos visuales y propioceptivos. En conjunto, estas contribuciones proporcionan métodos adaptables, robustos y consistentes en el tiempo para aplicaciones en el mundo real.Doctorado85 páginasapplication/pdfengUniversidad de los AndesDoctorado en IngenieríaFacultad de IngenieríaAttribution-NonCommercial-NoDerivatives 4.0 Internationalhttp://creativecommons.org/licenses/by-nc-nd/4.0/info:eu-repo/semantics/openAccesshttp://purl.org/coar/access_right/c_abf23D Human pose estimation from egocentric inputsTrabajo de grado - Doctoradoinfo:eu-repo/semantics/doctoralThesisinfo:eu-repo/semantics/acceptedVersionhttp://purl.org/coar/resource_type/c_db06Texthttps://purl.org/redcol/resource_type/TDEgocentric visionPose estimationPose forecastingIngeniería201423470Publicationhttps://scholar.google.es/citations?user=k0nZO90AAAAJvirtual::22034-1https://scholar.google.es/citations?user=4TGvo8AAAAJ0000-0001-5244-2407virtual::22034-1https://scienti.minciencias.gov.co/cvlac/visualizador/generarCurriculoCv.do?cod_rh=0001579086virtual::22034-1https://scienti.minciencias.gov.co/cvlac/visualizador/generarCurriculoCv.do?cod_rh=0000802506b4f52d42-ce2a-4e74-a22f-e52a6bfbd48evirtual::22034-1b4f52d42-ce2a-4e74-a22f-e52a6bfbd48evirtual::22034-1eb386eec-3ec8-40c2-829d-ae8cbf0e384eORIGINAL3D Human Pose Estimation from Egocentric Inputs.pdf3D Human Pose Estimation from Egocentric Inputs.pdfapplication/pdf22404742https://repositorio.uniandes.edu.co/bitstreams/161754c2-4bf5-41a7-be3c-d3572861d5e0/download7d28a10d09cf20e2059dab4d74c4d9a2MD51autorizacion tesis.pdfautorizacion tesis.pdfHIDEapplication/pdf336747https://repositorio.uniandes.edu.co/bitstreams/20e0500d-12dc-4867-96d9-317a0573e9b1/downloadf594fe8c01b59680f85e693a349232d5MD52CC-LICENSElicense_rdflicense_rdfapplication/rdf+xml; charset=utf-8805https://repositorio.uniandes.edu.co/bitstreams/9936e6d2-0126-4922-9654-a1adde904017/download4460e5956bc1d1639be9ae6146a50347MD53LICENSElicense.txtlicense.txttext/plain; charset=utf-82535https://repositorio.uniandes.edu.co/bitstreams/3d830678-4cbd-4101-a99f-5a74ad5d1123/downloadae9e573a68e7f92501b6913cc846c39fMD54TEXT3D Human Pose Estimation from Egocentric Inputs.pdf.txt3D Human Pose Estimation from Egocentric Inputs.pdf.txtExtracted texttext/plain100654https://repositorio.uniandes.edu.co/bitstreams/1d0d8b5a-0b01-4fc9-8eb5-224d215071af/downloadeed94123b3664134e91c8d7afdb3e60eMD55autorizacion tesis.pdf.txtautorizacion tesis.pdf.txtExtracted texttext/plain2028https://repositorio.uniandes.edu.co/bitstreams/ede8a343-0c34-410d-83fe-fd3e3074c1f4/download3f7db7e9a1130c0929dec54179d91ecdMD57THUMBNAIL3D Human Pose Estimation from Egocentric Inputs.pdf.jpg3D Human Pose Estimation from Egocentric Inputs.pdf.jpgGenerated Thumbnailimage/jpeg8901https://repositorio.uniandes.edu.co/bitstreams/a08f3924-e3c9-4df5-9814-1de472808875/downloadf2111e44fdd5140f8e093425da20ba06MD56autorizacion tesis.pdf.jpgautorizacion tesis.pdf.jpgGenerated Thumbnailimage/jpeg11124https://repositorio.uniandes.edu.co/bitstreams/71cf6db1-af8c-4666-a6d4-5d78178ee652/downloaddb507722fc7e945a9aa0fbc25f2f5f50MD581992/75400oai:repositorio.uniandes.edu.co:1992/754002025-01-15 03:08:13.876http://creativecommons.org/licenses/by-nc-nd/4.0/Attribution-NonCommercial-NoDerivatives 4.0 Internationalopen.accesshttps://repositorio.uniandes.edu.coRepositorio institucional Sénecaadminrepositorio@uniandes.edu.coPGgzPjxzdHJvbmc+RGVzY2FyZ28gZGUgUmVzcG9uc2FiaWxpZGFkIC0gTGljZW5jaWEgZGUgQXV0b3JpemFjacOzbjwvc3Ryb25nPjwvaDM+CjxwPjxzdHJvbmc+UG9yIGZhdm9yIGxlZXIgYXRlbnRhbWVudGUgZXN0ZSBkb2N1bWVudG8gcXVlIHBlcm1pdGUgYWwgUmVwb3NpdG9yaW8gSW5zdGl0dWNpb25hbCBTw6luZWNhIHJlcHJvZHVjaXIgeSBkaXN0cmlidWlyIGxvcyByZWN1cnNvcyBkZSBpbmZvcm1hY2nDs24gZGVwb3NpdGFkb3MgbWVkaWFudGUgbGEgYXV0b3JpemFjacOzbiBkZSBsb3Mgc2lndWllbnRlcyB0w6lybWlub3M6PC9zdHJvbmc+PC9wPgo8cD5Db25jZWRhIGxhIGxpY2VuY2lhIGRlIGRlcMOzc2l0byBlc3TDoW5kYXIgc2VsZWNjaW9uYW5kbyBsYSBvcGNpw7NuIDxzdHJvbmc+J0FjZXB0YXIgbG9zIHTDqXJtaW5vcyBhbnRlcmlvcm1lbnRlIGRlc2NyaXRvcyc8L3N0cm9uZz4geSBjb250aW51YXIgZWwgcHJvY2VzbyBkZSBlbnbDrW8gbWVkaWFudGUgZWwgYm90w7NuIDxzdHJvbmc+J1NpZ3VpZW50ZScuPC9zdHJvbmc+PC9wPgo8aHI+CjxwPllvLCBlbiBtaSBjYWxpZGFkIGRlIGF1dG9yIGRlbCB0cmFiYWpvIGRlIHRlc2lzLCBtb25vZ3JhZsOtYSBvIHRyYWJham8gZGUgZ3JhZG8sIGhhZ28gZW50cmVnYSBkZWwgZWplbXBsYXIgcmVzcGVjdGl2byB5IGRlIHN1cyBhbmV4b3MgZGUgc2VyIGVsIGNhc28sIGVuIGZvcm1hdG8gZGlnaXRhbCB5L28gZWxlY3Ryw7NuaWNvIHkgYXV0b3Jpem8gYSBsYSBVbml2ZXJzaWRhZCBkZSBsb3MgQW5kZXMgcGFyYSBxdWUgcmVhbGljZSBsYSBwdWJsaWNhY2nDs24gZW4gZWwgU2lzdGVtYSBkZSBCaWJsaW90ZWNhcyBvIGVuIGN1YWxxdWllciBvdHJvIHNpc3RlbWEgbyBiYXNlIGRlIGRhdG9zIHByb3BpbyBvIGFqZW5vIGEgbGEgVW5pdmVyc2lkYWQgeSBwYXJhIHF1ZSBlbiBsb3MgdMOpcm1pbm9zIGVzdGFibGVjaWRvcyBlbiBsYSBMZXkgMjMgZGUgMTk4MiwgTGV5IDQ0IGRlIDE5OTMsIERlY2lzacOzbiBBbmRpbmEgMzUxIGRlIDE5OTMsIERlY3JldG8gNDYwIGRlIDE5OTUgeSBkZW3DoXMgbm9ybWFzIGdlbmVyYWxlcyBzb2JyZSBsYSBtYXRlcmlhLCB1dGlsaWNlIGVuIHRvZGFzIHN1cyBmb3JtYXMsIGxvcyBkZXJlY2hvcyBwYXRyaW1vbmlhbGVzIGRlIHJlcHJvZHVjY2nDs24sIGNvbXVuaWNhY2nDs24gcMO6YmxpY2EsIHRyYW5zZm9ybWFjacOzbiB5IGRpc3RyaWJ1Y2nDs24gKGFscXVpbGVyLCBwcsOpc3RhbW8gcMO6YmxpY28gZSBpbXBvcnRhY2nDs24pIHF1ZSBtZSBjb3JyZXNwb25kZW4gY29tbyBjcmVhZG9yIGRlIGxhIG9icmEgb2JqZXRvIGRlbCBwcmVzZW50ZSBkb2N1bWVudG8uPC9wPgo8cD5MYSBwcmVzZW50ZSBhdXRvcml6YWNpw7NuIHNlIGVtaXRlIGVuIGNhbGlkYWQgZGUgYXV0b3IgZGUgbGEgb2JyYSBvYmpldG8gZGVsIHByZXNlbnRlIGRvY3VtZW50byB5IG5vIGNvcnJlc3BvbmRlIGEgY2VzacOzbiBkZSBkZXJlY2hvcywgc2lubyBhIGxhIGF1dG9yaXphY2nDs24gZGUgdXNvIGFjYWTDqW1pY28gZGUgY29uZm9ybWlkYWQgY29uIGxvIGFudGVyaW9ybWVudGUgc2XDsWFsYWRvLiBMYSBwcmVzZW50ZSBhdXRvcml6YWNpw7NuIHNlIGhhY2UgZXh0ZW5zaXZhIG5vIHNvbG8gYSBsYXMgZmFjdWx0YWRlcyB5IGRlcmVjaG9zIGRlIHVzbyBzb2JyZSBsYSBvYnJhIGVuIGZvcm1hdG8gbyBzb3BvcnRlIG1hdGVyaWFsLCBzaW5vIHRhbWJpw6luIHBhcmEgZm9ybWF0byBlbGVjdHLDs25pY28sIHkgZW4gZ2VuZXJhbCBwYXJhIGN1YWxxdWllciBmb3JtYXRvIGNvbm9jaWRvIG8gcG9yIGNvbm9jZXIuPC9wPgo8cD5FbCBhdXRvciwgbWFuaWZpZXN0YSBxdWUgbGEgb2JyYSBvYmpldG8gZGUgbGEgcHJlc2VudGUgYXV0b3JpemFjacOzbiBlcyBvcmlnaW5hbCB5IGxhIHJlYWxpesOzIHNpbiB2aW9sYXIgbyB1c3VycGFyIGRlcmVjaG9zIGRlIGF1dG9yIGRlIHRlcmNlcm9zLCBwb3IgbG8gdGFudG8sIGxhIG9icmEgZXMgZGUgc3UgZXhjbHVzaXZhIGF1dG9yw61hIHkgdGllbmUgbGEgdGl0dWxhcmlkYWQgc29icmUgbGEgbWlzbWEuPC9wPgo8cD5FbiBjYXNvIGRlIHByZXNlbnRhcnNlIGN1YWxxdWllciByZWNsYW1hY2nDs24gbyBhY2Npw7NuIHBvciBwYXJ0ZSBkZSB1biB0ZXJjZXJvIGVuIGN1YW50byBhIGxvcyBkZXJlY2hvcyBkZSBhdXRvciBzb2JyZSBsYSBvYnJhIGVuIGN1ZXN0acOzbiwgZWwgYXV0b3IgYXN1bWlyw6EgdG9kYSBsYSByZXNwb25zYWJpbGlkYWQsIHkgc2FsZHLDoSBkZSBkZWZlbnNhIGRlIGxvcyBkZXJlY2hvcyBhcXXDrSBhdXRvcml6YWRvcywgcGFyYSB0b2RvcyBsb3MgZWZlY3RvcyBsYSBVbml2ZXJzaWRhZCBhY3TDumEgY29tbyB1biB0ZXJjZXJvIGRlIGJ1ZW5hIGZlLjwvcD4KPHA+U2kgdGllbmUgYWxndW5hIGR1ZGEgc29icmUgbGEgbGljZW5jaWEsIHBvciBmYXZvciwgY29udGFjdGUgY29uIGVsIDxhIGhyZWY9Im1haWx0bzpiaWJsaW90ZWNhQHVuaWFuZGVzLmVkdS5jbyIgdGFyZ2V0PSJfYmxhbmsiPkFkbWluaXN0cmFkb3IgZGVsIFNpc3RlbWEuPC9hPjwvcD4K |