Symbiosis between Human, Robot and Extended Reality

Se usaron las instalaciones del laboratorio Colivri para practicas relacionadas con la realidad aumentada.

Autores:
Vargas Cuadros, Andrés Felipe
Tipo de recurso:
Trabajo de grado de pregrado
Fecha de publicación:
2024
Institución:
Universidad de los Andes
Repositorio:
Séneca: repositorio Uniandes
Idioma:
eng
OAI Identifier:
oai:repositorio.uniandes.edu.co:1992/74929
Acceso en línea:
https://hdl.handle.net/1992/74929
Palabra clave:
Robot
Symbiosis
Gestures
Dynamics
Ingeniería
Rights
openAccess
License
Attribution 4.0 International
id UNIANDES2_dc53c14d22ea24684ed46d231917a353
oai_identifier_str oai:repositorio.uniandes.edu.co:1992/74929
network_acronym_str UNIANDES2
network_name_str Séneca: repositorio Uniandes
repository_id_str
dc.title.none.fl_str_mv Symbiosis between Human, Robot and Extended Reality
dc.title.alternative.none.fl_str_mv Simbiosis entre humano, robot y realidad aumentada
title Symbiosis between Human, Robot and Extended Reality
spellingShingle Symbiosis between Human, Robot and Extended Reality
Robot
Symbiosis
Gestures
Dynamics
Ingeniería
title_short Symbiosis between Human, Robot and Extended Reality
title_full Symbiosis between Human, Robot and Extended Reality
title_fullStr Symbiosis between Human, Robot and Extended Reality
title_full_unstemmed Symbiosis between Human, Robot and Extended Reality
title_sort Symbiosis between Human, Robot and Extended Reality
dc.creator.fl_str_mv Vargas Cuadros, Andrés Felipe
dc.contributor.advisor.none.fl_str_mv Camargo Leyva, Jonathan
dc.contributor.author.none.fl_str_mv Vargas Cuadros, Andrés Felipe
dc.subject.keyword.none.fl_str_mv Robot
Symbiosis
topic Robot
Symbiosis
Gestures
Dynamics
Ingeniería
dc.subject.keyword.eng.fl_str_mv Gestures
Dynamics
dc.subject.themes.spa.fl_str_mv Ingeniería
description Se usaron las instalaciones del laboratorio Colivri para practicas relacionadas con la realidad aumentada.
publishDate 2024
dc.date.accessioned.none.fl_str_mv 2024-08-02T19:57:41Z
dc.date.available.none.fl_str_mv 2024-08-02T19:57:41Z
dc.date.issued.none.fl_str_mv 2024-07-31
dc.type.none.fl_str_mv Trabajo de grado - Pregrado
dc.type.driver.none.fl_str_mv info:eu-repo/semantics/bachelorThesis
dc.type.version.none.fl_str_mv info:eu-repo/semantics/acceptedVersion
dc.type.coar.none.fl_str_mv http://purl.org/coar/resource_type/c_7a1f
dc.type.content.none.fl_str_mv Text
dc.type.redcol.none.fl_str_mv http://purl.org/redcol/resource_type/TP
format http://purl.org/coar/resource_type/c_7a1f
status_str acceptedVersion
dc.identifier.uri.none.fl_str_mv https://hdl.handle.net/1992/74929
dc.identifier.instname.none.fl_str_mv instname:Universidad de los Andes
dc.identifier.reponame.none.fl_str_mv reponame:Repositorio Institucional Séneca
dc.identifier.repourl.none.fl_str_mv repourl:https://repositorio.uniandes.edu.co/
url https://hdl.handle.net/1992/74929
identifier_str_mv instname:Universidad de los Andes
reponame:Repositorio Institucional Séneca
repourl:https://repositorio.uniandes.edu.co/
dc.language.iso.none.fl_str_mv eng
language eng
dc.relation.references.none.fl_str_mv EPSON. (s.f.). Epson VT6L All-in-One 6-Axis Robot. Obtenido de https://epson.com/6-axis-built-in-controller-industrial-robot
EPSON. (s.f.). EPSON VT6L ALL-IN-ONE 6-AXIS ROBOT. Obtenido de https://www.industrialcontrol.com/epson-vt6l-allinone-6axis-robot
Microsoft. (s.f.). Obtenido de https://support.xbox.com/en-US/help/hardware-network/controller/xbox-accessories-elite-series-2
Parejo, J. C. (2008). La representación Denavit-Hartenberg. Obtenido de La representación Denavit-Hartenberg
dc.rights.en.fl_str_mv Attribution 4.0 International
dc.rights.uri.none.fl_str_mv http://creativecommons.org/licenses/by/4.0/
dc.rights.accessrights.none.fl_str_mv info:eu-repo/semantics/openAccess
dc.rights.coar.none.fl_str_mv http://purl.org/coar/access_right/c_abf2
rights_invalid_str_mv Attribution 4.0 International
http://creativecommons.org/licenses/by/4.0/
http://purl.org/coar/access_right/c_abf2
eu_rights_str_mv openAccess
dc.format.extent.none.fl_str_mv 44 páginas
dc.format.mimetype.none.fl_str_mv application/pdf
dc.publisher.none.fl_str_mv Universidad de los Andes
dc.publisher.program.none.fl_str_mv Ingeniería Mecánica
dc.publisher.faculty.none.fl_str_mv Facultad de Ingeniería
dc.publisher.department.none.fl_str_mv Departamento de Ingeniería Mecánica
publisher.none.fl_str_mv Universidad de los Andes
institution Universidad de los Andes
bitstream.url.fl_str_mv https://repositorio.uniandes.edu.co/bitstreams/759423a4-dd2e-46ef-a455-37e5fb2a412e/download
https://repositorio.uniandes.edu.co/bitstreams/93a2fcc9-c1a9-40b9-bd1c-ef79d1ceb79d/download
https://repositorio.uniandes.edu.co/bitstreams/bcf66836-d3c6-480d-9dfa-b4caa99d7805/download
https://repositorio.uniandes.edu.co/bitstreams/b17ddde5-2665-4d73-959a-5f8db8f26cef/download
https://repositorio.uniandes.edu.co/bitstreams/bb0b03a2-ccb3-45c3-9142-3e445af4b8fb/download
https://repositorio.uniandes.edu.co/bitstreams/979f295d-c887-4b76-a6cd-8acbe2a59020/download
https://repositorio.uniandes.edu.co/bitstreams/4f976527-a7c5-4f40-b2b4-65d3beeaaade/download
https://repositorio.uniandes.edu.co/bitstreams/bd2eb685-948c-4365-a913-5bee0d004d79/download
bitstream.checksum.fl_str_mv 0175ea4a2d4caec4bbcc37e300941108
ae9e573a68e7f92501b6913cc846c39f
3b6f145c3f1ed33a7dd6980e89ad714e
56c136a2f18deb3525d32f2998d78997
824f51bc88e158461b68490d2be8f3b0
66e3442e76c732c81c599e0a13118f35
2995009c936cc7b694b5984a94bd1601
7a2d2907f7613314c4f214341345b4b5
bitstream.checksumAlgorithm.fl_str_mv MD5
MD5
MD5
MD5
MD5
MD5
MD5
MD5
repository.name.fl_str_mv Repositorio institucional Séneca
repository.mail.fl_str_mv adminrepositorio@uniandes.edu.co
_version_ 1812133990376669184
spelling Al consultar y hacer uso de este recurso, está aceptando las condiciones de uso establecidas por los autoresAttribution 4.0 Internationalhttp://creativecommons.org/licenses/by/4.0/info:eu-repo/semantics/openAccesshttp://purl.org/coar/access_right/c_abf2Camargo Leyva, Jonathanvirtual::19632-1Vargas Cuadros, Andrés Felipe2024-08-02T19:57:41Z2024-08-02T19:57:41Z2024-07-31https://hdl.handle.net/1992/74929instname:Universidad de los Andesreponame:Repositorio Institucional Sénecarepourl:https://repositorio.uniandes.edu.co/Se usaron las instalaciones del laboratorio Colivri para practicas relacionadas con la realidad aumentada.My research project focused on developing the convergence between understanding inverse and direct kinematics to enable a robotic arm to move in a 3D space. This exploration required a deep dive into the principles of kinematics, both inverse and direct, to achieve precise control over a robotic arm's movements. Furthermore, the project aimed to understand how modern virtual reality (VR) applications are implemented in Unreal Engine, encompassing everything from creating a skeleton mesh to integrating it with Blueprints for seamless functionality. Inverse kinematics (IK) and direct kinematics (DK) are fundamental concepts in robotics, each playing a critical role in controlling robotic arms. Direct kinematics involves calculating the position and orientation of the robot's end effector (the "hand" or tool at the end of the arm) based on the given joint parameters. It provides a straightforward way to understand where the end effector will be if each joint is at a specific angle. This is essential for basic movements and positioning within a defined workspace. Inverse kinematics, on the other hand, works in the opposite direction. Given a desired position and orientation of the end effector, inverse kinematics calculates the necessary joint parameters to achieve that position. This is a more complex problem because it involves solving equations with multiple variables, often requiring iterative methods or optimization techniques. IK is crucial for tasks where the end effector must reach a specific point in space, such as picking up an object or performing precise assembly work. To practically apply these kinematic principles, we used MATLAB & Simulink for simulation and validation. These tools allowed us to model the kinematic equations and visualize the robotic arm's movements. By inputting various parameters and observing the resulting motions, we could refine our understanding and ensure the accuracy of our calculations. This step was vital in developing a robust foundation before moving to the implementation phase in Unreal Engine. In MATLAB & Simulink, we simulated different scenarios for both inverse and direct kinematics. For direct kinematics, we input specific joint angles and observed the end effector's position. For inverse kinematics, we specified a target position and calculated the joint angles required to reach it. This dual approach ensured that our robotic arm could move accurately and efficiently within a 3D space. With a solid understanding of kinematics, the next step was to create a virtual model of the robotic arm in Unreal Engine. This process began with developing a detailed skeleton mesh, which serves as the underlying structure for the robotic arm. The skeleton mesh consists of bones and joints, each representing a part of the robotic arm. These bones are hierarchically connected, mimicking the physical structure of the robot. In Blender, we created the 3D model of the robotic arm, including all its components. Each part was carefully designed to reflect the actual robotic arm's dimensions and functionalities. Once the model was complete, we exported it as an STL file and imported it into Unreal Engine. Within Unreal Engine, the model was converted into two meshes: one for the skeleton and another for the assembled parts. Unreal Engine's Blueprints system provides a powerful visual scripting environment for developing interactive elements within a VR application. To control the robotic arm, we used Blueprints to define its behavior and responses to user inputs. This involved programming the kinematic equations and ensuring that the arm's movements were realistic and precise. We started by setting up the skeleton mesh in Unreal Engine, ensuring that each bone was correctly positioned and oriented. Next, we created Blueprints to handle the inverse and direct kinematics calculations. These Blueprints allowed us to input joint angles or target positions and automatically compute the necessary movements. To implement the inverse kinematics, we used Unreal Engine's animation system to drive the arm's joints based on the calculated angles. This system enabled real-time control and provided a visual representation of the arm's movements. For direct kinematics, we programmed Blueprints to take user inputs, such as joystick movements or keyboard commands, and translate them into joint angles. One of the project's innovative aspects was the incorporation of gesture-based controls. Using the VR Expansion Plugin (VRE) and MetaXR, we enabled the system to recognize and respond to predefined gestures. This capability significantly enhanced the user experience, allowing for intuitive control of the robotic arm. We developed Blueprints that could record and recognize gestures. For example, a specific hand movement could trigger the robotic arm to perform a particular action. This was achieved by recording the gesture data and assigning it to corresponding actions in the Blueprints. The VRE plugin facilitated this process by providing tools for gesture recognition and data management. To ensure the application ran smoothly, we focused on optimizing performance. This involved fine-tuning the Blueprints and minimizing computational overhead. One of the critical considerations was offloading the processing of gesture recognition from the VR headset to an external API. By developing a script to read MediaPipe data directly in Unreal Engine, we reduced the processing load on the headset, resulting in a more responsive and immersive experience. kinematic principles with modern VR technologies to control a robotic arm in a 3D environment. The success of the gesture-based control system opens up numerous possibilities for future applications. For instance, this approach can be extended to other types of robots or machinery, providing a flexible and intuitive control method. Moreover, integrating AI tools with Unreal Engine can further enhance the system's capabilities. AI-driven gesture recognition can improve accuracy and adapt to individual user preferences. Additionally, advanced AI algorithms can optimize kinematic calculations, making the system more efficient and responsive.Laboratorio ColivriPregradoRealidad aumentadaRobotsExtended Reality44 páginasapplication/pdfengUniversidad de los AndesIngeniería MecánicaFacultad de IngenieríaDepartamento de Ingeniería MecánicaSymbiosis between Human, Robot and Extended RealitySimbiosis entre humano, robot y realidad aumentadaTrabajo de grado - Pregradoinfo:eu-repo/semantics/bachelorThesisinfo:eu-repo/semantics/acceptedVersionhttp://purl.org/coar/resource_type/c_7a1fTexthttp://purl.org/redcol/resource_type/TPRobotSymbiosisGesturesDynamicsIngenieríaEPSON. (s.f.). Epson VT6L All-in-One 6-Axis Robot. Obtenido de https://epson.com/6-axis-built-in-controller-industrial-robotEPSON. (s.f.). EPSON VT6L ALL-IN-ONE 6-AXIS ROBOT. Obtenido de https://www.industrialcontrol.com/epson-vt6l-allinone-6axis-robotMicrosoft. (s.f.). Obtenido de https://support.xbox.com/en-US/help/hardware-network/controller/xbox-accessories-elite-series-2Parejo, J. C. (2008). La representación Denavit-Hartenberg. Obtenido de La representación Denavit-Hartenberg202013817Publication0bb50162-7add-44a0-8bf0-a4205f5869a5virtual::19632-10bb50162-7add-44a0-8bf0-a4205f5869a5virtual::19632-1https://scienti.minciencias.gov.co/cvlac/visualizador/generarCurriculoCv.do?cod_rh=0000018833virtual::19632-1CC-LICENSElicense_rdflicense_rdfapplication/rdf+xml; charset=utf-8908https://repositorio.uniandes.edu.co/bitstreams/759423a4-dd2e-46ef-a455-37e5fb2a412e/download0175ea4a2d4caec4bbcc37e300941108MD51LICENSElicense.txtlicense.txttext/plain; charset=utf-82535https://repositorio.uniandes.edu.co/bitstreams/93a2fcc9-c1a9-40b9-bd1c-ef79d1ceb79d/downloadae9e573a68e7f92501b6913cc846c39fMD52ORIGINALautorizacion tesis Andres Felipe.pdfautorizacion tesis Andres Felipe.pdfHIDEapplication/pdf403124https://repositorio.uniandes.edu.co/bitstreams/bcf66836-d3c6-480d-9dfa-b4caa99d7805/download3b6f145c3f1ed33a7dd6980e89ad714eMD52Symbiosis between Human, Robot and Extended Reality.pdfSymbiosis between Human, Robot and Extended Reality.pdfapplication/pdf1780284https://repositorio.uniandes.edu.co/bitstreams/b17ddde5-2665-4d73-959a-5f8db8f26cef/download56c136a2f18deb3525d32f2998d78997MD53TEXTautorizacion tesis Andres Felipe.pdf.txtautorizacion tesis Andres Felipe.pdf.txtExtracted texttext/plain2001https://repositorio.uniandes.edu.co/bitstreams/bb0b03a2-ccb3-45c3-9142-3e445af4b8fb/download824f51bc88e158461b68490d2be8f3b0MD54Symbiosis between Human, Robot and Extended Reality.pdf.txtSymbiosis between Human, Robot and Extended Reality.pdf.txtExtracted texttext/plain71422https://repositorio.uniandes.edu.co/bitstreams/979f295d-c887-4b76-a6cd-8acbe2a59020/download66e3442e76c732c81c599e0a13118f35MD56THUMBNAILautorizacion tesis Andres Felipe.pdf.jpgautorizacion tesis Andres Felipe.pdf.jpgGenerated Thumbnailimage/jpeg10817https://repositorio.uniandes.edu.co/bitstreams/4f976527-a7c5-4f40-b2b4-65d3beeaaade/download2995009c936cc7b694b5984a94bd1601MD55Symbiosis between Human, Robot and Extended Reality.pdf.jpgSymbiosis between Human, Robot and Extended Reality.pdf.jpgGenerated Thumbnailimage/jpeg5939https://repositorio.uniandes.edu.co/bitstreams/bd2eb685-948c-4365-a913-5bee0d004d79/download7a2d2907f7613314c4f214341345b4b5MD571992/74929oai:repositorio.uniandes.edu.co:1992/749292024-09-12 16:24:41.78http://creativecommons.org/licenses/by/4.0/Attribution 4.0 Internationalopen.accesshttps://repositorio.uniandes.edu.coRepositorio institucional Sénecaadminrepositorio@uniandes.edu.coPGgzPjxzdHJvbmc+RGVzY2FyZ28gZGUgUmVzcG9uc2FiaWxpZGFkIC0gTGljZW5jaWEgZGUgQXV0b3JpemFjacOzbjwvc3Ryb25nPjwvaDM+CjxwPjxzdHJvbmc+UG9yIGZhdm9yIGxlZXIgYXRlbnRhbWVudGUgZXN0ZSBkb2N1bWVudG8gcXVlIHBlcm1pdGUgYWwgUmVwb3NpdG9yaW8gSW5zdGl0dWNpb25hbCBTw6luZWNhIHJlcHJvZHVjaXIgeSBkaXN0cmlidWlyIGxvcyByZWN1cnNvcyBkZSBpbmZvcm1hY2nDs24gZGVwb3NpdGFkb3MgbWVkaWFudGUgbGEgYXV0b3JpemFjacOzbiBkZSBsb3Mgc2lndWllbnRlcyB0w6lybWlub3M6PC9zdHJvbmc+PC9wPgo8cD5Db25jZWRhIGxhIGxpY2VuY2lhIGRlIGRlcMOzc2l0byBlc3TDoW5kYXIgc2VsZWNjaW9uYW5kbyBsYSBvcGNpw7NuIDxzdHJvbmc+J0FjZXB0YXIgbG9zIHTDqXJtaW5vcyBhbnRlcmlvcm1lbnRlIGRlc2NyaXRvcyc8L3N0cm9uZz4geSBjb250aW51YXIgZWwgcHJvY2VzbyBkZSBlbnbDrW8gbWVkaWFudGUgZWwgYm90w7NuIDxzdHJvbmc+J1NpZ3VpZW50ZScuPC9zdHJvbmc+PC9wPgo8aHI+CjxwPllvLCBlbiBtaSBjYWxpZGFkIGRlIGF1dG9yIGRlbCB0cmFiYWpvIGRlIHRlc2lzLCBtb25vZ3JhZsOtYSBvIHRyYWJham8gZGUgZ3JhZG8sIGhhZ28gZW50cmVnYSBkZWwgZWplbXBsYXIgcmVzcGVjdGl2byB5IGRlIHN1cyBhbmV4b3MgZGUgc2VyIGVsIGNhc28sIGVuIGZvcm1hdG8gZGlnaXRhbCB5L28gZWxlY3Ryw7NuaWNvIHkgYXV0b3Jpem8gYSBsYSBVbml2ZXJzaWRhZCBkZSBsb3MgQW5kZXMgcGFyYSBxdWUgcmVhbGljZSBsYSBwdWJsaWNhY2nDs24gZW4gZWwgU2lzdGVtYSBkZSBCaWJsaW90ZWNhcyBvIGVuIGN1YWxxdWllciBvdHJvIHNpc3RlbWEgbyBiYXNlIGRlIGRhdG9zIHByb3BpbyBvIGFqZW5vIGEgbGEgVW5pdmVyc2lkYWQgeSBwYXJhIHF1ZSBlbiBsb3MgdMOpcm1pbm9zIGVzdGFibGVjaWRvcyBlbiBsYSBMZXkgMjMgZGUgMTk4MiwgTGV5IDQ0IGRlIDE5OTMsIERlY2lzacOzbiBBbmRpbmEgMzUxIGRlIDE5OTMsIERlY3JldG8gNDYwIGRlIDE5OTUgeSBkZW3DoXMgbm9ybWFzIGdlbmVyYWxlcyBzb2JyZSBsYSBtYXRlcmlhLCB1dGlsaWNlIGVuIHRvZGFzIHN1cyBmb3JtYXMsIGxvcyBkZXJlY2hvcyBwYXRyaW1vbmlhbGVzIGRlIHJlcHJvZHVjY2nDs24sIGNvbXVuaWNhY2nDs24gcMO6YmxpY2EsIHRyYW5zZm9ybWFjacOzbiB5IGRpc3RyaWJ1Y2nDs24gKGFscXVpbGVyLCBwcsOpc3RhbW8gcMO6YmxpY28gZSBpbXBvcnRhY2nDs24pIHF1ZSBtZSBjb3JyZXNwb25kZW4gY29tbyBjcmVhZG9yIGRlIGxhIG9icmEgb2JqZXRvIGRlbCBwcmVzZW50ZSBkb2N1bWVudG8uPC9wPgo8cD5MYSBwcmVzZW50ZSBhdXRvcml6YWNpw7NuIHNlIGVtaXRlIGVuIGNhbGlkYWQgZGUgYXV0b3IgZGUgbGEgb2JyYSBvYmpldG8gZGVsIHByZXNlbnRlIGRvY3VtZW50byB5IG5vIGNvcnJlc3BvbmRlIGEgY2VzacOzbiBkZSBkZXJlY2hvcywgc2lubyBhIGxhIGF1dG9yaXphY2nDs24gZGUgdXNvIGFjYWTDqW1pY28gZGUgY29uZm9ybWlkYWQgY29uIGxvIGFudGVyaW9ybWVudGUgc2XDsWFsYWRvLiBMYSBwcmVzZW50ZSBhdXRvcml6YWNpw7NuIHNlIGhhY2UgZXh0ZW5zaXZhIG5vIHNvbG8gYSBsYXMgZmFjdWx0YWRlcyB5IGRlcmVjaG9zIGRlIHVzbyBzb2JyZSBsYSBvYnJhIGVuIGZvcm1hdG8gbyBzb3BvcnRlIG1hdGVyaWFsLCBzaW5vIHRhbWJpw6luIHBhcmEgZm9ybWF0byBlbGVjdHLDs25pY28sIHkgZW4gZ2VuZXJhbCBwYXJhIGN1YWxxdWllciBmb3JtYXRvIGNvbm9jaWRvIG8gcG9yIGNvbm9jZXIuPC9wPgo8cD5FbCBhdXRvciwgbWFuaWZpZXN0YSBxdWUgbGEgb2JyYSBvYmpldG8gZGUgbGEgcHJlc2VudGUgYXV0b3JpemFjacOzbiBlcyBvcmlnaW5hbCB5IGxhIHJlYWxpesOzIHNpbiB2aW9sYXIgbyB1c3VycGFyIGRlcmVjaG9zIGRlIGF1dG9yIGRlIHRlcmNlcm9zLCBwb3IgbG8gdGFudG8sIGxhIG9icmEgZXMgZGUgc3UgZXhjbHVzaXZhIGF1dG9yw61hIHkgdGllbmUgbGEgdGl0dWxhcmlkYWQgc29icmUgbGEgbWlzbWEuPC9wPgo8cD5FbiBjYXNvIGRlIHByZXNlbnRhcnNlIGN1YWxxdWllciByZWNsYW1hY2nDs24gbyBhY2Npw7NuIHBvciBwYXJ0ZSBkZSB1biB0ZXJjZXJvIGVuIGN1YW50byBhIGxvcyBkZXJlY2hvcyBkZSBhdXRvciBzb2JyZSBsYSBvYnJhIGVuIGN1ZXN0acOzbiwgZWwgYXV0b3IgYXN1bWlyw6EgdG9kYSBsYSByZXNwb25zYWJpbGlkYWQsIHkgc2FsZHLDoSBkZSBkZWZlbnNhIGRlIGxvcyBkZXJlY2hvcyBhcXXDrSBhdXRvcml6YWRvcywgcGFyYSB0b2RvcyBsb3MgZWZlY3RvcyBsYSBVbml2ZXJzaWRhZCBhY3TDumEgY29tbyB1biB0ZXJjZXJvIGRlIGJ1ZW5hIGZlLjwvcD4KPHA+U2kgdGllbmUgYWxndW5hIGR1ZGEgc29icmUgbGEgbGljZW5jaWEsIHBvciBmYXZvciwgY29udGFjdGUgY29uIGVsIDxhIGhyZWY9Im1haWx0bzpiaWJsaW90ZWNhQHVuaWFuZGVzLmVkdS5jbyIgdGFyZ2V0PSJfYmxhbmsiPkFkbWluaXN0cmFkb3IgZGVsIFNpc3RlbWEuPC9hPjwvcD4K