Detection and Segmentation of Malignant Melanoma Regions in Dermoscopic Images Using Machine Learning
Melanomas are among the deadliest types of skin cancer, requiring early detection to improve survival rates. This project employs deep learning techniques, specifically U-Net-based architectures, to segment and identify melanomas in an open dataset of dermoscopic images. For the segmentation task, t...
- Autores:
-
Martínez Novoa, Santiago
- Tipo de recurso:
- Trabajo de grado de pregrado
- Fecha de publicación:
- 2025
- Institución:
- Universidad de los Andes
- Repositorio:
- Séneca: repositorio Uniandes
- Idioma:
- eng
- OAI Identifier:
- oai:repositorio.uniandes.edu.co:1992/75910
- Acceso en línea:
- https://hdl.handle.net/1992/75910
- Palabra clave:
- Deep learning
Melanoma detection
U-Net
Segmentation
Convolutional neural networks
Ingeniería
- Rights
- openAccess
- License
- Attribution 4.0 International
id |
UNIANDES2_7e92655684fb2412088fc9dcb8ef49ba |
---|---|
oai_identifier_str |
oai:repositorio.uniandes.edu.co:1992/75910 |
network_acronym_str |
UNIANDES2 |
network_name_str |
Séneca: repositorio Uniandes |
repository_id_str |
|
dc.title.eng.fl_str_mv |
Detection and Segmentation of Malignant Melanoma Regions in Dermoscopic Images Using Machine Learning |
title |
Detection and Segmentation of Malignant Melanoma Regions in Dermoscopic Images Using Machine Learning |
spellingShingle |
Detection and Segmentation of Malignant Melanoma Regions in Dermoscopic Images Using Machine Learning Deep learning Melanoma detection U-Net Segmentation Convolutional neural networks Ingeniería |
title_short |
Detection and Segmentation of Malignant Melanoma Regions in Dermoscopic Images Using Machine Learning |
title_full |
Detection and Segmentation of Malignant Melanoma Regions in Dermoscopic Images Using Machine Learning |
title_fullStr |
Detection and Segmentation of Malignant Melanoma Regions in Dermoscopic Images Using Machine Learning |
title_full_unstemmed |
Detection and Segmentation of Malignant Melanoma Regions in Dermoscopic Images Using Machine Learning |
title_sort |
Detection and Segmentation of Malignant Melanoma Regions in Dermoscopic Images Using Machine Learning |
dc.creator.fl_str_mv |
Martínez Novoa, Santiago |
dc.contributor.advisor.none.fl_str_mv |
Reyes Gómez, Juan Pablo |
dc.contributor.author.none.fl_str_mv |
Martínez Novoa, Santiago |
dc.contributor.researchgroup.none.fl_str_mv |
Facultad de Ingeniería::COMIT - Comunicaciones y Tecnología de Información |
dc.subject.keyword.eng.fl_str_mv |
Deep learning Melanoma detection U-Net Segmentation Convolutional neural networks |
topic |
Deep learning Melanoma detection U-Net Segmentation Convolutional neural networks Ingeniería |
dc.subject.themes.spa.fl_str_mv |
Ingeniería |
description |
Melanomas are among the deadliest types of skin cancer, requiring early detection to improve survival rates. This project employs deep learning techniques, specifically U-Net-based architectures, to segment and identify melanomas in an open dataset of dermoscopic images. For the segmentation task, the proposed models achieved a highest Dice score of 0.88 and an Intersection over Union (IoU) of 0.80, demonstrating their effectiveness in delineating lesion boundaries. In the classification task, the approaches ranged from the use of Convolutional Neural Networks (CNNs) to embedding extraction combined with machine learning models. The bestperforming classification model attained an overall accuracy of 0.84, with an F1-score of 0.45 for the melanoma class. These results highlight the potential of the developed methodologies to enhance diagnostic precision and support dermatologists in identifying high-risk cases more effectively. |
publishDate |
2025 |
dc.date.accessioned.none.fl_str_mv |
2025-01-31T15:33:32Z |
dc.date.available.none.fl_str_mv |
2025-01-31T15:33:32Z |
dc.date.issued.none.fl_str_mv |
2025-01-29 |
dc.type.none.fl_str_mv |
Trabajo de grado - Pregrado |
dc.type.driver.none.fl_str_mv |
info:eu-repo/semantics/bachelorThesis |
dc.type.version.none.fl_str_mv |
info:eu-repo/semantics/acceptedVersion |
dc.type.coar.none.fl_str_mv |
http://purl.org/coar/resource_type/c_7a1f |
dc.type.content.none.fl_str_mv |
Text |
dc.type.redcol.none.fl_str_mv |
http://purl.org/redcol/resource_type/TP |
format |
http://purl.org/coar/resource_type/c_7a1f |
status_str |
acceptedVersion |
dc.identifier.uri.none.fl_str_mv |
https://hdl.handle.net/1992/75910 |
dc.identifier.instname.none.fl_str_mv |
instname:Universidad de los Andes |
dc.identifier.reponame.none.fl_str_mv |
reponame:Repositorio Institucional Séneca |
dc.identifier.repourl.none.fl_str_mv |
repourl:https://repositorio.uniandes.edu.co/ |
url |
https://hdl.handle.net/1992/75910 |
identifier_str_mv |
instname:Universidad de los Andes reponame:Repositorio Institucional Séneca repourl:https://repositorio.uniandes.edu.co/ |
dc.language.iso.none.fl_str_mv |
eng |
language |
eng |
dc.relation.references.none.fl_str_mv |
Pypi: The python package index. https://pypi.org/. Accessed: 2024-12-10. Sphinx: Python documentation generator. https://www.sphinx-doc.org/en/master/. Accessed: 2024-12-10. Mohammed A. Al-masni, Mugahed A. Al-antari, Mun-Taek Choi, Seung-Moo Han, and Tae-Seong Kim. Skin lesion segmentation in dermoscopy images via deep full resolution convolutional networks. Computer Methods and Programs in Biomedicine, 162:221–231, 2018. Md. Zahangir Alom, Mahmudul Hasan, Chris Yakopcic, Tarek M. Taha, and Vijayan K. Asari. Recurrent residual convolutional neural network based on u-net (r2u-net) for medical image segmentation. CoRR, abs/1802.06955, 2018. ISIC Archive. International skin imaging collaboration dataset. G Argenziano and HP Soyer. Dermoscopy of pigmented skin lesions–a valuable tool for early diagnosis of melanoma. The Lancet. Oncology, 2(7):443—449, July 2001. Vijay Badrinarayanan, Ankur Handa, and Roberto Cipolla. Segnet: A deep convolutional encoder-decoder architecture for robust semantic pixel-wise labelling. CoRR, abs/1505.07293, 2015. P. Carli, E. Quercioli, S. Sestini, M. Stante, L. Ricci, G. Brunasso, and V. DE Giorgi. Pattern analysis, not simplified algorithms, is the most reliable method for teaching dermoscopy for melanoma diagnosis to residents in dermatology. British Journal of Dermatology, 148(5):981–984, 05 2003. Jieneng Chen, Jieru Mei, Xianhang Li, Yongyi Lu, Qihang Yu, Qingyue Wei, Xiangde Luo, Yutong Xie, Ehsan Adeli, Yan Wang, et al. Transunet: Rethinking the u-net architecture design for medical image segmentation through the lens of transformers. Medical Image Analysis, page 103280, 2024. Noel C. F. Codella, David A. Gutman, M. Emre Celebi, Brian Helba, Michael A. Marchetti, Stephen W. Dusza, Aadi Kalloo, Konstantinos Liopyris, Nabin K. Mishra, Harald Kittler, and Allan Halpern. Skin lesion analysis toward melanoma detection: A challenge at the 2017 international symposium on biomedical imaging (isbi), hosted by the international skin imaging collaboration (ISIC). CoRR, abs/1710.05006, 2017.41 Noel C. F. Codella, Quoc-Bao Nguyen, Sharath Pankanti, David A. Gutman, Brian Helba, Allan Halpern, and John R. Smith. Deep learning ensembles for melanoma recognition in dermoscopy images. CoRR, abs/1610.04662, 2016. MLOps Community. Crisp-ml process for machine learning projects. Cuenta de Alto Costo. Skin cancer statistics in colombia, 2024. Alexey Dosovitskiy, Lucas Beyer, Alexander Kolesnikov, Dirk Weissenborn, Xiaohua Zhai, Thomas Unterthiner, Mostafa Dehghani, Matthias Minderer, Georg Heigold, Sylvain Gelly, Jakob Uszkoreit, and Neil Houlsby. An image is worth 16x16 words: Transformers for image recognition at scale. CoRR, abs/2010.11929, 2020. Julie Gachon, Philippe Beaulieu, Jean Francois Sei, Johanny Gouvernet, Jean Paul Claudel, Michel Lemaitre, Marie Aleth Richard, and Jean Jacques Grob. First Prospective Study of the Recognition Process of Melanoma in Dermatological Practice. Archives of Dermatology, 141(4):434–438, 04 2005. Aurelien Geron. Hands-on machine learning with Scikit-Learn and TensorFlow: concepts, tools, and techniques to build intelligent systems. O’Reilly Media, Sebastopol, CA, 2017. Manu Goyal, Amanda Oakley, Priyanka Bansal, Darren Dancey, and Moi Hoon Yap. Automatic lesion boundary segmentation in dermoscopic images with ensemble deep learning methods, 2019. Kaiming He, Xiangyu Zhang, Shaoqing Ren, and Jian Sun. Deep residual learning for image recognition, 2015. Andrew G. Howard, Menglong Zhu, Bo Chen, Dmitry Kalenichenko, Weijun Wang, Tobias Weyand, Marco Andreetto, and Hartwig Adam. Mobilenets: Efficient convolutional neural networks for mobile vision applications. CoRR, abs/1704.04861, 2017. Shahzaib Iqbal, Muhammad Zeeshan, Mehwish Mehmood, Tariq M. Khan, and Imran Razzak. Tesl-net: A transformer-enhanced cnn for accurate skin lesión segmentation, 2024. Shivangi Jain, Vandana jagtap, and Nitin Pise. Computer aided melanoma skin cancer detection using image processing. Procedia Computer Science, 48:735–740, 2015. International Conference on Computer, Communication and Convergence (ICCC 2015). H. Kittler, H. Pehamberger, K. Wolff, and M. Binder. Diagnostic accuracy of dermoscopy. The Lancet Oncology, 3(3):159–165, 2002. 42 Xiaowei Liu, Yikun Hu, and Jianguo Chen. Hybrid cnn-transformer model for medical image segmentation with pyramid convolution and multi-layer perceptron. Biomedical Signal Processing and Control, 86:105331, 2023. MLOps Community. Crisp-ml process diagram, 2024. Accessed: 2024-11-06. Ozan Oktay, Jo Schlemper, Lo¨ıc Le Folgoc, Matthew C. H. Lee, Mattias P. Heinrich, Kazunari Misawa, Kensaku Mori, Steven G. McDonagh, Nils Y. Hammerla, Bernhard Kainz, Ben Glocker, and Daniel Rueckert. Attention u-net: Learning where to look for the pancreas. CoRR, abs/1804.03999, 2018. Yaopeng Peng, Milan Sonka, and Danny Z. Chen. U-net v2: Rethinking the skip connections of u-net for medical image segmentation, 2024. Tri Cong Pham, Giang Son Tran, Thi Phuong Nghiem, Antoine Doucet, Chi Mai Luong, and Van-Dung Hoang. A comparative study for classification of skin cancer. In 2019 International Conference on System Science and Engineering (ICSSE), pages 267–272, 2019. Kemal Polat and Kaan Onur Koc. Detection of skin diseases from dermoscopy image using the combination of convolutional neural network and one-versus-all. Journal of Artificial Intelligence and Systems, 2:80–97, 2020. Olaf Ronneberger, Philipp Fischer, and Thomas Brox. U-net: Convolutional networks for biomedical image segmentation. CoRR, abs/1505.04597, 2015. Jiacheng Ruan, Suncheng Xiang, Mingye Xie, Ting Liu, and Yuzhuo Fu. Malunet: A multi-attention and light-weight unet for skin lesion segmentation, 2022. Rebecca L Siegel, Kimberly D Miller, Nikita Sandeep Wagle, and Ahmedin Jemal. Cancer statistics, 2023. CA: A Cancer Journal for Clinicians, 73(1):17–48, 2023. © 2023 The Authors. Published by Wiley Periodicals LLC on behalf of American Cancer Society. Karen Simonyan and Andrew Zisserman. Very deep convolutional networks for large-scale image recognition, 2015. Sidharth Sonthalia, Sara Yumeen, and Feroze Kaliyadan. Dermoscopy overview and extradiagnostic applications. StatPearls [Internet], 2024. Updated 2023 Aug 8. Parvathaneni Naga Srinivasu, Jalluri Gnana SivaSai, Muhammad Fazal Ijaz, Akash Kumar Bhoi, Wonjoon Kim, and James Jin Kang. Classification of skin disease using deep learning neural networks with mobilenet v2 and lstm. Sensors, 21(8), 2021. 43 Stefan Studer, Thanh Binh Bui, Christian Drescher, Alexander Hanuschkin, Ludwig Winkler, Steven Peters, and Klaus-Robert M¨uller. Towards CRISP-ML(Q):A machine learning process model with quality assurance methodology. CoRR, abs/2003.05155, 2020. Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Lukasz Kaiser, and Illia Polosukhin. Attention is all you need. CoRR, abs/1706.03762, 2017. Peng Yao, Shuwei Shen, Mengjuan Xu, Peng Liu, Fan Zhang, Jinyu Xing, Pengfei Shao, Benjamin Kaffenberger, and Ronald X. Xu. Single model deep learning on imbalanced small datasets for skin lesion classification. CoRR, abs/2102.01284, 2021. Yundong Zhang, Huiye Liu, and Qiang Hu. Transfuse: Fusing transformers and cnns for medical image segmentation. CoRR, abs/2102.08005, 2021. |
dc.rights.en.fl_str_mv |
Attribution 4.0 International |
dc.rights.uri.none.fl_str_mv |
http://creativecommons.org/licenses/by/4.0/ |
dc.rights.accessrights.none.fl_str_mv |
info:eu-repo/semantics/openAccess |
dc.rights.coar.none.fl_str_mv |
http://purl.org/coar/access_right/c_abf2 |
rights_invalid_str_mv |
Attribution 4.0 International http://creativecommons.org/licenses/by/4.0/ http://purl.org/coar/access_right/c_abf2 |
eu_rights_str_mv |
openAccess |
dc.format.extent.none.fl_str_mv |
44 páginas |
dc.format.mimetype.none.fl_str_mv |
application/pdf |
dc.publisher.none.fl_str_mv |
Universidad de los Andes |
dc.publisher.program.none.fl_str_mv |
Ingeniería de Sistemas y Computación |
dc.publisher.faculty.none.fl_str_mv |
Facultad de Ingeniería |
dc.publisher.department.none.fl_str_mv |
Departamento de Ingeniería de Sistemas y Computación |
publisher.none.fl_str_mv |
Universidad de los Andes |
institution |
Universidad de los Andes |
bitstream.url.fl_str_mv |
https://repositorio.uniandes.edu.co/bitstreams/2bf15737-3788-4ea1-ad11-6df9665d04d7/download https://repositorio.uniandes.edu.co/bitstreams/756cb000-6caa-47e2-a773-3436ba8268d8/download https://repositorio.uniandes.edu.co/bitstreams/f81423ed-e58d-4258-b1ba-c7972c0fdde0/download https://repositorio.uniandes.edu.co/bitstreams/2bb3a1f0-3699-4ec7-b2c7-ba066be6474c/download https://repositorio.uniandes.edu.co/bitstreams/2fa124bd-90ee-48e6-8964-bc75cf20abb0/download https://repositorio.uniandes.edu.co/bitstreams/a6187cf7-db80-41b1-8128-c89c75d6a0ff/download https://repositorio.uniandes.edu.co/bitstreams/66a89d41-09ec-4074-aa0e-a8274ed9b6b2/download https://repositorio.uniandes.edu.co/bitstreams/25fb9743-0fca-4235-a459-2acedbcaafb8/download |
bitstream.checksum.fl_str_mv |
ae9e573a68e7f92501b6913cc846c39f 0175ea4a2d4caec4bbcc37e300941108 1a6e02763f1483af08afa94f911f577f 844d73f4c9f52742da4c03bbbcb13176 8b5be705a27ab0d8b62ddeb9a4776806 b9a99054c41575abbc6596a4e69232a9 ad3e8bed79d7a9a808046b6b9fb55b32 f4df02ce1c389f4062c0a6f241344f79 |
bitstream.checksumAlgorithm.fl_str_mv |
MD5 MD5 MD5 MD5 MD5 MD5 MD5 MD5 |
repository.name.fl_str_mv |
Repositorio institucional Séneca |
repository.mail.fl_str_mv |
adminrepositorio@uniandes.edu.co |
_version_ |
1828159282942574592 |
spelling |
Reyes Gómez, Juan Pablovirtual::22924-1Martínez Novoa, SantiagoFacultad de Ingeniería::COMIT - Comunicaciones y Tecnología de Información2025-01-31T15:33:32Z2025-01-31T15:33:32Z2025-01-29https://hdl.handle.net/1992/75910instname:Universidad de los Andesreponame:Repositorio Institucional Sénecarepourl:https://repositorio.uniandes.edu.co/Melanomas are among the deadliest types of skin cancer, requiring early detection to improve survival rates. This project employs deep learning techniques, specifically U-Net-based architectures, to segment and identify melanomas in an open dataset of dermoscopic images. For the segmentation task, the proposed models achieved a highest Dice score of 0.88 and an Intersection over Union (IoU) of 0.80, demonstrating their effectiveness in delineating lesion boundaries. In the classification task, the approaches ranged from the use of Convolutional Neural Networks (CNNs) to embedding extraction combined with machine learning models. The bestperforming classification model attained an overall accuracy of 0.84, with an F1-score of 0.45 for the melanoma class. These results highlight the potential of the developed methodologies to enhance diagnostic precision and support dermatologists in identifying high-risk cases more effectively.Pregrado44 páginasapplication/pdfengUniversidad de los AndesIngeniería de Sistemas y ComputaciónFacultad de IngenieríaDepartamento de Ingeniería de Sistemas y ComputaciónAttribution 4.0 Internationalhttp://creativecommons.org/licenses/by/4.0/info:eu-repo/semantics/openAccesshttp://purl.org/coar/access_right/c_abf2Detection and Segmentation of Malignant Melanoma Regions in Dermoscopic Images Using Machine LearningTrabajo de grado - Pregradoinfo:eu-repo/semantics/bachelorThesisinfo:eu-repo/semantics/acceptedVersionhttp://purl.org/coar/resource_type/c_7a1fTexthttp://purl.org/redcol/resource_type/TPDeep learningMelanoma detectionU-NetSegmentationConvolutional neural networksIngenieríaPypi: The python package index. https://pypi.org/. Accessed: 2024-12-10.Sphinx: Python documentation generator. https://www.sphinx-doc.org/en/master/. Accessed: 2024-12-10.Mohammed A. Al-masni, Mugahed A. Al-antari, Mun-Taek Choi, Seung-Moo Han, and Tae-Seong Kim. Skin lesion segmentation in dermoscopy images via deep full resolution convolutional networks. Computer Methods and Programs in Biomedicine, 162:221–231, 2018.Md. Zahangir Alom, Mahmudul Hasan, Chris Yakopcic, Tarek M. Taha, and Vijayan K. Asari. Recurrent residual convolutional neural network based on u-net (r2u-net) for medical image segmentation. CoRR, abs/1802.06955, 2018.ISIC Archive. International skin imaging collaboration dataset.G Argenziano and HP Soyer. Dermoscopy of pigmented skin lesions–a valuable tool for early diagnosis of melanoma. The Lancet. Oncology, 2(7):443—449, July 2001.Vijay Badrinarayanan, Ankur Handa, and Roberto Cipolla. Segnet: A deep convolutional encoder-decoder architecture for robust semantic pixel-wise labelling. CoRR, abs/1505.07293, 2015.P. Carli, E. Quercioli, S. Sestini, M. Stante, L. Ricci, G. Brunasso, and V. DE Giorgi. Pattern analysis, not simplified algorithms, is the most reliable method for teaching dermoscopy for melanoma diagnosis to residents in dermatology. British Journal of Dermatology, 148(5):981–984, 05 2003.Jieneng Chen, Jieru Mei, Xianhang Li, Yongyi Lu, Qihang Yu, Qingyue Wei, Xiangde Luo, Yutong Xie, Ehsan Adeli, Yan Wang, et al. Transunet: Rethinking the u-net architecture design for medical image segmentation through the lens of transformers. Medical Image Analysis, page 103280, 2024.Noel C. F. Codella, David A. Gutman, M. Emre Celebi, Brian Helba, Michael A. Marchetti, Stephen W. Dusza, Aadi Kalloo, Konstantinos Liopyris, Nabin K.Mishra, Harald Kittler, and Allan Halpern. Skin lesion analysis toward melanoma detection: A challenge at the 2017 international symposium on biomedical imaging (isbi), hosted by the international skin imaging collaboration (ISIC). CoRR, abs/1710.05006, 2017.41Noel C. F. Codella, Quoc-Bao Nguyen, Sharath Pankanti, David A. Gutman, Brian Helba, Allan Halpern, and John R. Smith. Deep learning ensembles for melanoma recognition in dermoscopy images. CoRR, abs/1610.04662, 2016.MLOps Community. Crisp-ml process for machine learning projects. Cuenta de Alto Costo. Skin cancer statistics in colombia, 2024.Alexey Dosovitskiy, Lucas Beyer, Alexander Kolesnikov, Dirk Weissenborn, Xiaohua Zhai, Thomas Unterthiner, Mostafa Dehghani, Matthias Minderer, Georg Heigold, Sylvain Gelly, Jakob Uszkoreit, and Neil Houlsby. An image is worth 16x16 words: Transformers for image recognition at scale. CoRR, abs/2010.11929, 2020.Julie Gachon, Philippe Beaulieu, Jean Francois Sei, Johanny Gouvernet, Jean Paul Claudel, Michel Lemaitre, Marie Aleth Richard, and Jean Jacques Grob. First Prospective Study of the Recognition Process of Melanoma in Dermatological Practice. Archives of Dermatology, 141(4):434–438, 04 2005.Aurelien Geron. Hands-on machine learning with Scikit-Learn and TensorFlow: concepts, tools, and techniques to build intelligent systems. O’Reilly Media, Sebastopol, CA, 2017.Manu Goyal, Amanda Oakley, Priyanka Bansal, Darren Dancey, and Moi Hoon Yap. Automatic lesion boundary segmentation in dermoscopic images with ensemble deep learning methods, 2019.Kaiming He, Xiangyu Zhang, Shaoqing Ren, and Jian Sun. Deep residual learning for image recognition, 2015.Andrew G. Howard, Menglong Zhu, Bo Chen, Dmitry Kalenichenko, Weijun Wang, Tobias Weyand, Marco Andreetto, and Hartwig Adam. Mobilenets: Efficient convolutional neural networks for mobile vision applications. CoRR, abs/1704.04861, 2017.Shahzaib Iqbal, Muhammad Zeeshan, Mehwish Mehmood, Tariq M. Khan, and Imran Razzak. Tesl-net: A transformer-enhanced cnn for accurate skin lesión segmentation, 2024.Shivangi Jain, Vandana jagtap, and Nitin Pise. Computer aided melanoma skin cancer detection using image processing. Procedia Computer Science, 48:735–740, 2015. International Conference on Computer, Communication and Convergence (ICCC 2015).H. Kittler, H. Pehamberger, K. Wolff, and M. Binder. Diagnostic accuracy of dermoscopy. The Lancet Oncology, 3(3):159–165, 2002. 42Xiaowei Liu, Yikun Hu, and Jianguo Chen. Hybrid cnn-transformer model for medical image segmentation with pyramid convolution and multi-layer perceptron. Biomedical Signal Processing and Control, 86:105331, 2023.MLOps Community. Crisp-ml process diagram, 2024. Accessed: 2024-11-06.Ozan Oktay, Jo Schlemper, Lo¨ıc Le Folgoc, Matthew C. H. Lee, Mattias P. Heinrich, Kazunari Misawa, Kensaku Mori, Steven G. McDonagh, Nils Y. Hammerla, Bernhard Kainz, Ben Glocker, and Daniel Rueckert. Attention u-net: Learning where to look for the pancreas. CoRR, abs/1804.03999, 2018.Yaopeng Peng, Milan Sonka, and Danny Z. Chen. U-net v2: Rethinking the skip connections of u-net for medical image segmentation, 2024.Tri Cong Pham, Giang Son Tran, Thi Phuong Nghiem, Antoine Doucet, Chi Mai Luong, and Van-Dung Hoang. A comparative study for classification of skin cancer. In 2019 International Conference on System Science and Engineering (ICSSE), pages 267–272, 2019.Kemal Polat and Kaan Onur Koc. Detection of skin diseases from dermoscopy image using the combination of convolutional neural network and one-versus-all. Journal of Artificial Intelligence and Systems, 2:80–97, 2020.Olaf Ronneberger, Philipp Fischer, and Thomas Brox. U-net: Convolutional networks for biomedical image segmentation. CoRR, abs/1505.04597, 2015.Jiacheng Ruan, Suncheng Xiang, Mingye Xie, Ting Liu, and Yuzhuo Fu. Malunet: A multi-attention and light-weight unet for skin lesion segmentation, 2022.Rebecca L Siegel, Kimberly D Miller, Nikita Sandeep Wagle, and Ahmedin Jemal. Cancer statistics, 2023. CA: A Cancer Journal for Clinicians, 73(1):17–48, 2023. © 2023 The Authors. Published by Wiley Periodicals LLC on behalf of American Cancer Society.Karen Simonyan and Andrew Zisserman. Very deep convolutional networks for large-scale image recognition, 2015.Sidharth Sonthalia, Sara Yumeen, and Feroze Kaliyadan. Dermoscopy overview and extradiagnostic applications. StatPearls [Internet], 2024. Updated 2023 Aug 8.Parvathaneni Naga Srinivasu, Jalluri Gnana SivaSai, Muhammad Fazal Ijaz, Akash Kumar Bhoi, Wonjoon Kim, and James Jin Kang. Classification of skin disease using deep learning neural networks with mobilenet v2 and lstm. Sensors, 21(8), 2021. 43Stefan Studer, Thanh Binh Bui, Christian Drescher, Alexander Hanuschkin, Ludwig Winkler, Steven Peters, and Klaus-Robert M¨uller. Towards CRISP-ML(Q):A machine learning process model with quality assurance methodology. CoRR, abs/2003.05155, 2020.Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Lukasz Kaiser, and Illia Polosukhin. Attention is all you need. CoRR, abs/1706.03762, 2017.Peng Yao, Shuwei Shen, Mengjuan Xu, Peng Liu, Fan Zhang, Jinyu Xing, Pengfei Shao, Benjamin Kaffenberger, and Ronald X. Xu. Single model deep learning on imbalanced small datasets for skin lesion classification. CoRR, abs/2102.01284, 2021.Yundong Zhang, Huiye Liu, and Qiang Hu. Transfuse: Fusing transformers and cnns for medical image segmentation. CoRR, abs/2102.08005, 2021.202112020Publicationa50afd39-959f-4beb-a35f-6f387946f795virtual::22924-1a50afd39-959f-4beb-a35f-6f387946f795virtual::22924-1LICENSElicense.txtlicense.txttext/plain; charset=utf-82535https://repositorio.uniandes.edu.co/bitstreams/2bf15737-3788-4ea1-ad11-6df9665d04d7/downloadae9e573a68e7f92501b6913cc846c39fMD51CC-LICENSElicense_rdflicense_rdfapplication/rdf+xml; charset=utf-8908https://repositorio.uniandes.edu.co/bitstreams/756cb000-6caa-47e2-a773-3436ba8268d8/download0175ea4a2d4caec4bbcc37e300941108MD52ORIGINALautorizacion tesis.pdfautorizacion tesis.pdfHIDEapplication/pdf285557https://repositorio.uniandes.edu.co/bitstreams/f81423ed-e58d-4258-b1ba-c7972c0fdde0/download1a6e02763f1483af08afa94f911f577fMD53Detection and Segmentation of Malignant Melanoma Regions in Dermoscopic Images using Machine Learning.pdfDetection and Segmentation of Malignant Melanoma Regions in Dermoscopic Images using Machine Learning.pdfapplication/pdf11370234https://repositorio.uniandes.edu.co/bitstreams/2bb3a1f0-3699-4ec7-b2c7-ba066be6474c/download844d73f4c9f52742da4c03bbbcb13176MD54TEXTautorizacion tesis.pdf.txtautorizacion tesis.pdf.txtExtracted texttext/plain2031https://repositorio.uniandes.edu.co/bitstreams/2fa124bd-90ee-48e6-8964-bc75cf20abb0/download8b5be705a27ab0d8b62ddeb9a4776806MD55Detection and Segmentation of Malignant Melanoma Regions in Dermoscopic Images using Machine Learning.pdf.txtDetection and Segmentation of Malignant Melanoma Regions in Dermoscopic Images using Machine Learning.pdf.txtExtracted texttext/plain74048https://repositorio.uniandes.edu.co/bitstreams/a6187cf7-db80-41b1-8128-c89c75d6a0ff/downloadb9a99054c41575abbc6596a4e69232a9MD57THUMBNAILautorizacion tesis.pdf.jpgautorizacion tesis.pdf.jpgGenerated Thumbnailimage/jpeg10912https://repositorio.uniandes.edu.co/bitstreams/66a89d41-09ec-4074-aa0e-a8274ed9b6b2/downloadad3e8bed79d7a9a808046b6b9fb55b32MD56Detection and Segmentation of Malignant Melanoma Regions in Dermoscopic Images using Machine Learning.pdf.jpgDetection and Segmentation of Malignant Melanoma Regions in Dermoscopic Images using Machine Learning.pdf.jpgGenerated Thumbnailimage/jpeg7847https://repositorio.uniandes.edu.co/bitstreams/25fb9743-0fca-4235-a459-2acedbcaafb8/downloadf4df02ce1c389f4062c0a6f241344f79MD581992/75910oai:repositorio.uniandes.edu.co:1992/759102025-03-05 10:02:01.755http://creativecommons.org/licenses/by/4.0/Attribution 4.0 Internationalopen.accesshttps://repositorio.uniandes.edu.coRepositorio institucional Sénecaadminrepositorio@uniandes.edu.coPGgzPjxzdHJvbmc+RGVzY2FyZ28gZGUgUmVzcG9uc2FiaWxpZGFkIC0gTGljZW5jaWEgZGUgQXV0b3JpemFjacOzbjwvc3Ryb25nPjwvaDM+CjxwPjxzdHJvbmc+UG9yIGZhdm9yIGxlZXIgYXRlbnRhbWVudGUgZXN0ZSBkb2N1bWVudG8gcXVlIHBlcm1pdGUgYWwgUmVwb3NpdG9yaW8gSW5zdGl0dWNpb25hbCBTw6luZWNhIHJlcHJvZHVjaXIgeSBkaXN0cmlidWlyIGxvcyByZWN1cnNvcyBkZSBpbmZvcm1hY2nDs24gZGVwb3NpdGFkb3MgbWVkaWFudGUgbGEgYXV0b3JpemFjacOzbiBkZSBsb3Mgc2lndWllbnRlcyB0w6lybWlub3M6PC9zdHJvbmc+PC9wPgo8cD5Db25jZWRhIGxhIGxpY2VuY2lhIGRlIGRlcMOzc2l0byBlc3TDoW5kYXIgc2VsZWNjaW9uYW5kbyBsYSBvcGNpw7NuIDxzdHJvbmc+J0FjZXB0YXIgbG9zIHTDqXJtaW5vcyBhbnRlcmlvcm1lbnRlIGRlc2NyaXRvcyc8L3N0cm9uZz4geSBjb250aW51YXIgZWwgcHJvY2VzbyBkZSBlbnbDrW8gbWVkaWFudGUgZWwgYm90w7NuIDxzdHJvbmc+J1NpZ3VpZW50ZScuPC9zdHJvbmc+PC9wPgo8aHI+CjxwPllvLCBlbiBtaSBjYWxpZGFkIGRlIGF1dG9yIGRlbCB0cmFiYWpvIGRlIHRlc2lzLCBtb25vZ3JhZsOtYSBvIHRyYWJham8gZGUgZ3JhZG8sIGhhZ28gZW50cmVnYSBkZWwgZWplbXBsYXIgcmVzcGVjdGl2byB5IGRlIHN1cyBhbmV4b3MgZGUgc2VyIGVsIGNhc28sIGVuIGZvcm1hdG8gZGlnaXRhbCB5L28gZWxlY3Ryw7NuaWNvIHkgYXV0b3Jpem8gYSBsYSBVbml2ZXJzaWRhZCBkZSBsb3MgQW5kZXMgcGFyYSBxdWUgcmVhbGljZSBsYSBwdWJsaWNhY2nDs24gZW4gZWwgU2lzdGVtYSBkZSBCaWJsaW90ZWNhcyBvIGVuIGN1YWxxdWllciBvdHJvIHNpc3RlbWEgbyBiYXNlIGRlIGRhdG9zIHByb3BpbyBvIGFqZW5vIGEgbGEgVW5pdmVyc2lkYWQgeSBwYXJhIHF1ZSBlbiBsb3MgdMOpcm1pbm9zIGVzdGFibGVjaWRvcyBlbiBsYSBMZXkgMjMgZGUgMTk4MiwgTGV5IDQ0IGRlIDE5OTMsIERlY2lzacOzbiBBbmRpbmEgMzUxIGRlIDE5OTMsIERlY3JldG8gNDYwIGRlIDE5OTUgeSBkZW3DoXMgbm9ybWFzIGdlbmVyYWxlcyBzb2JyZSBsYSBtYXRlcmlhLCB1dGlsaWNlIGVuIHRvZGFzIHN1cyBmb3JtYXMsIGxvcyBkZXJlY2hvcyBwYXRyaW1vbmlhbGVzIGRlIHJlcHJvZHVjY2nDs24sIGNvbXVuaWNhY2nDs24gcMO6YmxpY2EsIHRyYW5zZm9ybWFjacOzbiB5IGRpc3RyaWJ1Y2nDs24gKGFscXVpbGVyLCBwcsOpc3RhbW8gcMO6YmxpY28gZSBpbXBvcnRhY2nDs24pIHF1ZSBtZSBjb3JyZXNwb25kZW4gY29tbyBjcmVhZG9yIGRlIGxhIG9icmEgb2JqZXRvIGRlbCBwcmVzZW50ZSBkb2N1bWVudG8uPC9wPgo8cD5MYSBwcmVzZW50ZSBhdXRvcml6YWNpw7NuIHNlIGVtaXRlIGVuIGNhbGlkYWQgZGUgYXV0b3IgZGUgbGEgb2JyYSBvYmpldG8gZGVsIHByZXNlbnRlIGRvY3VtZW50byB5IG5vIGNvcnJlc3BvbmRlIGEgY2VzacOzbiBkZSBkZXJlY2hvcywgc2lubyBhIGxhIGF1dG9yaXphY2nDs24gZGUgdXNvIGFjYWTDqW1pY28gZGUgY29uZm9ybWlkYWQgY29uIGxvIGFudGVyaW9ybWVudGUgc2XDsWFsYWRvLiBMYSBwcmVzZW50ZSBhdXRvcml6YWNpw7NuIHNlIGhhY2UgZXh0ZW5zaXZhIG5vIHNvbG8gYSBsYXMgZmFjdWx0YWRlcyB5IGRlcmVjaG9zIGRlIHVzbyBzb2JyZSBsYSBvYnJhIGVuIGZvcm1hdG8gbyBzb3BvcnRlIG1hdGVyaWFsLCBzaW5vIHRhbWJpw6luIHBhcmEgZm9ybWF0byBlbGVjdHLDs25pY28sIHkgZW4gZ2VuZXJhbCBwYXJhIGN1YWxxdWllciBmb3JtYXRvIGNvbm9jaWRvIG8gcG9yIGNvbm9jZXIuPC9wPgo8cD5FbCBhdXRvciwgbWFuaWZpZXN0YSBxdWUgbGEgb2JyYSBvYmpldG8gZGUgbGEgcHJlc2VudGUgYXV0b3JpemFjacOzbiBlcyBvcmlnaW5hbCB5IGxhIHJlYWxpesOzIHNpbiB2aW9sYXIgbyB1c3VycGFyIGRlcmVjaG9zIGRlIGF1dG9yIGRlIHRlcmNlcm9zLCBwb3IgbG8gdGFudG8sIGxhIG9icmEgZXMgZGUgc3UgZXhjbHVzaXZhIGF1dG9yw61hIHkgdGllbmUgbGEgdGl0dWxhcmlkYWQgc29icmUgbGEgbWlzbWEuPC9wPgo8cD5FbiBjYXNvIGRlIHByZXNlbnRhcnNlIGN1YWxxdWllciByZWNsYW1hY2nDs24gbyBhY2Npw7NuIHBvciBwYXJ0ZSBkZSB1biB0ZXJjZXJvIGVuIGN1YW50byBhIGxvcyBkZXJlY2hvcyBkZSBhdXRvciBzb2JyZSBsYSBvYnJhIGVuIGN1ZXN0acOzbiwgZWwgYXV0b3IgYXN1bWlyw6EgdG9kYSBsYSByZXNwb25zYWJpbGlkYWQsIHkgc2FsZHLDoSBkZSBkZWZlbnNhIGRlIGxvcyBkZXJlY2hvcyBhcXXDrSBhdXRvcml6YWRvcywgcGFyYSB0b2RvcyBsb3MgZWZlY3RvcyBsYSBVbml2ZXJzaWRhZCBhY3TDumEgY29tbyB1biB0ZXJjZXJvIGRlIGJ1ZW5hIGZlLjwvcD4KPHA+U2kgdGllbmUgYWxndW5hIGR1ZGEgc29icmUgbGEgbGljZW5jaWEsIHBvciBmYXZvciwgY29udGFjdGUgY29uIGVsIDxhIGhyZWY9Im1haWx0bzpiaWJsaW90ZWNhQHVuaWFuZGVzLmVkdS5jbyIgdGFyZ2V0PSJfYmxhbmsiPkFkbWluaXN0cmFkb3IgZGVsIFNpc3RlbWEuPC9hPjwvcD4K |