{"id":2100,"date":"2024-08-25T19:31:20","date_gmt":"2024-08-25T19:31:20","guid":{"rendered":"https:\/\/labs.cs.queensu.ca\/perklab\/?post_type=qsc_member&#038;p=2100"},"modified":"2024-08-25T19:31:21","modified_gmt":"2024-08-25T19:31:21","slug":"laura-connolly","status":"publish","type":"qsc_member","link":"https:\/\/labs.cs.queensu.ca\/perklab\/members\/laura-connolly\/","title":{"rendered":"Laura\u00a0Connolly"},"content":{"rendered":"<div class=\"wp-block-columns is-layout-flex wp-block-columns-is-layout-flex qsc-member-single-core-info-container\">\n\t<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow qsc-member-single-photo-column\">\n\t\t<img loading=\"lazy\" decoding=\"async\" width=\"250\" height=\"225\" src=\"https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/04\/HeadShotNew-e1714324119909.png\" class=\"qsc-member-single-photo wp-post-image\" alt=\"\" srcset=\"https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/04\/HeadShotNew-e1714324119909.png 796w, https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/04\/HeadShotNew-e1714324119909-300x269.png 300w, https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/04\/HeadShotNew-e1714324119909-768x690.png 768w\" sizes=\"auto, (max-width: 250px) 100vw, 250px\" \/>\n\t<\/div>\n\t<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow qsc-member-single-info-column\">\n\t\t<div class=\"qsc-member-name\"><h1>Laura\u00a0Connolly<\/h1><\/div>\n\t\t<div class=\"qsc-member-position\">PhD Student<\/div>\n\t\t<div class=\"qsc-member-department\">Department of Electrical and Computer Engineering (Smith Engineering)<\/div>\n\t\t<div class=\"qsc-member-organization\">Queen&#8217;s University<\/div>\n\t\t<div class=\"qsc-member-contact\">\n\t\t\t<div class=\"qsc-member-email\"><a href=\"mailto:laura.connolly@queensu.ca\">laura.connolly@queensu.ca<\/a><\/div>\n\t\t\t<div class=\"qsc-member-socials\">\n\t\t\t<a href=\"https:\/\/www.linkedin.com\/in\/laura-connolly-0aab43144\/\" title=\"LinkedIn\"><i class=\"fa-brands fa-linkedin\"><\/i><\/a>\n\t\t\t<a href=\"https:\/\/scholar.google.com\/citations?user=9E-xfIwAAAAJ&amp;hl=en\" title=\"Google Scholar\"><i class=\"fa-brands fa-google-scholar\"><\/i><\/a>\n\t\t\t<a href=\"https:\/\/github.com\/LauraConnolly\" title=\"GitHub\"><i class=\"fa-brands fa-github\"><\/i><\/a>\n\t\t\t<\/div>\n\t\t<\/div>\n\t<\/div>\n<\/div>\n<div class=\"qsc-member-bio\">\n\t\n<h2 class=\"wp-block-heading\">Biography<\/h2>\n\n\n\n<p>Laura Connolly is a Ph.D. Candidate in Electrical Engineering at Queen\u2019s University in Kingston, ON, Canada. She is mentored by Dr. Gabor Fichtinger, Dr. Parvin Mousavi and Dr. Russell H. Taylor. She has recently completed her second visiting studentship at Johns Hopkins University in Baltimore, MD, USA in the Laboratory for Computational Sensing and Robotics (LCSR).<\/p>\n\n\n\n<p>Laura began researching computer-integrated surgery in 2018 as an undergraduate researcher before entering her graduate studies in 2020. Her research is now primarily focused on the application of robotics and image guidance for margin detection in breast-conserving surgery.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Publications<\/h2>\n\n\n<div class=\"teachpress_pub_list\"><form name=\"tppublistform\" method=\"get\"><a name=\"tppubs\" id=\"tppubs\"><\/a><\/form><div class=\"teachpress_publication_list\"><div class=\"tp_publication tp_publication_conference\"><div class=\"tp_pub_info\"><p class=\"tp_pub_author\"> Elkind, Emese;  Tun, Aung Tin;  Radcliffe, Olivia;  Connolly, Laura;  Davison, Colleen;  Purkey, Eva;  Mousavi, Parvin;  Fichtinger, Gabor;  Thornton, Kanchana<\/p><p class=\"tp_pub_title\"><a class=\"tp_title_link\" href=\"https:\/\/labs.cs.queensu.ca\/perklab\/eelkind_imno2025_poster_wrist\/\" title=\"https:\/\/labs.cs.queensu.ca\/perklab\/eelkind_imno2025_poster_wrist\/\" target=\"blank\">Developing low-cost 3D-printed prosthetics with a functional wrist for patients along the Thai-Myanmar border<\/a> <span class=\"tp_pub_type tp_  conference\">Conference<\/span> <\/p><p class=\"tp_pub_additional\"><span class=\"tp_pub_additional_publisher\">INOVAIT Image-Guided Therapy (IGT) x Imaging Network Ontario (ImNO), <\/span><span class=\"tp_pub_additional_year\">2025<\/span>.<\/p><p class=\"tp_pub_menu\"><span class=\"tp_abstract_link\"><a id=\"tp_abstract_sh_1163\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('1163','tp_abstract')\" title=\"Show abstract\" style=\"cursor:pointer;\">Abstract<\/a><\/span> | <span class=\"tp_resource_link\"><a id=\"tp_links_sh_1163\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('1163','tp_links')\" title=\"Show links and resources\" style=\"cursor:pointer;\">Links<\/a><\/span> | <span class=\"tp_bibtex_link\"><a id=\"tp_bibtex_sh_1163\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('1163','tp_bibtex')\" title=\"Show BibTeX entry\" style=\"cursor:pointer;\">BibTeX<\/a><\/span><\/p><div class=\"tp_bibtex\" id=\"tp_bibtex_1163\" style=\"display:none;\"><div class=\"tp_bibtex_entry\"><pre>@conference{Elkind2025b,<br \/>\r\ntitle = {Developing low-cost 3D-printed prosthetics with a functional wrist for patients along the Thai-Myanmar border},<br \/>\r\nauthor = {Emese Elkind and Aung Tin Tun and Olivia Radcliffe and Laura Connolly and Colleen Davison and Eva Purkey and Parvin Mousavi and Gabor Fichtinger and Kanchana Thornton <br \/>\r\n},<br \/>\r\nurl = {https:\/\/labs.cs.queensu.ca\/perklab\/eelkind_imno2025_poster_wrist\/},<br \/>\r\nyear  = {2025},<br \/>\r\ndate = {2025-03-05},<br \/>\r\nurldate = {2025-03-05},<br \/>\r\npublisher = {INOVAIT Image-Guided Therapy (IGT) x Imaging Network Ontario (ImNO)},<br \/>\r\nabstract = {INTRODUCTION: Inadequacies in the Burmese healthcare system, heightened by the 2021 military coup and related civil war in Myanmar and the COVID-19 pandemic, have contributed to an influx of refugees to Thailand to seek medical aid. An estimated 1.5 million Myanmar nationals entered Thailand since January 2023 [5]. Without immigration status, these refugees are unable to receive healthcare. Burma Children Medical Fund (BCMF) is a nonprofit based in Mae Sot, Tak, Thailand that focuses on funding underserved Burmese communities\u2019 medical treatment and providing support services, including accessible prosthetics for refugees who have experienced limb loss [1]. Prosthetics in lower-income countries are usually passive, meaning they lack mechanisms to restore critical limb functions such as gripping, rotation, or complex hand movements. Therefore, patients cannot fully perform their daily functions, impacting their abilities to work and affecting family caretakers. BCMF aims to make prosthetics that work best in low-resource settings using open-source designs, which only allow for fixed hand positions. The usage of prosthetic arms depends heavily on their functionality and comfort. Patients are more likely to consistently use prosthetics if it aids them in returning to normalcy. In this study, we present a design for an interchangeable and functional prosthetic wrist that enables critical hand motions such as rotation.<br \/>\r\nMETHODS: BCMF currently provides custom-fitted, low-cost, 3D-printed prostheses that are found on Thingiverse, a public library of 3D designs.  One such design is the Kwawu Arm 2.0 [2], which can be adjusted with OpenSCAD [4], a software for modifying 3D CAD models to fit the recipient's measurements. To maintain BCMF\u2019s workflow, the interchangeable wrist model was created using the 3D design software, Autodesk Fusion 360, and designs from open sourced Quick-Connect Wrist designs found on Thingiverse [3]. The wrist was merged onto the Kwawu Arm, printed, assembled, and tested for durability and comfort both with and without patients. This is an iterative process where patient feedback ensures the prosthetics cater to the diverse needs of the recipients. <br \/>\r\nRESULTS: Since the launch of the prosthetics project in 2019, BCMF has provided 3D-printed prosthetics to 76 patients. The interchangeable hand provides a solution to many patients' everyday activities and can rotate the hand 360 degrees (Fig.2) and has been tested on and used by one patient thus far (Fig.1). <br \/>\r\nCONCLUSIONS: The BCMF prosthetics project provides a low-cost solution to healthcare challenges in the context of poly-crisis experienced in Myanmar, enhancing the resilience and adaptability of affected refugee communities. The collaboration between BCMF and Queen\u2019s University demonstrates the potential for future partnerships between educational institutions and NGOs to address health care access disparities. Future work includes continuing to fill the gap between open-sourced models and patient-specific needs to refine the 3D-printing workflow by continuing to create customizable, generalized designs. We also plan to test the interchangeable wrist with more patients and develop body-powered prosthetic designs to support more critical movements.<br \/>\r\nREFERENCES: [1]Burma Children Medical Fund - Mae Sot, Thailand. BCMF | Burma Children Medical Fund - Mae Sot, Thailand - Operating to give people a future. (n.d.). https:\/\/burmachildren.com\/ [2]Buchanan, J. (2018, March 27). Kwawu Arm 2.0 - Prosthetic - socket version. Thingiverse. https:\/\/www.thingiverse.com\/thing:2841281 [3]NIOP. (2022, February 9). NIOP Q-C V1 quick-connect wrist. Thingiverse. http:\/\/www.thingiverse.com\/thing:5238794 [4]OpenSCAD. The Programmers Solid 3D CAD Modeller. (n.d.). https:\/\/openscad.org\/ [5]UN. Overview of Myanmar nationals in Thailand. IOM UN migration. https:\/\/thailand.iom.int\/sites\/g\/files\/tmzbdl1371\/files\/documents\/2024-10\/overview-of-myanmar-nationals-in-thailand-october-24.pdf <br \/>\r\n},<br \/>\r\nkeywords = {},<br \/>\r\npubstate = {published},<br \/>\r\ntppubtype = {conference}<br \/>\r\n}<br \/>\r\n<\/pre><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('1163','tp_bibtex')\">Close<\/a><\/p><\/div><div class=\"tp_abstract\" id=\"tp_abstract_1163\" style=\"display:none;\"><div class=\"tp_abstract_entry\">INTRODUCTION: Inadequacies in the Burmese healthcare system, heightened by the 2021 military coup and related civil war in Myanmar and the COVID-19 pandemic, have contributed to an influx of refugees to Thailand to seek medical aid. An estimated 1.5 million Myanmar nationals entered Thailand since January 2023 [5]. Without immigration status, these refugees are unable to receive healthcare. Burma Children Medical Fund (BCMF) is a nonprofit based in Mae Sot, Tak, Thailand that focuses on funding underserved Burmese communities\u2019 medical treatment and providing support services, including accessible prosthetics for refugees who have experienced limb loss [1]. Prosthetics in lower-income countries are usually passive, meaning they lack mechanisms to restore critical limb functions such as gripping, rotation, or complex hand movements. Therefore, patients cannot fully perform their daily functions, impacting their abilities to work and affecting family caretakers. BCMF aims to make prosthetics that work best in low-resource settings using open-source designs, which only allow for fixed hand positions. The usage of prosthetic arms depends heavily on their functionality and comfort. Patients are more likely to consistently use prosthetics if it aids them in returning to normalcy. In this study, we present a design for an interchangeable and functional prosthetic wrist that enables critical hand motions such as rotation.<br \/>\r\nMETHODS: BCMF currently provides custom-fitted, low-cost, 3D-printed prostheses that are found on Thingiverse, a public library of 3D designs.  One such design is the Kwawu Arm 2.0 [2], which can be adjusted with OpenSCAD [4], a software for modifying 3D CAD models to fit the recipient's measurements. To maintain BCMF\u2019s workflow, the interchangeable wrist model was created using the 3D design software, Autodesk Fusion 360, and designs from open sourced Quick-Connect Wrist designs found on Thingiverse [3]. The wrist was merged onto the Kwawu Arm, printed, assembled, and tested for durability and comfort both with and without patients. This is an iterative process where patient feedback ensures the prosthetics cater to the diverse needs of the recipients. <br \/>\r\nRESULTS: Since the launch of the prosthetics project in 2019, BCMF has provided 3D-printed prosthetics to 76 patients. The interchangeable hand provides a solution to many patients' everyday activities and can rotate the hand 360 degrees (Fig.2) and has been tested on and used by one patient thus far (Fig.1). <br \/>\r\nCONCLUSIONS: The BCMF prosthetics project provides a low-cost solution to healthcare challenges in the context of poly-crisis experienced in Myanmar, enhancing the resilience and adaptability of affected refugee communities. The collaboration between BCMF and Queen\u2019s University demonstrates the potential for future partnerships between educational institutions and NGOs to address health care access disparities. Future work includes continuing to fill the gap between open-sourced models and patient-specific needs to refine the 3D-printing workflow by continuing to create customizable, generalized designs. We also plan to test the interchangeable wrist with more patients and develop body-powered prosthetic designs to support more critical movements.<br \/>\r\nREFERENCES: [1]Burma Children Medical Fund - Mae Sot, Thailand. BCMF | Burma Children Medical Fund - Mae Sot, Thailand - Operating to give people a future. (n.d.). https:\/\/burmachildren.com\/ [2]Buchanan, J. (2018, March 27). Kwawu Arm 2.0 - Prosthetic - socket version. Thingiverse. https:\/\/www.thingiverse.com\/thing:2841281 [3]NIOP. (2022, February 9). NIOP Q-C V1 quick-connect wrist. Thingiverse. http:\/\/www.thingiverse.com\/thing:5238794 [4]OpenSCAD. The Programmers Solid 3D CAD Modeller. (n.d.). https:\/\/openscad.org\/ [5]UN. Overview of Myanmar nationals in Thailand. IOM UN migration. https:\/\/thailand.iom.int\/sites\/g\/files\/tmzbdl1371\/files\/documents\/2024-10\/overview-of-myanmar-nationals-in-thailand-october-24.pdf <br \/>\r\n<\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('1163','tp_abstract')\">Close<\/a><\/p><\/div><div class=\"tp_links\" id=\"tp_links_1163\" style=\"display:none;\"><div class=\"tp_links_entry\"><ul class=\"tp_pub_list\"><li><i class=\"fas fa-globe\"><\/i><a class=\"tp_pub_list\" href=\"https:\/\/labs.cs.queensu.ca\/perklab\/eelkind_imno2025_poster_wrist\/\" title=\"https:\/\/labs.cs.queensu.ca\/perklab\/eelkind_imno2025_poster_wrist\/\" target=\"_blank\">https:\/\/labs.cs.queensu.ca\/perklab\/eelkind_imno2025_poster_wrist\/<\/a><\/li><\/ul><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('1163','tp_links')\">Close<\/a><\/p><\/div><\/div><\/div><div class=\"tp_publication tp_publication_conference\"><div class=\"tp_pub_info\"><p class=\"tp_pub_author\"> Elkind, Emese;  Radcliffe, Olivia;  Tun, Aung Tin;  Connolly, Laura;  Davison, Colleen;  Purkey, Eva;  Fichtinger, Gabor;  Thornton, Kanchana<\/p><p class=\"tp_pub_title\">Strengthening Low-cost Prosthetic Solutions in Thailand\/Myanmar Through Academic Institution-NGO Collaboration  <span class=\"tp_pub_label_award\" title=\"Honorable Mention\"><i class=\"fas fa-award\"><\/i> Honorable Mention<\/span> <span class=\"tp_pub_type tp_  conference\">Conference<\/span> <\/p><p class=\"tp_pub_additional\"><span class=\"tp_pub_additional_booktitle\">Health &amp; Human Rights Conference, <\/span><span class=\"tp_pub_additional_organization\">Queen's University <\/span><span class=\"tp_pub_additional_publisher\">School of Medicine, <\/span><span class=\"tp_pub_additional_year\">2025<\/span><span class=\"tp_pub_additional_note\">, (3rd place)<\/span>.<\/p><p class=\"tp_pub_menu\"><span class=\"tp_abstract_link\"><a id=\"tp_abstract_sh_1162\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('1162','tp_abstract')\" title=\"Show abstract\" style=\"cursor:pointer;\">Abstract<\/a><\/span> | <span class=\"tp_bibtex_link\"><a id=\"tp_bibtex_sh_1162\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('1162','tp_bibtex')\" title=\"Show BibTeX entry\" style=\"cursor:pointer;\">BibTeX<\/a><\/span><\/p><div class=\"tp_bibtex\" id=\"tp_bibtex_1162\" style=\"display:none;\"><div class=\"tp_bibtex_entry\"><pre>@conference{Elkind2025,<br \/>\r\ntitle = {Strengthening Low-cost Prosthetic Solutions in Thailand\/Myanmar Through Academic Institution-NGO Collaboration },<br \/>\r\nauthor = {Emese Elkind and Olivia Radcliffe and Aung Tin Tun and Laura Connolly and Colleen Davison and Eva Purkey and Gabor Fichtinger and Kanchana Thornton<br \/>\r\n},<br \/>\r\nyear  = {2025},<br \/>\r\ndate = {2025-02-22},<br \/>\r\nurldate = {2025-02-22},<br \/>\r\nbooktitle = {Health & Human Rights Conference},<br \/>\r\npublisher = {School of Medicine},<br \/>\r\norganization = {Queen's University},<br \/>\r\nabstract = {The ongoing civil war in Myanmar, along with the related coup in 2021, have displaced millions of refugees to Thailand, where many lack immigration status and cannot access medical care. The Burma Children Medical Fund (BCMF) [1] addresses these challenges by providing funding and support for medical treatment, including a 3D-printed prosthetics program initiated in 2019 for individuals with limb loss. Due to limited Computer-Aided Design (CAD) experience, BCMF staff have turned to open-source prosthetic designs. We aim to establish an academia-NGO partnership to strengthen BCMF\u2019s efforts, provide technical support, and broaden outreach to underserved communities needing low-cost, body-powered prosthetic devices. Our collaboration includes Queen\u2019s University volunteers traveling to BCMF\u2019s workshop for on-ground support and continuing remote assistance. As BCMF utilizes open-source prosthetic designs from platforms such as Thingiverse [2], we wanted to maintain the 3D printing workflow while addressing gaps in open-source prosthetic offerings. We identified three critical needs: devices for short-below-elbow amputees, above-elbow amputees, and a detachable, rotatable wrist. In response, we modified BCMF\u2019s most used prosthetic design to customize the model for these specific needs. We conducted iterative testing for durability and comfort, ensuring constant communication between staff and recipients, allowing patient feedback to guide our designs. Over the past two years, Queen\u2019s University has sent two volunteers to BCMF, with another planned for this year. So far, five recipients use our short-below-elbow prosthetic design, and one has received a quick connect wrist. In addition, we are currently collaborating remotely on a new prosthetic design for above-elbow amputees. This partnership between Queen\u2019s University and BCMF improves access to low-cost prosthetic solutions, expands BCMF\u2019s recipient pool, and demonstrates the potential for future partnerships between educational institutions and NGOs to address disparities in healthcare access. <br \/>\r\nReferences<br \/>\r\n[1] Burma Children Medical Fund - Mae Sot, Thailand. BCMF | Burma Children Medical Fund - Mae Sot, Thailand - Operating to give people a future. https:\/\/burmachildren.com\/ <br \/>\r\n[2] Buchanan, J. (2018, March 27). Kwawu Arm 2.0 - Prosthetic - socket version. Thingiverse. https:\/\/www.thingiverse.com\/thing:2841281 },<br \/>\r\nnote = {3rd place},<br \/>\r\nkeywords = {},<br \/>\r\npubstate = {published},<br \/>\r\ntppubtype = {conference}<br \/>\r\n}<br \/>\r\n<\/pre><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('1162','tp_bibtex')\">Close<\/a><\/p><\/div><div class=\"tp_abstract\" id=\"tp_abstract_1162\" style=\"display:none;\"><div class=\"tp_abstract_entry\">The ongoing civil war in Myanmar, along with the related coup in 2021, have displaced millions of refugees to Thailand, where many lack immigration status and cannot access medical care. The Burma Children Medical Fund (BCMF) [1] addresses these challenges by providing funding and support for medical treatment, including a 3D-printed prosthetics program initiated in 2019 for individuals with limb loss. Due to limited Computer-Aided Design (CAD) experience, BCMF staff have turned to open-source prosthetic designs. We aim to establish an academia-NGO partnership to strengthen BCMF\u2019s efforts, provide technical support, and broaden outreach to underserved communities needing low-cost, body-powered prosthetic devices. Our collaboration includes Queen\u2019s University volunteers traveling to BCMF\u2019s workshop for on-ground support and continuing remote assistance. As BCMF utilizes open-source prosthetic designs from platforms such as Thingiverse [2], we wanted to maintain the 3D printing workflow while addressing gaps in open-source prosthetic offerings. We identified three critical needs: devices for short-below-elbow amputees, above-elbow amputees, and a detachable, rotatable wrist. In response, we modified BCMF\u2019s most used prosthetic design to customize the model for these specific needs. We conducted iterative testing for durability and comfort, ensuring constant communication between staff and recipients, allowing patient feedback to guide our designs. Over the past two years, Queen\u2019s University has sent two volunteers to BCMF, with another planned for this year. So far, five recipients use our short-below-elbow prosthetic design, and one has received a quick connect wrist. In addition, we are currently collaborating remotely on a new prosthetic design for above-elbow amputees. This partnership between Queen\u2019s University and BCMF improves access to low-cost prosthetic solutions, expands BCMF\u2019s recipient pool, and demonstrates the potential for future partnerships between educational institutions and NGOs to address disparities in healthcare access. <br \/>\r\nReferences<br \/>\r\n[1] Burma Children Medical Fund - Mae Sot, Thailand. BCMF | Burma Children Medical Fund - Mae Sot, Thailand - Operating to give people a future. https:\/\/burmachildren.com\/ <br \/>\r\n[2] Buchanan, J. (2018, March 27). Kwawu Arm 2.0 - Prosthetic - socket version. Thingiverse. https:\/\/www.thingiverse.com\/thing:2841281 <\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('1162','tp_abstract')\">Close<\/a><\/p><\/div><\/div><\/div><div class=\"tp_publication tp_publication_article\"><div class=\"tp_pub_info\"><p class=\"tp_pub_author\"> Hashtrudi-Zaad, Kian;  Farvolden, Coleman;  Connolly, Laura;  Barr, Colton;  Fichtinger, Gabor<\/p><p class=\"tp_pub_title\">Robotic tracking of a resection cavity using a low cost bench-top robotic arm and electromagnetics <span class=\"tp_pub_type tp_  article\">Journal Article<\/span> <\/p><p class=\"tp_pub_additional\"><span class=\"tp_pub_additional_in\">In: <\/span><span class=\"tp_pub_additional_volume\">vol. 13408, <\/span><span class=\"tp_pub_additional_pages\">pp. 217-222, <\/span><span class=\"tp_pub_additional_year\">2025<\/span>.<\/p><p class=\"tp_pub_menu\"><span class=\"tp_abstract_link\"><a id=\"tp_abstract_sh_1170\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('1170','tp_abstract')\" title=\"Show abstract\" style=\"cursor:pointer;\">Abstract<\/a><\/span> | <span class=\"tp_bibtex_link\"><a id=\"tp_bibtex_sh_1170\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('1170','tp_bibtex')\" title=\"Show BibTeX entry\" style=\"cursor:pointer;\">BibTeX<\/a><\/span><\/p><div class=\"tp_bibtex\" id=\"tp_bibtex_1170\" style=\"display:none;\"><div class=\"tp_bibtex_entry\"><pre>@article{hashtrudi-zaad2025,<br \/>\r\ntitle = {Robotic tracking of a resection cavity using a low cost bench-top robotic arm and electromagnetics},<br \/>\r\nauthor = {Kian Hashtrudi-Zaad and Coleman Farvolden and Laura Connolly and Colton Barr and Gabor Fichtinger},<br \/>\r\nyear  = {2025},<br \/>\r\ndate = {2025-01-01},<br \/>\r\nvolume = {13408},<br \/>\r\npages = {217-222},<br \/>\r\npublisher = {SPIE},<br \/>\r\nabstract = {INTRODUCTION <br \/>\r\nRoughly 40% of breast cancer patients are required to undergo corrective surgery after tumour resection via breast-conserving surgery (BCS). Sweeping of the cavity, resulting from the tumour resection, by spectroscopy and ultrasound imaging is emerging as a potential solution for identifying leftover cancer. However, the use of imaging modalities in the cavity is challenging as breast tissue is soft, malleable, and moves frequently. This paper presents and verifies an approach for tracking the relative motion of a resection cavity with a robotic arm. <br \/>\r\nMETHODS <br \/>\r\nWe use electromagnetic tracking and a low cost 6-axis robotic arm to track a simulated resection cavity. We embed an electromagnetic sensor in a 3D printed retractor that is designed to hold the cavity open. An open-source module in 3D Slicer is then used to detect cavity motion from the retractor and command the robotic arm to follow the \u2026},<br \/>\r\nkeywords = {},<br \/>\r\npubstate = {published},<br \/>\r\ntppubtype = {article}<br \/>\r\n}<br \/>\r\n<\/pre><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('1170','tp_bibtex')\">Close<\/a><\/p><\/div><div class=\"tp_abstract\" id=\"tp_abstract_1170\" style=\"display:none;\"><div class=\"tp_abstract_entry\">INTRODUCTION <br \/>\r\nRoughly 40% of breast cancer patients are required to undergo corrective surgery after tumour resection via breast-conserving surgery (BCS). Sweeping of the cavity, resulting from the tumour resection, by spectroscopy and ultrasound imaging is emerging as a potential solution for identifying leftover cancer. However, the use of imaging modalities in the cavity is challenging as breast tissue is soft, malleable, and moves frequently. This paper presents and verifies an approach for tracking the relative motion of a resection cavity with a robotic arm. <br \/>\r\nMETHODS <br \/>\r\nWe use electromagnetic tracking and a low cost 6-axis robotic arm to track a simulated resection cavity. We embed an electromagnetic sensor in a 3D printed retractor that is designed to hold the cavity open. An open-source module in 3D Slicer is then used to detect cavity motion from the retractor and command the robotic arm to follow the \u2026<\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('1170','tp_abstract')\">Close<\/a><\/p><\/div><\/div><\/div><div class=\"tp_publication tp_publication_article\"><div class=\"tp_pub_info\"><p class=\"tp_pub_author\"> Farvolden, Coleman;  Hashtrudi-Zaad, Kian;  Connolly, Laura;  Barr, Colton;  Fichtinger, Gabor<\/p><p class=\"tp_pub_title\">An accessible six-axis testbed for image-guided robotics research <span class=\"tp_pub_type tp_  article\">Journal Article<\/span> <\/p><p class=\"tp_pub_additional\"><span class=\"tp_pub_additional_in\">In: <\/span><span class=\"tp_pub_additional_volume\">vol. 13408, <\/span><span class=\"tp_pub_additional_pages\">pp. 458-463, <\/span><span class=\"tp_pub_additional_year\">2025<\/span>.<\/p><p class=\"tp_pub_menu\"><span class=\"tp_abstract_link\"><a id=\"tp_abstract_sh_1169\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('1169','tp_abstract')\" title=\"Show abstract\" style=\"cursor:pointer;\">Abstract<\/a><\/span> | <span class=\"tp_bibtex_link\"><a id=\"tp_bibtex_sh_1169\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('1169','tp_bibtex')\" title=\"Show BibTeX entry\" style=\"cursor:pointer;\">BibTeX<\/a><\/span><\/p><div class=\"tp_bibtex\" id=\"tp_bibtex_1169\" style=\"display:none;\"><div class=\"tp_bibtex_entry\"><pre>@article{farvolden2025,<br \/>\r\ntitle = {An accessible six-axis testbed for image-guided robotics research},<br \/>\r\nauthor = {Coleman Farvolden and Kian Hashtrudi-Zaad and Laura Connolly and Colton Barr and Gabor Fichtinger},<br \/>\r\nyear  = {2025},<br \/>\r\ndate = {2025-01-01},<br \/>\r\nvolume = {13408},<br \/>\r\npages = {458-463},<br \/>\r\npublisher = {SPIE},<br \/>\r\nabstract = {PURPOSE: Cancer can recur after tumor resection surgery if tumor tissue is missed and left behind. We hypothesize that intraoperative robotic imaging could be used to inspect the surgical cavity and localize residual cancer tissue. This technique has the potential to improve the success rate of tumor resection surgery. Towards this, we propose and evaluate a benchtop testbed for robotic manipulation of an optical imaging probe. We use low-cost hardware and open-source software to construct the testbed and describe the implementation so that it can be easily adopted to support similar research. METHODS: We implemented a reusable, open-source module in 3D Slicer for reading position coordinates and motion planning with an inexpensive 6-axis robotic arm in Robot Operating System (ROS). For demonstration, a custom end-effector was used to fix an optical probe to the robot. The accuracy of the testbed \u2026},<br \/>\r\nkeywords = {},<br \/>\r\npubstate = {published},<br \/>\r\ntppubtype = {article}<br \/>\r\n}<br \/>\r\n<\/pre><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('1169','tp_bibtex')\">Close<\/a><\/p><\/div><div class=\"tp_abstract\" id=\"tp_abstract_1169\" style=\"display:none;\"><div class=\"tp_abstract_entry\">PURPOSE: Cancer can recur after tumor resection surgery if tumor tissue is missed and left behind. We hypothesize that intraoperative robotic imaging could be used to inspect the surgical cavity and localize residual cancer tissue. This technique has the potential to improve the success rate of tumor resection surgery. Towards this, we propose and evaluate a benchtop testbed for robotic manipulation of an optical imaging probe. We use low-cost hardware and open-source software to construct the testbed and describe the implementation so that it can be easily adopted to support similar research. METHODS: We implemented a reusable, open-source module in 3D Slicer for reading position coordinates and motion planning with an inexpensive 6-axis robotic arm in Robot Operating System (ROS). For demonstration, a custom end-effector was used to fix an optical probe to the robot. The accuracy of the testbed \u2026<\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('1169','tp_abstract')\">Close<\/a><\/p><\/div><\/div><\/div><div class=\"tp_publication tp_publication_conference\"><div class=\"tp_pub_info\"><p class=\"tp_pub_author\"> Elkind, Emese;  Tun, Aung Tin;  Radcliffe, Olivia;  Connolly, Laura;  Davison, Colleen;  Purkey, Eva;  Mousavi, Parvin;  Fichtinger, Gabor;  Thornton, Kanchana<\/p><p class=\"tp_pub_title\"><a class=\"tp_title_link\" href=\"https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/10\/EElkind_CCGH2024.pdf\" title=\"https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/10\/EElkind_CCGH2024.pdf\" target=\"blank\">Enhancing healthcare access by developing low-cost 3D printed prosthetics along the Thai-Myanmar border<\/a> <span class=\"tp_pub_type tp_  conference\">Conference<\/span> <\/p><p class=\"tp_pub_additional\"><span class=\"tp_pub_additional_publisher\">Canadian Association for Global Health, <\/span><span class=\"tp_pub_additional_year\">2024<\/span>.<\/p><p class=\"tp_pub_menu\"><span class=\"tp_abstract_link\"><a id=\"tp_abstract_sh_1156\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('1156','tp_abstract')\" title=\"Show abstract\" style=\"cursor:pointer;\">Abstract<\/a><\/span> | <span class=\"tp_resource_link\"><a id=\"tp_links_sh_1156\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('1156','tp_links')\" title=\"Show links and resources\" style=\"cursor:pointer;\">Links<\/a><\/span> | <span class=\"tp_bibtex_link\"><a id=\"tp_bibtex_sh_1156\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('1156','tp_bibtex')\" title=\"Show BibTeX entry\" style=\"cursor:pointer;\">BibTeX<\/a><\/span><\/p><div class=\"tp_bibtex\" id=\"tp_bibtex_1156\" style=\"display:none;\"><div class=\"tp_bibtex_entry\"><pre>@conference{Elkind2024b,<br \/>\r\ntitle = {Enhancing healthcare access by developing low-cost 3D printed prosthetics along the Thai-Myanmar border},<br \/>\r\nauthor = {Emese Elkind and Aung Tin Tun and Olivia Radcliffe and Laura Connolly and Colleen Davison and Eva Purkey and Parvin Mousavi and Gabor Fichtinger and Kanchana Thornton <br \/>\r\n},<br \/>\r\nurl = {https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/10\/EElkind_CCGH2024.pdf},<br \/>\r\nyear  = {2024},<br \/>\r\ndate = {2024-10-25},<br \/>\r\nurldate = {2024-10-25},<br \/>\r\npublisher = {Canadian Association for Global Health},<br \/>\r\nabstract = {Background\/Objective<br \/>\r\nInadequacies in the Burmese healthcare system, heightened by the 2021 military coup of the civil war in Myanmar and the COVID-19 pandemic, have driven thousands of refugees to Thailand seeking medical aid. Without immigration status, these refugees, especially those who have experienced limb loss, are challenged by the inability to receive healthcare. Burma Children Medical Fund (BCMF, www.burmachildren.com) based in Mae Sot, Tak, Thailand focuses on funding underserved Burmese communities\u2019 medical treatment and providing support services.  <br \/>\r\nProsthetics in lower-income countries are usually passive, therefore, patients cannot fully perform their daily functions, impacting their abilities to work and affecting family caretakers. BCMF aims to make body-powered prosthetics that work best in low-resource settings using open-source designs, which only allow for fixed hand positions. The usage of prosthetic arms depends heavily on their functionality and comfort. Patients are more likely to consistently use prosthetics if it aids them in returning to normalcy and reducing family burdens. My objective is to design an interchangeable hand to enable critical rotational movements.<br \/>\r\nMethodology<br \/>\r\nThe BCMF prosthetics project makes custom-fitted, low-cost, 3D-printed prostheses. BCMF uses open-source prosthetic models such as the Kwawu Arm 2.0, which provides an OpenSCAD (openscad.org) file for adjusting the model to the recipient's measurements. To maintain BCMF\u2019s workflow, the interchangeable wrist model was created using the 3D design software, Autodesk Fusion 360, and designs from NIOP Q-C v1 and v2 Quick-Connect Wrist. The wrist was merged onto the Kwawu Arm, printed, assembled, and tested. This is an iterative process where patient feedback ensures the prosthetics cater to the diverse needs of the recipients. <br \/>\r\nResults<br \/>\r\nSince the launch of the prosthetics project in 2019, BCMF has provided 3D-printed prosthetics to 76 patients. The interchangeable hand provides a solution to many patients' everyday activities and can rotate the hand 360 degrees.<br \/>\r\nConclusions<br \/>\r\nThis project provides a low-cost solution to healthcare challenges in the context of poly-crisis experienced in Myanmar, enhancing the resilience and adaptability of affected refugee communities.<br \/>\r\nRelevance to Sub-Theme<br \/>\r\nThis presentation aligns with sub-theme 2 by developing and testing methods to improve healthcare access and quality in areas affected by war, migration, poverty, and racial disparities.},<br \/>\r\nkeywords = {},<br \/>\r\npubstate = {published},<br \/>\r\ntppubtype = {conference}<br \/>\r\n}<br \/>\r\n<\/pre><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('1156','tp_bibtex')\">Close<\/a><\/p><\/div><div class=\"tp_abstract\" id=\"tp_abstract_1156\" style=\"display:none;\"><div class=\"tp_abstract_entry\">Background\/Objective<br \/>\r\nInadequacies in the Burmese healthcare system, heightened by the 2021 military coup of the civil war in Myanmar and the COVID-19 pandemic, have driven thousands of refugees to Thailand seeking medical aid. Without immigration status, these refugees, especially those who have experienced limb loss, are challenged by the inability to receive healthcare. Burma Children Medical Fund (BCMF, www.burmachildren.com) based in Mae Sot, Tak, Thailand focuses on funding underserved Burmese communities\u2019 medical treatment and providing support services.  <br \/>\r\nProsthetics in lower-income countries are usually passive, therefore, patients cannot fully perform their daily functions, impacting their abilities to work and affecting family caretakers. BCMF aims to make body-powered prosthetics that work best in low-resource settings using open-source designs, which only allow for fixed hand positions. The usage of prosthetic arms depends heavily on their functionality and comfort. Patients are more likely to consistently use prosthetics if it aids them in returning to normalcy and reducing family burdens. My objective is to design an interchangeable hand to enable critical rotational movements.<br \/>\r\nMethodology<br \/>\r\nThe BCMF prosthetics project makes custom-fitted, low-cost, 3D-printed prostheses. BCMF uses open-source prosthetic models such as the Kwawu Arm 2.0, which provides an OpenSCAD (openscad.org) file for adjusting the model to the recipient's measurements. To maintain BCMF\u2019s workflow, the interchangeable wrist model was created using the 3D design software, Autodesk Fusion 360, and designs from NIOP Q-C v1 and v2 Quick-Connect Wrist. The wrist was merged onto the Kwawu Arm, printed, assembled, and tested. This is an iterative process where patient feedback ensures the prosthetics cater to the diverse needs of the recipients. <br \/>\r\nResults<br \/>\r\nSince the launch of the prosthetics project in 2019, BCMF has provided 3D-printed prosthetics to 76 patients. The interchangeable hand provides a solution to many patients' everyday activities and can rotate the hand 360 degrees.<br \/>\r\nConclusions<br \/>\r\nThis project provides a low-cost solution to healthcare challenges in the context of poly-crisis experienced in Myanmar, enhancing the resilience and adaptability of affected refugee communities.<br \/>\r\nRelevance to Sub-Theme<br \/>\r\nThis presentation aligns with sub-theme 2 by developing and testing methods to improve healthcare access and quality in areas affected by war, migration, poverty, and racial disparities.<\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('1156','tp_abstract')\">Close<\/a><\/p><\/div><div class=\"tp_links\" id=\"tp_links_1156\" style=\"display:none;\"><div class=\"tp_links_entry\"><ul class=\"tp_pub_list\"><li><i class=\"fas fa-file-pdf\"><\/i><a class=\"tp_pub_list\" href=\"https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/10\/EElkind_CCGH2024.pdf\" title=\"https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/10\/EElkind_CC[...]\" target=\"_blank\">https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/10\/EElkind_CC[...]<\/a><\/li><\/ul><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('1156','tp_links')\">Close<\/a><\/p><\/div><\/div><\/div><div class=\"tp_publication tp_publication_article\"><div class=\"tp_pub_info\"><p class=\"tp_pub_author\"> Connolly, Laura;  Kumar, Aravind S;  Mehta, Kapi Ketan;  Al-Zogbi, Lidia;  Kazanzides, Peter;  Mousavi, Parvin;  Fichtinger, Gabor;  Krieger, Axel;  Tokuda, Junichi;  Taylor, Russell H;  Leonard, Simon;  Deguet, Anton<\/p><p class=\"tp_pub_title\">SlicerROS2: A Research and Development Module for Image-Guided Robotic Interventions <span class=\"tp_pub_type tp_  article\">Journal Article<\/span> <\/p><p class=\"tp_pub_additional\"><span class=\"tp_pub_additional_in\">In: <\/span><span class=\"tp_pub_additional_journal\">IEEE Transactions on Medical Robotics and Bionics, <\/span><span class=\"tp_pub_additional_year\">2024<\/span>.<\/p><p class=\"tp_pub_menu\"><span class=\"tp_bibtex_link\"><a id=\"tp_bibtex_sh_1154\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('1154','tp_bibtex')\" title=\"Show BibTeX entry\" style=\"cursor:pointer;\">BibTeX<\/a><\/span><\/p><div class=\"tp_bibtex\" id=\"tp_bibtex_1154\" style=\"display:none;\"><div class=\"tp_bibtex_entry\"><pre>@article{connolly2024,<br \/>\r\ntitle = {SlicerROS2: A Research and Development Module for Image-Guided Robotic Interventions},<br \/>\r\nauthor = {Laura Connolly and Aravind S Kumar and Kapi Ketan Mehta and Lidia Al-Zogbi and Peter Kazanzides and Parvin Mousavi and Gabor Fichtinger and Axel Krieger and Junichi Tokuda and Russell H Taylor and Simon Leonard and Anton Deguet},<br \/>\r\nyear  = {2024},<br \/>\r\ndate = {2024-01-01},<br \/>\r\njournal = {IEEE Transactions on Medical Robotics and Bionics},<br \/>\r\npublisher = {IEEE},<br \/>\r\nkeywords = {},<br \/>\r\npubstate = {published},<br \/>\r\ntppubtype = {article}<br \/>\r\n}<br \/>\r\n<\/pre><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('1154','tp_bibtex')\">Close<\/a><\/p><\/div><\/div><\/div><div class=\"tp_publication tp_publication_article\"><div class=\"tp_pub_info\"><p class=\"tp_pub_author\"> Connolly, Laura;  Fooladgar, Fahimeh;  Jamzad, Amoon;  Kaufmann, Martin;  Syeda, Ayesha;  Ren, Kevin;  Abolmaesumi, Purang;  Rudan, John F;  McKay, Doug;  Fichtinger, Gabor;  Mousavi, Parvin<\/p><p class=\"tp_pub_title\"><a class=\"tp_title_link\" href=\"https:\/\/link.springer.com\/article\/10.1007\/s11548-024-03106-1\" title=\"https:\/\/link.springer.com\/article\/10.1007\/s11548-024-03106-1\" target=\"blank\">ImSpect: Image-driven self-supervised learning for surgical margin evaluation with mass spectrometry<\/a> <span class=\"tp_pub_type tp_  article\">Journal Article<\/span> <\/p><p class=\"tp_pub_additional\"><span class=\"tp_pub_additional_in\">In: <\/span><span class=\"tp_pub_additional_journal\">International Journal of Computer Assisted Radiology and Surgery, <\/span><span class=\"tp_pub_additional_pages\">pp. 1-8, <\/span><span class=\"tp_pub_additional_year\">2024<\/span>.<\/p><p class=\"tp_pub_menu\"><span class=\"tp_abstract_link\"><a id=\"tp_abstract_sh_987\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('987','tp_abstract')\" title=\"Show abstract\" style=\"cursor:pointer;\">Abstract<\/a><\/span> | <span class=\"tp_resource_link\"><a id=\"tp_links_sh_987\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('987','tp_links')\" title=\"Show links and resources\" style=\"cursor:pointer;\">Links<\/a><\/span> | <span class=\"tp_bibtex_link\"><a id=\"tp_bibtex_sh_987\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('987','tp_bibtex')\" title=\"Show BibTeX entry\" style=\"cursor:pointer;\">BibTeX<\/a><\/span><\/p><div class=\"tp_bibtex\" id=\"tp_bibtex_987\" style=\"display:none;\"><div class=\"tp_bibtex_entry\"><pre>@article{fichtinger2024e,<br \/>\r\ntitle = {ImSpect: Image-driven self-supervised learning for surgical margin evaluation with mass spectrometry},<br \/>\r\nauthor = {Laura Connolly and Fahimeh Fooladgar and Amoon Jamzad and Martin Kaufmann and Ayesha Syeda and Kevin Ren and Purang Abolmaesumi and John F Rudan and Doug McKay and Gabor Fichtinger and Parvin Mousavi},<br \/>\r\nurl = {https:\/\/link.springer.com\/article\/10.1007\/s11548-024-03106-1},<br \/>\r\nyear  = {2024},<br \/>\r\ndate = {2024-01-01},<br \/>\r\njournal = {International Journal of Computer Assisted Radiology and Surgery},<br \/>\r\npages = {1-8},<br \/>\r\npublisher = {Springer International Publishing},<br \/>\r\nabstract = {Purpose <br \/>\r\nReal-time assessment of surgical margins is critical for favorable outcomes in cancer patients. The iKnife is a mass spectrometry device that has demonstrated potential for margin detection in cancer surgery. Previous studies have shown that using deep learning on iKnife data can facilitate real-time tissue characterization. However, none of the existing literature on the iKnife facilitate the use of publicly available, state-of-the-art pretrained networks or datasets that have been used in computer vision and other domains. <br \/>\r\nMethods <br \/>\r\nIn a new framework we call ImSpect, we convert 1D iKnife data, captured during basal cell carcinoma (BCC) surgery, into 2D images in order to capitalize on state-of-the-art image classification networks. We also use self-supervision to leverage large amounts of unlabeled, intraoperative data to accommodate the data requirements of these networks. <br \/>\r\nResults <br \/>\r\nThrough extensive ablation \u2026},<br \/>\r\nkeywords = {},<br \/>\r\npubstate = {published},<br \/>\r\ntppubtype = {article}<br \/>\r\n}<br \/>\r\n<\/pre><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('987','tp_bibtex')\">Close<\/a><\/p><\/div><div class=\"tp_abstract\" id=\"tp_abstract_987\" style=\"display:none;\"><div class=\"tp_abstract_entry\">Purpose <br \/>\r\nReal-time assessment of surgical margins is critical for favorable outcomes in cancer patients. The iKnife is a mass spectrometry device that has demonstrated potential for margin detection in cancer surgery. Previous studies have shown that using deep learning on iKnife data can facilitate real-time tissue characterization. However, none of the existing literature on the iKnife facilitate the use of publicly available, state-of-the-art pretrained networks or datasets that have been used in computer vision and other domains. <br \/>\r\nMethods <br \/>\r\nIn a new framework we call ImSpect, we convert 1D iKnife data, captured during basal cell carcinoma (BCC) surgery, into 2D images in order to capitalize on state-of-the-art image classification networks. We also use self-supervision to leverage large amounts of unlabeled, intraoperative data to accommodate the data requirements of these networks. <br \/>\r\nResults <br \/>\r\nThrough extensive ablation \u2026<\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('987','tp_abstract')\">Close<\/a><\/p><\/div><div class=\"tp_links\" id=\"tp_links_987\" style=\"display:none;\"><div class=\"tp_links_entry\"><ul class=\"tp_pub_list\"><li><i class=\"fas fa-globe\"><\/i><a class=\"tp_pub_list\" href=\"https:\/\/link.springer.com\/article\/10.1007\/s11548-024-03106-1\" title=\"https:\/\/link.springer.com\/article\/10.1007\/s11548-024-03106-1\" target=\"_blank\">https:\/\/link.springer.com\/article\/10.1007\/s11548-024-03106-1<\/a><\/li><\/ul><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('987','tp_links')\">Close<\/a><\/p><\/div><\/div><\/div><div class=\"tp_publication tp_publication_proceedings\"><div class=\"tp_pub_info\"><p class=\"tp_pub_author\"> Radcliffe, Olivia;  Connolly, Laura;  Ungi, Tamas;  Yeo, Caitlin;  Rudan, John F.;  Fichtinger, Gabor;  Mousavi, Parvin<\/p><p class=\"tp_pub_title\"><a class=\"tp_title_link\" href=\"https:\/\/dx.doi.org\/https:\/\/doi.org\/10.1117\/12.2654015\" title=\"Navigated surgical resection cavity inspection for breast conserving surgery\" target=\"blank\">Navigated surgical resection cavity inspection for breast conserving surgery<\/a> <span class=\"tp_pub_type tp_  proceedings\">Proceedings<\/span> <\/p><p class=\"tp_pub_additional\"><span class=\"tp_pub_additional_year\">2023<\/span>.<\/p><p class=\"tp_pub_menu\"><span class=\"tp_abstract_link\"><a id=\"tp_abstract_sh_651\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('651','tp_abstract')\" title=\"Show abstract\" style=\"cursor:pointer;\">Abstract<\/a><\/span> | <span class=\"tp_resource_link\"><a id=\"tp_links_sh_651\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('651','tp_links')\" title=\"Show links and resources\" style=\"cursor:pointer;\">Links<\/a><\/span> | <span class=\"tp_bibtex_link\"><a id=\"tp_bibtex_sh_651\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('651','tp_bibtex')\" title=\"Show BibTeX entry\" style=\"cursor:pointer;\">BibTeX<\/a><\/span><\/p><div class=\"tp_bibtex\" id=\"tp_bibtex_651\" style=\"display:none;\"><div class=\"tp_bibtex_entry\"><pre>@proceedings{nokey,<br \/>\r\ntitle = {Navigated surgical resection cavity inspection for breast conserving surgery},<br \/>\r\nauthor = {Olivia Radcliffe and Laura Connolly and Tamas Ungi and Caitlin Yeo and John F. Rudan and Gabor Fichtinger and Parvin Mousavi},<br \/>\r\ndoi = {https:\/\/doi.org\/10.1117\/12.2654015},<br \/>\r\nyear  = {2023},<br \/>\r\ndate = {2023-04-03},<br \/>\r\nabstract = {Up to 40% of Breast Conserving Surgery (BCS) patients must undergo repeat surgery because cancer is left behind in the resection cavity. The mobility of the breast resection cavity makes it difficult to localize residual cancer and, therefore, cavity shaving is a common technique for cancer removal. Cavity shaving involves removing an additional layer of tissue from the entire resection cavity, often resulting in unnecessary healthy tissue loss. In this study, we demonstrated a navigation system and open-source software module that facilitates visualization of the breast resection cavity for targeted localization of residual cancer.},<br \/>\r\nkeywords = {},<br \/>\r\npubstate = {published},<br \/>\r\ntppubtype = {proceedings}<br \/>\r\n}<br \/>\r\n<\/pre><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('651','tp_bibtex')\">Close<\/a><\/p><\/div><div class=\"tp_abstract\" id=\"tp_abstract_651\" style=\"display:none;\"><div class=\"tp_abstract_entry\">Up to 40% of Breast Conserving Surgery (BCS) patients must undergo repeat surgery because cancer is left behind in the resection cavity. The mobility of the breast resection cavity makes it difficult to localize residual cancer and, therefore, cavity shaving is a common technique for cancer removal. Cavity shaving involves removing an additional layer of tissue from the entire resection cavity, often resulting in unnecessary healthy tissue loss. In this study, we demonstrated a navigation system and open-source software module that facilitates visualization of the breast resection cavity for targeted localization of residual cancer.<\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('651','tp_abstract')\">Close<\/a><\/p><\/div><div class=\"tp_links\" id=\"tp_links_651\" style=\"display:none;\"><div class=\"tp_links_entry\"><ul class=\"tp_pub_list\"><li><i class=\"ai ai-doi\"><\/i><a class=\"tp_pub_list\" href=\"https:\/\/dx.doi.org\/https:\/\/doi.org\/10.1117\/12.2654015\" title=\"Follow DOI:https:\/\/doi.org\/10.1117\/12.2654015\" target=\"_blank\">doi:https:\/\/doi.org\/10.1117\/12.2654015<\/a><\/li><\/ul><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('651','tp_links')\">Close<\/a><\/p><\/div><\/div><\/div><div class=\"tp_publication tp_publication_article\"><div class=\"tp_pub_info\"><p class=\"tp_pub_author\"> Morton, David;  Connolly, Laura;  Groves, Leah;  Sunderland, Kyle;  Jamzad, Amoon;  Rudan, John F;  Fichtinger, Gabor;  Ungi, Tamas;  Mousavi, Parvin<\/p><p class=\"tp_pub_title\"><a class=\"tp_title_link\" href=\"https:\/\/www.spiedigitallibrary.org\/conference-proceedings-of-spie\/12466\/124661K\/Tracked-tissue-sensing-for-tumor-bed-inspection\/10.1117\/12.2654217.short\" title=\"https:\/\/www.spiedigitallibrary.org\/conference-proceedings-of-spie\/12466\/124661K\/Tracked-tissue-sensing-for-tumor-bed-inspection\/10.1117\/12.2654217.short\" target=\"blank\">Tracked tissue sensing for tumor bed inspection<\/a> <span class=\"tp_pub_type tp_  article\">Journal Article<\/span> <\/p><p class=\"tp_pub_additional\"><span class=\"tp_pub_additional_in\">In: <\/span><span class=\"tp_pub_additional_volume\">vol. 12466, <\/span><span class=\"tp_pub_additional_pages\">pp. 378-385, <\/span><span class=\"tp_pub_additional_year\">2023<\/span>.<\/p><p class=\"tp_pub_menu\"><span class=\"tp_abstract_link\"><a id=\"tp_abstract_sh_1006\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('1006','tp_abstract')\" title=\"Show abstract\" style=\"cursor:pointer;\">Abstract<\/a><\/span> | <span class=\"tp_resource_link\"><a id=\"tp_links_sh_1006\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('1006','tp_links')\" title=\"Show links and resources\" style=\"cursor:pointer;\">Links<\/a><\/span> | <span class=\"tp_bibtex_link\"><a id=\"tp_bibtex_sh_1006\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('1006','tp_bibtex')\" title=\"Show BibTeX entry\" style=\"cursor:pointer;\">BibTeX<\/a><\/span><\/p><div class=\"tp_bibtex\" id=\"tp_bibtex_1006\" style=\"display:none;\"><div class=\"tp_bibtex_entry\"><pre>@article{fichtinger2023x,<br \/>\r\ntitle = {Tracked tissue sensing for tumor bed inspection},<br \/>\r\nauthor = {David Morton and Laura Connolly and Leah Groves and Kyle Sunderland and Amoon Jamzad and John F Rudan and Gabor Fichtinger and Tamas Ungi and Parvin Mousavi},<br \/>\r\nurl = {https:\/\/www.spiedigitallibrary.org\/conference-proceedings-of-spie\/12466\/124661K\/Tracked-tissue-sensing-for-tumor-bed-inspection\/10.1117\/12.2654217.short},<br \/>\r\nyear  = {2023},<br \/>\r\ndate = {2023-01-01},<br \/>\r\nvolume = {12466},<br \/>\r\npages = {378-385},<br \/>\r\npublisher = {SPIE},<br \/>\r\nabstract = {Up to 30% of breast-conserving surgery patients require secondary surgery to remove cancerous tissue missed in the initial intervention. We hypothesize that tracked tissue sensing can improve the success rate of breast-conserving surgery. Tissue sensor tracking allows the surgeon to intraoperatively scan the tumor bed for leftover cancerous tissue. In this study, we characterize the performance of our tracked optical scanning testbed using an experimental pipeline. We assess the Dice similarity coefficient, accuracy, and latency of the testbed.},<br \/>\r\nkeywords = {},<br \/>\r\npubstate = {published},<br \/>\r\ntppubtype = {article}<br \/>\r\n}<br \/>\r\n<\/pre><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('1006','tp_bibtex')\">Close<\/a><\/p><\/div><div class=\"tp_abstract\" id=\"tp_abstract_1006\" style=\"display:none;\"><div class=\"tp_abstract_entry\">Up to 30% of breast-conserving surgery patients require secondary surgery to remove cancerous tissue missed in the initial intervention. We hypothesize that tracked tissue sensing can improve the success rate of breast-conserving surgery. Tissue sensor tracking allows the surgeon to intraoperatively scan the tumor bed for leftover cancerous tissue. In this study, we characterize the performance of our tracked optical scanning testbed using an experimental pipeline. We assess the Dice similarity coefficient, accuracy, and latency of the testbed.<\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('1006','tp_abstract')\">Close<\/a><\/p><\/div><div class=\"tp_links\" id=\"tp_links_1006\" style=\"display:none;\"><div class=\"tp_links_entry\"><ul class=\"tp_pub_list\"><li><i class=\"fas fa-globe\"><\/i><a class=\"tp_pub_list\" href=\"https:\/\/www.spiedigitallibrary.org\/conference-proceedings-of-spie\/12466\/124661K\/Tracked-tissue-sensing-for-tumor-bed-inspection\/10.1117\/12.2654217.short\" title=\"https:\/\/www.spiedigitallibrary.org\/conference-proceedings-of-spie\/12466\/124661K\/[...]\" target=\"_blank\">https:\/\/www.spiedigitallibrary.org\/conference-proceedings-of-spie\/12466\/124661K\/[...]<\/a><\/li><\/ul><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('1006','tp_links')\">Close<\/a><\/p><\/div><\/div><\/div><div class=\"tp_publication tp_publication_article\"><div class=\"tp_pub_info\"><p class=\"tp_pub_author\"> Radcliffe, Olivia;  Connolly, Laura;  Ungi, Tamas;  Yeo, Caitlin;  Rudan, John F;  Fichtinger, Gabor;  Mousavi, Parvin<\/p><p class=\"tp_pub_title\"><a class=\"tp_title_link\" href=\"https:\/\/www.spiedigitallibrary.org\/conference-proceedings-of-spie\/12466\/124660Z\/Navigated-surgical-resection-cavity-inspection-for-breast-conserving-surgery\/10.1117\/12.2654015.short\" title=\"https:\/\/www.spiedigitallibrary.org\/conference-proceedings-of-spie\/12466\/124660Z\/Navigated-surgical-resection-cavity-inspection-for-breast-conserving-surgery\/10.1117\/12.2654015.short\" target=\"blank\">Navigated surgical resection cavity inspection for breast conserving surgery<\/a> <span class=\"tp_pub_type tp_  article\">Journal Article<\/span> <\/p><p class=\"tp_pub_additional\"><span class=\"tp_pub_additional_in\">In: <\/span><span class=\"tp_pub_additional_volume\">vol. 12466, <\/span><span class=\"tp_pub_additional_pages\">pp. 234-241, <\/span><span class=\"tp_pub_additional_year\">2023<\/span>.<\/p><p class=\"tp_pub_menu\"><span class=\"tp_abstract_link\"><a id=\"tp_abstract_sh_1002\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('1002','tp_abstract')\" title=\"Show abstract\" style=\"cursor:pointer;\">Abstract<\/a><\/span> | <span class=\"tp_resource_link\"><a id=\"tp_links_sh_1002\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('1002','tp_links')\" title=\"Show links and resources\" style=\"cursor:pointer;\">Links<\/a><\/span> | <span class=\"tp_bibtex_link\"><a id=\"tp_bibtex_sh_1002\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('1002','tp_bibtex')\" title=\"Show BibTeX entry\" style=\"cursor:pointer;\">BibTeX<\/a><\/span><\/p><div class=\"tp_bibtex\" id=\"tp_bibtex_1002\" style=\"display:none;\"><div class=\"tp_bibtex_entry\"><pre>@article{fichtinger2023t,<br \/>\r\ntitle = {Navigated surgical resection cavity inspection for breast conserving surgery},<br \/>\r\nauthor = {Olivia Radcliffe and Laura Connolly and Tamas Ungi and Caitlin Yeo and John F Rudan and Gabor Fichtinger and Parvin Mousavi},<br \/>\r\nurl = {https:\/\/www.spiedigitallibrary.org\/conference-proceedings-of-spie\/12466\/124660Z\/Navigated-surgical-resection-cavity-inspection-for-breast-conserving-surgery\/10.1117\/12.2654015.short},<br \/>\r\nyear  = {2023},<br \/>\r\ndate = {2023-01-01},<br \/>\r\nvolume = {12466},<br \/>\r\npages = {234-241},<br \/>\r\npublisher = {SPIE},<br \/>\r\nabstract = {Up to 40% of Breast Conserving Surgery (BCS) patients must undergo repeat surgery because cancer is left behind in the resection cavity. The mobility of the breast resection cavity makes it difficult to localize residual cancer and, therefore, cavity shaving is a common technique for cancer removal. Cavity shaving involves removing an additional layer of tissue from the entire resection cavity, often resulting in unnecessary healthy tissue loss. In this study, we demonstrated a navigation system and open-source software module that facilitates visualization of the breast resection cavity for targeted localization of residual cancer.},<br \/>\r\nkeywords = {},<br \/>\r\npubstate = {published},<br \/>\r\ntppubtype = {article}<br \/>\r\n}<br \/>\r\n<\/pre><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('1002','tp_bibtex')\">Close<\/a><\/p><\/div><div class=\"tp_abstract\" id=\"tp_abstract_1002\" style=\"display:none;\"><div class=\"tp_abstract_entry\">Up to 40% of Breast Conserving Surgery (BCS) patients must undergo repeat surgery because cancer is left behind in the resection cavity. The mobility of the breast resection cavity makes it difficult to localize residual cancer and, therefore, cavity shaving is a common technique for cancer removal. Cavity shaving involves removing an additional layer of tissue from the entire resection cavity, often resulting in unnecessary healthy tissue loss. In this study, we demonstrated a navigation system and open-source software module that facilitates visualization of the breast resection cavity for targeted localization of residual cancer.<\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('1002','tp_abstract')\">Close<\/a><\/p><\/div><div class=\"tp_links\" id=\"tp_links_1002\" style=\"display:none;\"><div class=\"tp_links_entry\"><ul class=\"tp_pub_list\"><li><i class=\"fas fa-globe\"><\/i><a class=\"tp_pub_list\" href=\"https:\/\/www.spiedigitallibrary.org\/conference-proceedings-of-spie\/12466\/124660Z\/Navigated-surgical-resection-cavity-inspection-for-breast-conserving-surgery\/10.1117\/12.2654015.short\" title=\"https:\/\/www.spiedigitallibrary.org\/conference-proceedings-of-spie\/12466\/124660Z\/[...]\" target=\"_blank\">https:\/\/www.spiedigitallibrary.org\/conference-proceedings-of-spie\/12466\/124660Z\/[...]<\/a><\/li><\/ul><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('1002','tp_links')\">Close<\/a><\/p><\/div><\/div><\/div><div class=\"tp_publication tp_publication_article\"><div class=\"tp_pub_info\"><p class=\"tp_pub_author\"> Jamzad, Amoon;  Fooladgar, Fahimeh;  Connolly, Laura;  Srikanthan, Dilakshan;  Syeda, Ayesha;  Kaufmann, Martin;  Ren, Kevin YM;  Merchant, Shaila;  Engel, Jay;  Varma, Sonal;  Fichtinger, Gabor;  Rudan, John F;  Mousavi, Parvin<\/p><p class=\"tp_pub_title\"><a class=\"tp_title_link\" href=\"https:\/\/link.springer.com\/chapter\/10.1007\/978-3-031-43990-2_53\" title=\"https:\/\/link.springer.com\/chapter\/10.1007\/978-3-031-43990-2_53\" target=\"blank\">Bridging Ex-Vivo Training and Intra-operative Deployment for Surgical Margin Assessment with Evidential Graph Transformer<\/a> <span class=\"tp_pub_type tp_  article\">Journal Article<\/span> <\/p><p class=\"tp_pub_additional\"><span class=\"tp_pub_additional_in\">In: <\/span><span class=\"tp_pub_additional_pages\">pp. 562-571, <\/span><span class=\"tp_pub_additional_year\">2023<\/span>.<\/p><p class=\"tp_pub_menu\"><span class=\"tp_abstract_link\"><a id=\"tp_abstract_sh_939\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('939','tp_abstract')\" title=\"Show abstract\" style=\"cursor:pointer;\">Abstract<\/a><\/span> | <span class=\"tp_resource_link\"><a id=\"tp_links_sh_939\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('939','tp_links')\" title=\"Show links and resources\" style=\"cursor:pointer;\">Links<\/a><\/span> | <span class=\"tp_bibtex_link\"><a id=\"tp_bibtex_sh_939\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('939','tp_bibtex')\" title=\"Show BibTeX entry\" style=\"cursor:pointer;\">BibTeX<\/a><\/span><\/p><div class=\"tp_bibtex\" id=\"tp_bibtex_939\" style=\"display:none;\"><div class=\"tp_bibtex_entry\"><pre>@article{fichtinger2023g,<br \/>\r\ntitle = {Bridging Ex-Vivo Training and Intra-operative Deployment for Surgical Margin Assessment with Evidential Graph Transformer},<br \/>\r\nauthor = {Amoon Jamzad and Fahimeh Fooladgar and Laura Connolly and Dilakshan Srikanthan and Ayesha Syeda and Martin Kaufmann and Kevin YM Ren and Shaila Merchant and Jay Engel and Sonal Varma and Gabor Fichtinger and John F Rudan and Parvin Mousavi},<br \/>\r\nurl = {https:\/\/link.springer.com\/chapter\/10.1007\/978-3-031-43990-2_53},<br \/>\r\nyear  = {2023},<br \/>\r\ndate = {2023-01-01},<br \/>\r\npages = {562-571},<br \/>\r\npublisher = {Springer Nature Switzerland},<br \/>\r\nabstract = {PURPOSE <br \/>\r\nThe use of intra-operative mass spectrometry along with Graph Transformer models showed promising results for margin detection on ex-vivo data. Although highly interpretable, these methods lack the ability to handle the uncertainty associated with intra-operative decision making. In this paper for the first time, we propose Evidential Graph Transformer network, a combination of attention mapping and uncertainty estimation to increase the performance and interpretability of surgical margin assessment. <br \/>\r\nMETHODS <br \/>\r\nThe Evidential Graph Transformer was formulated to output the uncertainty estimation along with intermediate attentions. The performance of the model was compared with different baselines in an ex-vivo cross-validation scheme, with extensive ablation study. The association of the model with clinical features were explored. The model was further validated for a prospective ex-vivo data, as \u2026},<br \/>\r\nkeywords = {},<br \/>\r\npubstate = {published},<br \/>\r\ntppubtype = {article}<br \/>\r\n}<br \/>\r\n<\/pre><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('939','tp_bibtex')\">Close<\/a><\/p><\/div><div class=\"tp_abstract\" id=\"tp_abstract_939\" style=\"display:none;\"><div class=\"tp_abstract_entry\">PURPOSE <br \/>\r\nThe use of intra-operative mass spectrometry along with Graph Transformer models showed promising results for margin detection on ex-vivo data. Although highly interpretable, these methods lack the ability to handle the uncertainty associated with intra-operative decision making. In this paper for the first time, we propose Evidential Graph Transformer network, a combination of attention mapping and uncertainty estimation to increase the performance and interpretability of surgical margin assessment. <br \/>\r\nMETHODS <br \/>\r\nThe Evidential Graph Transformer was formulated to output the uncertainty estimation along with intermediate attentions. The performance of the model was compared with different baselines in an ex-vivo cross-validation scheme, with extensive ablation study. The association of the model with clinical features were explored. The model was further validated for a prospective ex-vivo data, as \u2026<\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('939','tp_abstract')\">Close<\/a><\/p><\/div><div class=\"tp_links\" id=\"tp_links_939\" style=\"display:none;\"><div class=\"tp_links_entry\"><ul class=\"tp_pub_list\"><li><i class=\"fas fa-globe\"><\/i><a class=\"tp_pub_list\" href=\"https:\/\/link.springer.com\/chapter\/10.1007\/978-3-031-43990-2_53\" title=\"https:\/\/link.springer.com\/chapter\/10.1007\/978-3-031-43990-2_53\" target=\"_blank\">https:\/\/link.springer.com\/chapter\/10.1007\/978-3-031-43990-2_53<\/a><\/li><\/ul><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('939','tp_links')\">Close<\/a><\/p><\/div><\/div><\/div><div class=\"tp_publication tp_publication_article\"><div class=\"tp_pub_info\"><p class=\"tp_pub_author\"> Fooladgar, Fahimeh;  Jamzad, Amoon;  Connolly, Laura;  Santilli, Alice;  Kaufmann, Martin;  Ren, Kevin;  Abolmaesumi, Purang;  Rudan, John;  McKay, Doug;  Fichtinger, Gabor;  Mousavi, Parvin<\/p><p class=\"tp_pub_title\"><a class=\"tp_title_link\" href=\"https:\/\/dx.doi.org\/https:\/\/doi.org\/10.1007\/s11548-022-02764-3\" title=\"Uncertainty estimation for margin detection in cancer surgery using mass spectrometry\" target=\"blank\">Uncertainty estimation for margin detection in cancer surgery using mass spectrometry<\/a> <span class=\"tp_pub_type tp_  article\">Journal Article<\/span> <\/p><p class=\"tp_pub_additional\"><span class=\"tp_pub_additional_in\">In: <\/span><span class=\"tp_pub_additional_journal\">International Journal of Computer Assisted Radiology and Surgery, <\/span><span class=\"tp_pub_additional_year\">2022<\/span>.<\/p><p class=\"tp_pub_menu\"><span class=\"tp_resource_link\"><a id=\"tp_links_sh_30\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('30','tp_links')\" title=\"Show links and resources\" style=\"cursor:pointer;\">Links<\/a><\/span> | <span class=\"tp_bibtex_link\"><a id=\"tp_bibtex_sh_30\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('30','tp_bibtex')\" title=\"Show BibTeX entry\" style=\"cursor:pointer;\">BibTeX<\/a><\/span><\/p><div class=\"tp_bibtex\" id=\"tp_bibtex_30\" style=\"display:none;\"><div class=\"tp_bibtex_entry\"><pre>@article{Fooladgar2022,<br \/>\r\ntitle = {Uncertainty estimation for margin detection in cancer surgery using mass spectrometry},<br \/>\r\nauthor = {Fahimeh Fooladgar and Amoon Jamzad and Laura Connolly and Alice Santilli and Martin Kaufmann and Kevin Ren and Purang Abolmaesumi and John Rudan and Doug McKay and Gabor Fichtinger and Parvin Mousavi},<br \/>\r\ndoi = {https:\/\/doi.org\/10.1007\/s11548-022-02764-3},<br \/>\r\nyear  = {2022},<br \/>\r\ndate = {2022-09-01},<br \/>\r\njournal = {International Journal of Computer Assisted Radiology and Surgery},<br \/>\r\nkeywords = {},<br \/>\r\npubstate = {published},<br \/>\r\ntppubtype = {article}<br \/>\r\n}<br \/>\r\n<\/pre><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('30','tp_bibtex')\">Close<\/a><\/p><\/div><div class=\"tp_links\" id=\"tp_links_30\" style=\"display:none;\"><div class=\"tp_links_entry\"><ul class=\"tp_pub_list\"><li><i class=\"ai ai-doi\"><\/i><a class=\"tp_pub_list\" href=\"https:\/\/dx.doi.org\/https:\/\/doi.org\/10.1007\/s11548-022-02764-3\" title=\"Follow DOI:https:\/\/doi.org\/10.1007\/s11548-022-02764-3\" target=\"_blank\">doi:https:\/\/doi.org\/10.1007\/s11548-022-02764-3<\/a><\/li><\/ul><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('30','tp_links')\">Close<\/a><\/p><\/div><\/div><\/div><div class=\"tp_publication tp_publication_article\"><div class=\"tp_pub_info\"><p class=\"tp_pub_author\"> Connolly, Laura;  Degeut, Anton;  Leonard, Simon;  Tokuda, Junichi;  Ungi, Tamas;  Krieger, Axel;  Kazanzides, Peter;  Mousavi, Parvin;  Fichtinger, Gabor;  Taylor, Russell H.<\/p><p class=\"tp_pub_title\"><a class=\"tp_title_link\" href=\"https:\/\/dx.doi.org\/https:\/\/doi.org\/10.3390\/s22145336\" title=\"Bridging 3D Slicer and ROS2 for Image-Guided Robotic Interventions\" target=\"blank\">Bridging 3D Slicer and ROS2 for Image-Guided Robotic Interventions<\/a> <span class=\"tp_pub_type tp_  article\">Journal Article<\/span> <\/p><p class=\"tp_pub_additional\"><span class=\"tp_pub_additional_in\">In: <\/span><span class=\"tp_pub_additional_journal\">Sensors, <\/span><span class=\"tp_pub_additional_volume\">vol. 22, <\/span><span class=\"tp_pub_additional_year\">2022<\/span>.<\/p><p class=\"tp_pub_menu\"><span class=\"tp_resource_link\"><a id=\"tp_links_sh_21\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('21','tp_links')\" title=\"Show links and resources\" style=\"cursor:pointer;\">Links<\/a><\/span> | <span class=\"tp_bibtex_link\"><a id=\"tp_bibtex_sh_21\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('21','tp_bibtex')\" title=\"Show BibTeX entry\" style=\"cursor:pointer;\">BibTeX<\/a><\/span><\/p><div class=\"tp_bibtex\" id=\"tp_bibtex_21\" style=\"display:none;\"><div class=\"tp_bibtex_entry\"><pre>@article{Connolly2022c,<br \/>\r\ntitle = {Bridging 3D Slicer and ROS2 for Image-Guided Robotic Interventions},<br \/>\r\nauthor = {Laura Connolly and Anton Degeut and Simon Leonard and Junichi Tokuda and Tamas Ungi and Axel Krieger and Peter Kazanzides and Parvin Mousavi and Gabor Fichtinger and Russell H. Taylor},<br \/>\r\ndoi = {https:\/\/doi.org\/10.3390\/s22145336},<br \/>\r\nyear  = {2022},<br \/>\r\ndate = {2022-07-01},<br \/>\r\njournal = {Sensors},<br \/>\r\nvolume = {22},<br \/>\r\nkeywords = {},<br \/>\r\npubstate = {published},<br \/>\r\ntppubtype = {article}<br \/>\r\n}<br \/>\r\n<\/pre><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('21','tp_bibtex')\">Close<\/a><\/p><\/div><div class=\"tp_links\" id=\"tp_links_21\" style=\"display:none;\"><div class=\"tp_links_entry\"><ul class=\"tp_pub_list\"><li><i class=\"ai ai-doi\"><\/i><a class=\"tp_pub_list\" href=\"https:\/\/dx.doi.org\/https:\/\/doi.org\/10.3390\/s22145336\" title=\"Follow DOI:https:\/\/doi.org\/10.3390\/s22145336\" target=\"_blank\">doi:https:\/\/doi.org\/10.3390\/s22145336<\/a><\/li><\/ul><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('21','tp_links')\">Close<\/a><\/p><\/div><\/div><\/div><div class=\"tp_publication tp_publication_conference\"><div class=\"tp_pub_info\"><p class=\"tp_pub_author\"> Connolly, Laura;  Jamzad, Amoon;  Nikniazi, Arash;  Poushimin, Rana;  Nunzi, Jean Michel;  Rudan, John;  Fichtinger, Gabor;  Mousavi, Parvin<\/p><p class=\"tp_pub_title\"><a class=\"tp_title_link\" href=\"https:\/\/dx.doi.org\/https:\/\/doi.org\/10.1117\/12.2611964\" title=\"Feasibility of combined optical and acoustic imaging for surgical cavity scanning\" target=\"blank\">Feasibility of combined optical and acoustic imaging for surgical cavity scanning<\/a> <span class=\"tp_pub_type tp_  conference\">Conference<\/span> <\/p><p class=\"tp_pub_additional\"><span class=\"tp_pub_additional_booktitle\">SPIE Medical Imaging 2022: Image-Guided Procedures, Robotic Interventions, and Modeling, <\/span><span class=\"tp_pub_additional_volume\">vol. 12034, <\/span><span class=\"tp_pub_additional_address\">San Diego (online), <\/span><span class=\"tp_pub_additional_year\">2022<\/span>.<\/p><p class=\"tp_pub_menu\"><span class=\"tp_abstract_link\"><a id=\"tp_abstract_sh_23\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('23','tp_abstract')\" title=\"Show abstract\" style=\"cursor:pointer;\">Abstract<\/a><\/span> | <span class=\"tp_resource_link\"><a id=\"tp_links_sh_23\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('23','tp_links')\" title=\"Show links and resources\" style=\"cursor:pointer;\">Links<\/a><\/span> | <span class=\"tp_bibtex_link\"><a id=\"tp_bibtex_sh_23\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('23','tp_bibtex')\" title=\"Show BibTeX entry\" style=\"cursor:pointer;\">BibTeX<\/a><\/span><\/p><div class=\"tp_bibtex\" id=\"tp_bibtex_23\" style=\"display:none;\"><div class=\"tp_bibtex_entry\"><pre>@conference{Connolly2022,<br \/>\r\ntitle = {Feasibility of combined optical and acoustic imaging for surgical cavity scanning},<br \/>\r\nauthor = {Laura Connolly and Amoon Jamzad and Arash Nikniazi and Rana Poushimin and Jean Michel Nunzi and John Rudan and Gabor Fichtinger and Parvin Mousavi},<br \/>\r\ndoi = {https:\/\/doi.org\/10.1117\/12.2611964},<br \/>\r\nyear  = {2022},<br \/>\r\ndate = {2022-04-01},<br \/>\r\nbooktitle = {SPIE Medical Imaging 2022: Image-Guided Procedures, Robotic Interventions, and Modeling},<br \/>\r\nvolume = {12034},<br \/>\r\naddress = {San Diego (online)},<br \/>\r\nabstract = {PURPOSE: Over 30% of breast conserving surgery patients must undergo repeat surgery to address incomplete tumor resection. We hypothesize that the addition of a robotic cavity scanning system can improve the success rates of these procedures by performing additional, intraoperative imaging to detect left-over cancer cells. In this study, we assess the feasibility of a combined optical and acoustic imaging approach for this cavity scanning system. METHODS: Dual-layer tissue phantoms are imaged with both throughput broadband spectroscopy and an endocavity ultrasound probe. The absorbance and transmittance of the incident light from the broadband source is used to characterize each tissue sample optically. Additionally, a temporally enhanced ultrasound approach is used to distinguish the heterogeneity of the tissue sample by classifying individual pixels in the ultrasound image with a support vector machine. The goal of this combined approach is to use optical characterization to classify the tissue surface, and acoustic characterization to classify the sample heterogeneity. RESULTS: Both optical and acoustic characterization demonstrated promising preliminary results. The class of each tissue sample is distinctly separable based on the transmittance and absorption of the broadband light. Additionally, an SVM trained on the temporally enhance ultrasound signals for each tissue type, showed 82% linear separability of labelled temporally enhanced ultrasound sequences in our test set. CONCLUSIONS: By combining broadband and ultrasound imaging, we demonstrate a potential non-destructive imaging approach for this robotic cavity scanning system. With this approach, our system can detect both surface level tissue characteristics and depth information. Applying this to breast conserving surgery can help inform the surgeon about the tissue composition of the resection cavity after initial tumor resection.},<br \/>\r\nkeywords = {},<br \/>\r\npubstate = {published},<br \/>\r\ntppubtype = {conference}<br \/>\r\n}<br \/>\r\n<\/pre><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('23','tp_bibtex')\">Close<\/a><\/p><\/div><div class=\"tp_abstract\" id=\"tp_abstract_23\" style=\"display:none;\"><div class=\"tp_abstract_entry\">PURPOSE: Over 30% of breast conserving surgery patients must undergo repeat surgery to address incomplete tumor resection. We hypothesize that the addition of a robotic cavity scanning system can improve the success rates of these procedures by performing additional, intraoperative imaging to detect left-over cancer cells. In this study, we assess the feasibility of a combined optical and acoustic imaging approach for this cavity scanning system. METHODS: Dual-layer tissue phantoms are imaged with both throughput broadband spectroscopy and an endocavity ultrasound probe. The absorbance and transmittance of the incident light from the broadband source is used to characterize each tissue sample optically. Additionally, a temporally enhanced ultrasound approach is used to distinguish the heterogeneity of the tissue sample by classifying individual pixels in the ultrasound image with a support vector machine. The goal of this combined approach is to use optical characterization to classify the tissue surface, and acoustic characterization to classify the sample heterogeneity. RESULTS: Both optical and acoustic characterization demonstrated promising preliminary results. The class of each tissue sample is distinctly separable based on the transmittance and absorption of the broadband light. Additionally, an SVM trained on the temporally enhance ultrasound signals for each tissue type, showed 82% linear separability of labelled temporally enhanced ultrasound sequences in our test set. CONCLUSIONS: By combining broadband and ultrasound imaging, we demonstrate a potential non-destructive imaging approach for this robotic cavity scanning system. With this approach, our system can detect both surface level tissue characteristics and depth information. Applying this to breast conserving surgery can help inform the surgeon about the tissue composition of the resection cavity after initial tumor resection.<\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('23','tp_abstract')\">Close<\/a><\/p><\/div><div class=\"tp_links\" id=\"tp_links_23\" style=\"display:none;\"><div class=\"tp_links_entry\"><ul class=\"tp_pub_list\"><li><i class=\"ai ai-doi\"><\/i><a class=\"tp_pub_list\" href=\"https:\/\/dx.doi.org\/https:\/\/doi.org\/10.1117\/12.2611964\" title=\"Follow DOI:https:\/\/doi.org\/10.1117\/12.2611964\" target=\"_blank\">doi:https:\/\/doi.org\/10.1117\/12.2611964<\/a><\/li><\/ul><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('23','tp_links')\">Close<\/a><\/p><\/div><\/div><\/div><div class=\"tp_publication tp_publication_article\"><div class=\"tp_pub_info\"><p class=\"tp_pub_author\"> Connolly, Laura;  Deguet, Anton;  Leonard, Simon;  Tokuda, Junichi;  Ungi, Tamas;  Krieger, Axel;  Kazanzides, Peter;  Mousavi, Parvin;  Fichtinger, Gabor;  Taylor, Russell H<\/p><p class=\"tp_pub_title\"><a class=\"tp_title_link\" href=\"https:\/\/www.mdpi.com\/1424-8220\/22\/14\/5336\" title=\"https:\/\/www.mdpi.com\/1424-8220\/22\/14\/5336\" target=\"blank\">Bridging 3D Slicer and ROS2 for image-guided robotic interventions<\/a> <span class=\"tp_pub_type tp_  article\">Journal Article<\/span> <\/p><p class=\"tp_pub_additional\"><span class=\"tp_pub_additional_in\">In: <\/span><span class=\"tp_pub_additional_journal\">Sensors, <\/span><span class=\"tp_pub_additional_volume\">vol. 22, <\/span><span class=\"tp_pub_additional_issue\">iss. 14, <\/span><span class=\"tp_pub_additional_pages\">pp. 5336, <\/span><span class=\"tp_pub_additional_year\">2022<\/span>.<\/p><p class=\"tp_pub_menu\"><span class=\"tp_abstract_link\"><a id=\"tp_abstract_sh_849\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('849','tp_abstract')\" title=\"Show abstract\" style=\"cursor:pointer;\">Abstract<\/a><\/span> | <span class=\"tp_resource_link\"><a id=\"tp_links_sh_849\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('849','tp_links')\" title=\"Show links and resources\" style=\"cursor:pointer;\">Links<\/a><\/span> | <span class=\"tp_bibtex_link\"><a id=\"tp_bibtex_sh_849\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('849','tp_bibtex')\" title=\"Show BibTeX entry\" style=\"cursor:pointer;\">BibTeX<\/a><\/span><\/p><div class=\"tp_bibtex\" id=\"tp_bibtex_849\" style=\"display:none;\"><div class=\"tp_bibtex_entry\"><pre>@article{fichtinger2022e,<br \/>\r\ntitle = {Bridging 3D Slicer and ROS2 for image-guided robotic interventions},<br \/>\r\nauthor = {Laura Connolly and Anton Deguet and Simon Leonard and Junichi Tokuda and Tamas Ungi and Axel Krieger and Peter Kazanzides and Parvin Mousavi and Gabor Fichtinger and Russell H Taylor},<br \/>\r\nurl = {https:\/\/www.mdpi.com\/1424-8220\/22\/14\/5336},<br \/>\r\nyear  = {2022},<br \/>\r\ndate = {2022-01-01},<br \/>\r\njournal = {Sensors},<br \/>\r\nvolume = {22},<br \/>\r\nissue = {14},<br \/>\r\npages = {5336},<br \/>\r\npublisher = {MDPI},<br \/>\r\nabstract = {Developing image-guided robotic systems requires access to flexible, open-source software. For image guidance, the open-source medical imaging platform 3D Slicer is one of the most adopted tools that can be used for research and prototyping. Similarly, for robotics, the open-source middleware suite robot operating system (ROS) is the standard development framework. In the past, there have been several \u201cad hoc\u201d attempts made to bridge both tools; however, they are all reliant on middleware and custom interfaces. Additionally, none of these attempts have been successful in bridging access to the full suite of tools provided by ROS or 3D Slicer. Therefore, in this paper, we present the SlicerROS2 module, which was designed for the direct use of ROS2 packages and libraries within 3D Slicer. The module was developed to enable real-time visualization of robots, accommodate different robot configurations, and facilitate data transfer in both directions (between ROS and Slicer). We demonstrate the system on multiple robots with different configurations, evaluate the system performance and discuss an image-guided robotic intervention that can be prototyped with this module. This module can serve as a starting point for clinical system development that reduces the need for custom interfaces and time-intensive platform setup.},<br \/>\r\nkeywords = {},<br \/>\r\npubstate = {published},<br \/>\r\ntppubtype = {article}<br \/>\r\n}<br \/>\r\n<\/pre><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('849','tp_bibtex')\">Close<\/a><\/p><\/div><div class=\"tp_abstract\" id=\"tp_abstract_849\" style=\"display:none;\"><div class=\"tp_abstract_entry\">Developing image-guided robotic systems requires access to flexible, open-source software. For image guidance, the open-source medical imaging platform 3D Slicer is one of the most adopted tools that can be used for research and prototyping. Similarly, for robotics, the open-source middleware suite robot operating system (ROS) is the standard development framework. In the past, there have been several \u201cad hoc\u201d attempts made to bridge both tools; however, they are all reliant on middleware and custom interfaces. Additionally, none of these attempts have been successful in bridging access to the full suite of tools provided by ROS or 3D Slicer. Therefore, in this paper, we present the SlicerROS2 module, which was designed for the direct use of ROS2 packages and libraries within 3D Slicer. The module was developed to enable real-time visualization of robots, accommodate different robot configurations, and facilitate data transfer in both directions (between ROS and Slicer). We demonstrate the system on multiple robots with different configurations, evaluate the system performance and discuss an image-guided robotic intervention that can be prototyped with this module. This module can serve as a starting point for clinical system development that reduces the need for custom interfaces and time-intensive platform setup.<\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('849','tp_abstract')\">Close<\/a><\/p><\/div><div class=\"tp_links\" id=\"tp_links_849\" style=\"display:none;\"><div class=\"tp_links_entry\"><ul class=\"tp_pub_list\"><li><i class=\"fas fa-globe\"><\/i><a class=\"tp_pub_list\" href=\"https:\/\/www.mdpi.com\/1424-8220\/22\/14\/5336\" title=\"https:\/\/www.mdpi.com\/1424-8220\/22\/14\/5336\" target=\"_blank\">https:\/\/www.mdpi.com\/1424-8220\/22\/14\/5336<\/a><\/li><\/ul><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('849','tp_links')\">Close<\/a><\/p><\/div><\/div><\/div><div class=\"tp_publication tp_publication_conference\"><div class=\"tp_pub_info\"><p class=\"tp_pub_author\"> Connolly, Laura;  Jamzad, Amoon;  Nikniazi, Arash;  Poushimin, Rana;  Lasso, Andras;  Sunderland, Kyle R.;  Ungi, Tamas;  Nunzi, Jean Michel;  Rudan, John;  Fichtinger, Gabor;  Mousavi, Parvin<\/p><p class=\"tp_pub_title\"><a class=\"tp_title_link\" href=\"https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/01\/Connolly2022b.pdf\" title=\"https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/01\/Connolly2022b.pdf\" target=\"blank\">An open-source testbed for developing image-guided robotic tumor-bed inspection<\/a> <span class=\"tp_pub_type tp_  conference\">Conference<\/span> <\/p><p class=\"tp_pub_additional\"><span class=\"tp_pub_additional_booktitle\">Imaging Network of Ontario (ImNO) Symposium, <\/span><span class=\"tp_pub_additional_year\">2022<\/span>.<\/p><p class=\"tp_pub_menu\"><span class=\"tp_resource_link\"><a id=\"tp_links_sh_28\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('28','tp_links')\" title=\"Show links and resources\" style=\"cursor:pointer;\">Links<\/a><\/span> | <span class=\"tp_bibtex_link\"><a id=\"tp_bibtex_sh_28\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('28','tp_bibtex')\" title=\"Show BibTeX entry\" style=\"cursor:pointer;\">BibTeX<\/a><\/span><\/p><div class=\"tp_bibtex\" id=\"tp_bibtex_28\" style=\"display:none;\"><div class=\"tp_bibtex_entry\"><pre>@conference{connolly2022b,<br \/>\r\ntitle = {An open-source testbed for developing image-guided robotic tumor-bed inspection},<br \/>\r\nauthor = {Laura Connolly and Amoon Jamzad and Arash Nikniazi and Rana Poushimin and Andras Lasso and Kyle R. Sunderland and Tamas Ungi and Jean Michel Nunzi and John Rudan and Gabor Fichtinger and Parvin Mousavi},<br \/>\r\nurl = {https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/01\/Connolly2022b.pdf},<br \/>\r\nyear  = {2022},<br \/>\r\ndate = {2022-01-01},<br \/>\r\nurldate = {2022-01-01},<br \/>\r\nbooktitle = {Imaging Network of Ontario (ImNO) Symposium},<br \/>\r\nkeywords = {},<br \/>\r\npubstate = {published},<br \/>\r\ntppubtype = {conference}<br \/>\r\n}<br \/>\r\n<\/pre><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('28','tp_bibtex')\">Close<\/a><\/p><\/div><div class=\"tp_links\" id=\"tp_links_28\" style=\"display:none;\"><div class=\"tp_links_entry\"><ul class=\"tp_pub_list\"><li><i class=\"fas fa-file-pdf\"><\/i><a class=\"tp_pub_list\" href=\"https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/01\/Connolly2022b.pdf\" title=\"https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/01\/Connolly20[...]\" target=\"_blank\">https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/01\/Connolly20[...]<\/a><\/li><\/ul><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('28','tp_links')\">Close<\/a><\/p><\/div><\/div><\/div><div class=\"tp_publication tp_publication_conference\"><div class=\"tp_pub_info\"><p class=\"tp_pub_author\"> Connolly, Laura;  Degeut, Anton;  Sunderland, Kyle R.;  Lasso, Andras;  Ungi, Tamas;  Rudan, John;  Taylor, Russell H.;  Mousavi, Parvin;  Fichtinger, Gabor<\/p><p class=\"tp_pub_title\"><a class=\"tp_title_link\" href=\"https:\/\/dx.doi.org\/https:\/\/doi.org\/10.1109\/ICAS49788.2021.9551149\" title=\"An open-source platform for cooperative semi-autonomous robotic surgery\" target=\"blank\">An open-source platform for cooperative semi-autonomous robotic surgery<\/a> <span class=\"tp_pub_type tp_  conference\">Conference<\/span> <\/p><p class=\"tp_pub_additional\"><span class=\"tp_pub_additional_booktitle\">IEEE International Conference on Autonomous Systems, <\/span><span class=\"tp_pub_additional_organization\">IEEE <\/span><span class=\"tp_pub_additional_publisher\">IEEE, <\/span><span class=\"tp_pub_additional_address\">Montreal, Quebec, <\/span><span class=\"tp_pub_additional_year\">2021<\/span>.<\/p><p class=\"tp_pub_menu\"><span class=\"tp_resource_link\"><a id=\"tp_links_sh_38\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('38','tp_links')\" title=\"Show links and resources\" style=\"cursor:pointer;\">Links<\/a><\/span> | <span class=\"tp_bibtex_link\"><a id=\"tp_bibtex_sh_38\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('38','tp_bibtex')\" title=\"Show BibTeX entry\" style=\"cursor:pointer;\">BibTeX<\/a><\/span><\/p><div class=\"tp_bibtex\" id=\"tp_bibtex_38\" style=\"display:none;\"><div class=\"tp_bibtex_entry\"><pre>@conference{Connolly2021,<br \/>\r\ntitle = {An open-source platform for cooperative semi-autonomous robotic surgery},<br \/>\r\nauthor = {Laura Connolly and Anton Degeut and Kyle R. Sunderland and Andras Lasso and Tamas Ungi and John Rudan and Russell H. Taylor and Parvin Mousavi and Gabor Fichtinger},<br \/>\r\ndoi = {https:\/\/doi.org\/10.1109\/ICAS49788.2021.9551149},<br \/>\r\nyear  = {2021},<br \/>\r\ndate = {2021-10-01},<br \/>\r\nurldate = {2021-10-01},<br \/>\r\nbooktitle = {IEEE International Conference on Autonomous Systems},<br \/>\r\npublisher = {IEEE},<br \/>\r\naddress = {Montreal, Quebec},<br \/>\r\norganization = {IEEE},<br \/>\r\nkeywords = {},<br \/>\r\npubstate = {published},<br \/>\r\ntppubtype = {conference}<br \/>\r\n}<br \/>\r\n<\/pre><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('38','tp_bibtex')\">Close<\/a><\/p><\/div><div class=\"tp_links\" id=\"tp_links_38\" style=\"display:none;\"><div class=\"tp_links_entry\"><ul class=\"tp_pub_list\"><li><i class=\"ai ai-doi\"><\/i><a class=\"tp_pub_list\" href=\"https:\/\/dx.doi.org\/https:\/\/doi.org\/10.1109\/ICAS49788.2021.9551149\" title=\"Follow DOI:https:\/\/doi.org\/10.1109\/ICAS49788.2021.9551149\" target=\"_blank\">doi:https:\/\/doi.org\/10.1109\/ICAS49788.2021.9551149<\/a><\/li><\/ul><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('38','tp_links')\">Close<\/a><\/p><\/div><\/div><\/div><div class=\"tp_publication tp_publication_article\"><div class=\"tp_pub_info\"><p class=\"tp_pub_author\"> Connolly, Laura;  Jamzad, Amoon;  Kaufmann, Martin;  Farquharson, Catriona E.;  Ren, Kevin;  Rudan, John;  Fichtinger, Gabor;  Mousavi, Parvin<\/p><p class=\"tp_pub_title\"><a class=\"tp_title_link\" href=\"https:\/\/dx.doi.org\/https:\/\/doi.org\/10.3390\/jimaging7100203\" title=\"Combined Mass Spectrometry and Histopathology Imaging for Perioperative Tissue Assessment in Cancer Surgery\" target=\"blank\">Combined Mass Spectrometry and Histopathology Imaging for Perioperative Tissue Assessment in Cancer Surgery<\/a> <span class=\"tp_pub_type tp_  article\">Journal Article<\/span> <\/p><p class=\"tp_pub_additional\"><span class=\"tp_pub_additional_in\">In: <\/span><span class=\"tp_pub_additional_journal\">Journal of Imaging, <\/span><span class=\"tp_pub_additional_volume\">vol. 7, <\/span><span class=\"tp_pub_additional_number\">no. 203, <\/span><span class=\"tp_pub_additional_year\">2021<\/span>.<\/p><p class=\"tp_pub_menu\"><span class=\"tp_abstract_link\"><a id=\"tp_abstract_sh_33\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('33','tp_abstract')\" title=\"Show abstract\" style=\"cursor:pointer;\">Abstract<\/a><\/span> | <span class=\"tp_resource_link\"><a id=\"tp_links_sh_33\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('33','tp_links')\" title=\"Show links and resources\" style=\"cursor:pointer;\">Links<\/a><\/span> | <span class=\"tp_bibtex_link\"><a id=\"tp_bibtex_sh_33\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('33','tp_bibtex')\" title=\"Show BibTeX entry\" style=\"cursor:pointer;\">BibTeX<\/a><\/span><\/p><div class=\"tp_bibtex\" id=\"tp_bibtex_33\" style=\"display:none;\"><div class=\"tp_bibtex_entry\"><pre>@article{Connolly2021c,<br \/>\r\ntitle = {Combined Mass Spectrometry and Histopathology Imaging for Perioperative Tissue Assessment in Cancer Surgery},<br \/>\r\nauthor = {Laura Connolly and Amoon Jamzad and Martin Kaufmann and Catriona E. Farquharson and Kevin Ren and John Rudan and Gabor Fichtinger and Parvin Mousavi},<br \/>\r\ndoi = {https:\/\/doi.org\/10.3390\/jimaging7100203},<br \/>\r\nyear  = {2021},<br \/>\r\ndate = {2021-10-01},<br \/>\r\njournal = {Journal of Imaging},<br \/>\r\nvolume = {7},<br \/>\r\nnumber = {203},<br \/>\r\nabstract = {&lt;p&gt;&lt;span style=\"color:rgb(34, 34, 34); font-family:arial; font-size:12px\"&gt;Mass spectrometry is an effective imaging tool for evaluating biological tissue to detect cancer. With the assistance of deep learning, this technology can be used as a perioperative tissue assessment tool that will facilitate informed surgical decisions. To achieve such a system requires the development of a database of mass spectrometry signals and their corresponding pathology labels. Assigning correct labels, in turn, necessitates precise spatial registration of histopathology and mass spectrometry data. This is a challenging task due to the domain differences and noisy nature of images. In this study, we create a registration framework for mass spectrometry and pathology images as a contribution to the development of perioperative tissue assessment. In doing so, we explore two opportunities in deep learning for medical image registration, namely, unsupervised, multi-modal deformable image registration and evaluation of the registration. We test this system on prostate needle biopsy cores that were imaged with desorption electrospray ionization mass spectrometry (DESI) and show that we can successfully register DESI and histology images to achieve accurate alignment and, consequently, labelling for future training. This automation is expected to improve the efficiency and development of a deep learning architecture that will benefit the use of mass spectrometry imaging for cancer diagnosis.&lt;\/span&gt;&lt;\/p&gt;},<br \/>\r\nkeywords = {},<br \/>\r\npubstate = {published},<br \/>\r\ntppubtype = {article}<br \/>\r\n}<br \/>\r\n<\/pre><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('33','tp_bibtex')\">Close<\/a><\/p><\/div><div class=\"tp_abstract\" id=\"tp_abstract_33\" style=\"display:none;\"><div class=\"tp_abstract_entry\">&lt;p&gt;&lt;span style=\"color:rgb(34, 34, 34); font-family:arial; font-size:12px\"&gt;Mass spectrometry is an effective imaging tool for evaluating biological tissue to detect cancer. With the assistance of deep learning, this technology can be used as a perioperative tissue assessment tool that will facilitate informed surgical decisions. To achieve such a system requires the development of a database of mass spectrometry signals and their corresponding pathology labels. Assigning correct labels, in turn, necessitates precise spatial registration of histopathology and mass spectrometry data. This is a challenging task due to the domain differences and noisy nature of images. In this study, we create a registration framework for mass spectrometry and pathology images as a contribution to the development of perioperative tissue assessment. In doing so, we explore two opportunities in deep learning for medical image registration, namely, unsupervised, multi-modal deformable image registration and evaluation of the registration. We test this system on prostate needle biopsy cores that were imaged with desorption electrospray ionization mass spectrometry (DESI) and show that we can successfully register DESI and histology images to achieve accurate alignment and, consequently, labelling for future training. This automation is expected to improve the efficiency and development of a deep learning architecture that will benefit the use of mass spectrometry imaging for cancer diagnosis.&lt;\/span&gt;&lt;\/p&gt;<\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('33','tp_abstract')\">Close<\/a><\/p><\/div><div class=\"tp_links\" id=\"tp_links_33\" style=\"display:none;\"><div class=\"tp_links_entry\"><ul class=\"tp_pub_list\"><li><i class=\"ai ai-doi\"><\/i><a class=\"tp_pub_list\" href=\"https:\/\/dx.doi.org\/https:\/\/doi.org\/10.3390\/jimaging7100203\" title=\"Follow DOI:https:\/\/doi.org\/10.3390\/jimaging7100203\" target=\"_blank\">doi:https:\/\/doi.org\/10.3390\/jimaging7100203<\/a><\/li><\/ul><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('33','tp_links')\">Close<\/a><\/p><\/div><\/div><\/div><div class=\"tp_publication tp_publication_article\"><div class=\"tp_pub_info\"><p class=\"tp_pub_author\"> Connolly, Laura;  Jamzad, Amoon;  Kaufmann, Martin;  Farquharson, Catriona E;  Ren, Kevin;  Rudan, John F;  Fichtinger, Gabor;  Mousavi, Parvin<\/p><p class=\"tp_pub_title\"><a class=\"tp_title_link\" href=\"https:\/\/www.mdpi.com\/2313-433X\/7\/10\/203\" title=\"https:\/\/www.mdpi.com\/2313-433X\/7\/10\/203\" target=\"blank\">Combined mass spectrometry and histopathology imaging for perioperative tissue assessment in cancer surgery<\/a> <span class=\"tp_pub_type tp_  article\">Journal Article<\/span> <\/p><p class=\"tp_pub_additional\"><span class=\"tp_pub_additional_in\">In: <\/span><span class=\"tp_pub_additional_journal\">Journal of Imaging, <\/span><span class=\"tp_pub_additional_volume\">vol. 7, <\/span><span class=\"tp_pub_additional_issue\">iss. 10, <\/span><span class=\"tp_pub_additional_pages\">pp. 203, <\/span><span class=\"tp_pub_additional_year\">2021<\/span>.<\/p><p class=\"tp_pub_menu\"><span class=\"tp_abstract_link\"><a id=\"tp_abstract_sh_879\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('879','tp_abstract')\" title=\"Show abstract\" style=\"cursor:pointer;\">Abstract<\/a><\/span> | <span class=\"tp_resource_link\"><a id=\"tp_links_sh_879\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('879','tp_links')\" title=\"Show links and resources\" style=\"cursor:pointer;\">Links<\/a><\/span> | <span class=\"tp_bibtex_link\"><a id=\"tp_bibtex_sh_879\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('879','tp_bibtex')\" title=\"Show BibTeX entry\" style=\"cursor:pointer;\">BibTeX<\/a><\/span><\/p><div class=\"tp_bibtex\" id=\"tp_bibtex_879\" style=\"display:none;\"><div class=\"tp_bibtex_entry\"><pre>@article{fichtinger2021f,<br \/>\r\ntitle = {Combined mass spectrometry and histopathology imaging for perioperative tissue assessment in cancer surgery},<br \/>\r\nauthor = {Laura Connolly and Amoon Jamzad and Martin Kaufmann and Catriona E Farquharson and Kevin Ren and John F Rudan and Gabor Fichtinger and Parvin Mousavi},<br \/>\r\nurl = {https:\/\/www.mdpi.com\/2313-433X\/7\/10\/203},<br \/>\r\nyear  = {2021},<br \/>\r\ndate = {2021-01-01},<br \/>\r\njournal = {Journal of Imaging},<br \/>\r\nvolume = {7},<br \/>\r\nissue = {10},<br \/>\r\npages = {203},<br \/>\r\npublisher = {MDPI},<br \/>\r\nabstract = {Mass spectrometry is an effective imaging tool for evaluating biological tissue to detect cancer. With the assistance of deep learning, this technology can be used as a perioperative tissue assessment tool that will facilitate informed surgical decisions. To achieve such a system requires the development of a database of mass spectrometry signals and their corresponding pathology labels. Assigning correct labels, in turn, necessitates precise spatial registration of histopathology and mass spectrometry data. This is a challenging task due to the domain differences and noisy nature of images. In this study, we create a registration framework for mass spectrometry and pathology images as a contribution to the development of perioperative tissue assessment. In doing so, we explore two opportunities in deep learning for medical image registration, namely, unsupervised, multi-modal deformable image registration and evaluation of the registration. We test this system on prostate needle biopsy cores that were imaged with desorption electrospray ionization mass spectrometry (DESI) and show that we can successfully register DESI and histology images to achieve accurate alignment and, consequently, labelling for future training. This automation is expected to improve the efficiency and development of a deep learning architecture that will benefit the use of mass spectrometry imaging for cancer diagnosis.},<br \/>\r\nkeywords = {},<br \/>\r\npubstate = {published},<br \/>\r\ntppubtype = {article}<br \/>\r\n}<br \/>\r\n<\/pre><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('879','tp_bibtex')\">Close<\/a><\/p><\/div><div class=\"tp_abstract\" id=\"tp_abstract_879\" style=\"display:none;\"><div class=\"tp_abstract_entry\">Mass spectrometry is an effective imaging tool for evaluating biological tissue to detect cancer. With the assistance of deep learning, this technology can be used as a perioperative tissue assessment tool that will facilitate informed surgical decisions. To achieve such a system requires the development of a database of mass spectrometry signals and their corresponding pathology labels. Assigning correct labels, in turn, necessitates precise spatial registration of histopathology and mass spectrometry data. This is a challenging task due to the domain differences and noisy nature of images. In this study, we create a registration framework for mass spectrometry and pathology images as a contribution to the development of perioperative tissue assessment. In doing so, we explore two opportunities in deep learning for medical image registration, namely, unsupervised, multi-modal deformable image registration and evaluation of the registration. We test this system on prostate needle biopsy cores that were imaged with desorption electrospray ionization mass spectrometry (DESI) and show that we can successfully register DESI and histology images to achieve accurate alignment and, consequently, labelling for future training. This automation is expected to improve the efficiency and development of a deep learning architecture that will benefit the use of mass spectrometry imaging for cancer diagnosis.<\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('879','tp_abstract')\">Close<\/a><\/p><\/div><div class=\"tp_links\" id=\"tp_links_879\" style=\"display:none;\"><div class=\"tp_links_entry\"><ul class=\"tp_pub_list\"><li><i class=\"fas fa-globe\"><\/i><a class=\"tp_pub_list\" href=\"https:\/\/www.mdpi.com\/2313-433X\/7\/10\/203\" title=\"https:\/\/www.mdpi.com\/2313-433X\/7\/10\/203\" target=\"_blank\">https:\/\/www.mdpi.com\/2313-433X\/7\/10\/203<\/a><\/li><\/ul><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('879','tp_links')\">Close<\/a><\/p><\/div><\/div><\/div><div class=\"tp_publication tp_publication_article\"><div class=\"tp_pub_info\"><p class=\"tp_pub_author\"> Akbarifar, Faranak;  Jamzad, Amoon;  Santilli, Alice;  Kauffman, Martin;  Janssen, Natasja;  Connolly, Laura;  Ren, K;  Vanderbeck, Kaitlin;  Wang, Ami;  Mckay, Doug;  Rudan, John;  Fichtinger, Gabor;  Mousavi, Parvin<\/p><p class=\"tp_pub_title\"><a class=\"tp_title_link\" href=\"https:\/\/www.spiedigitallibrary.org\/conference-proceedings-of-spie\/11598\/1159812\/Graph-based-analysis-of-mass-spectrometry-data-for-tissue-characterization\/10.1117\/12.2582045.short\" title=\"https:\/\/www.spiedigitallibrary.org\/conference-proceedings-of-spie\/11598\/1159812\/Graph-based-analysis-of-mass-spectrometry-data-for-tissue-characterization\/10.1117\/12.2582045.short\" target=\"blank\">Graph-based analysis of mass spectrometry data for tissue characterization with application in basal cell carcinoma surgery<\/a> <span class=\"tp_pub_type tp_  article\">Journal Article<\/span> <\/p><p class=\"tp_pub_additional\"><span class=\"tp_pub_additional_in\">In: <\/span><span class=\"tp_pub_additional_volume\">vol. 11598, <\/span><span class=\"tp_pub_additional_pages\">pp. 279-285, <\/span><span class=\"tp_pub_additional_year\">2021<\/span>.<\/p><p class=\"tp_pub_menu\"><span class=\"tp_abstract_link\"><a id=\"tp_abstract_sh_869\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('869','tp_abstract')\" title=\"Show abstract\" style=\"cursor:pointer;\">Abstract<\/a><\/span> | <span class=\"tp_resource_link\"><a id=\"tp_links_sh_869\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('869','tp_links')\" title=\"Show links and resources\" style=\"cursor:pointer;\">Links<\/a><\/span> | <span class=\"tp_bibtex_link\"><a id=\"tp_bibtex_sh_869\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('869','tp_bibtex')\" title=\"Show BibTeX entry\" style=\"cursor:pointer;\">BibTeX<\/a><\/span><\/p><div class=\"tp_bibtex\" id=\"tp_bibtex_869\" style=\"display:none;\"><div class=\"tp_bibtex_entry\"><pre>@article{fichtinger2021e,<br \/>\r\ntitle = {Graph-based analysis of mass spectrometry data for tissue characterization with application in basal cell carcinoma surgery},<br \/>\r\nauthor = {Faranak Akbarifar and Amoon Jamzad and Alice Santilli and Martin Kauffman and Natasja Janssen and Laura Connolly and K Ren and Kaitlin Vanderbeck and Ami Wang and Doug Mckay and John Rudan and Gabor Fichtinger and Parvin Mousavi},<br \/>\r\nurl = {https:\/\/www.spiedigitallibrary.org\/conference-proceedings-of-spie\/11598\/1159812\/Graph-based-analysis-of-mass-spectrometry-data-for-tissue-characterization\/10.1117\/12.2582045.short},<br \/>\r\nyear  = {2021},<br \/>\r\ndate = {2021-01-01},<br \/>\r\nvolume = {11598},<br \/>\r\npages = {279-285},<br \/>\r\npublisher = {SPIE},<br \/>\r\nabstract = {PURPOSE <br \/>\r\nBasal Cell Carcinoma (BCC) is the most common cancer in the world. Surgery is the standard treatment and margin assessment is used to evaluate the outcome. The presence of cancerous cells at the edge of resected tissue i.e., positive margin, can negatively impact patient outcomes and increase the probability of cancer recurrence. Novel mass spectrometry technologies paired with machine learning can provide surgeons with real-time feedback about margins to eliminate the need for resurgery. To our knowledge, this is the first study to report the performance of cancer detection using Graph Convolutional Networks (GCN) on mass spectrometry data from resected BCC samples. <br \/>\r\nMETHODS <br \/>\r\nThe dataset used in this study is a subset of an ongoing clinical data acquired by our group and annotated with the help of a trained pathologist. There is a total number of 190 spectra in this dataset, including 127 \u2026},<br \/>\r\nkeywords = {},<br \/>\r\npubstate = {published},<br \/>\r\ntppubtype = {article}<br \/>\r\n}<br \/>\r\n<\/pre><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('869','tp_bibtex')\">Close<\/a><\/p><\/div><div class=\"tp_abstract\" id=\"tp_abstract_869\" style=\"display:none;\"><div class=\"tp_abstract_entry\">PURPOSE <br \/>\r\nBasal Cell Carcinoma (BCC) is the most common cancer in the world. Surgery is the standard treatment and margin assessment is used to evaluate the outcome. The presence of cancerous cells at the edge of resected tissue i.e., positive margin, can negatively impact patient outcomes and increase the probability of cancer recurrence. Novel mass spectrometry technologies paired with machine learning can provide surgeons with real-time feedback about margins to eliminate the need for resurgery. To our knowledge, this is the first study to report the performance of cancer detection using Graph Convolutional Networks (GCN) on mass spectrometry data from resected BCC samples. <br \/>\r\nMETHODS <br \/>\r\nThe dataset used in this study is a subset of an ongoing clinical data acquired by our group and annotated with the help of a trained pathologist. There is a total number of 190 spectra in this dataset, including 127 \u2026<\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('869','tp_abstract')\">Close<\/a><\/p><\/div><div class=\"tp_links\" id=\"tp_links_869\" style=\"display:none;\"><div class=\"tp_links_entry\"><ul class=\"tp_pub_list\"><li><i class=\"fas fa-globe\"><\/i><a class=\"tp_pub_list\" href=\"https:\/\/www.spiedigitallibrary.org\/conference-proceedings-of-spie\/11598\/1159812\/Graph-based-analysis-of-mass-spectrometry-data-for-tissue-characterization\/10.1117\/12.2582045.short\" title=\"https:\/\/www.spiedigitallibrary.org\/conference-proceedings-of-spie\/11598\/1159812\/[...]\" target=\"_blank\">https:\/\/www.spiedigitallibrary.org\/conference-proceedings-of-spie\/11598\/1159812\/[...]<\/a><\/li><\/ul><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('869','tp_links')\">Close<\/a><\/p><\/div><\/div><\/div><div class=\"tp_publication tp_publication_article\"><div class=\"tp_pub_info\"><p class=\"tp_pub_author\"> Connolly, Laura;  Deguet, Anton;  Sunderland, Kyle;  Lasso, Andras;  Ungi, Tamas;  Rudan, John F;  Taylor, Russell H;  Mousavi, Parvin;  Fichtinger, Gabor<\/p><p class=\"tp_pub_title\"><a class=\"tp_title_link\" href=\"https:\/\/ieeexplore.ieee.org\/abstract\/document\/9551149\/\" title=\"https:\/\/ieeexplore.ieee.org\/abstract\/document\/9551149\/\" target=\"blank\">An open-source platform for cooperative, semi-autonomous robotic surgery<\/a> <span class=\"tp_pub_type tp_  article\">Journal Article<\/span> <\/p><p class=\"tp_pub_additional\"><span class=\"tp_pub_additional_in\">In: <\/span><span class=\"tp_pub_additional_pages\">pp. 1-5, <\/span><span class=\"tp_pub_additional_year\">2021<\/span>.<\/p><p class=\"tp_pub_menu\"><span class=\"tp_abstract_link\"><a id=\"tp_abstract_sh_868\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('868','tp_abstract')\" title=\"Show abstract\" style=\"cursor:pointer;\">Abstract<\/a><\/span> | <span class=\"tp_resource_link\"><a id=\"tp_links_sh_868\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('868','tp_links')\" title=\"Show links and resources\" style=\"cursor:pointer;\">Links<\/a><\/span> | <span class=\"tp_bibtex_link\"><a id=\"tp_bibtex_sh_868\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('868','tp_bibtex')\" title=\"Show BibTeX entry\" style=\"cursor:pointer;\">BibTeX<\/a><\/span><\/p><div class=\"tp_bibtex\" id=\"tp_bibtex_868\" style=\"display:none;\"><div class=\"tp_bibtex_entry\"><pre>@article{fichtinger2021d,<br \/>\r\ntitle = {An open-source platform for cooperative, semi-autonomous robotic surgery},<br \/>\r\nauthor = {Laura Connolly and Anton Deguet and Kyle Sunderland and Andras Lasso and Tamas Ungi and John F Rudan and Russell H Taylor and Parvin Mousavi and Gabor Fichtinger},<br \/>\r\nurl = {https:\/\/ieeexplore.ieee.org\/abstract\/document\/9551149\/},<br \/>\r\nyear  = {2021},<br \/>\r\ndate = {2021-01-01},<br \/>\r\npages = {1-5},<br \/>\r\npublisher = {IEEE},<br \/>\r\nabstract = {Introduction <br \/>\r\nIn this paper, we present and assess a proof of concept platform for semi-autonomous, cooperative robotic surgery. The platform is easily reproducible thanks to simple hardware components and open-source software. Moreover, the design accommodates open, soft tissue surgeries that recent advancements in surgical robotics do not generally focus on. <br \/>\r\nMethods <br \/>\r\nThe system is made up of an inexpensive robotic manipulator, a navigation system and a software interface. Accuracy measurement is performed on a rigid phantom that mimics the conditions of breast conserving surgery (BCS) as an example of a surgical use case. <br \/>\r\nResults <br \/>\r\nThe average target registration error (TRE) and fiducial registration error (FRE) of the system is within 1 mm. This indicates that the navigation system is sufficient for certain surgical applications such as BCS. The platform can also be easily replicated and used in a lab or \u2026},<br \/>\r\nkeywords = {},<br \/>\r\npubstate = {published},<br \/>\r\ntppubtype = {article}<br \/>\r\n}<br \/>\r\n<\/pre><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('868','tp_bibtex')\">Close<\/a><\/p><\/div><div class=\"tp_abstract\" id=\"tp_abstract_868\" style=\"display:none;\"><div class=\"tp_abstract_entry\">Introduction <br \/>\r\nIn this paper, we present and assess a proof of concept platform for semi-autonomous, cooperative robotic surgery. The platform is easily reproducible thanks to simple hardware components and open-source software. Moreover, the design accommodates open, soft tissue surgeries that recent advancements in surgical robotics do not generally focus on. <br \/>\r\nMethods <br \/>\r\nThe system is made up of an inexpensive robotic manipulator, a navigation system and a software interface. Accuracy measurement is performed on a rigid phantom that mimics the conditions of breast conserving surgery (BCS) as an example of a surgical use case. <br \/>\r\nResults <br \/>\r\nThe average target registration error (TRE) and fiducial registration error (FRE) of the system is within 1 mm. This indicates that the navigation system is sufficient for certain surgical applications such as BCS. The platform can also be easily replicated and used in a lab or \u2026<\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('868','tp_abstract')\">Close<\/a><\/p><\/div><div class=\"tp_links\" id=\"tp_links_868\" style=\"display:none;\"><div class=\"tp_links_entry\"><ul class=\"tp_pub_list\"><li><i class=\"fas fa-globe\"><\/i><a class=\"tp_pub_list\" href=\"https:\/\/ieeexplore.ieee.org\/abstract\/document\/9551149\/\" title=\"https:\/\/ieeexplore.ieee.org\/abstract\/document\/9551149\/\" target=\"_blank\">https:\/\/ieeexplore.ieee.org\/abstract\/document\/9551149\/<\/a><\/li><\/ul><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('868','tp_links')\">Close<\/a><\/p><\/div><\/div><\/div><div class=\"tp_publication tp_publication_conference\"><div class=\"tp_pub_info\"><p class=\"tp_pub_author\"> Connolly, Laura;  Sunderland, Kyle R.;  Lasso, Andras;  Degeut, Anton;  Ungi, Tamas;  Rudan, John;  Taylor, Russell H.;  Mousavi, Parvin;  Fichtinger, Gabor<\/p><p class=\"tp_pub_title\"><a class=\"tp_title_link\" href=\"https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/Connolly2021a_1.pdf\" title=\"https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/Connolly2021a_1.pdf\" target=\"blank\">A platform for robot-assisted Intraoperative imaging in breast conserving surgery<\/a> <span class=\"tp_pub_type tp_  conference\">Conference<\/span> <\/p><p class=\"tp_pub_additional\"><span class=\"tp_pub_additional_booktitle\">Imaging Network of Ontario Symposium, <\/span><span class=\"tp_pub_additional_publisher\">Imaging Network of Ontario Symposium, <\/span><span class=\"tp_pub_additional_address\">Online, <\/span><span class=\"tp_pub_additional_year\">2021<\/span>.<\/p><p class=\"tp_pub_menu\"><span class=\"tp_resource_link\"><a id=\"tp_links_sh_39\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('39','tp_links')\" title=\"Show links and resources\" style=\"cursor:pointer;\">Links<\/a><\/span> | <span class=\"tp_bibtex_link\"><a id=\"tp_bibtex_sh_39\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('39','tp_bibtex')\" title=\"Show BibTeX entry\" style=\"cursor:pointer;\">BibTeX<\/a><\/span><\/p><div class=\"tp_bibtex\" id=\"tp_bibtex_39\" style=\"display:none;\"><div class=\"tp_bibtex_entry\"><pre>@conference{Connolly2021b,<br \/>\r\ntitle = {A platform for robot-assisted Intraoperative imaging in breast conserving surgery},<br \/>\r\nauthor = {Laura Connolly and Kyle R. Sunderland and Andras Lasso and Anton Degeut and Tamas Ungi and John Rudan and Russell H. Taylor and Parvin Mousavi and Gabor Fichtinger},<br \/>\r\nurl = {https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/Connolly2021a_1.pdf},<br \/>\r\nyear  = {2021},<br \/>\r\ndate = {2021-01-01},<br \/>\r\nurldate = {2021-01-01},<br \/>\r\nbooktitle = {Imaging Network of Ontario Symposium},<br \/>\r\npublisher = {Imaging Network of Ontario Symposium},<br \/>\r\naddress = {Online},<br \/>\r\nkeywords = {},<br \/>\r\npubstate = {published},<br \/>\r\ntppubtype = {conference}<br \/>\r\n}<br \/>\r\n<\/pre><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('39','tp_bibtex')\">Close<\/a><\/p><\/div><div class=\"tp_links\" id=\"tp_links_39\" style=\"display:none;\"><div class=\"tp_links_entry\"><ul class=\"tp_pub_list\"><li><i class=\"fas fa-file-pdf\"><\/i><a class=\"tp_pub_list\" href=\"https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/Connolly2021a_1.pdf\" title=\"https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/Connolly20[...]\" target=\"_blank\">https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/Connolly20[...]<\/a><\/li><\/ul><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('39','tp_links')\">Close<\/a><\/p><\/div><\/div><\/div><div class=\"tp_publication tp_publication_article\"><div class=\"tp_pub_info\"><p class=\"tp_pub_author\"> Yates, Lauren;  Connolly, Laura;  Jamzad, Amoon;  Asselin, Mark;  Rubino, Rachel;  Yam, Scott;  Ungi, Tamas;  Lasso, Andras;  Nicol, Christopher;  Mousavi, Parvin;  Fichtinger, Gabor<\/p><p class=\"tp_pub_title\"><a class=\"tp_title_link\" href=\"https:\/\/www.spiedigitallibrary.org\/conference-proceedings-of-spie\/11315\/1131519\/Robotic-tissue-scanning-with-biophotonic-probe\/10.1117\/12.2549635.short\" title=\"https:\/\/www.spiedigitallibrary.org\/conference-proceedings-of-spie\/11315\/1131519\/Robotic-tissue-scanning-with-biophotonic-probe\/10.1117\/12.2549635.short\" target=\"blank\">Robotic tissue scanning with biophotonic probe<\/a> <span class=\"tp_pub_type tp_  article\">Journal Article<\/span> <\/p><p class=\"tp_pub_additional\"><span class=\"tp_pub_additional_in\">In: <\/span><span class=\"tp_pub_additional_volume\">vol. 11315, <\/span><span class=\"tp_pub_additional_pages\">pp. 330-335, <\/span><span class=\"tp_pub_additional_year\">2020<\/span>.<\/p><p class=\"tp_pub_menu\"><span class=\"tp_abstract_link\"><a id=\"tp_abstract_sh_952\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('952','tp_abstract')\" title=\"Show abstract\" style=\"cursor:pointer;\">Abstract<\/a><\/span> | <span class=\"tp_resource_link\"><a id=\"tp_links_sh_952\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('952','tp_links')\" title=\"Show links and resources\" style=\"cursor:pointer;\">Links<\/a><\/span> | <span class=\"tp_bibtex_link\"><a id=\"tp_bibtex_sh_952\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('952','tp_bibtex')\" title=\"Show BibTeX entry\" style=\"cursor:pointer;\">BibTeX<\/a><\/span><\/p><div class=\"tp_bibtex\" id=\"tp_bibtex_952\" style=\"display:none;\"><div class=\"tp_bibtex_entry\"><pre>@article{fichtinger2020o,<br \/>\r\ntitle = {Robotic tissue scanning with biophotonic probe},<br \/>\r\nauthor = {Lauren Yates and Laura Connolly and Amoon Jamzad and Mark Asselin and Rachel Rubino and Scott Yam and Tamas Ungi and Andras Lasso and Christopher Nicol and Parvin Mousavi and Gabor Fichtinger},<br \/>\r\nurl = {https:\/\/www.spiedigitallibrary.org\/conference-proceedings-of-spie\/11315\/1131519\/Robotic-tissue-scanning-with-biophotonic-probe\/10.1117\/12.2549635.short},<br \/>\r\nyear  = {2020},<br \/>\r\ndate = {2020-01-01},<br \/>\r\nvolume = {11315},<br \/>\r\npages = {330-335},<br \/>\r\npublisher = {SPIE},<br \/>\r\nabstract = {PURPOSE <br \/>\r\nRaman spectroscopy is an optical imaging technique used to characterize tissue via molecular analysis. The use of Raman spectroscopy for real-time intraoperative tissue classification requires fast analysis with minimal human intervention. In order to have accurate predictions and classifications, a large and reliable database of tissue classifications with spectra results is required. We have developed a system that can be used to generate an efficient scanning path for robotic scanning of tissues using Raman spectroscopy. <br \/>\r\nMETHODS <br \/>\r\nA camera mounted to a robotic controller is used to take an image of a tissue slide. The corners of the tissue slides within the sample image are identified, and the size of the slide is calculated. The image is cropped to fit the size of the slide and the image is manipulated to identify the tissue contour. A grid set to fit around the size of the tissue is calculated and a grid \u2026},<br \/>\r\nkeywords = {},<br \/>\r\npubstate = {published},<br \/>\r\ntppubtype = {article}<br \/>\r\n}<br \/>\r\n<\/pre><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('952','tp_bibtex')\">Close<\/a><\/p><\/div><div class=\"tp_abstract\" id=\"tp_abstract_952\" style=\"display:none;\"><div class=\"tp_abstract_entry\">PURPOSE <br \/>\r\nRaman spectroscopy is an optical imaging technique used to characterize tissue via molecular analysis. The use of Raman spectroscopy for real-time intraoperative tissue classification requires fast analysis with minimal human intervention. In order to have accurate predictions and classifications, a large and reliable database of tissue classifications with spectra results is required. We have developed a system that can be used to generate an efficient scanning path for robotic scanning of tissues using Raman spectroscopy. <br \/>\r\nMETHODS <br \/>\r\nA camera mounted to a robotic controller is used to take an image of a tissue slide. The corners of the tissue slides within the sample image are identified, and the size of the slide is calculated. The image is cropped to fit the size of the slide and the image is manipulated to identify the tissue contour. A grid set to fit around the size of the tissue is calculated and a grid \u2026<\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('952','tp_abstract')\">Close<\/a><\/p><\/div><div class=\"tp_links\" id=\"tp_links_952\" style=\"display:none;\"><div class=\"tp_links_entry\"><ul class=\"tp_pub_list\"><li><i class=\"fas fa-globe\"><\/i><a class=\"tp_pub_list\" href=\"https:\/\/www.spiedigitallibrary.org\/conference-proceedings-of-spie\/11315\/1131519\/Robotic-tissue-scanning-with-biophotonic-probe\/10.1117\/12.2549635.short\" title=\"https:\/\/www.spiedigitallibrary.org\/conference-proceedings-of-spie\/11315\/1131519\/[...]\" target=\"_blank\">https:\/\/www.spiedigitallibrary.org\/conference-proceedings-of-spie\/11315\/1131519\/[...]<\/a><\/li><\/ul><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('952','tp_links')\">Close<\/a><\/p><\/div><\/div><\/div><div class=\"tp_publication tp_publication_article\"><div class=\"tp_pub_info\"><p class=\"tp_pub_author\"> Santilli, Alice ML;  Jamzad, Amoon;  Janssen, Natasja NY;  Kaufmann, Martin;  Connolly, Laura;  Vanderbeck, Kaitlin;  Wang, Ami;  McKay, Doug;  Rudan, John F;  Fichtinger, Gabor;  Mousavi, Parvin<\/p><p class=\"tp_pub_title\"><a class=\"tp_title_link\" href=\"https:\/\/link.springer.com\/article\/10.1007\/s11548-020-02152-9\" title=\"https:\/\/link.springer.com\/article\/10.1007\/s11548-020-02152-9\" target=\"blank\">Perioperative margin detection in basal cell carcinoma using a deep learning framework: a feasibility study<\/a> <span class=\"tp_pub_type tp_  article\">Journal Article<\/span> <\/p><p class=\"tp_pub_additional\"><span class=\"tp_pub_additional_in\">In: <\/span><span class=\"tp_pub_additional_journal\">International Journal of Computer Assisted Radiology and Surgery, <\/span><span class=\"tp_pub_additional_volume\">vol. 15, <\/span><span class=\"tp_pub_additional_pages\">pp. 887-896, <\/span><span class=\"tp_pub_additional_year\">2020<\/span>.<\/p><p class=\"tp_pub_menu\"><span class=\"tp_abstract_link\"><a id=\"tp_abstract_sh_782\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('782','tp_abstract')\" title=\"Show abstract\" style=\"cursor:pointer;\">Abstract<\/a><\/span> | <span class=\"tp_resource_link\"><a id=\"tp_links_sh_782\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('782','tp_links')\" title=\"Show links and resources\" style=\"cursor:pointer;\">Links<\/a><\/span> | <span class=\"tp_bibtex_link\"><a id=\"tp_bibtex_sh_782\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('782','tp_bibtex')\" title=\"Show BibTeX entry\" style=\"cursor:pointer;\">BibTeX<\/a><\/span><\/p><div class=\"tp_bibtex\" id=\"tp_bibtex_782\" style=\"display:none;\"><div class=\"tp_bibtex_entry\"><pre>@article{fichtinger2020b,<br \/>\r\ntitle = {Perioperative margin detection in basal cell carcinoma using a deep learning framework: a feasibility study},<br \/>\r\nauthor = {Alice ML Santilli and Amoon Jamzad and Natasja NY Janssen and Martin Kaufmann and Laura Connolly and Kaitlin Vanderbeck and Ami Wang and Doug McKay and John F Rudan and Gabor Fichtinger and Parvin Mousavi},<br \/>\r\nurl = {https:\/\/link.springer.com\/article\/10.1007\/s11548-020-02152-9},<br \/>\r\nyear  = {2020},<br \/>\r\ndate = {2020-01-01},<br \/>\r\njournal = {International Journal of Computer Assisted Radiology and Surgery},<br \/>\r\nvolume = {15},<br \/>\r\npages = {887-896},<br \/>\r\npublisher = {Springer International Publishing},<br \/>\r\nabstract = {Purpose <br \/>\r\nBasal cell carcinoma (BCC) is the most commonly diagnosed cancer and the number of diagnosis is growing worldwide due to increased exposure to solar radiation and the aging population. Reduction of positive margin rates when removing BCC leads to fewer revision surgeries and consequently lower health care costs, improved cosmetic outcomes and better patient care. In this study, we propose the first use of a perioperative mass spectrometry technology (iKnife) along with a deep learning framework for detection of BCC signatures from tissue burns. <br \/>\r\nMethods <br \/>\r\nResected surgical specimen were collected and inspected by a pathologist. With their guidance, data were collected by burning regions of the specimen labeled as BCC or normal, with the iKnife. Data included 190 scans of which 127 were normal and 63 were BCC. A data \u2026},<br \/>\r\nkeywords = {},<br \/>\r\npubstate = {published},<br \/>\r\ntppubtype = {article}<br \/>\r\n}<br \/>\r\n<\/pre><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('782','tp_bibtex')\">Close<\/a><\/p><\/div><div class=\"tp_abstract\" id=\"tp_abstract_782\" style=\"display:none;\"><div class=\"tp_abstract_entry\">Purpose <br \/>\r\nBasal cell carcinoma (BCC) is the most commonly diagnosed cancer and the number of diagnosis is growing worldwide due to increased exposure to solar radiation and the aging population. Reduction of positive margin rates when removing BCC leads to fewer revision surgeries and consequently lower health care costs, improved cosmetic outcomes and better patient care. In this study, we propose the first use of a perioperative mass spectrometry technology (iKnife) along with a deep learning framework for detection of BCC signatures from tissue burns. <br \/>\r\nMethods <br \/>\r\nResected surgical specimen were collected and inspected by a pathologist. With their guidance, data were collected by burning regions of the specimen labeled as BCC or normal, with the iKnife. Data included 190 scans of which 127 were normal and 63 were BCC. A data \u2026<\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('782','tp_abstract')\">Close<\/a><\/p><\/div><div class=\"tp_links\" id=\"tp_links_782\" style=\"display:none;\"><div class=\"tp_links_entry\"><ul class=\"tp_pub_list\"><li><i class=\"fas fa-globe\"><\/i><a class=\"tp_pub_list\" href=\"https:\/\/link.springer.com\/article\/10.1007\/s11548-020-02152-9\" title=\"https:\/\/link.springer.com\/article\/10.1007\/s11548-020-02152-9\" target=\"_blank\">https:\/\/link.springer.com\/article\/10.1007\/s11548-020-02152-9<\/a><\/li><\/ul><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('782','tp_links')\">Close<\/a><\/p><\/div><\/div><\/div><div class=\"tp_publication tp_publication_conference\"><div class=\"tp_pub_info\"><p class=\"tp_pub_author\"> Connolly, Laura;  Jamzad, Amoon;  Kaufmann, Martin;  Rubino, Rachel;  Sedghi, Alireza;  Ungi, Tamas;  Asselin, Mark;  Yam, Scott;  Rudan, John;  Nicol, Christopher;  Fichtinger, Gabor;  Mousavi, Parvin<\/p><p class=\"tp_pub_title\"><a class=\"tp_title_link\" href=\"https:\/\/dx.doi.org\/https:\/\/doi.org\/10.1117\/12.2549343\" title=\"Classification of tumor signatures from electrosurgical vapors using mass spectrometry and machine learning: a feasibility study\" target=\"blank\">Classification of tumor signatures from electrosurgical vapors using mass spectrometry and machine learning: a feasibility study<\/a> <span class=\"tp_pub_type tp_  conference\">Conference<\/span> <\/p><p class=\"tp_pub_additional\"><span class=\"tp_pub_additional_booktitle\">Medical Imaging 2020: Image-Guided Procedures, Robotic Interventions and Modeling, <\/span><span class=\"tp_pub_additional_volume\">vol. 11315, <\/span><span class=\"tp_pub_additional_organization\">SPIE <\/span><span class=\"tp_pub_additional_publisher\">SPIE, <\/span><span class=\"tp_pub_additional_address\">Houston, Texas, United States, <\/span><span class=\"tp_pub_additional_year\">2020<\/span>.<\/p><p class=\"tp_pub_menu\"><span class=\"tp_resource_link\"><a id=\"tp_links_sh_49\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('49','tp_links')\" title=\"Show links and resources\" style=\"cursor:pointer;\">Links<\/a><\/span> | <span class=\"tp_bibtex_link\"><a id=\"tp_bibtex_sh_49\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('49','tp_bibtex')\" title=\"Show BibTeX entry\" style=\"cursor:pointer;\">BibTeX<\/a><\/span><\/p><div class=\"tp_bibtex\" id=\"tp_bibtex_49\" style=\"display:none;\"><div class=\"tp_bibtex_entry\"><pre>@conference{Connolly2020a,<br \/>\r\ntitle = {Classification of tumor signatures from electrosurgical vapors using mass spectrometry and machine learning: a feasibility study},<br \/>\r\nauthor = {Laura Connolly and Amoon Jamzad and Martin Kaufmann and Rachel Rubino and Alireza Sedghi and Tamas Ungi and Mark Asselin and Scott Yam and John Rudan and Christopher Nicol and Gabor Fichtinger and Parvin Mousavi},<br \/>\r\nurl = {https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/Connolly2020a.pdf},<br \/>\r\ndoi = {https:\/\/doi.org\/10.1117\/12.2549343},<br \/>\r\nyear  = {2020},<br \/>\r\ndate = {2020-01-01},<br \/>\r\nurldate = {2020-01-01},<br \/>\r\nbooktitle = {Medical Imaging 2020: Image-Guided Procedures, Robotic Interventions and Modeling},<br \/>\r\nvolume = {11315},<br \/>\r\npublisher = {SPIE},<br \/>\r\naddress = {Houston, Texas, United States},<br \/>\r\norganization = {SPIE},<br \/>\r\nkeywords = {},<br \/>\r\npubstate = {published},<br \/>\r\ntppubtype = {conference}<br \/>\r\n}<br \/>\r\n<\/pre><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('49','tp_bibtex')\">Close<\/a><\/p><\/div><div class=\"tp_links\" id=\"tp_links_49\" style=\"display:none;\"><div class=\"tp_links_entry\"><ul class=\"tp_pub_list\"><li><i class=\"fas fa-file-pdf\"><\/i><a class=\"tp_pub_list\" href=\"https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/Connolly2020a.pdf\" title=\"https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/Connolly20[...]\" target=\"_blank\">https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/Connolly20[...]<\/a><\/li><li><i class=\"ai ai-doi\"><\/i><a class=\"tp_pub_list\" href=\"https:\/\/dx.doi.org\/https:\/\/doi.org\/10.1117\/12.2549343\" title=\"Follow DOI:https:\/\/doi.org\/10.1117\/12.2549343\" target=\"_blank\">doi:https:\/\/doi.org\/10.1117\/12.2549343<\/a><\/li><\/ul><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('49','tp_links')\">Close<\/a><\/p><\/div><\/div><\/div><div class=\"tp_publication tp_publication_conference\"><div class=\"tp_pub_info\"><p class=\"tp_pub_author\"> Connolly, Laura;  Ungi, Tamas;  Lasso, Andras;  Vaughan, Thomas;  Asselin, Mark;  Mousavi, Parvin;  Yam, Scott;  Fichtinger, Gabor<\/p><p class=\"tp_pub_title\"><a class=\"tp_title_link\" href=\"https:\/\/dx.doi.org\/https:\/\/doi.org\/10.1117\/12.2512481\" title=\"Mechanically-Controlled Spectroscopic Imaging for Tissue Classification\" target=\"blank\">Mechanically-Controlled Spectroscopic Imaging for Tissue Classification<\/a> <span class=\"tp_pub_type tp_  conference\">Conference<\/span> <\/p><p class=\"tp_pub_additional\"><span class=\"tp_pub_additional_booktitle\">SPIE Medical Imaging 2019: Image-Guided Procedures, Robotic Interventions, and Modeling, <\/span><span class=\"tp_pub_additional_volume\">vol. 10951, <\/span><span class=\"tp_pub_additional_address\">San Diego, California, <\/span><span class=\"tp_pub_additional_year\">2019<\/span>.<\/p><p class=\"tp_pub_menu\"><span class=\"tp_resource_link\"><a id=\"tp_links_sh_72\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('72','tp_links')\" title=\"Show links and resources\" style=\"cursor:pointer;\">Links<\/a><\/span> | <span class=\"tp_bibtex_link\"><a id=\"tp_bibtex_sh_72\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('72','tp_bibtex')\" title=\"Show BibTeX entry\" style=\"cursor:pointer;\">BibTeX<\/a><\/span><\/p><div class=\"tp_bibtex\" id=\"tp_bibtex_72\" style=\"display:none;\"><div class=\"tp_bibtex_entry\"><pre>@conference{Connolly2019a,<br \/>\r\ntitle = {Mechanically-Controlled Spectroscopic Imaging for Tissue Classification},<br \/>\r\nauthor = {Laura Connolly and Tamas Ungi and Andras Lasso and Thomas Vaughan and Mark Asselin and Parvin Mousavi and Scott Yam and Gabor Fichtinger},<br \/>\r\nurl = {https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/Connolly2019a_3.pdf},<br \/>\r\ndoi = {https:\/\/doi.org\/10.1117\/12.2512481},<br \/>\r\nyear  = {2019},<br \/>\r\ndate = {2019-03-01},<br \/>\r\nurldate = {2019-03-01},<br \/>\r\nbooktitle = {SPIE Medical Imaging 2019: Image-Guided Procedures, Robotic Interventions, and Modeling},<br \/>\r\nvolume = {10951},<br \/>\r\naddress = {San Diego, California},<br \/>\r\nkeywords = {},<br \/>\r\npubstate = {published},<br \/>\r\ntppubtype = {conference}<br \/>\r\n}<br \/>\r\n<\/pre><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('72','tp_bibtex')\">Close<\/a><\/p><\/div><div class=\"tp_links\" id=\"tp_links_72\" style=\"display:none;\"><div class=\"tp_links_entry\"><ul class=\"tp_pub_list\"><li><i class=\"fas fa-file-pdf\"><\/i><a class=\"tp_pub_list\" href=\"https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/Connolly2019a_3.pdf\" title=\"https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/Connolly20[...]\" target=\"_blank\">https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/Connolly20[...]<\/a><\/li><li><i class=\"ai ai-doi\"><\/i><a class=\"tp_pub_list\" href=\"https:\/\/dx.doi.org\/https:\/\/doi.org\/10.1117\/12.2512481\" title=\"Follow DOI:https:\/\/doi.org\/10.1117\/12.2512481\" target=\"_blank\">doi:https:\/\/doi.org\/10.1117\/12.2512481<\/a><\/li><\/ul><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('72','tp_links')\">Close<\/a><\/p><\/div><\/div><\/div><div class=\"tp_publication tp_publication_article\"><div class=\"tp_pub_info\"><p class=\"tp_pub_author\"> Connolly, Laura;  Ungi, Tamas;  Lasso, Andras;  Vaughan, Thomas;  Asselin, Mark;  Mousavi, Parvin;  Yam, Scott;  Fichtinger, Gabor<\/p><p class=\"tp_pub_title\"><a class=\"tp_title_link\" href=\"https:\/\/www.spiedigitallibrary.org\/conference-proceedings-of-spie\/10951\/109512E\/Mechanically-controlled-spectroscopic-imaging-for-tissue-classification\/10.1117\/12.2512481.short\" title=\"https:\/\/www.spiedigitallibrary.org\/conference-proceedings-of-spie\/10951\/109512E\/Mechanically-controlled-spectroscopic-imaging-for-tissue-classification\/10.1117\/12.2512481.short\" target=\"blank\">Mechanically controlled spectroscopic imaging for tissue classification<\/a> <span class=\"tp_pub_type tp_  article\">Journal Article<\/span> <\/p><p class=\"tp_pub_additional\"><span class=\"tp_pub_additional_in\">In: <\/span><span class=\"tp_pub_additional_volume\">vol. 10951, <\/span><span class=\"tp_pub_additional_pages\">pp. 632-640, <\/span><span class=\"tp_pub_additional_year\">2019<\/span>.<\/p><p class=\"tp_pub_menu\"><span class=\"tp_abstract_link\"><a id=\"tp_abstract_sh_956\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('956','tp_abstract')\" title=\"Show abstract\" style=\"cursor:pointer;\">Abstract<\/a><\/span> | <span class=\"tp_resource_link\"><a id=\"tp_links_sh_956\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('956','tp_links')\" title=\"Show links and resources\" style=\"cursor:pointer;\">Links<\/a><\/span> | <span class=\"tp_bibtex_link\"><a id=\"tp_bibtex_sh_956\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('956','tp_bibtex')\" title=\"Show BibTeX entry\" style=\"cursor:pointer;\">BibTeX<\/a><\/span><\/p><div class=\"tp_bibtex\" id=\"tp_bibtex_956\" style=\"display:none;\"><div class=\"tp_bibtex_entry\"><pre>@article{fichtinger2019n,<br \/>\r\ntitle = {Mechanically controlled spectroscopic imaging for tissue classification},<br \/>\r\nauthor = {Laura Connolly and Tamas Ungi and Andras Lasso and Thomas Vaughan and Mark Asselin and Parvin Mousavi and Scott Yam and Gabor Fichtinger},<br \/>\r\nurl = {https:\/\/www.spiedigitallibrary.org\/conference-proceedings-of-spie\/10951\/109512E\/Mechanically-controlled-spectroscopic-imaging-for-tissue-classification\/10.1117\/12.2512481.short},<br \/>\r\nyear  = {2019},<br \/>\r\ndate = {2019-01-01},<br \/>\r\nvolume = {10951},<br \/>\r\npages = {632-640},<br \/>\r\npublisher = {SPIE},<br \/>\r\nabstract = {PURPOSE <br \/>\r\nRaman Spectroscopy is amongst several optical imaging techniques that have the ability to characterize tissue non-invasively. To use these technologies for intraoperative tissue classification, fast and efficient analysis of optical data is required with minimal operator intervention. Additionally, there is a need for a reliable database of optical signatures to account for variable conditions. We developed a software system with an inexpensive, flexible mechanical framework to facilitate automated scanning of tissue and validate spectroscopic scans with histologic ground truths. This system will be used, in the future, to train a machine learning algorithm to distinguish between different tissue types using Raman Spectroscopy. <br \/>\r\nMETHODS <br \/>\r\nA sample of chicken breast tissue is mounted to a microscope slide following a biopsy of fresh frozen tissue. Landmarks for registration and evaluation are marked on the \u2026},<br \/>\r\nkeywords = {},<br \/>\r\npubstate = {published},<br \/>\r\ntppubtype = {article}<br \/>\r\n}<br \/>\r\n<\/pre><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('956','tp_bibtex')\">Close<\/a><\/p><\/div><div class=\"tp_abstract\" id=\"tp_abstract_956\" style=\"display:none;\"><div class=\"tp_abstract_entry\">PURPOSE <br \/>\r\nRaman Spectroscopy is amongst several optical imaging techniques that have the ability to characterize tissue non-invasively. To use these technologies for intraoperative tissue classification, fast and efficient analysis of optical data is required with minimal operator intervention. Additionally, there is a need for a reliable database of optical signatures to account for variable conditions. We developed a software system with an inexpensive, flexible mechanical framework to facilitate automated scanning of tissue and validate spectroscopic scans with histologic ground truths. This system will be used, in the future, to train a machine learning algorithm to distinguish between different tissue types using Raman Spectroscopy. <br \/>\r\nMETHODS <br \/>\r\nA sample of chicken breast tissue is mounted to a microscope slide following a biopsy of fresh frozen tissue. Landmarks for registration and evaluation are marked on the \u2026<\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('956','tp_abstract')\">Close<\/a><\/p><\/div><div class=\"tp_links\" id=\"tp_links_956\" style=\"display:none;\"><div class=\"tp_links_entry\"><ul class=\"tp_pub_list\"><li><i class=\"fas fa-globe\"><\/i><a class=\"tp_pub_list\" href=\"https:\/\/www.spiedigitallibrary.org\/conference-proceedings-of-spie\/10951\/109512E\/Mechanically-controlled-spectroscopic-imaging-for-tissue-classification\/10.1117\/12.2512481.short\" title=\"https:\/\/www.spiedigitallibrary.org\/conference-proceedings-of-spie\/10951\/109512E\/[...]\" target=\"_blank\">https:\/\/www.spiedigitallibrary.org\/conference-proceedings-of-spie\/10951\/109512E\/[...]<\/a><\/li><\/ul><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('956','tp_links')\">Close<\/a><\/p><\/div><\/div><\/div><\/div><\/div>\n\n<\/div>\n","protected":false},"featured_media":1930,"template":"","meta":{"_acf_changed":false,"_uag_custom_page_level_css":"","site-sidebar-layout":"default","site-content-layout":"","ast-site-content-layout":"default","site-content-style":"default","site-sidebar-style":"default","ast-global-header-display":"","ast-banner-title-visibility":"","ast-main-header-display":"","ast-hfb-above-header-display":"","ast-hfb-below-header-display":"","ast-hfb-mobile-header-display":"","site-post-title":"","ast-breadcrumbs-content":"","ast-featured-img":"","footer-sml-layout":"","ast-disable-related-posts":"","theme-transparent-header-meta":"","adv-header-id-meta":"","stick-header-meta":"","header-above-stick-meta":"","header-main-stick-meta":"","header-below-stick-meta":"","astra-migrate-meta-layouts":"default","ast-page-background-enabled":"default","ast-page-background-meta":{"desktop":{"background-color":"var(--ast-global-color-4)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"tablet":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"mobile":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""}},"ast-content-background-meta":{"desktop":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"tablet":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"mobile":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""}},"footnotes":""},"class_list":["post-2100","qsc_member","type-qsc_member","status-publish","has-post-thumbnail","hentry"],"acf":[],"spectra_custom_meta":{"_edit_lock":["1724616829:11"],"_thumbnail_id":["1930"],"_uag_custom_page_level_css":[""],"theme-transparent-header-meta":[""],"adv-header-id-meta":[""],"stick-header-meta":[""],"footnotes":[""],"_edit_last":["11"],"field_qsc_member_acf_email":["laura.connolly@queensu.ca"],"_field_qsc_member_acf_email":["qsc_member_acf_email"],"qsc_member_acf_position":["PhD Student"],"_qsc_member_acf_position":["field_qsc_member_acf_position"],"qsc_member_acf_department":["a:1:{i:0;s:69:\"Department of Electrical and Computer Engineering (Smith Engineering)\";}"],"_qsc_member_acf_department":["field_qsc_member_acf_department"],"field_qsc_member_acf_organization":["Queen's University"],"_field_qsc_member_acf_organization":["qsc_member_acf_organization"],"field_qsc_member_acf_linkedin":["https:\/\/www.linkedin.com\/in\/laura-connolly-0aab43144\/"],"_field_qsc_member_acf_linkedin":["qsc_member_acf_linkedin"],"field_qsc_member_acf_gscholar":["https:\/\/scholar.google.com\/citations?user=9E-xfIwAAAAJ&amp;hl=en"],"_field_qsc_member_acf_gscholar":["qsc_member_acf_gscholar"],"field_qsc_member_acf_github":["https:\/\/github.com\/LauraConnolly"],"_field_qsc_member_acf_github":["qsc_member_acf_github"],"field_qsc_member_acf_researchgate":[""],"_field_qsc_member_acf_researchgate":["qsc_member_acf_researchgate"],"field_qsc_member_acf_web":[""],"_field_qsc_member_acf_web":["qsc_member_acf_web"],"field_qsc_member_acf_program_status":["Current"],"_field_qsc_member_acf_program_status":["qsc_member_acf_program_status"],"field_qsc_member_acf_start_year":[""],"_field_qsc_member_acf_start_year":["qsc_member_acf_start_year"],"field_qsc_member_acf_end_year":[""],"_field_qsc_member_acf_end_year":["qsc_member_acf_end_year"],"_uag_css_file_name":["uag-css-2100.css"],"_uag_page_assets":["a:9:{s:3:\"css\";s:263:\".uag-blocks-common-selector{z-index:var(--z-index-desktop) !important}@media (max-width: 976px){.uag-blocks-common-selector{z-index:var(--z-index-tablet) !important}}@media (max-width: 767px){.uag-blocks-common-selector{z-index:var(--z-index-mobile) !important}}\n\";s:2:\"js\";s:0:\"\";s:18:\"current_block_list\";a:9:{i:0;s:12:\"core\/heading\";i:1;s:14:\"core\/paragraph\";i:2;s:14:\"core\/shortcode\";i:3;s:11:\"core\/search\";i:4;s:10:\"core\/group\";i:5;s:17:\"core\/latest-posts\";i:6;s:20:\"core\/latest-comments\";i:7;s:13:\"core\/archives\";i:8;s:15:\"core\/categories\";}s:8:\"uag_flag\";b:0;s:11:\"uag_version\";s:10:\"1771033544\";s:6:\"gfonts\";a:0:{}s:10:\"gfonts_url\";s:0:\"\";s:12:\"gfonts_files\";a:0:{}s:14:\"uag_faq_layout\";b:0;}"]},"uagb_featured_image_src":{"full":["https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/04\/HeadShotNew-e1714324119909.png",796,715,false],"thumbnail":["https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/04\/HeadShotNew-e1714324119909-150x150.png",150,150,true],"medium":["https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/04\/HeadShotNew-e1714324119909-300x269.png",300,269,true],"medium_large":["https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/04\/HeadShotNew-e1714324119909-768x690.png",768,690,true],"large":["https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/04\/HeadShotNew-1012x1024.png",1012,1024,true],"1536x1536":["https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/04\/HeadShotNew-e1714324119909.png",796,715,false],"2048x2048":["https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/04\/HeadShotNew-e1714324119909.png",796,715,false]},"uagb_author_info":{"display_name":"Khyle Sewpersaud","author_link":"https:\/\/labs.cs.queensu.ca\/perklab\/author\/"},"uagb_comment_info":0,"uagb_excerpt":"Laura\u00a0Connolly PhD Student Department of Electrical and Computer Engineering (Smith Engineering) Queen&#8217;s University laura.connolly@queensu.ca Biography Laura Connolly is a Ph.D. Candidate in Electrical Engineering at Queen\u2019s University in Kingston, ON, Canada. She is mentored by Dr. Gabor Fichtinger, Dr. Parvin Mousavi and Dr. Russell H. Taylor. She has recently completed her second visiting studentship at&hellip;","_links":{"self":[{"href":"https:\/\/labs.cs.queensu.ca\/perklab\/wp-json\/wp\/v2\/qsc_member\/2100","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/labs.cs.queensu.ca\/perklab\/wp-json\/wp\/v2\/qsc_member"}],"about":[{"href":"https:\/\/labs.cs.queensu.ca\/perklab\/wp-json\/wp\/v2\/types\/qsc_member"}],"version-history":[{"count":1,"href":"https:\/\/labs.cs.queensu.ca\/perklab\/wp-json\/wp\/v2\/qsc_member\/2100\/revisions"}],"predecessor-version":[{"id":2101,"href":"https:\/\/labs.cs.queensu.ca\/perklab\/wp-json\/wp\/v2\/qsc_member\/2100\/revisions\/2101"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/labs.cs.queensu.ca\/perklab\/wp-json\/wp\/v2\/media\/1930"}],"wp:attachment":[{"href":"https:\/\/labs.cs.queensu.ca\/perklab\/wp-json\/wp\/v2\/media?parent=2100"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}