{"id":2126,"date":"2024-08-25T20:54:09","date_gmt":"2024-08-25T20:54:09","guid":{"rendered":"https:\/\/labs.cs.queensu.ca\/perklab\/?post_type=qsc_member&#038;p=2126"},"modified":"2024-08-25T20:54:10","modified_gmt":"2024-08-25T20:54:10","slug":"chris-yeung","status":"publish","type":"qsc_member","link":"https:\/\/labs.cs.queensu.ca\/perklab\/members\/chris-yeung\/","title":{"rendered":"Chris Yeung"},"content":{"rendered":"<div class=\"wp-block-columns is-layout-flex wp-block-columns-is-layout-flex qsc-member-single-core-info-container\">\n\t<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow qsc-member-single-photo-column\">\n\t\t<img loading=\"lazy\" decoding=\"async\" width=\"250\" height=\"250\" src=\"https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2023\/09\/cyeung-e1694031381570.jpg\" class=\"qsc-member-single-photo wp-post-image\" alt=\"\" \/>\n\t<\/div>\n\t<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow qsc-member-single-info-column\">\n\t\t<div class=\"qsc-member-name\"><h1>Chris Yeung<\/h1><\/div>\n\t\t<div class=\"qsc-member-position\">PhD Student<\/div>\n\t\t<div class=\"qsc-member-department\">School of Computing<\/div>\n\t\t<div class=\"qsc-member-organization\">Queen&#8217;s University<\/div>\n\t\t<div class=\"qsc-member-contact\">\n\t\t\t<div class=\"qsc-member-email\"><a href=\"mailto:chris.yeung@queensu.ca\">chris.yeung@queensu.ca<\/a><\/div>\n\t\t\t<div class=\"qsc-member-socials\">\n\t\t\t<a href=\"https:\/\/www.linkedin.com\/in\/chriscyyeung\/\" title=\"LinkedIn\"><i class=\"fa-brands fa-linkedin\"><\/i><\/a>\n\t\t\t<a href=\"https:\/\/github.com\/chriscyyeung\" title=\"GitHub\"><i class=\"fa-brands fa-github\"><\/i><\/a>\n\t\t\t<\/div>\n\t\t<\/div>\n\t<\/div>\n<\/div>\n<div class=\"qsc-member-bio\">\n\t\n<h2 class=\"wp-block-heading\">Biography<\/h2>\n\n\n\n<p>Chris is a PhD student in the School of Computing under the supervision of Dr. Gabor Fichtinger and Dr. Parvin Mousavi. Chris\u2019 research is in deep learning for medical image segmentation and developing a navigation system for breast cancer surgery.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Publications<\/h2>\n\n\n<div class=\"teachpress_pub_list\"><form name=\"tppublistform\" method=\"get\"><a name=\"tppubs\" id=\"tppubs\"><\/a><\/form><div class=\"teachpress_publication_list\"><div class=\"tp_publication tp_publication_article\"><div class=\"tp_pub_info\"><p class=\"tp_pub_author\"> Othman, Amira;  Kaufmann, Martin;  Koster, Teaghan;  Jamzad, Amoon;  Ungi, Tamas;  Rodgers, Jessica;  Mcmullen, Julie;  Yeung, Chris;  Janssen, Natasja;  Solberg, Kathryn;  Cheesman, Joanna;  Rudan, John;  Mousavi, Parvin;  Fichtinger, Gabor;  Hoyos, Andrea Gallo;  Jabs, Doris;  Engel, Jay;  Merchant, Shaila;  Walker, Ross;  Ren, Kevin;  Varma, Sonal<\/p><p class=\"tp_pub_title\">211 Three-Dimensional Navigated Mass Spectrometry for Intraoperative Margin Assessment During Breast Cancer Surgery <span class=\"tp_pub_type tp_  article\">Journal Article<\/span> <\/p><p class=\"tp_pub_additional\"><span class=\"tp_pub_additional_in\">In: <\/span><span class=\"tp_pub_additional_journal\">Laboratory Investigation, <\/span><span class=\"tp_pub_additional_volume\">vol. 105, <\/span><span class=\"tp_pub_additional_number\">no. 3, <\/span><span class=\"tp_pub_additional_year\">2025<\/span>.<\/p><p class=\"tp_pub_menu\"><span class=\"tp_abstract_link\"><a id=\"tp_abstract_sh_1165\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('1165','tp_abstract')\" title=\"Show abstract\" style=\"cursor:pointer;\">Abstract<\/a><\/span> | <span class=\"tp_bibtex_link\"><a id=\"tp_bibtex_sh_1165\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('1165','tp_bibtex')\" title=\"Show BibTeX entry\" style=\"cursor:pointer;\">BibTeX<\/a><\/span><\/p><div class=\"tp_bibtex\" id=\"tp_bibtex_1165\" style=\"display:none;\"><div class=\"tp_bibtex_entry\"><pre>@article{othman2025,<br \/>\r\ntitle = {211 Three-Dimensional Navigated Mass Spectrometry for Intraoperative Margin Assessment During Breast Cancer Surgery},<br \/>\r\nauthor = {Amira Othman and Martin Kaufmann and Teaghan Koster and Amoon Jamzad and Tamas Ungi and Jessica Rodgers and Julie Mcmullen and Chris Yeung and Natasja Janssen and Kathryn Solberg and Joanna Cheesman and John Rudan and Parvin Mousavi and Gabor Fichtinger and Andrea Gallo Hoyos and Doris Jabs and Jay Engel and Shaila Merchant and Ross Walker and Kevin Ren and Sonal Varma},<br \/>\r\nyear  = {2025},<br \/>\r\ndate = {2025-01-01},<br \/>\r\njournal = {Laboratory Investigation},<br \/>\r\nvolume = {105},<br \/>\r\nnumber = {3},<br \/>\r\npublisher = {Elsevier},<br \/>\r\nabstract = {Background <br \/>\r\nIntraoperative frozen sections are not routine in Breast cancer (BC), hence, patients with positive margin need reoperation for margin clearance. Technologies that can identify residual cancer in real-time during the surgery can be of immense help in reducing the morbidity, healthcare utilization and, the prognosis in BC. Rapid evaporative ionization mass spectrometry (REIMS) is a mass spectrometric technique that can chemically profile the surgical cauterization plume to classify the tissue as either cancerous, suspicious or non-cancerous. A plastic tube with solvent is attached to the cautery knife (i-knife) and it passes the smoke generated from cautery of the tissue to the mass spectrometric machine located in the OR. The spectra generated from this smoke solution are assessed in real-time to help classify the tissue. Our goal was to compare the accuracy of REIMS with histology (the gold standard) to \u2026},<br \/>\r\nkeywords = {},<br \/>\r\npubstate = {published},<br \/>\r\ntppubtype = {article}<br \/>\r\n}<br \/>\r\n<\/pre><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('1165','tp_bibtex')\">Close<\/a><\/p><\/div><div class=\"tp_abstract\" id=\"tp_abstract_1165\" style=\"display:none;\"><div class=\"tp_abstract_entry\">Background <br \/>\r\nIntraoperative frozen sections are not routine in Breast cancer (BC), hence, patients with positive margin need reoperation for margin clearance. Technologies that can identify residual cancer in real-time during the surgery can be of immense help in reducing the morbidity, healthcare utilization and, the prognosis in BC. Rapid evaporative ionization mass spectrometry (REIMS) is a mass spectrometric technique that can chemically profile the surgical cauterization plume to classify the tissue as either cancerous, suspicious or non-cancerous. A plastic tube with solvent is attached to the cautery knife (i-knife) and it passes the smoke generated from cautery of the tissue to the mass spectrometric machine located in the OR. The spectra generated from this smoke solution are assessed in real-time to help classify the tissue. Our goal was to compare the accuracy of REIMS with histology (the gold standard) to \u2026<\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('1165','tp_abstract')\">Close<\/a><\/p><\/div><\/div><\/div><div class=\"tp_publication tp_publication_proceedings\"><div class=\"tp_pub_info\"><p class=\"tp_pub_author\"> Kim, Andrew S.;  Yeung, Chris;  Szabo, Robert;  Sunderland, Kyle;  Hisey, Rebecca;  Morton, David;  Kikinis, Ron;  Diao, Babacar;  Mousavi, Parvin;  Ungi, Tamas;  Fichtinger, Gabor<\/p><p class=\"tp_pub_title\"><a class=\"tp_title_link\" href=\"https:\/\/dx.doi.org\/10.1117\/12.3006533\" title=\"Percutaneous nephrostomy needle guidance using real-time 3D anatomical visualization with live ultrasound segmentation\" target=\"blank\">Percutaneous nephrostomy needle guidance using real-time 3D anatomical visualization with live ultrasound segmentation<\/a> <span class=\"tp_pub_type tp_  proceedings\">Proceedings<\/span> <\/p><p class=\"tp_pub_additional\"><span class=\"tp_pub_additional_publisher\">SPIE, <\/span><span class=\"tp_pub_additional_year\">2024<\/span>.<\/p><p class=\"tp_pub_menu\"><span class=\"tp_abstract_link\"><a id=\"tp_abstract_sh_652\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('652','tp_abstract')\" title=\"Show abstract\" style=\"cursor:pointer;\">Abstract<\/a><\/span> | <span class=\"tp_resource_link\"><a id=\"tp_links_sh_652\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('652','tp_links')\" title=\"Show links and resources\" style=\"cursor:pointer;\">Links<\/a><\/span> | <span class=\"tp_bibtex_link\"><a id=\"tp_bibtex_sh_652\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('652','tp_bibtex')\" title=\"Show BibTeX entry\" style=\"cursor:pointer;\">BibTeX<\/a><\/span><\/p><div class=\"tp_bibtex\" id=\"tp_bibtex_652\" style=\"display:none;\"><div class=\"tp_bibtex_entry\"><pre>@proceedings{Kim2024,<br \/>\r\ntitle = {Percutaneous nephrostomy needle guidance using real-time 3D anatomical visualization with live ultrasound segmentation},<br \/>\r\nauthor = {Andrew S. Kim and Chris Yeung and Robert Szabo and Kyle Sunderland and Rebecca Hisey and David Morton and Ron Kikinis and Babacar Diao and Parvin Mousavi and Tamas Ungi and Gabor Fichtinger},<br \/>\r\neditor = {Maryam E. Rettmann and Jeffrey H. Siewerdsen},<br \/>\r\ndoi = {10.1117\/12.3006533},<br \/>\r\nyear  = {2024},<br \/>\r\ndate = {2024-03-29},<br \/>\r\nurldate = {2024-03-29},<br \/>\r\npublisher = {SPIE},<br \/>\r\nabstract = {<br \/>\r\nPURPOSE: Percutaneous nephrostomy is a commonly performed procedure to drain urine to provide relief in patients with hydronephrosis. Conventional percutaneous nephrostomy needle guidance methods can be difficult, expensive, or not portable. We propose an open-source real-time 3D anatomical visualization aid for needle guidance with live ultrasound segmentation and 3D volume reconstruction using free, open-source software. METHODS: Basic hydronephrotic kidney phantoms were created, and recordings of these models were manually segmented and used to train a deep learning model that makes live segmentation predictions to perform live 3D volume reconstruction of the fluid-filled cavity. Participants performed 5 needle insertions with the visualization aid and 5 insertions with ultrasound needle guidance on a kidney phantom in randomized order, and these were recorded. Recordings of the trials were analyzed for needle tip distance to the center of the target calyx, needle insertion time, and success rate. Participants also completed a survey on their experience. RESULTS: Using the visualization aid showed significantly higher accuracy, while needle insertion time and success rate were not statistically significant at our sample size. Participants mostly responded positively to the visualization aid, and 80% found it easier to use than ultrasound needle guidance. CONCLUSION: We found that our visualization aid produced increased accuracy and an overall positive experience. We demonstrated that our system is functional and stable and believe that the workflow with this system can be applied to other procedures. This visualization aid system is effective on phantoms and is ready for translation with clinical data.},<br \/>\r\nkeywords = {},<br \/>\r\npubstate = {published},<br \/>\r\ntppubtype = {proceedings}<br \/>\r\n}<br \/>\r\n<\/pre><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('652','tp_bibtex')\">Close<\/a><\/p><\/div><div class=\"tp_abstract\" id=\"tp_abstract_652\" style=\"display:none;\"><div class=\"tp_abstract_entry\"><br \/>\r\nPURPOSE: Percutaneous nephrostomy is a commonly performed procedure to drain urine to provide relief in patients with hydronephrosis. Conventional percutaneous nephrostomy needle guidance methods can be difficult, expensive, or not portable. We propose an open-source real-time 3D anatomical visualization aid for needle guidance with live ultrasound segmentation and 3D volume reconstruction using free, open-source software. METHODS: Basic hydronephrotic kidney phantoms were created, and recordings of these models were manually segmented and used to train a deep learning model that makes live segmentation predictions to perform live 3D volume reconstruction of the fluid-filled cavity. Participants performed 5 needle insertions with the visualization aid and 5 insertions with ultrasound needle guidance on a kidney phantom in randomized order, and these were recorded. Recordings of the trials were analyzed for needle tip distance to the center of the target calyx, needle insertion time, and success rate. Participants also completed a survey on their experience. RESULTS: Using the visualization aid showed significantly higher accuracy, while needle insertion time and success rate were not statistically significant at our sample size. Participants mostly responded positively to the visualization aid, and 80% found it easier to use than ultrasound needle guidance. CONCLUSION: We found that our visualization aid produced increased accuracy and an overall positive experience. We demonstrated that our system is functional and stable and believe that the workflow with this system can be applied to other procedures. This visualization aid system is effective on phantoms and is ready for translation with clinical data.<\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('652','tp_abstract')\">Close<\/a><\/p><\/div><div class=\"tp_links\" id=\"tp_links_652\" style=\"display:none;\"><div class=\"tp_links_entry\"><ul class=\"tp_pub_list\"><li><i class=\"ai ai-doi\"><\/i><a class=\"tp_pub_list\" href=\"https:\/\/dx.doi.org\/10.1117\/12.3006533\" title=\"Follow DOI:10.1117\/12.3006533\" target=\"_blank\">doi:10.1117\/12.3006533<\/a><\/li><\/ul><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('652','tp_links')\">Close<\/a><\/p><\/div><\/div><\/div><div class=\"tp_publication tp_publication_article\"><div class=\"tp_pub_info\"><p class=\"tp_pub_author\"> Kaufmann, Martin;  Jamzad, Amoon;  Ungi, Tamas;  Rodgers, Jessica R;  Koster, Teaghan;  Yeung, Chris;  Ehrlich, Josh;  Santilli, Alice;  Asselin, Mark;  Janssen, Natasja;  McMullen, Julie;  Solberg, Kathryn;  Cheesman, Joanna;  Carlo, Alessia Di;  Ren, Kevin Yi Mi;  Varma, Sonal;  Merchant, Shaila;  Engel, Cecil Jay;  Walker, G Ross;  Gallo, Andrea;  Jabs, Doris;  Mousavi, Parvin;  Fichtinger, Gabor;  Rudan, John F<\/p><p class=\"tp_pub_title\"><a class=\"tp_title_link\" href=\"https:\/\/aacrjournals.org\/cancerres\/article\/84\/9_Supplement\/PO2-23-07\/743683\" title=\"https:\/\/aacrjournals.org\/cancerres\/article\/84\/9_Supplement\/PO2-23-07\/743683\" target=\"blank\">Abstract PO2-23-07: Three-dimensional navigated mass spectrometry for intraoperative margin assessment during breast cancer surgery<\/a> <span class=\"tp_pub_type tp_  article\">Journal Article<\/span> <\/p><p class=\"tp_pub_additional\"><span class=\"tp_pub_additional_in\">In: <\/span><span class=\"tp_pub_additional_journal\">Cancer Research, <\/span><span class=\"tp_pub_additional_volume\">vol. 84, <\/span><span class=\"tp_pub_additional_issue\">iss. 9_Supplement, <\/span><span class=\"tp_pub_additional_pages\">pp. PO2-23-07-PO2-23-07, <\/span><span class=\"tp_pub_additional_year\">2024<\/span>.<\/p><p class=\"tp_pub_menu\"><span class=\"tp_abstract_link\"><a id=\"tp_abstract_sh_985\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('985','tp_abstract')\" title=\"Show abstract\" style=\"cursor:pointer;\">Abstract<\/a><\/span> | <span class=\"tp_resource_link\"><a id=\"tp_links_sh_985\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('985','tp_links')\" title=\"Show links and resources\" style=\"cursor:pointer;\">Links<\/a><\/span> | <span class=\"tp_bibtex_link\"><a id=\"tp_bibtex_sh_985\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('985','tp_bibtex')\" title=\"Show BibTeX entry\" style=\"cursor:pointer;\">BibTeX<\/a><\/span><\/p><div class=\"tp_bibtex\" id=\"tp_bibtex_985\" style=\"display:none;\"><div class=\"tp_bibtex_entry\"><pre>@article{fichtinger2024c,<br \/>\r\ntitle = {Abstract PO2-23-07: Three-dimensional navigated mass spectrometry for intraoperative margin assessment during breast cancer surgery},<br \/>\r\nauthor = {Martin Kaufmann and Amoon Jamzad and Tamas Ungi and Jessica R Rodgers and Teaghan Koster and Chris Yeung and Josh Ehrlich and Alice Santilli and Mark Asselin and Natasja Janssen and Julie McMullen and Kathryn Solberg and Joanna Cheesman and Alessia Di Carlo and Kevin Yi Mi Ren and Sonal Varma and Shaila Merchant and Cecil Jay Engel and G Ross Walker and Andrea Gallo and Doris Jabs and Parvin Mousavi and Gabor Fichtinger and John F Rudan},<br \/>\r\nurl = {https:\/\/aacrjournals.org\/cancerres\/article\/84\/9_Supplement\/PO2-23-07\/743683},<br \/>\r\nyear  = {2024},<br \/>\r\ndate = {2024-01-01},<br \/>\r\njournal = {Cancer Research},<br \/>\r\nvolume = {84},<br \/>\r\nissue = {9_Supplement},<br \/>\r\npages = {PO2-23-07-PO2-23-07},<br \/>\r\npublisher = {The American Association for Cancer Research},<br \/>\r\nabstract = {Positive resection margins occur in approximately 25% of breast cancer (BCa) surgeries, requiring re-operation. Margin status is not routinely available during surgery; thus, technologies that identify residual cancer on the specimen or cavity are needed to provide intraoperative decision support that may reduce positive margin rates. Rapid evaporative ionization mass spectrometry (REIMS) is an emerging technique that chemically profiles the plume generated by tissue cauterization to classify the ablated tissue as either cancerous or non-cancerous, on the basis of detected lipid species. Although REIMS can distinguish cancer and non-cancerous breast tissue by the signals generated, it does not indicate the location of the classified tissue in real-time. Our objective was to combine REIMS with spatio-temporal navigation (navigated REIMS), and to compare performance of navigated REIMS with conventional \u2026},<br \/>\r\nkeywords = {},<br \/>\r\npubstate = {published},<br \/>\r\ntppubtype = {article}<br \/>\r\n}<br \/>\r\n<\/pre><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('985','tp_bibtex')\">Close<\/a><\/p><\/div><div class=\"tp_abstract\" id=\"tp_abstract_985\" style=\"display:none;\"><div class=\"tp_abstract_entry\">Positive resection margins occur in approximately 25% of breast cancer (BCa) surgeries, requiring re-operation. Margin status is not routinely available during surgery; thus, technologies that identify residual cancer on the specimen or cavity are needed to provide intraoperative decision support that may reduce positive margin rates. Rapid evaporative ionization mass spectrometry (REIMS) is an emerging technique that chemically profiles the plume generated by tissue cauterization to classify the ablated tissue as either cancerous or non-cancerous, on the basis of detected lipid species. Although REIMS can distinguish cancer and non-cancerous breast tissue by the signals generated, it does not indicate the location of the classified tissue in real-time. Our objective was to combine REIMS with spatio-temporal navigation (navigated REIMS), and to compare performance of navigated REIMS with conventional \u2026<\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('985','tp_abstract')\">Close<\/a><\/p><\/div><div class=\"tp_links\" id=\"tp_links_985\" style=\"display:none;\"><div class=\"tp_links_entry\"><ul class=\"tp_pub_list\"><li><i class=\"fas fa-globe\"><\/i><a class=\"tp_pub_list\" href=\"https:\/\/aacrjournals.org\/cancerres\/article\/84\/9_Supplement\/PO2-23-07\/743683\" title=\"https:\/\/aacrjournals.org\/cancerres\/article\/84\/9_Supplement\/PO2-23-07\/743683\" target=\"_blank\">https:\/\/aacrjournals.org\/cancerres\/article\/84\/9_Supplement\/PO2-23-07\/743683<\/a><\/li><\/ul><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('985','tp_links')\">Close<\/a><\/p><\/div><\/div><\/div><div class=\"tp_publication tp_publication_article\"><div class=\"tp_pub_info\"><p class=\"tp_pub_author\"> Yeung, Chris;  Ungi, Tamas;  Hu, Zoe;  Jamzad, Amoon;  Kaufmann, Martin;  Walker, Ross;  Merchant, Shaila;  Engel, Cecil Jay;  Jabs, Doris;  Rudan, John;  Mousavi, Parvin;  Fichtinger, Gabor<\/p><p class=\"tp_pub_title\"><a class=\"tp_title_link\" href=\"https:\/\/link.springer.com\/article\/10.1007\/s11548-024-03133-y\" title=\"https:\/\/link.springer.com\/article\/10.1007\/s11548-024-03133-y\" target=\"blank\">From quantitative metrics to clinical success: assessing the utility of deep learning for tumor segmentation in breast surgery<\/a> <span class=\"tp_pub_type tp_  article\">Journal Article<\/span> <\/p><p class=\"tp_pub_additional\"><span class=\"tp_pub_additional_in\">In: <\/span><span class=\"tp_pub_additional_journal\">International Journal of Computer Assisted Radiology and Surgery, <\/span><span class=\"tp_pub_additional_pages\">pp. 1-9, <\/span><span class=\"tp_pub_additional_year\">2024<\/span>.<\/p><p class=\"tp_pub_menu\"><span class=\"tp_abstract_link\"><a id=\"tp_abstract_sh_986\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('986','tp_abstract')\" title=\"Show abstract\" style=\"cursor:pointer;\">Abstract<\/a><\/span> | <span class=\"tp_resource_link\"><a id=\"tp_links_sh_986\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('986','tp_links')\" title=\"Show links and resources\" style=\"cursor:pointer;\">Links<\/a><\/span> | <span class=\"tp_bibtex_link\"><a id=\"tp_bibtex_sh_986\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('986','tp_bibtex')\" title=\"Show BibTeX entry\" style=\"cursor:pointer;\">BibTeX<\/a><\/span><\/p><div class=\"tp_bibtex\" id=\"tp_bibtex_986\" style=\"display:none;\"><div class=\"tp_bibtex_entry\"><pre>@article{yeung2024,<br \/>\r\ntitle = {From quantitative metrics to clinical success: assessing the utility of deep learning for tumor segmentation in breast surgery},<br \/>\r\nauthor = {Chris Yeung and Tamas Ungi and Zoe Hu and Amoon Jamzad and Martin Kaufmann and Ross Walker and Shaila Merchant and Cecil Jay Engel and Doris Jabs and John Rudan and Parvin Mousavi and Gabor Fichtinger},<br \/>\r\nurl = {https:\/\/link.springer.com\/article\/10.1007\/s11548-024-03133-y},<br \/>\r\nyear  = {2024},<br \/>\r\ndate = {2024-01-01},<br \/>\r\nurldate = {2024-01-01},<br \/>\r\njournal = {International Journal of Computer Assisted Radiology and Surgery},<br \/>\r\npages = {1-9},<br \/>\r\npublisher = {Springer International Publishing},<br \/>\r\nabstract = {Purpose <br \/>\r\nPreventing positive margins is essential for ensuring favorable patient outcomes following breast-conserving surgery (BCS). Deep learning has the potential to enable this by automatically contouring the tumor and guiding resection in real time. However, evaluation of such models with respect to pathology outcomes is necessary for their successful translation into clinical practice. <br \/>\r\nMethods <br \/>\r\nSixteen deep learning models based on established architectures in the literature are trained on 7318 ultrasound images from 33 patients. Models are ranked by an expert based on their contours generated from images in our test set. Generated contours from each model are also analyzed using recorded cautery trajectories of five navigated BCS cases to predict margin status. Predicted margins are compared with pathology reports. <br \/>\r\nResults <br \/>\r\nThe best-performing model using both quantitative evaluation and our visual \u2026},<br \/>\r\nkeywords = {},<br \/>\r\npubstate = {published},<br \/>\r\ntppubtype = {article}<br \/>\r\n}<br \/>\r\n<\/pre><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('986','tp_bibtex')\">Close<\/a><\/p><\/div><div class=\"tp_abstract\" id=\"tp_abstract_986\" style=\"display:none;\"><div class=\"tp_abstract_entry\">Purpose <br \/>\r\nPreventing positive margins is essential for ensuring favorable patient outcomes following breast-conserving surgery (BCS). Deep learning has the potential to enable this by automatically contouring the tumor and guiding resection in real time. However, evaluation of such models with respect to pathology outcomes is necessary for their successful translation into clinical practice. <br \/>\r\nMethods <br \/>\r\nSixteen deep learning models based on established architectures in the literature are trained on 7318 ultrasound images from 33 patients. Models are ranked by an expert based on their contours generated from images in our test set. Generated contours from each model are also analyzed using recorded cautery trajectories of five navigated BCS cases to predict margin status. Predicted margins are compared with pathology reports. <br \/>\r\nResults <br \/>\r\nThe best-performing model using both quantitative evaluation and our visual \u2026<\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('986','tp_abstract')\">Close<\/a><\/p><\/div><div class=\"tp_links\" id=\"tp_links_986\" style=\"display:none;\"><div class=\"tp_links_entry\"><ul class=\"tp_pub_list\"><li><i class=\"fas fa-globe\"><\/i><a class=\"tp_pub_list\" href=\"https:\/\/link.springer.com\/article\/10.1007\/s11548-024-03133-y\" title=\"https:\/\/link.springer.com\/article\/10.1007\/s11548-024-03133-y\" target=\"_blank\">https:\/\/link.springer.com\/article\/10.1007\/s11548-024-03133-y<\/a><\/li><\/ul><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('986','tp_links')\">Close<\/a><\/p><\/div><\/div><\/div><div class=\"tp_publication tp_publication_article\"><div class=\"tp_pub_info\"><p class=\"tp_pub_author\"> Hashtrudi-Zaad, Kian;  Ungi, Tamas;  Yeung, Chris;  Baum, Zachary;  Cernelev, Pavel-Dumitru;  Hage, Anthony N;  Schlenger, Christopher;  Fichtinger, Gabor<\/p><p class=\"tp_pub_title\">Expert-guided optimization of ultrasound segmentation models for 3D spine imaging <span class=\"tp_pub_type tp_  article\">Journal Article<\/span> <\/p><p class=\"tp_pub_additional\"><span class=\"tp_pub_additional_in\">In: <\/span><span class=\"tp_pub_additional_pages\">pp. 680-685, <\/span><span class=\"tp_pub_additional_year\">2024<\/span>.<\/p><p class=\"tp_pub_menu\"><span class=\"tp_abstract_link\"><a id=\"tp_abstract_sh_1151\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('1151','tp_abstract')\" title=\"Show abstract\" style=\"cursor:pointer;\">Abstract<\/a><\/span> | <span class=\"tp_bibtex_link\"><a id=\"tp_bibtex_sh_1151\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('1151','tp_bibtex')\" title=\"Show BibTeX entry\" style=\"cursor:pointer;\">BibTeX<\/a><\/span><\/p><div class=\"tp_bibtex\" id=\"tp_bibtex_1151\" style=\"display:none;\"><div class=\"tp_bibtex_entry\"><pre>@article{hashtrudi-zaad2024,<br \/>\r\ntitle = {Expert-guided optimization of ultrasound segmentation models for 3D spine imaging},<br \/>\r\nauthor = {Kian Hashtrudi-Zaad and Tamas Ungi and Chris Yeung and Zachary Baum and Pavel-Dumitru Cernelev and Anthony N Hage and Christopher Schlenger and Gabor Fichtinger},<br \/>\r\nyear  = {2024},<br \/>\r\ndate = {2024-01-01},<br \/>\r\npages = {680-685},<br \/>\r\npublisher = {IEEE},<br \/>\r\nabstract = {We explored ultrasound for imaging bones, specifically the spine, as a safer and more accessible alternative to conventional X-ray. We aimed to improve how well deep learning segmentation models filter bone signals from ultrasound frames with the goal of using these segmented images for reconstructing the 3-dimensional spine volume.Our dataset consisted of spatially tracked ultrasound scans from 25 patients. Image frames from these scans were also manually annotated to provide training data for image segmentation deep learning. To find the optimal automatic segmentation method, we assessed five different artificial neural network models and their variations by hyperparameter tuning. Our main contribution is a new approach for model selection, employing an Elo rating system to efficiently rank trained models based on their visual performance as assessed by clinical users. This method addresses the \u2026},<br \/>\r\nkeywords = {},<br \/>\r\npubstate = {published},<br \/>\r\ntppubtype = {article}<br \/>\r\n}<br \/>\r\n<\/pre><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('1151','tp_bibtex')\">Close<\/a><\/p><\/div><div class=\"tp_abstract\" id=\"tp_abstract_1151\" style=\"display:none;\"><div class=\"tp_abstract_entry\">We explored ultrasound for imaging bones, specifically the spine, as a safer and more accessible alternative to conventional X-ray. We aimed to improve how well deep learning segmentation models filter bone signals from ultrasound frames with the goal of using these segmented images for reconstructing the 3-dimensional spine volume.Our dataset consisted of spatially tracked ultrasound scans from 25 patients. Image frames from these scans were also manually annotated to provide training data for image segmentation deep learning. To find the optimal automatic segmentation method, we assessed five different artificial neural network models and their variations by hyperparameter tuning. Our main contribution is a new approach for model selection, employing an Elo rating system to efficiently rank trained models based on their visual performance as assessed by clinical users. This method addresses the \u2026<\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('1151','tp_abstract')\">Close<\/a><\/p><\/div><\/div><\/div><div class=\"tp_publication tp_publication_article\"><div class=\"tp_pub_info\"><p class=\"tp_pub_author\"> Orosz, G\u00e1bor;  Szab\u00f3, R\u00f3bert Zsolt;  Ungi, Tam\u00e1s;  Barr, Colton;  Yeung, Chris;  Fichtinger, G\u00e1bor;  G\u00e1l, J\u00e1nos;  Haidegger, Tam\u00e1s<\/p><p class=\"tp_pub_title\"><a class=\"tp_title_link\" href=\"https:\/\/acta.uni-obuda.hu\/Orosz_Szabo_Ungi_Barr_Yeung_Fichtinger_Gal_Haidegger_137.pdf\" title=\"https:\/\/acta.uni-obuda.hu\/Orosz_Szabo_Ungi_Barr_Yeung_Fichtinger_Gal_Haidegger_137.pdf\" target=\"blank\">Lung Ultrasound Imaging and Image Processing with Artificial Intelligence Methods for Bedside Diagnostic Examinations<\/a> <span class=\"tp_pub_type tp_  article\">Journal Article<\/span> <\/p><p class=\"tp_pub_additional\"><span class=\"tp_pub_additional_in\">In: <\/span><span class=\"tp_pub_additional_journal\">Acta Polytechnica Hungarica, <\/span><span class=\"tp_pub_additional_volume\">vol. 20, <\/span><span class=\"tp_pub_additional_issue\">iss. 8, <\/span><span class=\"tp_pub_additional_year\">2023<\/span>.<\/p><p class=\"tp_pub_menu\"><span class=\"tp_abstract_link\"><a id=\"tp_abstract_sh_915\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('915','tp_abstract')\" title=\"Show abstract\" style=\"cursor:pointer;\">Abstract<\/a><\/span> | <span class=\"tp_resource_link\"><a id=\"tp_links_sh_915\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('915','tp_links')\" title=\"Show links and resources\" style=\"cursor:pointer;\">Links<\/a><\/span> | <span class=\"tp_bibtex_link\"><a id=\"tp_bibtex_sh_915\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('915','tp_bibtex')\" title=\"Show BibTeX entry\" style=\"cursor:pointer;\">BibTeX<\/a><\/span><\/p><div class=\"tp_bibtex\" id=\"tp_bibtex_915\" style=\"display:none;\"><div class=\"tp_bibtex_entry\"><pre>@article{fichtinger2023d,<br \/>\r\ntitle = {Lung Ultrasound Imaging and Image Processing with Artificial Intelligence Methods for Bedside Diagnostic Examinations},<br \/>\r\nauthor = {G\u00e1bor Orosz and R\u00f3bert Zsolt Szab\u00f3 and Tam\u00e1s Ungi and Colton Barr and Chris Yeung and G\u00e1bor Fichtinger and J\u00e1nos G\u00e1l and Tam\u00e1s Haidegger},<br \/>\r\nurl = {https:\/\/acta.uni-obuda.hu\/Orosz_Szabo_Ungi_Barr_Yeung_Fichtinger_Gal_Haidegger_137.pdf},<br \/>\r\nyear  = {2023},<br \/>\r\ndate = {2023-01-01},<br \/>\r\njournal = {Acta Polytechnica Hungarica},<br \/>\r\nvolume = {20},<br \/>\r\nissue = {8},<br \/>\r\nabstract = {Artificial Intelligence-assisted radiology has shown to offer significant benefits in clinical care. Physicians often face challenges in identifying the underlying causes of acute respiratory failure. One method employed by experts is the utilization of bedside lung ultrasound, although it has a significant learning curve. In our study, we explore the potential of a Machine Learning-based automated decision-support system to assist inexperienced practitioners in interpreting lung ultrasound scans. This system incorporates medical ultrasound, advanced data processing techniques, and a neural network implementation to achieve its objective. The article provides a comprehensive overview of the steps involved in data preparation and the implementation of the neural network. The accuracy and error rate of the most effective model are presented, accompanied by illustrative examples of their predictions. Furthermore, the paper concludes with an evaluation of the results, identification of limitations, and recommendations for future enhancements.},<br \/>\r\nkeywords = {},<br \/>\r\npubstate = {published},<br \/>\r\ntppubtype = {article}<br \/>\r\n}<br \/>\r\n<\/pre><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('915','tp_bibtex')\">Close<\/a><\/p><\/div><div class=\"tp_abstract\" id=\"tp_abstract_915\" style=\"display:none;\"><div class=\"tp_abstract_entry\">Artificial Intelligence-assisted radiology has shown to offer significant benefits in clinical care. Physicians often face challenges in identifying the underlying causes of acute respiratory failure. One method employed by experts is the utilization of bedside lung ultrasound, although it has a significant learning curve. In our study, we explore the potential of a Machine Learning-based automated decision-support system to assist inexperienced practitioners in interpreting lung ultrasound scans. This system incorporates medical ultrasound, advanced data processing techniques, and a neural network implementation to achieve its objective. The article provides a comprehensive overview of the steps involved in data preparation and the implementation of the neural network. The accuracy and error rate of the most effective model are presented, accompanied by illustrative examples of their predictions. Furthermore, the paper concludes with an evaluation of the results, identification of limitations, and recommendations for future enhancements.<\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('915','tp_abstract')\">Close<\/a><\/p><\/div><div class=\"tp_links\" id=\"tp_links_915\" style=\"display:none;\"><div class=\"tp_links_entry\"><ul class=\"tp_pub_list\"><li><i class=\"fas fa-file-pdf\"><\/i><a class=\"tp_pub_list\" href=\"https:\/\/acta.uni-obuda.hu\/Orosz_Szabo_Ungi_Barr_Yeung_Fichtinger_Gal_Haidegger_137.pdf\" title=\"https:\/\/acta.uni-obuda.hu\/Orosz_Szabo_Ungi_Barr_Yeung_Fichtinger_Gal_Haidegger_1[...]\" target=\"_blank\">https:\/\/acta.uni-obuda.hu\/Orosz_Szabo_Ungi_Barr_Yeung_Fichtinger_Gal_Haidegger_1[...]<\/a><\/li><\/ul><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('915','tp_links')\">Close<\/a><\/p><\/div><\/div><\/div><div class=\"tp_publication tp_publication_article\"><div class=\"tp_pub_info\"><p class=\"tp_pub_author\"> Szab\u00f3, R\u00f3bert Zsolt;  Orosz, G\u00e1bor;  Ungi, Tam\u00e1s;  Barr, Colton;  Yeung, Chris;  Incze, Roland;  Fichtinger, Gabor;  G\u00e1l, J\u00e1nos;  Haidegger, Tam\u00e1s<\/p><p class=\"tp_pub_title\"><a class=\"tp_title_link\" href=\"https:\/\/ieeexplore.ieee.org\/abstract\/document\/10158672\/\" title=\"https:\/\/ieeexplore.ieee.org\/abstract\/document\/10158672\/\" target=\"blank\">Automation of lung ultrasound imaging and image processing for bedside diagnostic examinations<\/a> <span class=\"tp_pub_type tp_  article\">Journal Article<\/span> <\/p><p class=\"tp_pub_additional\"><span class=\"tp_pub_additional_in\">In: <\/span><span class=\"tp_pub_additional_pages\">pp. 000779-000784, <\/span><span class=\"tp_pub_additional_year\">2023<\/span>.<\/p><p class=\"tp_pub_menu\"><span class=\"tp_abstract_link\"><a id=\"tp_abstract_sh_940\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('940','tp_abstract')\" title=\"Show abstract\" style=\"cursor:pointer;\">Abstract<\/a><\/span> | <span class=\"tp_resource_link\"><a id=\"tp_links_sh_940\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('940','tp_links')\" title=\"Show links and resources\" style=\"cursor:pointer;\">Links<\/a><\/span> | <span class=\"tp_bibtex_link\"><a id=\"tp_bibtex_sh_940\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('940','tp_bibtex')\" title=\"Show BibTeX entry\" style=\"cursor:pointer;\">BibTeX<\/a><\/span><\/p><div class=\"tp_bibtex\" id=\"tp_bibtex_940\" style=\"display:none;\"><div class=\"tp_bibtex_entry\"><pre>@article{fichtinger2023h,<br \/>\r\ntitle = {Automation of lung ultrasound imaging and image processing for bedside diagnostic examinations},<br \/>\r\nauthor = {R\u00f3bert Zsolt Szab\u00f3 and G\u00e1bor Orosz and Tam\u00e1s Ungi and Colton Barr and Chris Yeung and Roland Incze and Gabor Fichtinger and J\u00e1nos G\u00e1l and Tam\u00e1s Haidegger},<br \/>\r\nurl = {https:\/\/ieeexplore.ieee.org\/abstract\/document\/10158672\/},<br \/>\r\nyear  = {2023},<br \/>\r\ndate = {2023-01-01},<br \/>\r\npages = {000779-000784},<br \/>\r\npublisher = {IEEE},<br \/>\r\nabstract = {The causes of acute respiratory failure can be difficult to identify for physicians. Experts can differentiate these causes using bedside lung ultrasound, but lung ultrasound has a considerable learning curve. We investigate if an automated decision-support system could help novices interpret lung ultrasound scans. The system utilizes medical ultrasound, data processing, and a neural network implementation to achieve this goal. The article details the steps taken in the data preparation, and the implementation of the neural network. The best model\u2019s accuracy and error rate are presented, along with examples of its predictions. The paper concludes with an evaluation of the results, identification of limitations, and suggestions for future improvements.},<br \/>\r\nkeywords = {},<br \/>\r\npubstate = {published},<br \/>\r\ntppubtype = {article}<br \/>\r\n}<br \/>\r\n<\/pre><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('940','tp_bibtex')\">Close<\/a><\/p><\/div><div class=\"tp_abstract\" id=\"tp_abstract_940\" style=\"display:none;\"><div class=\"tp_abstract_entry\">The causes of acute respiratory failure can be difficult to identify for physicians. Experts can differentiate these causes using bedside lung ultrasound, but lung ultrasound has a considerable learning curve. We investigate if an automated decision-support system could help novices interpret lung ultrasound scans. The system utilizes medical ultrasound, data processing, and a neural network implementation to achieve this goal. The article details the steps taken in the data preparation, and the implementation of the neural network. The best model\u2019s accuracy and error rate are presented, along with examples of its predictions. The paper concludes with an evaluation of the results, identification of limitations, and suggestions for future improvements.<\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('940','tp_abstract')\">Close<\/a><\/p><\/div><div class=\"tp_links\" id=\"tp_links_940\" style=\"display:none;\"><div class=\"tp_links_entry\"><ul class=\"tp_pub_list\"><li><i class=\"fas fa-globe\"><\/i><a class=\"tp_pub_list\" href=\"https:\/\/ieeexplore.ieee.org\/abstract\/document\/10158672\/\" title=\"https:\/\/ieeexplore.ieee.org\/abstract\/document\/10158672\/\" target=\"_blank\">https:\/\/ieeexplore.ieee.org\/abstract\/document\/10158672\/<\/a><\/li><\/ul><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('940','tp_links')\">Close<\/a><\/p><\/div><\/div><\/div><div class=\"tp_publication tp_publication_article\"><div class=\"tp_pub_info\"><p class=\"tp_pub_author\"> Yeung, Chris;  Ehrlich, Joshua;  Jamzad, Amoon;  Kaufmann, Martin;  Rudan, John;  Engel, Cecil Jay;  Mousavi, Parvin;  Ungi, Tamas;  Fichtinger, Gabor<\/p><p class=\"tp_pub_title\"><a class=\"tp_title_link\" href=\"https:\/\/www.spiedigitallibrary.org\/conference-proceedings-of-spie\/12466\/1246622\/Cautery-trajectory-analysis-for-evaluation-of-resection-margins-in-breast\/10.1117\/12.2654497.short\" title=\"https:\/\/www.spiedigitallibrary.org\/conference-proceedings-of-spie\/12466\/1246622\/Cautery-trajectory-analysis-for-evaluation-of-resection-margins-in-breast\/10.1117\/12.2654497.short\" target=\"blank\">Cautery trajectory analysis for evaluation of resection margins in breast-conserving surgery<\/a> <span class=\"tp_pub_type tp_  article\">Journal Article<\/span> <\/p><p class=\"tp_pub_additional\"><span class=\"tp_pub_additional_in\">In: <\/span><span class=\"tp_pub_additional_volume\">vol. 12466, <\/span><span class=\"tp_pub_additional_pages\">pp. 495-501, <\/span><span class=\"tp_pub_additional_year\">2023<\/span>.<\/p><p class=\"tp_pub_menu\"><span class=\"tp_abstract_link\"><a id=\"tp_abstract_sh_999\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('999','tp_abstract')\" title=\"Show abstract\" style=\"cursor:pointer;\">Abstract<\/a><\/span> | <span class=\"tp_resource_link\"><a id=\"tp_links_sh_999\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('999','tp_links')\" title=\"Show links and resources\" style=\"cursor:pointer;\">Links<\/a><\/span> | <span class=\"tp_bibtex_link\"><a id=\"tp_bibtex_sh_999\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('999','tp_bibtex')\" title=\"Show BibTeX entry\" style=\"cursor:pointer;\">BibTeX<\/a><\/span><\/p><div class=\"tp_bibtex\" id=\"tp_bibtex_999\" style=\"display:none;\"><div class=\"tp_bibtex_entry\"><pre>@article{fichtinger2023q,<br \/>\r\ntitle = {Cautery trajectory analysis for evaluation of resection margins in breast-conserving surgery},<br \/>\r\nauthor = {Chris Yeung and Joshua Ehrlich and Amoon Jamzad and Martin Kaufmann and John Rudan and Cecil Jay Engel and Parvin Mousavi and Tamas Ungi and Gabor Fichtinger},<br \/>\r\nurl = {https:\/\/www.spiedigitallibrary.org\/conference-proceedings-of-spie\/12466\/1246622\/Cautery-trajectory-analysis-for-evaluation-of-resection-margins-in-breast\/10.1117\/12.2654497.short},<br \/>\r\nyear  = {2023},<br \/>\r\ndate = {2023-01-01},<br \/>\r\nvolume = {12466},<br \/>\r\npages = {495-501},<br \/>\r\npublisher = {SPIE},<br \/>\r\nabstract = {After breast-conserving surgery, positive margins occur when breast cancer cells are found on the resection margin, leading to a higher chance of recurrence and the need for repeat surgery. The NaviKnife is an electromagnetic tracking-based surgical navigation system that helps to provide visual and spatial feedback to the surgeon. In this study, we conduct a gross evaluation of this navigation system with respect to resection margins. The trajectory of the surgical cautery relative to ultrasound-visible tumor will be visualized, and its distance and location from the tumor will be compared with pathology reports. Six breast-conserving surgery cases that resulted in positive margins were performed using the NaviKnife system. Trackers were placed on the surgical tools and their positions in three-dimensional space were recorded throughout the procedure. The closest distance between the cautery and the tumor \u2026},<br \/>\r\nkeywords = {},<br \/>\r\npubstate = {published},<br \/>\r\ntppubtype = {article}<br \/>\r\n}<br \/>\r\n<\/pre><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('999','tp_bibtex')\">Close<\/a><\/p><\/div><div class=\"tp_abstract\" id=\"tp_abstract_999\" style=\"display:none;\"><div class=\"tp_abstract_entry\">After breast-conserving surgery, positive margins occur when breast cancer cells are found on the resection margin, leading to a higher chance of recurrence and the need for repeat surgery. The NaviKnife is an electromagnetic tracking-based surgical navigation system that helps to provide visual and spatial feedback to the surgeon. In this study, we conduct a gross evaluation of this navigation system with respect to resection margins. The trajectory of the surgical cautery relative to ultrasound-visible tumor will be visualized, and its distance and location from the tumor will be compared with pathology reports. Six breast-conserving surgery cases that resulted in positive margins were performed using the NaviKnife system. Trackers were placed on the surgical tools and their positions in three-dimensional space were recorded throughout the procedure. The closest distance between the cautery and the tumor \u2026<\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('999','tp_abstract')\">Close<\/a><\/p><\/div><div class=\"tp_links\" id=\"tp_links_999\" style=\"display:none;\"><div class=\"tp_links_entry\"><ul class=\"tp_pub_list\"><li><i class=\"fas fa-globe\"><\/i><a class=\"tp_pub_list\" href=\"https:\/\/www.spiedigitallibrary.org\/conference-proceedings-of-spie\/12466\/1246622\/Cautery-trajectory-analysis-for-evaluation-of-resection-margins-in-breast\/10.1117\/12.2654497.short\" title=\"https:\/\/www.spiedigitallibrary.org\/conference-proceedings-of-spie\/12466\/1246622\/[...]\" target=\"_blank\">https:\/\/www.spiedigitallibrary.org\/conference-proceedings-of-spie\/12466\/1246622\/[...]<\/a><\/li><\/ul><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('999','tp_links')\">Close<\/a><\/p><\/div><\/div><\/div><div class=\"tp_publication tp_publication_article\"><div class=\"tp_pub_info\"><p class=\"tp_pub_author\"> Hu, Zoe;  Fauerbach, Paola V. Nasute;  Yeung, Chris;  Ungi, Tamas;  Rudan, John;  Engel, C. Jay;  Mousavi, Parvin;  Fichtinger, Gabor;  Jabs, Doris<\/p><p class=\"tp_pub_title\"><a class=\"tp_title_link\" href=\"https:\/\/dx.doi.org\/10.1007\/s11548-022-02658-4\" title=\"Real-time automatic tumor segmentation for ultrasound-guided breast-conserving surgery navigation\" target=\"blank\">Real-time automatic tumor segmentation for ultrasound-guided breast-conserving surgery navigation<\/a> <span class=\"tp_pub_type tp_  article\">Journal Article<\/span> <\/p><p class=\"tp_pub_additional\"><span class=\"tp_pub_additional_in\">In: <\/span><span class=\"tp_pub_additional_journal\">International Journal of Computer Assisted Radiology and Surgery, <\/span><span class=\"tp_pub_additional_volume\">vol. 17, <\/span><span class=\"tp_pub_additional_number\">no. 9, <\/span><span class=\"tp_pub_additional_pages\">pp. 1663\u20131672, <\/span><span class=\"tp_pub_additional_year\">2022<\/span>.<\/p><p class=\"tp_pub_menu\"><span class=\"tp_resource_link\"><a id=\"tp_links_sh_5\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('5','tp_links')\" title=\"Show links and resources\" style=\"cursor:pointer;\">Links<\/a><\/span> | <span class=\"tp_bibtex_link\"><a id=\"tp_bibtex_sh_5\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('5','tp_bibtex')\" title=\"Show BibTeX entry\" style=\"cursor:pointer;\">BibTeX<\/a><\/span><\/p><div class=\"tp_bibtex\" id=\"tp_bibtex_5\" style=\"display:none;\"><div class=\"tp_bibtex_entry\"><pre>@article{Hu2022,<br \/>\r\ntitle = {Real-time automatic tumor segmentation for ultrasound-guided breast-conserving surgery navigation},<br \/>\r\nauthor = {Zoe Hu and Paola V. Nasute Fauerbach and Chris Yeung and Tamas Ungi and John Rudan and C. Jay Engel and Parvin Mousavi and Gabor Fichtinger and Doris Jabs},<br \/>\r\ndoi = {10.1007\/s11548-022-02658-4},<br \/>\r\nyear  = {2022},<br \/>\r\ndate = {2022-05-01},<br \/>\r\nurldate = {2022-05-01},<br \/>\r\njournal = {International Journal of Computer Assisted Radiology and Surgery},<br \/>\r\nvolume = {17},<br \/>\r\nnumber = {9},<br \/>\r\npages = {1663\u20131672},<br \/>\r\nkeywords = {},<br \/>\r\npubstate = {published},<br \/>\r\ntppubtype = {article}<br \/>\r\n}<br \/>\r\n<\/pre><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('5','tp_bibtex')\">Close<\/a><\/p><\/div><div class=\"tp_links\" id=\"tp_links_5\" style=\"display:none;\"><div class=\"tp_links_entry\"><ul class=\"tp_pub_list\"><li><i class=\"ai ai-doi\"><\/i><a class=\"tp_pub_list\" href=\"https:\/\/dx.doi.org\/10.1007\/s11548-022-02658-4\" title=\"Follow DOI:10.1007\/s11548-022-02658-4\" target=\"_blank\">doi:10.1007\/s11548-022-02658-4<\/a><\/li><\/ul><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('5','tp_links')\">Close<\/a><\/p><\/div><\/div><\/div><\/div><\/div>\n\n<\/div>\n","protected":false},"featured_media":238,"template":"","meta":{"_acf_changed":false,"_uag_custom_page_level_css":"","site-sidebar-layout":"default","site-content-layout":"","ast-site-content-layout":"default","site-content-style":"default","site-sidebar-style":"default","ast-global-header-display":"","ast-banner-title-visibility":"","ast-main-header-display":"","ast-hfb-above-header-display":"","ast-hfb-below-header-display":"","ast-hfb-mobile-header-display":"","site-post-title":"","ast-breadcrumbs-content":"","ast-featured-img":"","footer-sml-layout":"","ast-disable-related-posts":"","theme-transparent-header-meta":"","adv-header-id-meta":"","stick-header-meta":"","header-above-stick-meta":"","header-main-stick-meta":"","header-below-stick-meta":"","astra-migrate-meta-layouts":"default","ast-page-background-enabled":"default","ast-page-background-meta":{"desktop":{"background-color":"var(--ast-global-color-4)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"tablet":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"mobile":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""}},"ast-content-background-meta":{"desktop":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"tablet":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"mobile":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""}},"footnotes":""},"class_list":["post-2126","qsc_member","type-qsc_member","status-publish","has-post-thumbnail","hentry"],"acf":[],"spectra_custom_meta":{"_edit_lock":["1730838437:7"],"_thumbnail_id":["238"],"_uag_custom_page_level_css":[""],"theme-transparent-header-meta":[""],"adv-header-id-meta":[""],"stick-header-meta":[""],"footnotes":[""],"_edit_last":["11"],"field_qsc_member_acf_email":["chris.yeung@queensu.ca"],"_field_qsc_member_acf_email":["qsc_member_acf_email"],"qsc_member_acf_position":["PhD Student"],"_qsc_member_acf_position":["field_qsc_member_acf_position"],"qsc_member_acf_department":["a:1:{i:0;s:19:\"School of Computing\";}"],"_qsc_member_acf_department":["field_qsc_member_acf_department"],"field_qsc_member_acf_organization":["Queen's University"],"_field_qsc_member_acf_organization":["qsc_member_acf_organization"],"field_qsc_member_acf_linkedin":["https:\/\/www.linkedin.com\/in\/chriscyyeung\/"],"_field_qsc_member_acf_linkedin":["qsc_member_acf_linkedin"],"field_qsc_member_acf_gscholar":[""],"_field_qsc_member_acf_gscholar":["qsc_member_acf_gscholar"],"field_qsc_member_acf_github":["https:\/\/github.com\/chriscyyeung"],"_field_qsc_member_acf_github":["qsc_member_acf_github"],"field_qsc_member_acf_researchgate":[""],"_field_qsc_member_acf_researchgate":["qsc_member_acf_researchgate"],"field_qsc_member_acf_web":[""],"_field_qsc_member_acf_web":["qsc_member_acf_web"],"field_qsc_member_acf_program_status":["Current"],"_field_qsc_member_acf_program_status":["qsc_member_acf_program_status"],"field_qsc_member_acf_start_year":[""],"_field_qsc_member_acf_start_year":["qsc_member_acf_start_year"],"field_qsc_member_acf_end_year":[""],"_field_qsc_member_acf_end_year":["qsc_member_acf_end_year"],"_uag_css_file_name":["uag-css-2126.css"],"_uag_page_assets":["a:9:{s:3:\"css\";s:263:\".uag-blocks-common-selector{z-index:var(--z-index-desktop) !important}@media (max-width: 976px){.uag-blocks-common-selector{z-index:var(--z-index-tablet) !important}}@media (max-width: 767px){.uag-blocks-common-selector{z-index:var(--z-index-mobile) !important}}\n\";s:2:\"js\";s:0:\"\";s:18:\"current_block_list\";a:9:{i:0;s:12:\"core\/heading\";i:1;s:14:\"core\/paragraph\";i:2;s:14:\"core\/shortcode\";i:3;s:11:\"core\/search\";i:4;s:10:\"core\/group\";i:5;s:17:\"core\/latest-posts\";i:6;s:20:\"core\/latest-comments\";i:7;s:13:\"core\/archives\";i:8;s:15:\"core\/categories\";}s:8:\"uag_flag\";b:0;s:11:\"uag_version\";s:10:\"1771033544\";s:6:\"gfonts\";a:0:{}s:10:\"gfonts_url\";s:0:\"\";s:12:\"gfonts_files\";a:0:{}s:14:\"uag_faq_layout\";b:0;}"],"_uagb_previous_block_counts":["a:90:{s:21:\"uagb\/advanced-heading\";i:0;s:15:\"uagb\/blockquote\";i:0;s:12:\"uagb\/buttons\";i:0;s:18:\"uagb\/buttons-child\";i:0;s:19:\"uagb\/call-to-action\";i:0;s:15:\"uagb\/cf7-styler\";i:0;s:11:\"uagb\/column\";i:0;s:12:\"uagb\/columns\";i:0;s:14:\"uagb\/container\";i:0;s:21:\"uagb\/content-timeline\";i:0;s:27:\"uagb\/content-timeline-child\";i:0;s:14:\"uagb\/countdown\";i:0;s:12:\"uagb\/counter\";i:0;s:8:\"uagb\/faq\";i:0;s:14:\"uagb\/faq-child\";i:0;s:10:\"uagb\/forms\";i:0;s:17:\"uagb\/forms-accept\";i:0;s:19:\"uagb\/forms-checkbox\";i:0;s:15:\"uagb\/forms-date\";i:0;s:16:\"uagb\/forms-email\";i:0;s:17:\"uagb\/forms-hidden\";i:0;s:15:\"uagb\/forms-name\";i:0;s:16:\"uagb\/forms-phone\";i:0;s:16:\"uagb\/forms-radio\";i:0;s:17:\"uagb\/forms-select\";i:0;s:19:\"uagb\/forms-textarea\";i:0;s:17:\"uagb\/forms-toggle\";i:0;s:14:\"uagb\/forms-url\";i:0;s:14:\"uagb\/gf-styler\";i:0;s:15:\"uagb\/google-map\";i:0;s:11:\"uagb\/how-to\";i:0;s:16:\"uagb\/how-to-step\";i:0;s:9:\"uagb\/icon\";i:0;s:14:\"uagb\/icon-list\";i:0;s:20:\"uagb\/icon-list-child\";i:0;s:10:\"uagb\/image\";i:0;s:18:\"uagb\/image-gallery\";i:0;s:13:\"uagb\/info-box\";i:0;s:18:\"uagb\/inline-notice\";i:0;s:11:\"uagb\/lottie\";i:0;s:21:\"uagb\/marketing-button\";i:0;s:10:\"uagb\/modal\";i:0;s:18:\"uagb\/popup-builder\";i:0;s:16:\"uagb\/post-button\";i:0;s:18:\"uagb\/post-carousel\";i:0;s:17:\"uagb\/post-excerpt\";i:0;s:14:\"uagb\/post-grid\";i:0;s:15:\"uagb\/post-image\";i:0;s:17:\"uagb\/post-masonry\";i:0;s:14:\"uagb\/post-meta\";i:0;s:18:\"uagb\/post-taxonomy\";i:0;s:18:\"uagb\/post-timeline\";i:0;s:15:\"uagb\/post-title\";i:0;s:20:\"uagb\/restaurant-menu\";i:0;s:26:\"uagb\/restaurant-menu-child\";i:0;s:11:\"uagb\/review\";i:0;s:12:\"uagb\/section\";i:0;s:14:\"uagb\/separator\";i:0;s:11:\"uagb\/slider\";i:0;s:17:\"uagb\/slider-child\";i:0;s:17:\"uagb\/social-share\";i:0;s:23:\"uagb\/social-share-child\";i:0;s:16:\"uagb\/star-rating\";i:0;s:23:\"uagb\/sure-cart-checkout\";i:0;s:22:\"uagb\/sure-cart-product\";i:0;s:15:\"uagb\/sure-forms\";i:0;s:22:\"uagb\/table-of-contents\";i:0;s:9:\"uagb\/tabs\";i:0;s:15:\"uagb\/tabs-child\";i:0;s:18:\"uagb\/taxonomy-list\";i:0;s:9:\"uagb\/team\";i:0;s:16:\"uagb\/testimonial\";i:0;s:14:\"uagb\/wp-search\";i:0;s:19:\"uagb\/instagram-feed\";i:0;s:10:\"uagb\/login\";i:0;s:17:\"uagb\/loop-builder\";i:0;s:18:\"uagb\/loop-category\";i:0;s:20:\"uagb\/loop-pagination\";i:0;s:15:\"uagb\/loop-reset\";i:0;s:16:\"uagb\/loop-search\";i:0;s:14:\"uagb\/loop-sort\";i:0;s:17:\"uagb\/loop-wrapper\";i:0;s:13:\"uagb\/register\";i:0;s:19:\"uagb\/register-email\";i:0;s:24:\"uagb\/register-first-name\";i:0;s:23:\"uagb\/register-last-name\";i:0;s:22:\"uagb\/register-password\";i:0;s:30:\"uagb\/register-reenter-password\";i:0;s:19:\"uagb\/register-terms\";i:0;s:22:\"uagb\/register-username\";i:0;}"]},"uagb_featured_image_src":{"full":["https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2023\/09\/cyeung-e1694031381570.jpg",250,250,false],"thumbnail":["https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2023\/09\/cyeung-e1694031381570.jpg",150,150,false],"medium":["https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2023\/09\/cyeung-e1694031381570.jpg",250,250,false],"medium_large":["https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2023\/09\/cyeung-e1694031381570.jpg",250,250,false],"large":["https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2023\/09\/cyeung-e1694031381570.jpg",250,250,false],"1536x1536":["https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2023\/09\/cyeung-e1694031381570.jpg",250,250,false],"2048x2048":["https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2023\/09\/cyeung-e1694031381570.jpg",250,250,false]},"uagb_author_info":{"display_name":"Khyle Sewpersaud","author_link":"https:\/\/labs.cs.queensu.ca\/perklab\/author\/"},"uagb_comment_info":0,"uagb_excerpt":"Chris Yeung PhD Student School of Computing Queen&#8217;s University chris.yeung@queensu.ca Biography Chris is a PhD student in the School of Computing under the supervision of Dr. Gabor Fichtinger and Dr. Parvin Mousavi. Chris\u2019 research is in deep learning for medical image segmentation and developing a navigation system for breast cancer surgery. Publications Othman, Amira; Kaufmann,&hellip;","_links":{"self":[{"href":"https:\/\/labs.cs.queensu.ca\/perklab\/wp-json\/wp\/v2\/qsc_member\/2126","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/labs.cs.queensu.ca\/perklab\/wp-json\/wp\/v2\/qsc_member"}],"about":[{"href":"https:\/\/labs.cs.queensu.ca\/perklab\/wp-json\/wp\/v2\/types\/qsc_member"}],"version-history":[{"count":1,"href":"https:\/\/labs.cs.queensu.ca\/perklab\/wp-json\/wp\/v2\/qsc_member\/2126\/revisions"}],"predecessor-version":[{"id":2127,"href":"https:\/\/labs.cs.queensu.ca\/perklab\/wp-json\/wp\/v2\/qsc_member\/2126\/revisions\/2127"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/labs.cs.queensu.ca\/perklab\/wp-json\/wp\/v2\/media\/238"}],"wp:attachment":[{"href":"https:\/\/labs.cs.queensu.ca\/perklab\/wp-json\/wp\/v2\/media?parent=2126"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}