{"id":2342,"date":"2024-02-25T16:11:50","date_gmt":"2024-02-25T16:11:50","guid":{"rendered":"https:\/\/labs.cs.queensu.ca\/perklab\/members\/mark-asselin\/"},"modified":"2024-02-25T16:11:50","modified_gmt":"2024-02-25T16:11:50","slug":"mark-asselin","status":"publish","type":"qsc_member","link":"https:\/\/labs.cs.queensu.ca\/perklab\/members\/mark-asselin\/","title":{"rendered":"Mark Asselin"},"content":{"rendered":"<div class=\"wp-block-columns is-layout-flex wp-block-columns-is-layout-flex qsc-member-single-core-info-container\">\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow qsc-member-single-photo-column\">\n\t\t<img loading=\"lazy\" decoding=\"async\" width=\"250\" height=\"250\" src=\"https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/MarkAsselin_0-300x300.jpg\" class=\"qsc-member-single-photo wp-post-image\" alt=\"\" srcset=\"https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/MarkAsselin_0-300x300.jpg 300w, https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/MarkAsselin_0-150x150.jpg 150w, https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/MarkAsselin_0.jpg 480w\" sizes=\"auto, (max-width: 250px) 100vw, 250px\" \/>\n\t<\/div>\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow qsc-member-single-info-column\">\n<div class=\"qsc-member-name\">\n<h1>Mark Asselin<\/h1>\n<\/div>\n<div class=\"qsc-member-position\">Masters Student<\/div>\n<div class=\"qsc-member-department\">School of Computing<\/div>\n<div class=\"qsc-member-organization\">Queen&#8217;s University<\/div>\n<div class=\"qsc-member-date-range\">Member from <em>May 2017<\/em> to <em>present<\/em><\/div>\n<div class=\"qsc-member-contact\">\n<div class=\"qsc-member-socials\">\n\t\t\t<\/div>\n<\/p><\/div>\n<\/p><\/div>\n<\/div>\n<div class=\"qsc-member-bio\">\n\tMark is a masters student in biomedical computing with previous experience in electrical engineering. Please see his website at <a href=\"https:\/\/markasselin.github.io\/\">https:\/\/markasselin.github.io\/<\/a> for more information about him.<br \/>\n<div class=\"teachpress_pub_list\"><form name=\"tppublistform\" method=\"get\"><a name=\"tppubs\" id=\"tppubs\"><\/a><\/form><div class=\"teachpress_publication_list\"><div class=\"tp_publication tp_publication_article\"><div class=\"tp_pub_info\"><p class=\"tp_pub_author\"> Kaufmann, Martin;  Jamzad, Amoon;  Ungi, Tamas;  Rodgers, Jessica R;  Koster, Teaghan;  Yeung, Chris;  Ehrlich, Josh;  Santilli, Alice;  Asselin, Mark;  Janssen, Natasja;  McMullen, Julie;  Solberg, Kathryn;  Cheesman, Joanna;  Carlo, Alessia Di;  Ren, Kevin Yi Mi;  Varma, Sonal;  Merchant, Shaila;  Engel, Cecil Jay;  Walker, G Ross;  Gallo, Andrea;  Jabs, Doris;  Mousavi, Parvin;  Fichtinger, Gabor;  Rudan, John F<\/p><p class=\"tp_pub_title\"><a class=\"tp_title_link\" href=\"https:\/\/aacrjournals.org\/cancerres\/article\/84\/9_Supplement\/PO2-23-07\/743683\" title=\"https:\/\/aacrjournals.org\/cancerres\/article\/84\/9_Supplement\/PO2-23-07\/743683\" target=\"blank\">Abstract PO2-23-07: Three-dimensional navigated mass spectrometry for intraoperative margin assessment during breast cancer surgery<\/a> <span class=\"tp_pub_type tp_  article\">Journal Article<\/span> <\/p><p class=\"tp_pub_additional\"><span class=\"tp_pub_additional_in\">In: <\/span><span class=\"tp_pub_additional_journal\">Cancer Research, <\/span><span class=\"tp_pub_additional_volume\">vol. 84, <\/span><span class=\"tp_pub_additional_issue\">iss. 9_Supplement, <\/span><span class=\"tp_pub_additional_pages\">pp. PO2-23-07-PO2-23-07, <\/span><span class=\"tp_pub_additional_year\">2024<\/span>.<\/p><p class=\"tp_pub_menu\"><span class=\"tp_abstract_link\"><a id=\"tp_abstract_sh_985\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('985','tp_abstract')\" title=\"Show abstract\" style=\"cursor:pointer;\">Abstract<\/a><\/span> | <span class=\"tp_resource_link\"><a id=\"tp_links_sh_985\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('985','tp_links')\" title=\"Show links and resources\" style=\"cursor:pointer;\">Links<\/a><\/span> | <span class=\"tp_bibtex_link\"><a id=\"tp_bibtex_sh_985\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('985','tp_bibtex')\" title=\"Show BibTeX entry\" style=\"cursor:pointer;\">BibTeX<\/a><\/span><\/p><div class=\"tp_bibtex\" id=\"tp_bibtex_985\" style=\"display:none;\"><div class=\"tp_bibtex_entry\"><pre>@article{fichtinger2024c,<br \/>\r\ntitle = {Abstract PO2-23-07: Three-dimensional navigated mass spectrometry for intraoperative margin assessment during breast cancer surgery},<br \/>\r\nauthor = {Martin Kaufmann and Amoon Jamzad and Tamas Ungi and Jessica R Rodgers and Teaghan Koster and Chris Yeung and Josh Ehrlich and Alice Santilli and Mark Asselin and Natasja Janssen and Julie McMullen and Kathryn Solberg and Joanna Cheesman and Alessia Di Carlo and Kevin Yi Mi Ren and Sonal Varma and Shaila Merchant and Cecil Jay Engel and G Ross Walker and Andrea Gallo and Doris Jabs and Parvin Mousavi and Gabor Fichtinger and John F Rudan},<br \/>\r\nurl = {https:\/\/aacrjournals.org\/cancerres\/article\/84\/9_Supplement\/PO2-23-07\/743683},<br \/>\r\nyear  = {2024},<br \/>\r\ndate = {2024-01-01},<br \/>\r\njournal = {Cancer Research},<br \/>\r\nvolume = {84},<br \/>\r\nissue = {9_Supplement},<br \/>\r\npages = {PO2-23-07-PO2-23-07},<br \/>\r\npublisher = {The American Association for Cancer Research},<br \/>\r\nabstract = {Positive resection margins occur in approximately 25% of breast cancer (BCa) surgeries, requiring re-operation. Margin status is not routinely available during surgery; thus, technologies that identify residual cancer on the specimen or cavity are needed to provide intraoperative decision support that may reduce positive margin rates. Rapid evaporative ionization mass spectrometry (REIMS) is an emerging technique that chemically profiles the plume generated by tissue cauterization to classify the ablated tissue as either cancerous or non-cancerous, on the basis of detected lipid species. Although REIMS can distinguish cancer and non-cancerous breast tissue by the signals generated, it does not indicate the location of the classified tissue in real-time. Our objective was to combine REIMS with spatio-temporal navigation (navigated REIMS), and to compare performance of navigated REIMS with conventional \u2026},<br \/>\r\nkeywords = {},<br \/>\r\npubstate = {published},<br \/>\r\ntppubtype = {article}<br \/>\r\n}<br \/>\r\n<\/pre><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('985','tp_bibtex')\">Close<\/a><\/p><\/div><div class=\"tp_abstract\" id=\"tp_abstract_985\" style=\"display:none;\"><div class=\"tp_abstract_entry\">Positive resection margins occur in approximately 25% of breast cancer (BCa) surgeries, requiring re-operation. Margin status is not routinely available during surgery; thus, technologies that identify residual cancer on the specimen or cavity are needed to provide intraoperative decision support that may reduce positive margin rates. Rapid evaporative ionization mass spectrometry (REIMS) is an emerging technique that chemically profiles the plume generated by tissue cauterization to classify the ablated tissue as either cancerous or non-cancerous, on the basis of detected lipid species. Although REIMS can distinguish cancer and non-cancerous breast tissue by the signals generated, it does not indicate the location of the classified tissue in real-time. Our objective was to combine REIMS with spatio-temporal navigation (navigated REIMS), and to compare performance of navigated REIMS with conventional \u2026<\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('985','tp_abstract')\">Close<\/a><\/p><\/div><div class=\"tp_links\" id=\"tp_links_985\" style=\"display:none;\"><div class=\"tp_links_entry\"><ul class=\"tp_pub_list\"><li><i class=\"fas fa-globe\"><\/i><a class=\"tp_pub_list\" href=\"https:\/\/aacrjournals.org\/cancerres\/article\/84\/9_Supplement\/PO2-23-07\/743683\" title=\"https:\/\/aacrjournals.org\/cancerres\/article\/84\/9_Supplement\/PO2-23-07\/743683\" target=\"_blank\">https:\/\/aacrjournals.org\/cancerres\/article\/84\/9_Supplement\/PO2-23-07\/743683<\/a><\/li><\/ul><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('985','tp_links')\">Close<\/a><\/p><\/div><\/div><\/div><div class=\"tp_publication tp_publication_article\"><div class=\"tp_pub_info\"><p class=\"tp_pub_author\"> Asselin, Mark;  Fichtinger, Gabor<\/p><p class=\"tp_pub_title\"><a class=\"tp_title_link\" href=\"https:\/\/patents.google.com\/patent\/US11456165B2\/en\" title=\"https:\/\/patents.google.com\/patent\/US11456165B2\/en\" target=\"blank\">Spatio-temporal localization for mass spectrometry sample analysis<\/a> <span class=\"tp_pub_type tp_  article\">Journal Article<\/span> <\/p><p class=\"tp_pub_additional\"><span class=\"tp_pub_additional_in\">In: <\/span><span class=\"tp_pub_additional_year\">2022<\/span>.<\/p><p class=\"tp_pub_menu\"><span class=\"tp_abstract_link\"><a id=\"tp_abstract_sh_1011\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('1011','tp_abstract')\" title=\"Show abstract\" style=\"cursor:pointer;\">Abstract<\/a><\/span> | <span class=\"tp_resource_link\"><a id=\"tp_links_sh_1011\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('1011','tp_links')\" title=\"Show links and resources\" style=\"cursor:pointer;\">Links<\/a><\/span> | <span class=\"tp_bibtex_link\"><a id=\"tp_bibtex_sh_1011\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('1011','tp_bibtex')\" title=\"Show BibTeX entry\" style=\"cursor:pointer;\">BibTeX<\/a><\/span><\/p><div class=\"tp_bibtex\" id=\"tp_bibtex_1011\" style=\"display:none;\"><div class=\"tp_bibtex_entry\"><pre>@article{fichtinger2022n,<br \/>\r\ntitle = {Spatio-temporal localization for mass spectrometry sample analysis},<br \/>\r\nauthor = {Mark Asselin and Gabor Fichtinger},<br \/>\r\nurl = {https:\/\/patents.google.com\/patent\/US11456165B2\/en},<br \/>\r\nyear  = {2022},<br \/>\r\ndate = {2022-01-01},<br \/>\r\nurldate = {2022-01-01},<br \/>\r\nabstract = {In a method for spatially localizing mass-spectrometry analysis of an analyte derived from an energy event, an electrical device is used to deliver an energy event to a substrate, and the analyte produced is analyzed using mass spectrometry. Electrical signals sent to and received from the electrical device under different modes of operation are sensed and classified according to each different mode of operation. A location of the electrical device is tracked in three dimensions during the energy event, and a processor is used to perform spatial-temporal alignment of the mass-spectrometry, the determined modes of operation of the electrical device, and the tracked location of the electrical device, wherein mass spectrometry data corresponding to the determined modes of the electrical device are identified and localized within the site of the energy event. The substrate may be tissue in a surgical site, and the electrical \u2026},<br \/>\r\nkeywords = {},<br \/>\r\npubstate = {published},<br \/>\r\ntppubtype = {article}<br \/>\r\n}<br \/>\r\n<\/pre><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('1011','tp_bibtex')\">Close<\/a><\/p><\/div><div class=\"tp_abstract\" id=\"tp_abstract_1011\" style=\"display:none;\"><div class=\"tp_abstract_entry\">In a method for spatially localizing mass-spectrometry analysis of an analyte derived from an energy event, an electrical device is used to deliver an energy event to a substrate, and the analyte produced is analyzed using mass spectrometry. Electrical signals sent to and received from the electrical device under different modes of operation are sensed and classified according to each different mode of operation. A location of the electrical device is tracked in three dimensions during the energy event, and a processor is used to perform spatial-temporal alignment of the mass-spectrometry, the determined modes of operation of the electrical device, and the tracked location of the electrical device, wherein mass spectrometry data corresponding to the determined modes of the electrical device are identified and localized within the site of the energy event. The substrate may be tissue in a surgical site, and the electrical \u2026<\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('1011','tp_abstract')\">Close<\/a><\/p><\/div><div class=\"tp_links\" id=\"tp_links_1011\" style=\"display:none;\"><div class=\"tp_links_entry\"><ul class=\"tp_pub_list\"><li><i class=\"fas fa-globe\"><\/i><a class=\"tp_pub_list\" href=\"https:\/\/patents.google.com\/patent\/US11456165B2\/en\" title=\"https:\/\/patents.google.com\/patent\/US11456165B2\/en\" target=\"_blank\">https:\/\/patents.google.com\/patent\/US11456165B2\/en<\/a><\/li><\/ul><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('1011','tp_links')\">Close<\/a><\/p><\/div><\/div><\/div><div class=\"tp_publication tp_publication_article\"><div class=\"tp_pub_info\"><p class=\"tp_pub_author\"> Ehrlich, Josh;  Jamzad, Amoon;  Asselin, Mark;  Rodgers, Jessica Robin;  Kaufmann, Martin;  Haidegger, Tamas;  Rudan, John;  Mousavi, Parvin;  Fichtinger, Gabor;  Ungi, Tamas<\/p><p class=\"tp_pub_title\"><a class=\"tp_title_link\" href=\"https:\/\/www.mdpi.com\/1424-8220\/22\/15\/5808\" title=\"https:\/\/www.mdpi.com\/1424-8220\/22\/15\/5808\" target=\"blank\">Sensor-Based Automated Detection of Electrosurgical Cautery States<\/a> <span class=\"tp_pub_type tp_  article\">Journal Article<\/span> <\/p><p class=\"tp_pub_additional\"><span class=\"tp_pub_additional_in\">In: <\/span><span class=\"tp_pub_additional_journal\">Sensors, <\/span><span class=\"tp_pub_additional_volume\">vol. 22, <\/span><span class=\"tp_pub_additional_issue\">iss. 15, <\/span><span class=\"tp_pub_additional_pages\">pp. 5808, <\/span><span class=\"tp_pub_additional_year\">2022<\/span>.<\/p><p class=\"tp_pub_menu\"><span class=\"tp_abstract_link\"><a id=\"tp_abstract_sh_917\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('917','tp_abstract')\" title=\"Show abstract\" style=\"cursor:pointer;\">Abstract<\/a><\/span> | <span class=\"tp_resource_link\"><a id=\"tp_links_sh_917\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('917','tp_links')\" title=\"Show links and resources\" style=\"cursor:pointer;\">Links<\/a><\/span> | <span class=\"tp_bibtex_link\"><a id=\"tp_bibtex_sh_917\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('917','tp_bibtex')\" title=\"Show BibTeX entry\" style=\"cursor:pointer;\">BibTeX<\/a><\/span><\/p><div class=\"tp_bibtex\" id=\"tp_bibtex_917\" style=\"display:none;\"><div class=\"tp_bibtex_entry\"><pre>@article{fichtinger2022i,<br \/>\r\ntitle = {Sensor-Based Automated Detection of Electrosurgical Cautery States},<br \/>\r\nauthor = {Josh Ehrlich and Amoon Jamzad and Mark Asselin and Jessica Robin Rodgers and Martin Kaufmann and Tamas Haidegger and John Rudan and Parvin Mousavi and Gabor Fichtinger and Tamas Ungi},<br \/>\r\nurl = {https:\/\/www.mdpi.com\/1424-8220\/22\/15\/5808},<br \/>\r\nyear  = {2022},<br \/>\r\ndate = {2022-01-01},<br \/>\r\njournal = {Sensors},<br \/>\r\nvolume = {22},<br \/>\r\nissue = {15},<br \/>\r\npages = {5808},<br \/>\r\npublisher = {MDPI},<br \/>\r\nabstract = {In computer-assisted surgery, it is typically required to detect when the tool comes into contact with the patient. In activated electrosurgery, this is known as the energy event. By continuously tracking the electrosurgical tools\u2019 location using a navigation system, energy events can help determine locations of sensor-classified tissues. Our objective was to detect the energy event and determine the settings of electrosurgical cautery\u2014robustly and automatically based on sensor data. This study aims to demonstrate the feasibility of using the cautery state to detect surgical incisions, without disrupting the surgical workflow. We detected current changes in the wires of the cautery device and grounding pad using non-invasive current sensors and an oscilloscope. An open-source software was implemented to apply machine learning on sensor data to detect energy events and cautery settings. Our methods classified each cautery state at an average accuracy of 95.56% across different tissue types and energy level parameters altered by surgeons during an operation. Our results demonstrate the feasibility of automatically identifying energy events during surgical incisions, which could be an important safety feature in robotic and computer-integrated surgery. This study provides a key step towards locating tissue classifications during breast cancer operations and reducing the rate of positive margins.},<br \/>\r\nkeywords = {},<br \/>\r\npubstate = {published},<br \/>\r\ntppubtype = {article}<br \/>\r\n}<br \/>\r\n<\/pre><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('917','tp_bibtex')\">Close<\/a><\/p><\/div><div class=\"tp_abstract\" id=\"tp_abstract_917\" style=\"display:none;\"><div class=\"tp_abstract_entry\">In computer-assisted surgery, it is typically required to detect when the tool comes into contact with the patient. In activated electrosurgery, this is known as the energy event. By continuously tracking the electrosurgical tools\u2019 location using a navigation system, energy events can help determine locations of sensor-classified tissues. Our objective was to detect the energy event and determine the settings of electrosurgical cautery\u2014robustly and automatically based on sensor data. This study aims to demonstrate the feasibility of using the cautery state to detect surgical incisions, without disrupting the surgical workflow. We detected current changes in the wires of the cautery device and grounding pad using non-invasive current sensors and an oscilloscope. An open-source software was implemented to apply machine learning on sensor data to detect energy events and cautery settings. Our methods classified each cautery state at an average accuracy of 95.56% across different tissue types and energy level parameters altered by surgeons during an operation. Our results demonstrate the feasibility of automatically identifying energy events during surgical incisions, which could be an important safety feature in robotic and computer-integrated surgery. This study provides a key step towards locating tissue classifications during breast cancer operations and reducing the rate of positive margins.<\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('917','tp_abstract')\">Close<\/a><\/p><\/div><div class=\"tp_links\" id=\"tp_links_917\" style=\"display:none;\"><div class=\"tp_links_entry\"><ul class=\"tp_pub_list\"><li><i class=\"fas fa-globe\"><\/i><a class=\"tp_pub_list\" href=\"https:\/\/www.mdpi.com\/1424-8220\/22\/15\/5808\" title=\"https:\/\/www.mdpi.com\/1424-8220\/22\/15\/5808\" target=\"_blank\">https:\/\/www.mdpi.com\/1424-8220\/22\/15\/5808<\/a><\/li><\/ul><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('917','tp_links')\">Close<\/a><\/p><\/div><\/div><\/div><div class=\"tp_publication tp_publication_article\"><div class=\"tp_pub_info\"><p class=\"tp_pub_author\"> Pinter, Csaba;  Lasso, Andras;  Choueib, Saleh;  Asselin, Mark;  Fillion-Robin, Jean-ChristopheC.;  Vimort, Jean-Baptiste;  Martin, Ken;  Jolley, MatthewA;  Fichtinger, Gabor<\/p><p class=\"tp_pub_title\"><a class=\"tp_title_link\" href=\"https:\/\/dx.doi.org\/10.1109\/TMRB.2020.2983199\" title=\"SlicerVR for Medical Intervention Training and Planning in Immersive Virtual Reality\" target=\"blank\">SlicerVR for Medical Intervention Training and Planning in Immersive Virtual Reality<\/a> <span class=\"tp_pub_type tp_  article\">Journal Article<\/span> <\/p><p class=\"tp_pub_additional\"><span class=\"tp_pub_additional_in\">In: <\/span><span class=\"tp_pub_additional_journal\">IEEE Transactions on Medical Robotics and Bionics, <\/span><span class=\"tp_pub_additional_volume\">vol. 2, <\/span><span class=\"tp_pub_additional_number\">no. 2, <\/span><span class=\"tp_pub_additional_pages\">pp. 108-117, <\/span><span class=\"tp_pub_additional_year\">2020<\/span>.<\/p><p class=\"tp_pub_menu\"><span class=\"tp_abstract_link\"><a id=\"tp_abstract_sh_56\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('56','tp_abstract')\" title=\"Show abstract\" style=\"cursor:pointer;\">Abstract<\/a><\/span> | <span class=\"tp_resource_link\"><a id=\"tp_links_sh_56\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('56','tp_links')\" title=\"Show links and resources\" style=\"cursor:pointer;\">Links<\/a><\/span> | <span class=\"tp_bibtex_link\"><a id=\"tp_bibtex_sh_56\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('56','tp_bibtex')\" title=\"Show BibTeX entry\" style=\"cursor:pointer;\">BibTeX<\/a><\/span><\/p><div class=\"tp_bibtex\" id=\"tp_bibtex_56\" style=\"display:none;\"><div class=\"tp_bibtex_entry\"><pre>@article{Pinter2020,<br \/>\r\ntitle = {SlicerVR for Medical Intervention Training and Planning in Immersive Virtual Reality},<br \/>\r\nauthor = {Csaba Pinter and Andras Lasso and Saleh Choueib and Mark Asselin and Jean-ChristopheC. Fillion-Robin and Jean-Baptiste Vimort and Ken Martin and MatthewA Jolley and Gabor Fichtinger},<br \/>\r\nurl = {https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/03\/Pinter2020a_0.pdf},<br \/>\r\ndoi = {10.1109\/TMRB.2020.2983199},<br \/>\r\nyear  = {2020},<br \/>\r\ndate = {2020-03-01},<br \/>\r\nurldate = {2020-03-01},<br \/>\r\njournal = {IEEE Transactions on Medical Robotics and Bionics},<br \/>\r\nvolume = {2},<br \/>\r\nnumber = {2},<br \/>\r\npages = {108-117},<br \/>\r\nabstract = {&lt;p&gt;Virtual reality (VR) provides immersive visualization that has proved to be useful in a variety of medical applications. Currently, however, no free open-source software platform exists that would provide comprehensive support for translational clinical researchers in prototyping experimental VR scenarios in training, planning or guiding medical interventions. By integrating VR functions in 3D Slicer, an established medical image analysis and visualization platform, SlicerVR enables virtual reality experience by a single click. It provides functions to navigate and manipulate the virtual scene, as well as various settings to abate the feeling of motion sickness. SlicerVR allows for shared collaborative VR experience both locally and remotely. We present illustrative scenarios created with SlicerVR in a wide spectrum of applications, including echocardiography, neurosurgery, spine surgery, brachytherapy, intervention training and personalized patient education. SlicerVR is freely available under BSD type license as an extension to 3D Slicer and it has been downloaded over 7,800 times at the time of writing this article.&lt;\/p&gt;},<br \/>\r\nkeywords = {},<br \/>\r\npubstate = {published},<br \/>\r\ntppubtype = {article}<br \/>\r\n}<br \/>\r\n<\/pre><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('56','tp_bibtex')\">Close<\/a><\/p><\/div><div class=\"tp_abstract\" id=\"tp_abstract_56\" style=\"display:none;\"><div class=\"tp_abstract_entry\">&lt;p&gt;Virtual reality (VR) provides immersive visualization that has proved to be useful in a variety of medical applications. Currently, however, no free open-source software platform exists that would provide comprehensive support for translational clinical researchers in prototyping experimental VR scenarios in training, planning or guiding medical interventions. By integrating VR functions in 3D Slicer, an established medical image analysis and visualization platform, SlicerVR enables virtual reality experience by a single click. It provides functions to navigate and manipulate the virtual scene, as well as various settings to abate the feeling of motion sickness. SlicerVR allows for shared collaborative VR experience both locally and remotely. We present illustrative scenarios created with SlicerVR in a wide spectrum of applications, including echocardiography, neurosurgery, spine surgery, brachytherapy, intervention training and personalized patient education. SlicerVR is freely available under BSD type license as an extension to 3D Slicer and it has been downloaded over 7,800 times at the time of writing this article.&lt;\/p&gt;<\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('56','tp_abstract')\">Close<\/a><\/p><\/div><div class=\"tp_links\" id=\"tp_links_56\" style=\"display:none;\"><div class=\"tp_links_entry\"><ul class=\"tp_pub_list\"><li><i class=\"fas fa-file-pdf\"><\/i><a class=\"tp_pub_list\" href=\"https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/03\/Pinter2020a_0.pdf\" title=\"https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/03\/Pinter2020[...]\" target=\"_blank\">https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/03\/Pinter2020[...]<\/a><\/li><li><i class=\"ai ai-doi\"><\/i><a class=\"tp_pub_list\" href=\"https:\/\/dx.doi.org\/10.1109\/TMRB.2020.2983199\" title=\"Follow DOI:10.1109\/TMRB.2020.2983199\" target=\"_blank\">doi:10.1109\/TMRB.2020.2983199<\/a><\/li><\/ul><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('56','tp_links')\">Close<\/a><\/p><\/div><\/div><\/div><div class=\"tp_publication tp_publication_conference\"><div class=\"tp_pub_info\"><p class=\"tp_pub_author\"> Connolly, Laura;  Jamzad, Amoon;  Kaufmann, Martin;  Rubino, Rachel;  Sedghi, Alireza;  Ungi, Tamas;  Asselin, Mark;  Yam, Scott;  Rudan, John;  Nicol, Christopher;  Fichtinger, Gabor;  Mousavi, Parvin<\/p><p class=\"tp_pub_title\"><a class=\"tp_title_link\" href=\"https:\/\/dx.doi.org\/https:\/\/doi.org\/10.1117\/12.2549343\" title=\"Classification of tumor signatures from electrosurgical vapors using mass spectrometry and machine learning: a feasibility study\" target=\"blank\">Classification of tumor signatures from electrosurgical vapors using mass spectrometry and machine learning: a feasibility study<\/a> <span class=\"tp_pub_type tp_  conference\">Conference<\/span> <\/p><p class=\"tp_pub_additional\"><span class=\"tp_pub_additional_booktitle\">Medical Imaging 2020: Image-Guided Procedures, Robotic Interventions and Modeling, <\/span><span class=\"tp_pub_additional_volume\">vol. 11315, <\/span><span class=\"tp_pub_additional_organization\">SPIE <\/span><span class=\"tp_pub_additional_publisher\">SPIE, <\/span><span class=\"tp_pub_additional_address\">Houston, Texas, United States, <\/span><span class=\"tp_pub_additional_year\">2020<\/span>.<\/p><p class=\"tp_pub_menu\"><span class=\"tp_resource_link\"><a id=\"tp_links_sh_49\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('49','tp_links')\" title=\"Show links and resources\" style=\"cursor:pointer;\">Links<\/a><\/span> | <span class=\"tp_bibtex_link\"><a id=\"tp_bibtex_sh_49\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('49','tp_bibtex')\" title=\"Show BibTeX entry\" style=\"cursor:pointer;\">BibTeX<\/a><\/span><\/p><div class=\"tp_bibtex\" id=\"tp_bibtex_49\" style=\"display:none;\"><div class=\"tp_bibtex_entry\"><pre>@conference{Connolly2020a,<br \/>\r\ntitle = {Classification of tumor signatures from electrosurgical vapors using mass spectrometry and machine learning: a feasibility study},<br \/>\r\nauthor = {Laura Connolly and Amoon Jamzad and Martin Kaufmann and Rachel Rubino and Alireza Sedghi and Tamas Ungi and Mark Asselin and Scott Yam and John Rudan and Christopher Nicol and Gabor Fichtinger and Parvin Mousavi},<br \/>\r\nurl = {https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/Connolly2020a.pdf},<br \/>\r\ndoi = {https:\/\/doi.org\/10.1117\/12.2549343},<br \/>\r\nyear  = {2020},<br \/>\r\ndate = {2020-01-01},<br \/>\r\nurldate = {2020-01-01},<br \/>\r\nbooktitle = {Medical Imaging 2020: Image-Guided Procedures, Robotic Interventions and Modeling},<br \/>\r\nvolume = {11315},<br \/>\r\npublisher = {SPIE},<br \/>\r\naddress = {Houston, Texas, United States},<br \/>\r\norganization = {SPIE},<br \/>\r\nkeywords = {},<br \/>\r\npubstate = {published},<br \/>\r\ntppubtype = {conference}<br \/>\r\n}<br \/>\r\n<\/pre><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('49','tp_bibtex')\">Close<\/a><\/p><\/div><div class=\"tp_links\" id=\"tp_links_49\" style=\"display:none;\"><div class=\"tp_links_entry\"><ul class=\"tp_pub_list\"><li><i class=\"fas fa-file-pdf\"><\/i><a class=\"tp_pub_list\" href=\"https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/Connolly2020a.pdf\" title=\"https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/Connolly20[...]\" target=\"_blank\">https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/Connolly20[...]<\/a><\/li><li><i class=\"ai ai-doi\"><\/i><a class=\"tp_pub_list\" href=\"https:\/\/dx.doi.org\/https:\/\/doi.org\/10.1117\/12.2549343\" title=\"Follow DOI:https:\/\/doi.org\/10.1117\/12.2549343\" target=\"_blank\">doi:https:\/\/doi.org\/10.1117\/12.2549343<\/a><\/li><\/ul><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('49','tp_links')\">Close<\/a><\/p><\/div><\/div><\/div><div class=\"tp_publication tp_publication_conference\"><div class=\"tp_pub_info\"><p class=\"tp_pub_author\"> Barr, Colton;  Lasso, Andras;  Asselin, Mark;  Pieper, Steve;  Robertson, Faith C.;  Gormley, William B.;  Fichtinger, Gabor<\/p><p class=\"tp_pub_title\"><a class=\"tp_title_link\" href=\"https:\/\/dx.doi.org\/10.1117\/12.2549723\" title=\"Towards portable image guidance and automatic patient registration using an RGB-D camera and video projector\" target=\"blank\">Towards portable image guidance and automatic patient registration using an RGB-D camera and video projector<\/a> <span class=\"tp_pub_type tp_  conference\">Conference<\/span> <\/p><p class=\"tp_pub_additional\"><span class=\"tp_pub_additional_booktitle\">Medical Imaging 2020: Image-Guided Procedures, Robotic Interventions and Modeling, <\/span><span class=\"tp_pub_additional_volume\">vol. 11315, <\/span><span class=\"tp_pub_additional_organization\">SPIE <\/span><span class=\"tp_pub_additional_publisher\">SPIE, <\/span><span class=\"tp_pub_additional_address\">Houston, Texas, United States, <\/span><span class=\"tp_pub_additional_year\">2020<\/span>.<\/p><p class=\"tp_pub_menu\"><span class=\"tp_resource_link\"><a id=\"tp_links_sh_57\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('57','tp_links')\" title=\"Show links and resources\" style=\"cursor:pointer;\">Links<\/a><\/span> | <span class=\"tp_bibtex_link\"><a id=\"tp_bibtex_sh_57\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('57','tp_bibtex')\" title=\"Show BibTeX entry\" style=\"cursor:pointer;\">BibTeX<\/a><\/span><\/p><div class=\"tp_bibtex\" id=\"tp_bibtex_57\" style=\"display:none;\"><div class=\"tp_bibtex_entry\"><pre>@conference{BarrC2020,<br \/>\r\ntitle = {Towards portable image guidance and automatic patient registration using an RGB-D camera and video projector},<br \/>\r\nauthor = {Colton Barr and Andras Lasso and Mark Asselin and Steve Pieper and Faith C. Robertson and William B. Gormley and Gabor Fichtinger},<br \/>\r\nurl = {https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/Barr2020.pdf},<br \/>\r\ndoi = {10.1117\/12.2549723},<br \/>\r\nyear  = {2020},<br \/>\r\ndate = {2020-01-01},<br \/>\r\nurldate = {2020-01-01},<br \/>\r\nbooktitle = {Medical Imaging 2020: Image-Guided Procedures, Robotic Interventions and Modeling},<br \/>\r\nvolume = {11315},<br \/>\r\npublisher = {SPIE},<br \/>\r\naddress = {Houston, Texas, United States},<br \/>\r\norganization = {SPIE},<br \/>\r\nkeywords = {},<br \/>\r\npubstate = {published},<br \/>\r\ntppubtype = {conference}<br \/>\r\n}<br \/>\r\n<\/pre><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('57','tp_bibtex')\">Close<\/a><\/p><\/div><div class=\"tp_links\" id=\"tp_links_57\" style=\"display:none;\"><div class=\"tp_links_entry\"><ul class=\"tp_pub_list\"><li><i class=\"fas fa-file-pdf\"><\/i><a class=\"tp_pub_list\" href=\"https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/Barr2020.pdf\" title=\"https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/Barr2020.p[...]\" target=\"_blank\">https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/Barr2020.p[...]<\/a><\/li><li><i class=\"ai ai-doi\"><\/i><a class=\"tp_pub_list\" href=\"https:\/\/dx.doi.org\/10.1117\/12.2549723\" title=\"Follow DOI:10.1117\/12.2549723\" target=\"_blank\">doi:10.1117\/12.2549723<\/a><\/li><\/ul><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('57','tp_links')\">Close<\/a><\/p><\/div><\/div><\/div><div class=\"tp_publication tp_publication_article\"><div class=\"tp_pub_info\"><p class=\"tp_pub_author\"> Yates, Lauren;  Connolly, Laura;  Jamzad, Amoon;  Asselin, Mark;  Rubino, Rachel;  Yam, Scott;  Ungi, Tamas;  Lasso, Andras;  Nicol, Christopher;  Mousavi, Parvin;  Fichtinger, Gabor<\/p><p class=\"tp_pub_title\"><a class=\"tp_title_link\" href=\"https:\/\/www.spiedigitallibrary.org\/conference-proceedings-of-spie\/11315\/1131519\/Robotic-tissue-scanning-with-biophotonic-probe\/10.1117\/12.2549635.short\" title=\"https:\/\/www.spiedigitallibrary.org\/conference-proceedings-of-spie\/11315\/1131519\/Robotic-tissue-scanning-with-biophotonic-probe\/10.1117\/12.2549635.short\" target=\"blank\">Robotic tissue scanning with biophotonic probe<\/a> <span class=\"tp_pub_type tp_  article\">Journal Article<\/span> <\/p><p class=\"tp_pub_additional\"><span class=\"tp_pub_additional_in\">In: <\/span><span class=\"tp_pub_additional_volume\">vol. 11315, <\/span><span class=\"tp_pub_additional_pages\">pp. 330-335, <\/span><span class=\"tp_pub_additional_year\">2020<\/span>.<\/p><p class=\"tp_pub_menu\"><span class=\"tp_abstract_link\"><a id=\"tp_abstract_sh_952\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('952','tp_abstract')\" title=\"Show abstract\" style=\"cursor:pointer;\">Abstract<\/a><\/span> | <span class=\"tp_resource_link\"><a id=\"tp_links_sh_952\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('952','tp_links')\" title=\"Show links and resources\" style=\"cursor:pointer;\">Links<\/a><\/span> | <span class=\"tp_bibtex_link\"><a id=\"tp_bibtex_sh_952\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('952','tp_bibtex')\" title=\"Show BibTeX entry\" style=\"cursor:pointer;\">BibTeX<\/a><\/span><\/p><div class=\"tp_bibtex\" id=\"tp_bibtex_952\" style=\"display:none;\"><div class=\"tp_bibtex_entry\"><pre>@article{fichtinger2020o,<br \/>\r\ntitle = {Robotic tissue scanning with biophotonic probe},<br \/>\r\nauthor = {Lauren Yates and Laura Connolly and Amoon Jamzad and Mark Asselin and Rachel Rubino and Scott Yam and Tamas Ungi and Andras Lasso and Christopher Nicol and Parvin Mousavi and Gabor Fichtinger},<br \/>\r\nurl = {https:\/\/www.spiedigitallibrary.org\/conference-proceedings-of-spie\/11315\/1131519\/Robotic-tissue-scanning-with-biophotonic-probe\/10.1117\/12.2549635.short},<br \/>\r\nyear  = {2020},<br \/>\r\ndate = {2020-01-01},<br \/>\r\nvolume = {11315},<br \/>\r\npages = {330-335},<br \/>\r\npublisher = {SPIE},<br \/>\r\nabstract = {PURPOSE <br \/>\r\nRaman spectroscopy is an optical imaging technique used to characterize tissue via molecular analysis. The use of Raman spectroscopy for real-time intraoperative tissue classification requires fast analysis with minimal human intervention. In order to have accurate predictions and classifications, a large and reliable database of tissue classifications with spectra results is required. We have developed a system that can be used to generate an efficient scanning path for robotic scanning of tissues using Raman spectroscopy. <br \/>\r\nMETHODS <br \/>\r\nA camera mounted to a robotic controller is used to take an image of a tissue slide. The corners of the tissue slides within the sample image are identified, and the size of the slide is calculated. The image is cropped to fit the size of the slide and the image is manipulated to identify the tissue contour. A grid set to fit around the size of the tissue is calculated and a grid \u2026},<br \/>\r\nkeywords = {},<br \/>\r\npubstate = {published},<br \/>\r\ntppubtype = {article}<br \/>\r\n}<br \/>\r\n<\/pre><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('952','tp_bibtex')\">Close<\/a><\/p><\/div><div class=\"tp_abstract\" id=\"tp_abstract_952\" style=\"display:none;\"><div class=\"tp_abstract_entry\">PURPOSE <br \/>\r\nRaman spectroscopy is an optical imaging technique used to characterize tissue via molecular analysis. The use of Raman spectroscopy for real-time intraoperative tissue classification requires fast analysis with minimal human intervention. In order to have accurate predictions and classifications, a large and reliable database of tissue classifications with spectra results is required. We have developed a system that can be used to generate an efficient scanning path for robotic scanning of tissues using Raman spectroscopy. <br \/>\r\nMETHODS <br \/>\r\nA camera mounted to a robotic controller is used to take an image of a tissue slide. The corners of the tissue slides within the sample image are identified, and the size of the slide is calculated. The image is cropped to fit the size of the slide and the image is manipulated to identify the tissue contour. A grid set to fit around the size of the tissue is calculated and a grid \u2026<\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('952','tp_abstract')\">Close<\/a><\/p><\/div><div class=\"tp_links\" id=\"tp_links_952\" style=\"display:none;\"><div class=\"tp_links_entry\"><ul class=\"tp_pub_list\"><li><i class=\"fas fa-globe\"><\/i><a class=\"tp_pub_list\" href=\"https:\/\/www.spiedigitallibrary.org\/conference-proceedings-of-spie\/11315\/1131519\/Robotic-tissue-scanning-with-biophotonic-probe\/10.1117\/12.2549635.short\" title=\"https:\/\/www.spiedigitallibrary.org\/conference-proceedings-of-spie\/11315\/1131519\/[...]\" target=\"_blank\">https:\/\/www.spiedigitallibrary.org\/conference-proceedings-of-spie\/11315\/1131519\/[...]<\/a><\/li><\/ul><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('952','tp_links')\">Close<\/a><\/p><\/div><\/div><\/div><div class=\"tp_publication tp_publication_article\"><div class=\"tp_pub_info\"><p class=\"tp_pub_author\"> Pinter, Csaba;  Lasso, Andras;  Choueib, Saleh;  Asselin, Mark;  Fillion-Robin, Jean-Christophe;  Vimort, Jean-Baptiste;  Martin, Ken;  Jolley, Matthew A;  Fichtinger, Gabor<\/p><p class=\"tp_pub_title\"><a class=\"tp_title_link\" href=\"https:\/\/ieeexplore.ieee.org\/abstract\/document\/9047949\/\" title=\"https:\/\/ieeexplore.ieee.org\/abstract\/document\/9047949\/\" target=\"blank\">SlicerVR for medical intervention training and planning in immersive virtual reality<\/a> <span class=\"tp_pub_type tp_  article\">Journal Article<\/span> <\/p><p class=\"tp_pub_additional\"><span class=\"tp_pub_additional_in\">In: <\/span><span class=\"tp_pub_additional_journal\">IEEE transactions on medical robotics and bionics, <\/span><span class=\"tp_pub_additional_volume\">vol. 2, <\/span><span class=\"tp_pub_additional_issue\">iss. 2, <\/span><span class=\"tp_pub_additional_pages\">pp. 108-117, <\/span><span class=\"tp_pub_additional_year\">2020<\/span>.<\/p><p class=\"tp_pub_menu\"><span class=\"tp_abstract_link\"><a id=\"tp_abstract_sh_757\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('757','tp_abstract')\" title=\"Show abstract\" style=\"cursor:pointer;\">Abstract<\/a><\/span> | <span class=\"tp_resource_link\"><a id=\"tp_links_sh_757\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('757','tp_links')\" title=\"Show links and resources\" style=\"cursor:pointer;\">Links<\/a><\/span> | <span class=\"tp_bibtex_link\"><a id=\"tp_bibtex_sh_757\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('757','tp_bibtex')\" title=\"Show BibTeX entry\" style=\"cursor:pointer;\">BibTeX<\/a><\/span><\/p><div class=\"tp_bibtex\" id=\"tp_bibtex_757\" style=\"display:none;\"><div class=\"tp_bibtex_entry\"><pre>@article{fichtinger2020,<br \/>\r\ntitle = {SlicerVR for medical intervention training and planning in immersive virtual reality},<br \/>\r\nauthor = {Csaba Pinter and Andras Lasso and Saleh Choueib and Mark Asselin and Jean-Christophe Fillion-Robin and Jean-Baptiste Vimort and Ken Martin and Matthew A Jolley and Gabor Fichtinger},<br \/>\r\nurl = {https:\/\/ieeexplore.ieee.org\/abstract\/document\/9047949\/},<br \/>\r\nyear  = {2020},<br \/>\r\ndate = {2020-01-01},<br \/>\r\njournal = {IEEE transactions on medical robotics and bionics},<br \/>\r\nvolume = {2},<br \/>\r\nissue = {2},<br \/>\r\npages = {108-117},<br \/>\r\npublisher = {IEEE},<br \/>\r\nabstract = {Virtual reality (VR) provides immersive visualization that has proved to be useful in a variety of medical applications. Currently, however, no free open-source software platform exists that would provide comprehensive support for translational clinical researchers in prototyping experimental VR scenarios in training, planning or guiding medical interventions. By integrating VR functions in 3D Slicer, an established medical image analysis and visualization platform, SlicerVR enables virtual reality experience by a single click. It provides functions to navigate and manipulate the virtual scene, as well as various settings to abate the feeling of motion sickness. SlicerVR allows for shared collaborative VR experience both locally and remotely. We present illustrative scenarios created with SlicerVR in a wide spectrum of applications, including echocardiography, neurosurgery, spine surgery, brachytherapy, intervention \u2026},<br \/>\r\nkeywords = {},<br \/>\r\npubstate = {published},<br \/>\r\ntppubtype = {article}<br \/>\r\n}<br \/>\r\n<\/pre><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('757','tp_bibtex')\">Close<\/a><\/p><\/div><div class=\"tp_abstract\" id=\"tp_abstract_757\" style=\"display:none;\"><div class=\"tp_abstract_entry\">Virtual reality (VR) provides immersive visualization that has proved to be useful in a variety of medical applications. Currently, however, no free open-source software platform exists that would provide comprehensive support for translational clinical researchers in prototyping experimental VR scenarios in training, planning or guiding medical interventions. By integrating VR functions in 3D Slicer, an established medical image analysis and visualization platform, SlicerVR enables virtual reality experience by a single click. It provides functions to navigate and manipulate the virtual scene, as well as various settings to abate the feeling of motion sickness. SlicerVR allows for shared collaborative VR experience both locally and remotely. We present illustrative scenarios created with SlicerVR in a wide spectrum of applications, including echocardiography, neurosurgery, spine surgery, brachytherapy, intervention \u2026<\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('757','tp_abstract')\">Close<\/a><\/p><\/div><div class=\"tp_links\" id=\"tp_links_757\" style=\"display:none;\"><div class=\"tp_links_entry\"><ul class=\"tp_pub_list\"><li><i class=\"fas fa-globe\"><\/i><a class=\"tp_pub_list\" href=\"https:\/\/ieeexplore.ieee.org\/abstract\/document\/9047949\/\" title=\"https:\/\/ieeexplore.ieee.org\/abstract\/document\/9047949\/\" target=\"_blank\">https:\/\/ieeexplore.ieee.org\/abstract\/document\/9047949\/<\/a><\/li><\/ul><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('757','tp_links')\">Close<\/a><\/p><\/div><\/div><\/div><div class=\"tp_publication tp_publication_conference\"><div class=\"tp_pub_info\"><p class=\"tp_pub_author\"> Pinter, Csaba;  Lasso, Andras;  Asselin, Mark;  Fillion-Robin, Jean-ChristopheC.;  Vimort, Jean-Baptiste;  Martin, Ken;  Fichtinger, Gabor<\/p><p class=\"tp_pub_title\"><a class=\"tp_title_link\" href=\"https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/Pinter2019a_0.pdf\" title=\"https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/Pinter2019a_0.pdf\" target=\"blank\">SlicerVR for image-guided therapy planning in immersive virtual reality<\/a> <span class=\"tp_pub_type tp_  conference\">Conference<\/span> <\/p><p class=\"tp_pub_additional\"><span class=\"tp_pub_additional_booktitle\">The 12th Hamlyn Symposium on Medical Robotics, 23-26 June 2019, Imperial College, London, UK, <\/span><span class=\"tp_pub_additional_address\">London, UK, <\/span><span class=\"tp_pub_additional_year\">2019<\/span>.<\/p><p class=\"tp_pub_menu\"><span class=\"tp_resource_link\"><a id=\"tp_links_sh_80\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('80','tp_links')\" title=\"Show links and resources\" style=\"cursor:pointer;\">Links<\/a><\/span> | <span class=\"tp_bibtex_link\"><a id=\"tp_bibtex_sh_80\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('80','tp_bibtex')\" title=\"Show BibTeX entry\" style=\"cursor:pointer;\">BibTeX<\/a><\/span><\/p><div class=\"tp_bibtex\" id=\"tp_bibtex_80\" style=\"display:none;\"><div class=\"tp_bibtex_entry\"><pre>@conference{Pinter2019a,<br \/>\r\ntitle = {SlicerVR for image-guided therapy planning in immersive virtual reality},<br \/>\r\nauthor = {Csaba Pinter and Andras Lasso and Mark Asselin and Jean-ChristopheC. Fillion-Robin and Jean-Baptiste Vimort and Ken Martin and Gabor Fichtinger},<br \/>\r\nurl = {https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/Pinter2019a_0.pdf},<br \/>\r\nyear  = {2019},<br \/>\r\ndate = {2019-06-01},<br \/>\r\nurldate = {2019-06-01},<br \/>\r\nbooktitle = {The 12th Hamlyn Symposium on Medical Robotics, 23-26 June 2019, Imperial College, London, UK},<br \/>\r\npages = {91-92},<br \/>\r\naddress = {London, UK},<br \/>\r\nkeywords = {},<br \/>\r\npubstate = {published},<br \/>\r\ntppubtype = {conference}<br \/>\r\n}<br \/>\r\n<\/pre><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('80','tp_bibtex')\">Close<\/a><\/p><\/div><div class=\"tp_links\" id=\"tp_links_80\" style=\"display:none;\"><div class=\"tp_links_entry\"><ul class=\"tp_pub_list\"><li><i class=\"fas fa-file-pdf\"><\/i><a class=\"tp_pub_list\" href=\"https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/Pinter2019a_0.pdf\" title=\"https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/Pinter2019[...]\" target=\"_blank\">https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/Pinter2019[...]<\/a><\/li><\/ul><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('80','tp_links')\">Close<\/a><\/p><\/div><\/div><\/div><div class=\"tp_publication tp_publication_conference\"><div class=\"tp_pub_info\"><p class=\"tp_pub_author\"> Laframboise, Jacob;  Ungi, Tamas;  Lasso, Andras;  Asselin, Mark;  Holden, M.;  Tan, Pearl;  Hookey, Lawrence;  Fichtinger, Gabor<\/p><p class=\"tp_pub_title\"><a class=\"tp_title_link\" href=\"https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/Laframboise2019a.pdf\" title=\"https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/Laframboise2019a.pdf\" target=\"blank\">Analyzing the curvature of the colon in different patient positions<\/a> <span class=\"tp_pub_type tp_  conference\">Conference<\/span> <\/p><p class=\"tp_pub_additional\"><span class=\"tp_pub_additional_booktitle\">SPIE Medical Imaging 2019: Image-Guided Procedures, Robotic Interventions, and Modeling, <\/span><span class=\"tp_pub_additional_volume\">vol. 10951, <\/span><span class=\"tp_pub_additional_address\">San Diego, California, <\/span><span class=\"tp_pub_additional_year\">2019<\/span>.<\/p><p class=\"tp_pub_menu\"><span class=\"tp_resource_link\"><a id=\"tp_links_sh_61\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('61','tp_links')\" title=\"Show links and resources\" style=\"cursor:pointer;\">Links<\/a><\/span> | <span class=\"tp_bibtex_link\"><a id=\"tp_bibtex_sh_61\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('61','tp_bibtex')\" title=\"Show BibTeX entry\" style=\"cursor:pointer;\">BibTeX<\/a><\/span><\/p><div class=\"tp_bibtex\" id=\"tp_bibtex_61\" style=\"display:none;\"><div class=\"tp_bibtex_entry\"><pre>@conference{Laframboise2019a,<br \/>\r\ntitle = {Analyzing the curvature of the colon in different patient positions},<br \/>\r\nauthor = {Jacob Laframboise and Tamas Ungi and Andras Lasso and Mark Asselin and M. Holden and Pearl Tan and Lawrence Hookey and Gabor Fichtinger},<br \/>\r\nurl = {https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/Laframboise2019a.pdf},<br \/>\r\nyear  = {2019},<br \/>\r\ndate = {2019-03-01},<br \/>\r\nurldate = {2019-03-01},<br \/>\r\nbooktitle = {SPIE Medical Imaging 2019: Image-Guided Procedures, Robotic Interventions, and Modeling},<br \/>\r\nvolume = {10951},<br \/>\r\naddress = {San Diego, California},<br \/>\r\nkeywords = {},<br \/>\r\npubstate = {published},<br \/>\r\ntppubtype = {conference}<br \/>\r\n}<br \/>\r\n<\/pre><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('61','tp_bibtex')\">Close<\/a><\/p><\/div><div class=\"tp_links\" id=\"tp_links_61\" style=\"display:none;\"><div class=\"tp_links_entry\"><ul class=\"tp_pub_list\"><li><i class=\"fas fa-file-pdf\"><\/i><a class=\"tp_pub_list\" href=\"https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/Laframboise2019a.pdf\" title=\"https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/Laframbois[...]\" target=\"_blank\">https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/Laframbois[...]<\/a><\/li><\/ul><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('61','tp_links')\">Close<\/a><\/p><\/div><\/div><\/div><div class=\"tp_publication tp_publication_conference\"><div class=\"tp_pub_info\"><p class=\"tp_pub_author\"> Connolly, Laura;  Ungi, Tamas;  Lasso, Andras;  Vaughan, Thomas;  Asselin, Mark;  Mousavi, Parvin;  Yam, Scott;  Fichtinger, Gabor<\/p><p class=\"tp_pub_title\"><a class=\"tp_title_link\" href=\"https:\/\/dx.doi.org\/https:\/\/doi.org\/10.1117\/12.2512481\" title=\"Mechanically-Controlled Spectroscopic Imaging for Tissue Classification\" target=\"blank\">Mechanically-Controlled Spectroscopic Imaging for Tissue Classification<\/a> <span class=\"tp_pub_type tp_  conference\">Conference<\/span> <\/p><p class=\"tp_pub_additional\"><span class=\"tp_pub_additional_booktitle\">SPIE Medical Imaging 2019: Image-Guided Procedures, Robotic Interventions, and Modeling, <\/span><span class=\"tp_pub_additional_volume\">vol. 10951, <\/span><span class=\"tp_pub_additional_address\">San Diego, California, <\/span><span class=\"tp_pub_additional_year\">2019<\/span>.<\/p><p class=\"tp_pub_menu\"><span class=\"tp_resource_link\"><a id=\"tp_links_sh_72\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('72','tp_links')\" title=\"Show links and resources\" style=\"cursor:pointer;\">Links<\/a><\/span> | <span class=\"tp_bibtex_link\"><a id=\"tp_bibtex_sh_72\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('72','tp_bibtex')\" title=\"Show BibTeX entry\" style=\"cursor:pointer;\">BibTeX<\/a><\/span><\/p><div class=\"tp_bibtex\" id=\"tp_bibtex_72\" style=\"display:none;\"><div class=\"tp_bibtex_entry\"><pre>@conference{Connolly2019a,<br \/>\r\ntitle = {Mechanically-Controlled Spectroscopic Imaging for Tissue Classification},<br \/>\r\nauthor = {Laura Connolly and Tamas Ungi and Andras Lasso and Thomas Vaughan and Mark Asselin and Parvin Mousavi and Scott Yam and Gabor Fichtinger},<br \/>\r\nurl = {https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/Connolly2019a_3.pdf},<br \/>\r\ndoi = {https:\/\/doi.org\/10.1117\/12.2512481},<br \/>\r\nyear  = {2019},<br \/>\r\ndate = {2019-03-01},<br \/>\r\nurldate = {2019-03-01},<br \/>\r\nbooktitle = {SPIE Medical Imaging 2019: Image-Guided Procedures, Robotic Interventions, and Modeling},<br \/>\r\nvolume = {10951},<br \/>\r\naddress = {San Diego, California},<br \/>\r\nkeywords = {},<br \/>\r\npubstate = {published},<br \/>\r\ntppubtype = {conference}<br \/>\r\n}<br \/>\r\n<\/pre><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('72','tp_bibtex')\">Close<\/a><\/p><\/div><div class=\"tp_links\" id=\"tp_links_72\" style=\"display:none;\"><div class=\"tp_links_entry\"><ul class=\"tp_pub_list\"><li><i class=\"fas fa-file-pdf\"><\/i><a class=\"tp_pub_list\" href=\"https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/Connolly2019a_3.pdf\" title=\"https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/Connolly20[...]\" target=\"_blank\">https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/Connolly20[...]<\/a><\/li><li><i class=\"ai ai-doi\"><\/i><a class=\"tp_pub_list\" href=\"https:\/\/dx.doi.org\/https:\/\/doi.org\/10.1117\/12.2512481\" title=\"Follow DOI:https:\/\/doi.org\/10.1117\/12.2512481\" target=\"_blank\">doi:https:\/\/doi.org\/10.1117\/12.2512481<\/a><\/li><\/ul><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('72','tp_links')\">Close<\/a><\/p><\/div><\/div><\/div><div class=\"tp_publication tp_publication_conference\"><div class=\"tp_pub_info\"><p class=\"tp_pub_author\"> Laframboise, Jacob;  Ungi, Tamas;  Lasso, Andras;  Asselin, Mark;  Holden, M.;  Tan, Pearl;  Hookey, Lawrence;  Fichtinger, Gabor<\/p><p class=\"tp_pub_title\"><a class=\"tp_title_link\" href=\"https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/Laframboise2019b.pdf\" title=\"https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/Laframboise2019b.pdf\" target=\"blank\">Quantifying the effect of patient position on the curvature of colons<\/a> <span class=\"tp_pub_type tp_  conference\">Conference<\/span> <\/p><p class=\"tp_pub_additional\"><span class=\"tp_pub_additional_booktitle\">17th Annual Imaging Network Ontario Symposium (ImNO), <\/span><span class=\"tp_pub_additional_publisher\">Imaging Network Ontario (ImNO), <\/span><span class=\"tp_pub_additional_address\">London, Ontario, <\/span><span class=\"tp_pub_additional_year\">2019<\/span>.<\/p><p class=\"tp_pub_menu\"><span class=\"tp_resource_link\"><a id=\"tp_links_sh_77\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('77','tp_links')\" title=\"Show links and resources\" style=\"cursor:pointer;\">Links<\/a><\/span> | <span class=\"tp_bibtex_link\"><a id=\"tp_bibtex_sh_77\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('77','tp_bibtex')\" title=\"Show BibTeX entry\" style=\"cursor:pointer;\">BibTeX<\/a><\/span><\/p><div class=\"tp_bibtex\" id=\"tp_bibtex_77\" style=\"display:none;\"><div class=\"tp_bibtex_entry\"><pre>@conference{Laframboise2019b,<br \/>\r\ntitle = {Quantifying the effect of patient position on the curvature of colons},<br \/>\r\nauthor = {Jacob Laframboise and Tamas Ungi and Andras Lasso and Mark Asselin and M. Holden and Pearl Tan and Lawrence Hookey and Gabor Fichtinger},<br \/>\r\nurl = {https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/Laframboise2019b.pdf},<br \/>\r\nyear  = {2019},<br \/>\r\ndate = {2019-01-01},<br \/>\r\nurldate = {2019-01-01},<br \/>\r\nbooktitle = {17th Annual Imaging Network Ontario Symposium (ImNO)},<br \/>\r\npublisher = {Imaging Network Ontario (ImNO)},<br \/>\r\naddress = {London, Ontario},<br \/>\r\nkeywords = {},<br \/>\r\npubstate = {published},<br \/>\r\ntppubtype = {conference}<br \/>\r\n}<br \/>\r\n<\/pre><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('77','tp_bibtex')\">Close<\/a><\/p><\/div><div class=\"tp_links\" id=\"tp_links_77\" style=\"display:none;\"><div class=\"tp_links_entry\"><ul class=\"tp_pub_list\"><li><i class=\"fas fa-file-pdf\"><\/i><a class=\"tp_pub_list\" href=\"https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/Laframboise2019b.pdf\" title=\"https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/Laframbois[...]\" target=\"_blank\">https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/Laframbois[...]<\/a><\/li><\/ul><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('77','tp_links')\">Close<\/a><\/p><\/div><\/div><\/div><div class=\"tp_publication tp_publication_article\"><div class=\"tp_pub_info\"><p class=\"tp_pub_author\"> Connolly, Laura;  Ungi, Tamas;  Lasso, Andras;  Vaughan, Thomas;  Asselin, Mark;  Mousavi, Parvin;  Yam, Scott;  Fichtinger, Gabor<\/p><p class=\"tp_pub_title\"><a class=\"tp_title_link\" href=\"https:\/\/www.spiedigitallibrary.org\/conference-proceedings-of-spie\/10951\/109512E\/Mechanically-controlled-spectroscopic-imaging-for-tissue-classification\/10.1117\/12.2512481.short\" title=\"https:\/\/www.spiedigitallibrary.org\/conference-proceedings-of-spie\/10951\/109512E\/Mechanically-controlled-spectroscopic-imaging-for-tissue-classification\/10.1117\/12.2512481.short\" target=\"blank\">Mechanically controlled spectroscopic imaging for tissue classification<\/a> <span class=\"tp_pub_type tp_  article\">Journal Article<\/span> <\/p><p class=\"tp_pub_additional\"><span class=\"tp_pub_additional_in\">In: <\/span><span class=\"tp_pub_additional_volume\">vol. 10951, <\/span><span class=\"tp_pub_additional_pages\">pp. 632-640, <\/span><span class=\"tp_pub_additional_year\">2019<\/span>.<\/p><p class=\"tp_pub_menu\"><span class=\"tp_abstract_link\"><a id=\"tp_abstract_sh_956\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('956','tp_abstract')\" title=\"Show abstract\" style=\"cursor:pointer;\">Abstract<\/a><\/span> | <span class=\"tp_resource_link\"><a id=\"tp_links_sh_956\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('956','tp_links')\" title=\"Show links and resources\" style=\"cursor:pointer;\">Links<\/a><\/span> | <span class=\"tp_bibtex_link\"><a id=\"tp_bibtex_sh_956\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('956','tp_bibtex')\" title=\"Show BibTeX entry\" style=\"cursor:pointer;\">BibTeX<\/a><\/span><\/p><div class=\"tp_bibtex\" id=\"tp_bibtex_956\" style=\"display:none;\"><div class=\"tp_bibtex_entry\"><pre>@article{fichtinger2019n,<br \/>\r\ntitle = {Mechanically controlled spectroscopic imaging for tissue classification},<br \/>\r\nauthor = {Laura Connolly and Tamas Ungi and Andras Lasso and Thomas Vaughan and Mark Asselin and Parvin Mousavi and Scott Yam and Gabor Fichtinger},<br \/>\r\nurl = {https:\/\/www.spiedigitallibrary.org\/conference-proceedings-of-spie\/10951\/109512E\/Mechanically-controlled-spectroscopic-imaging-for-tissue-classification\/10.1117\/12.2512481.short},<br \/>\r\nyear  = {2019},<br \/>\r\ndate = {2019-01-01},<br \/>\r\nvolume = {10951},<br \/>\r\npages = {632-640},<br \/>\r\npublisher = {SPIE},<br \/>\r\nabstract = {PURPOSE <br \/>\r\nRaman Spectroscopy is amongst several optical imaging techniques that have the ability to characterize tissue non-invasively. To use these technologies for intraoperative tissue classification, fast and efficient analysis of optical data is required with minimal operator intervention. Additionally, there is a need for a reliable database of optical signatures to account for variable conditions. We developed a software system with an inexpensive, flexible mechanical framework to facilitate automated scanning of tissue and validate spectroscopic scans with histologic ground truths. This system will be used, in the future, to train a machine learning algorithm to distinguish between different tissue types using Raman Spectroscopy. <br \/>\r\nMETHODS <br \/>\r\nA sample of chicken breast tissue is mounted to a microscope slide following a biopsy of fresh frozen tissue. Landmarks for registration and evaluation are marked on the \u2026},<br \/>\r\nkeywords = {},<br \/>\r\npubstate = {published},<br \/>\r\ntppubtype = {article}<br \/>\r\n}<br \/>\r\n<\/pre><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('956','tp_bibtex')\">Close<\/a><\/p><\/div><div class=\"tp_abstract\" id=\"tp_abstract_956\" style=\"display:none;\"><div class=\"tp_abstract_entry\">PURPOSE <br \/>\r\nRaman Spectroscopy is amongst several optical imaging techniques that have the ability to characterize tissue non-invasively. To use these technologies for intraoperative tissue classification, fast and efficient analysis of optical data is required with minimal operator intervention. Additionally, there is a need for a reliable database of optical signatures to account for variable conditions. We developed a software system with an inexpensive, flexible mechanical framework to facilitate automated scanning of tissue and validate spectroscopic scans with histologic ground truths. This system will be used, in the future, to train a machine learning algorithm to distinguish between different tissue types using Raman Spectroscopy. <br \/>\r\nMETHODS <br \/>\r\nA sample of chicken breast tissue is mounted to a microscope slide following a biopsy of fresh frozen tissue. Landmarks for registration and evaluation are marked on the \u2026<\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('956','tp_abstract')\">Close<\/a><\/p><\/div><div class=\"tp_links\" id=\"tp_links_956\" style=\"display:none;\"><div class=\"tp_links_entry\"><ul class=\"tp_pub_list\"><li><i class=\"fas fa-globe\"><\/i><a class=\"tp_pub_list\" href=\"https:\/\/www.spiedigitallibrary.org\/conference-proceedings-of-spie\/10951\/109512E\/Mechanically-controlled-spectroscopic-imaging-for-tissue-classification\/10.1117\/12.2512481.short\" title=\"https:\/\/www.spiedigitallibrary.org\/conference-proceedings-of-spie\/10951\/109512E\/[...]\" target=\"_blank\">https:\/\/www.spiedigitallibrary.org\/conference-proceedings-of-spie\/10951\/109512E\/[...]<\/a><\/li><\/ul><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('956','tp_links')\">Close<\/a><\/p><\/div><\/div><\/div><div class=\"tp_publication tp_publication_article\"><div class=\"tp_pub_info\"><p class=\"tp_pub_author\"> Asselin, Mark;  Kaufmann, Martin;  Wiercigroch, Julia;  Ungi, Tamas;  Lasso, Andras;  Rudan, John;  Fichtinger, Gabor<\/p><p class=\"tp_pub_title\"><a class=\"tp_title_link\" href=\"https:\/\/www.spiedigitallibrary.org\/conference-proceedings-of-spie\/10951\/109512C\/Navigated-real-time-molecular-analysis-in-the-operating-theatre\/10.1117\/12.2512586.short\" title=\"https:\/\/www.spiedigitallibrary.org\/conference-proceedings-of-spie\/10951\/109512C\/Navigated-real-time-molecular-analysis-in-the-operating-theatre\/10.1117\/12.2512586.short\" target=\"blank\">Navigated real-time molecular analysis in the operating theatre: demonstration of concept<\/a> <span class=\"tp_pub_type tp_  article\">Journal Article<\/span> <\/p><p class=\"tp_pub_additional\"><span class=\"tp_pub_additional_in\">In: <\/span><span class=\"tp_pub_additional_volume\">vol. 10951, <\/span><span class=\"tp_pub_additional_pages\">pp. 618-624, <\/span><span class=\"tp_pub_additional_year\">2019<\/span>.<\/p><p class=\"tp_pub_menu\"><span class=\"tp_abstract_link\"><a id=\"tp_abstract_sh_883\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('883','tp_abstract')\" title=\"Show abstract\" style=\"cursor:pointer;\">Abstract<\/a><\/span> | <span class=\"tp_resource_link\"><a id=\"tp_links_sh_883\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('883','tp_links')\" title=\"Show links and resources\" style=\"cursor:pointer;\">Links<\/a><\/span> | <span class=\"tp_bibtex_link\"><a id=\"tp_bibtex_sh_883\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('883','tp_bibtex')\" title=\"Show BibTeX entry\" style=\"cursor:pointer;\">BibTeX<\/a><\/span><\/p><div class=\"tp_bibtex\" id=\"tp_bibtex_883\" style=\"display:none;\"><div class=\"tp_bibtex_entry\"><pre>@article{fichtinger2019h,<br \/>\r\ntitle = {Navigated real-time molecular analysis in the operating theatre: demonstration of concept},<br \/>\r\nauthor = {Mark Asselin and Martin Kaufmann and Julia Wiercigroch and Tamas Ungi and Andras Lasso and John Rudan and Gabor Fichtinger},<br \/>\r\nurl = {https:\/\/www.spiedigitallibrary.org\/conference-proceedings-of-spie\/10951\/109512C\/Navigated-real-time-molecular-analysis-in-the-operating-theatre\/10.1117\/12.2512586.short},<br \/>\r\nyear  = {2019},<br \/>\r\ndate = {2019-01-01},<br \/>\r\nvolume = {10951},<br \/>\r\npages = {618-624},<br \/>\r\npublisher = {SPIE},<br \/>\r\nabstract = {PURPOSE <br \/>\r\nIn the operating theatre surgeons are accustomed to using spatially navigated tools in conjunction with standard clinical imaging during a procedure. This gives them a good idea where they are in the patients\u2019 anatomy but doesn\u2019t provide information about the type of tissue they are dissecting. In this paper we demonstrate an integrated system consisting of a spatially navigated surgical electrocautery combined with real-time molecular analysis of the dissected tissue using mass spectrometry. <br \/>\r\nMETHODS <br \/>\r\nUsing the 3D Slicer software package, we have integrated a commercially available neurosurgical navigation system with an intra-operative mass spectrometer (colloquially referred to as the intelligent knife, or iKnife) that analyzes the charged ions in the smoke created during cauterization. We demonstrate this system using a simulated patient comprised of an MRI scan from a brain cancer patient \u2026},<br \/>\r\nkeywords = {},<br \/>\r\npubstate = {published},<br \/>\r\ntppubtype = {article}<br \/>\r\n}<br \/>\r\n<\/pre><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('883','tp_bibtex')\">Close<\/a><\/p><\/div><div class=\"tp_abstract\" id=\"tp_abstract_883\" style=\"display:none;\"><div class=\"tp_abstract_entry\">PURPOSE <br \/>\r\nIn the operating theatre surgeons are accustomed to using spatially navigated tools in conjunction with standard clinical imaging during a procedure. This gives them a good idea where they are in the patients\u2019 anatomy but doesn\u2019t provide information about the type of tissue they are dissecting. In this paper we demonstrate an integrated system consisting of a spatially navigated surgical electrocautery combined with real-time molecular analysis of the dissected tissue using mass spectrometry. <br \/>\r\nMETHODS <br \/>\r\nUsing the 3D Slicer software package, we have integrated a commercially available neurosurgical navigation system with an intra-operative mass spectrometer (colloquially referred to as the intelligent knife, or iKnife) that analyzes the charged ions in the smoke created during cauterization. We demonstrate this system using a simulated patient comprised of an MRI scan from a brain cancer patient \u2026<\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('883','tp_abstract')\">Close<\/a><\/p><\/div><div class=\"tp_links\" id=\"tp_links_883\" style=\"display:none;\"><div class=\"tp_links_entry\"><ul class=\"tp_pub_list\"><li><i class=\"fas fa-globe\"><\/i><a class=\"tp_pub_list\" href=\"https:\/\/www.spiedigitallibrary.org\/conference-proceedings-of-spie\/10951\/109512C\/Navigated-real-time-molecular-analysis-in-the-operating-theatre\/10.1117\/12.2512586.short\" title=\"https:\/\/www.spiedigitallibrary.org\/conference-proceedings-of-spie\/10951\/109512C\/[...]\" target=\"_blank\">https:\/\/www.spiedigitallibrary.org\/conference-proceedings-of-spie\/10951\/109512C\/[...]<\/a><\/li><\/ul><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('883','tp_links')\">Close<\/a><\/p><\/div><\/div><\/div><div class=\"tp_publication tp_publication_conference\"><div class=\"tp_pub_info\"><p class=\"tp_pub_author\"> Wu, Victoria;  Asselin, Mark;  Ungi, Tamas;  Fichtinger, Gabor<\/p><p class=\"tp_pub_title\"><a class=\"tp_title_link\" href=\"https:\/\/dx.doi.org\/https:\/\/doi.org\/10.1007\/s11548-019-01969-3\" title=\"Detection of Spinal Ultrasound Landmarks Using Convolutional Neural Networks\" target=\"blank\">Detection of Spinal Ultrasound Landmarks Using Convolutional Neural Networks<\/a> <span class=\"tp_pub_type tp_  conference\">Conference<\/span> <\/p><p class=\"tp_pub_additional\"><span class=\"tp_pub_additional_booktitle\">33rd International Congress &amp; Exhibition on Computer Assisted Radiology and Surgery (CARS), <\/span><span class=\"tp_pub_additional_volume\">vol. 14, <\/span><span class=\"tp_pub_additional_publisher\">Int J CARS, <\/span><span class=\"tp_pub_additional_address\">Rennes, France, <\/span><span class=\"tp_pub_additional_year\">2019<\/span>.<\/p><p class=\"tp_pub_menu\"><span class=\"tp_resource_link\"><a id=\"tp_links_sh_68\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('68','tp_links')\" title=\"Show links and resources\" style=\"cursor:pointer;\">Links<\/a><\/span> | <span class=\"tp_bibtex_link\"><a id=\"tp_bibtex_sh_68\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('68','tp_bibtex')\" title=\"Show BibTeX entry\" style=\"cursor:pointer;\">BibTeX<\/a><\/span><\/p><div class=\"tp_bibtex\" id=\"tp_bibtex_68\" style=\"display:none;\"><div class=\"tp_bibtex_entry\"><pre>@conference{Wu2019b,<br \/>\r\ntitle = {Detection of Spinal Ultrasound Landmarks Using Convolutional Neural Networks},<br \/>\r\nauthor = {Victoria Wu and Mark Asselin and Tamas Ungi and Gabor Fichtinger},<br \/>\r\nurl = {https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/Wu2019b.pdf},<br \/>\r\ndoi = {https:\/\/doi.org\/10.1007\/s11548-019-01969-3},<br \/>\r\nyear  = {2019},<br \/>\r\ndate = {2019-01-01},<br \/>\r\nurldate = {2019-01-01},<br \/>\r\nbooktitle = {33rd International Congress & Exhibition on Computer Assisted Radiology and Surgery (CARS)},<br \/>\r\nvolume = {14},<br \/>\r\npublisher = {Int J CARS},<br \/>\r\naddress = {Rennes, France},<br \/>\r\nkeywords = {},<br \/>\r\npubstate = {published},<br \/>\r\ntppubtype = {conference}<br \/>\r\n}<br \/>\r\n<\/pre><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('68','tp_bibtex')\">Close<\/a><\/p><\/div><div class=\"tp_links\" id=\"tp_links_68\" style=\"display:none;\"><div class=\"tp_links_entry\"><ul class=\"tp_pub_list\"><li><i class=\"fas fa-file-pdf\"><\/i><a class=\"tp_pub_list\" href=\"https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/Wu2019b.pdf\" title=\"https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/Wu2019b.pd[...]\" target=\"_blank\">https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/Wu2019b.pd[...]<\/a><\/li><li><i class=\"ai ai-doi\"><\/i><a class=\"tp_pub_list\" href=\"https:\/\/dx.doi.org\/https:\/\/doi.org\/10.1007\/s11548-019-01969-3\" title=\"Follow DOI:https:\/\/doi.org\/10.1007\/s11548-019-01969-3\" target=\"_blank\">doi:https:\/\/doi.org\/10.1007\/s11548-019-01969-3<\/a><\/li><\/ul><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('68','tp_links')\">Close<\/a><\/p><\/div><\/div><\/div><div class=\"tp_publication tp_publication_conference\"><div class=\"tp_pub_info\"><p class=\"tp_pub_author\"> Lund, Shaun;  Vaughan, Thomas;  Ungi, Tamas;  Lasso, Andras;  Asselin, Mark;  Yeo, Caitlin;  Engel, C. Jay;  Fichtinger, Gabor<\/p><p class=\"tp_pub_title\">Controlling virtual views in navigated breast conserving surgery <span class=\"tp_pub_type tp_  conference\">Conference<\/span> <\/p><p class=\"tp_pub_additional\"><span class=\"tp_pub_additional_booktitle\">SPIE Medical Imaging 2019, <\/span><span class=\"tp_pub_additional_year\">2019<\/span>.<\/p><p class=\"tp_pub_menu\"><span class=\"tp_abstract_link\"><a id=\"tp_abstract_sh_67\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('67','tp_abstract')\" title=\"Show abstract\" style=\"cursor:pointer;\">Abstract<\/a><\/span> | <span class=\"tp_bibtex_link\"><a id=\"tp_bibtex_sh_67\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('67','tp_bibtex')\" title=\"Show BibTeX entry\" style=\"cursor:pointer;\">BibTeX<\/a><\/span><\/p><div class=\"tp_bibtex\" id=\"tp_bibtex_67\" style=\"display:none;\"><div class=\"tp_bibtex_entry\"><pre>@conference{Lund2019a,<br \/>\r\ntitle = {Controlling virtual views in navigated breast conserving surgery},<br \/>\r\nauthor = {Shaun Lund and Thomas Vaughan and Tamas Ungi and Andras Lasso and Mark Asselin and Caitlin Yeo and C. Jay Engel and Gabor Fichtinger},<br \/>\r\nyear  = {2019},<br \/>\r\ndate = {2019-01-01},<br \/>\r\nurldate = {2019-01-01},<br \/>\r\nbooktitle = {SPIE Medical Imaging 2019},<br \/>\r\nabstract = {&lt;p&gt;&lt;strong&gt;PURPOSE&lt;\/strong&gt;: Lumpectomy is the resection of a tumor in the breast while retaining as much healthy tissue as possible.&lt;br \/&gt; <br \/>\r\nNavigated lumpectomy seeks to improve on the traditional technique by employing computer guidance to achieve the&lt;br \/&gt; <br \/>\r\ncomplete excision of the cancer with optimal retention of healthy tissue. Setting up navigation in the OR relies on the&lt;br \/&gt; <br \/>\r\nmanual interactions of a trained technician to align three-dimensional virtual views to the patient\u2019s physical position&lt;br \/&gt; <br \/>\r\nand maintain their alignment throughout surgery. This work develops automatic alignment tools to improve the&lt;br \/&gt; <br \/>\r\noperability of navigation software for lumpectomies.&lt;br \/&gt; <br \/>\r\n&lt;strong&gt;METHODS&lt;\/strong&gt;: Preset view buttons were developed to refine view setup to a single interaction. These buttons were&lt;br \/&gt; <br \/>\r\ntested by measuring the reduction in setup time and the number of manual interactions avoided through their use. An&lt;br \/&gt; <br \/>\r\nauto-center feature was created to ensure that three-dimensional models of anatomy and instruments were in the center&lt;br \/&gt; <br \/>\r\nof view throughout surgery. Recorded data from 32 lumpectomy cases were replayed and the number of auto-center&lt;br \/&gt; <br \/>\r\nview shifts was counted from the first cautery incision until the completion of the excision of cancerous tissue.&lt;br \/&gt; <br \/>\r\n&lt;strong&gt;RESULTS&lt;\/strong&gt;: View setup can now be performed in a single interaction compared to an average of 13 interactions&lt;br \/&gt; <br \/>\r\n(taking 83 seconds) when performed manually. The auto-center feature was activated an average of 33 times in the&lt;br \/&gt; cases studied (n=32).&lt;br \/&gt; <br \/>\r\n&lt;strong&gt;CONCLUSION&lt;\/strong&gt;: The auto-center feature enhances the operability of the surgical navigation system, reducing the&lt;br \/&gt; <br \/>\r\nnumber of manual interactions required by a technician during the surgery. This feature along with preset camera view&lt;br \/&gt; <br \/>\r\noptions are instrumental in the shift towards a completely surgeon-operable navigated lumpectomy system.&lt;\/p&gt;},<br \/>\r\nkeywords = {},<br \/>\r\npubstate = {published},<br \/>\r\ntppubtype = {conference}<br \/>\r\n}<br \/>\r\n<\/pre><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('67','tp_bibtex')\">Close<\/a><\/p><\/div><div class=\"tp_abstract\" id=\"tp_abstract_67\" style=\"display:none;\"><div class=\"tp_abstract_entry\">&lt;p&gt;&lt;strong&gt;PURPOSE&lt;\/strong&gt;: Lumpectomy is the resection of a tumor in the breast while retaining as much healthy tissue as possible.&lt;br \/&gt; <br \/>\r\nNavigated lumpectomy seeks to improve on the traditional technique by employing computer guidance to achieve the&lt;br \/&gt; <br \/>\r\ncomplete excision of the cancer with optimal retention of healthy tissue. Setting up navigation in the OR relies on the&lt;br \/&gt; <br \/>\r\nmanual interactions of a trained technician to align three-dimensional virtual views to the patient\u2019s physical position&lt;br \/&gt; <br \/>\r\nand maintain their alignment throughout surgery. This work develops automatic alignment tools to improve the&lt;br \/&gt; <br \/>\r\noperability of navigation software for lumpectomies.&lt;br \/&gt; <br \/>\r\n&lt;strong&gt;METHODS&lt;\/strong&gt;: Preset view buttons were developed to refine view setup to a single interaction. These buttons were&lt;br \/&gt; <br \/>\r\ntested by measuring the reduction in setup time and the number of manual interactions avoided through their use. An&lt;br \/&gt; <br \/>\r\nauto-center feature was created to ensure that three-dimensional models of anatomy and instruments were in the center&lt;br \/&gt; <br \/>\r\nof view throughout surgery. Recorded data from 32 lumpectomy cases were replayed and the number of auto-center&lt;br \/&gt; <br \/>\r\nview shifts was counted from the first cautery incision until the completion of the excision of cancerous tissue.&lt;br \/&gt; <br \/>\r\n&lt;strong&gt;RESULTS&lt;\/strong&gt;: View setup can now be performed in a single interaction compared to an average of 13 interactions&lt;br \/&gt; <br \/>\r\n(taking 83 seconds) when performed manually. The auto-center feature was activated an average of 33 times in the&lt;br \/&gt; cases studied (n=32).&lt;br \/&gt; <br \/>\r\n&lt;strong&gt;CONCLUSION&lt;\/strong&gt;: The auto-center feature enhances the operability of the surgical navigation system, reducing the&lt;br \/&gt; <br \/>\r\nnumber of manual interactions required by a technician during the surgery. This feature along with preset camera view&lt;br \/&gt; <br \/>\r\noptions are instrumental in the shift towards a completely surgeon-operable navigated lumpectomy system.&lt;\/p&gt;<\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('67','tp_abstract')\">Close<\/a><\/p><\/div><\/div><\/div><div class=\"tp_publication tp_publication_conference\"><div class=\"tp_pub_info\"><p class=\"tp_pub_author\"> Asselin, Mark;  Kaufmann, Martin;  Wiercigroch, Julia;  Ungi, Tamas;  Lasso, Andras;  Rudan, John;  Fichtinger, Gabor<\/p><p class=\"tp_pub_title\">Navigated real-time molecular analysis in the operating theatre, demonstration of concept <span class=\"tp_pub_type tp_  conference\">Conference<\/span> <\/p><p class=\"tp_pub_additional\"><span class=\"tp_pub_additional_booktitle\">SPIE Medical Imaging 2019, <\/span><span class=\"tp_pub_additional_year\">2019<\/span>.<\/p><p class=\"tp_pub_menu\"><span class=\"tp_abstract_link\"><a id=\"tp_abstract_sh_74\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('74','tp_abstract')\" title=\"Show abstract\" style=\"cursor:pointer;\">Abstract<\/a><\/span> | <span class=\"tp_bibtex_link\"><a id=\"tp_bibtex_sh_74\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('74','tp_bibtex')\" title=\"Show BibTeX entry\" style=\"cursor:pointer;\">BibTeX<\/a><\/span><\/p><div class=\"tp_bibtex\" id=\"tp_bibtex_74\" style=\"display:none;\"><div class=\"tp_bibtex_entry\"><pre>@conference{Asselin2019a,<br \/>\r\ntitle = {Navigated real-time molecular analysis in the operating theatre, demonstration of concept},<br \/>\r\nauthor = {Mark Asselin and Martin Kaufmann and Julia Wiercigroch and Tamas Ungi and Andras Lasso and John Rudan and Gabor Fichtinger},<br \/>\r\nyear  = {2019},<br \/>\r\ndate = {2019-01-01},<br \/>\r\nurldate = {2019-01-01},<br \/>\r\nbooktitle = {SPIE Medical Imaging 2019},<br \/>\r\nabstract = {&lt;p&gt;&lt;strong&gt;PURPOSE&lt;\/strong&gt;: In the operating theatre surgeons are accustomed to using spatially navigated tools in conjunction with&lt;br \/&gt; <br \/>\r\nstandard clinical imaging during a procedure. This gives them a good idea where they are in the patients\u2019 anatomy but&lt;br \/&gt; <br \/>\r\ndoesn\u2019t provide information about the type of tissue they are dissecting. In this paper we demonstrate an integrated&lt;br \/&gt; <br \/>\r\nsystem consisting of a spatially navigated surgical electrocautery combined with real-time molecular analysis of the&lt;br \/&gt; <br \/>\r\ndissected tissue using mass spectrometry.&lt;br \/&gt; <br \/>\r\n&lt;strong&gt;METHODS&lt;\/strong&gt;: Using the 3D Slicer software package, we have integrated a commercially available neurosurgical&lt;br \/&gt; <br \/>\r\nnavigation system with an intra-operative mass spectrometer (colloquially referred to as the intelligent knife, or iKnife)&lt;br \/&gt; <br \/>\r\nthat analyzes the charged ions in the smoke created during cauterization. We demonstrate this system using a simulated&lt;br \/&gt; <br \/>\r\npatient comprised of an MRI scan from a brain cancer patient deformably registered to a plastic skull model. On the&lt;br \/&gt; <br \/>\r\nskull model we placed porcine and bovine tissues to simulate cancerous and healthy tissue, respectively. We built a&lt;br \/&gt; <br \/>\r\nPCA\/LDA model to distinguish between these tissue types. The tissue classifications were displayed in a spatially&lt;br \/&gt; <br \/>\r\nlocalized manner in the pre-operative imaging, in both 2D and 3D views.&lt;br \/&gt; <br \/>\r\n&lt;strong&gt;RESULTS&lt;\/strong&gt;: We have demonstrated the feasibility of performing spatially navigated intra-operative analysis of tissues by&lt;br \/&gt; <br \/>\r\nmass spectrometry. We show that machine learning can classify our sample tissues, with an average computed&lt;br \/&gt; <br \/>\r\nconfidence of 99.37 % for porcine tissue and 99.36% for bovine tissue.&lt;br \/&gt; <br \/>\r\n&lt;strong&gt;CONCLUSION&lt;\/strong&gt;: In this paper we demonstrate a proof of concept system for navigated intra-operative molecular&lt;br \/&gt; <br \/>\r\nanalysis. This system may enable intra-operative awareness of spatially localized tissue classification during dissection,&lt;br \/&gt; <br \/>\r\ninformation that is especially useful in tumor surgeries where margins may not be visible to the unassisted eye.&lt;br \/&gt; <br \/>\r\n&lt;strong&gt;Keywords&lt;\/strong&gt;: image guided therapy, intra-operative mass spectrometry, iKnife, 3D Slicer, open-source, rapid evaporative&lt;br \/&gt; <br \/>\r\nionization mass spectrometry (REIMS)&lt;\/p&gt;},<br \/>\r\nkeywords = {},<br \/>\r\npubstate = {published},<br \/>\r\ntppubtype = {conference}<br \/>\r\n}<br \/>\r\n<\/pre><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('74','tp_bibtex')\">Close<\/a><\/p><\/div><div class=\"tp_abstract\" id=\"tp_abstract_74\" style=\"display:none;\"><div class=\"tp_abstract_entry\">&lt;p&gt;&lt;strong&gt;PURPOSE&lt;\/strong&gt;: In the operating theatre surgeons are accustomed to using spatially navigated tools in conjunction with&lt;br \/&gt; <br \/>\r\nstandard clinical imaging during a procedure. This gives them a good idea where they are in the patients\u2019 anatomy but&lt;br \/&gt; <br \/>\r\ndoesn\u2019t provide information about the type of tissue they are dissecting. In this paper we demonstrate an integrated&lt;br \/&gt; <br \/>\r\nsystem consisting of a spatially navigated surgical electrocautery combined with real-time molecular analysis of the&lt;br \/&gt; <br \/>\r\ndissected tissue using mass spectrometry.&lt;br \/&gt; <br \/>\r\n&lt;strong&gt;METHODS&lt;\/strong&gt;: Using the 3D Slicer software package, we have integrated a commercially available neurosurgical&lt;br \/&gt; <br \/>\r\nnavigation system with an intra-operative mass spectrometer (colloquially referred to as the intelligent knife, or iKnife)&lt;br \/&gt; <br \/>\r\nthat analyzes the charged ions in the smoke created during cauterization. We demonstrate this system using a simulated&lt;br \/&gt; <br \/>\r\npatient comprised of an MRI scan from a brain cancer patient deformably registered to a plastic skull model. On the&lt;br \/&gt; <br \/>\r\nskull model we placed porcine and bovine tissues to simulate cancerous and healthy tissue, respectively. We built a&lt;br \/&gt; <br \/>\r\nPCA\/LDA model to distinguish between these tissue types. The tissue classifications were displayed in a spatially&lt;br \/&gt; <br \/>\r\nlocalized manner in the pre-operative imaging, in both 2D and 3D views.&lt;br \/&gt; <br \/>\r\n&lt;strong&gt;RESULTS&lt;\/strong&gt;: We have demonstrated the feasibility of performing spatially navigated intra-operative analysis of tissues by&lt;br \/&gt; <br \/>\r\nmass spectrometry. We show that machine learning can classify our sample tissues, with an average computed&lt;br \/&gt; <br \/>\r\nconfidence of 99.37 % for porcine tissue and 99.36% for bovine tissue.&lt;br \/&gt; <br \/>\r\n&lt;strong&gt;CONCLUSION&lt;\/strong&gt;: In this paper we demonstrate a proof of concept system for navigated intra-operative molecular&lt;br \/&gt; <br \/>\r\nanalysis. This system may enable intra-operative awareness of spatially localized tissue classification during dissection,&lt;br \/&gt; <br \/>\r\ninformation that is especially useful in tumor surgeries where margins may not be visible to the unassisted eye.&lt;br \/&gt; <br \/>\r\n&lt;strong&gt;Keywords&lt;\/strong&gt;: image guided therapy, intra-operative mass spectrometry, iKnife, 3D Slicer, open-source, rapid evaporative&lt;br \/&gt; <br \/>\r\nionization mass spectrometry (REIMS)&lt;\/p&gt;<\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('74','tp_abstract')\">Close<\/a><\/p><\/div><\/div><\/div><div class=\"tp_publication tp_publication_conference\"><div class=\"tp_pub_info\"><p class=\"tp_pub_author\"> Perrin, Sydney;  Baum, Zachary M C;  Asselin, Mark;  Underwood, Grace;  Choueib, Saleh;  Lia, H.;  Ungi, Tamas;  Lasso, Andras;  Fichtinger, Gabor<\/p><p class=\"tp_pub_title\"><a class=\"tp_title_link\" href=\"https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/Perrin2019a.pdf\" title=\"https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/Perrin2019a.pdf\" target=\"blank\">Reproducibility of freehand calibrations for ultrasound-guided needle navigation<\/a> <span class=\"tp_pub_type tp_  conference\">Conference<\/span> <\/p><p class=\"tp_pub_additional\"><span class=\"tp_pub_additional_booktitle\">SPIE Medical Imaging 2019: Image-Guided Procedures, Robotic Interventions, and Modeling, <\/span><span class=\"tp_pub_additional_volume\">vol. 10951, <\/span><span class=\"tp_pub_additional_address\">San Diego, California, <\/span><span class=\"tp_pub_additional_year\">2019<\/span>.<\/p><p class=\"tp_pub_menu\"><span class=\"tp_resource_link\"><a id=\"tp_links_sh_78\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('78','tp_links')\" title=\"Show links and resources\" style=\"cursor:pointer;\">Links<\/a><\/span> | <span class=\"tp_bibtex_link\"><a id=\"tp_bibtex_sh_78\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('78','tp_bibtex')\" title=\"Show BibTeX entry\" style=\"cursor:pointer;\">BibTeX<\/a><\/span><\/p><div class=\"tp_bibtex\" id=\"tp_bibtex_78\" style=\"display:none;\"><div class=\"tp_bibtex_entry\"><pre>@conference{Perrin2019a,<br \/>\r\ntitle = {Reproducibility of freehand calibrations for ultrasound-guided needle navigation},<br \/>\r\nauthor = {Sydney Perrin and Zachary M C Baum and Mark Asselin and Grace Underwood and Saleh Choueib and H. Lia and Tamas Ungi and Andras Lasso and Gabor Fichtinger},<br \/>\r\nurl = {https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/Perrin2019a.pdf},<br \/>\r\nyear  = {2019},<br \/>\r\ndate = {2019-01-01},<br \/>\r\nurldate = {2019-01-01},<br \/>\r\nbooktitle = {SPIE Medical Imaging 2019: Image-Guided Procedures, Robotic Interventions, and Modeling},<br \/>\r\nvolume = {10951},<br \/>\r\naddress = {San Diego, California},<br \/>\r\nkeywords = {},<br \/>\r\npubstate = {published},<br \/>\r\ntppubtype = {conference}<br \/>\r\n}<br \/>\r\n<\/pre><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('78','tp_bibtex')\">Close<\/a><\/p><\/div><div class=\"tp_links\" id=\"tp_links_78\" style=\"display:none;\"><div class=\"tp_links_entry\"><ul class=\"tp_pub_list\"><li><i class=\"fas fa-file-pdf\"><\/i><a class=\"tp_pub_list\" href=\"https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/Perrin2019a.pdf\" title=\"https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/Perrin2019[...]\" target=\"_blank\">https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/Perrin2019[...]<\/a><\/li><\/ul><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('78','tp_links')\">Close<\/a><\/p><\/div><\/div><\/div><div class=\"tp_publication tp_publication_conference\"><div class=\"tp_pub_info\"><p class=\"tp_pub_author\"> Asselin, Mark;  Lasso, Andras;  Ungi, Tamas;  Fichtinger, Gabor<\/p><p class=\"tp_pub_title\"><a class=\"tp_title_link\" href=\"https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/Asselin2018a.pdf\" title=\"https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/Asselin2018a.pdf\" target=\"blank\">Towards webcam-based tracking for interventional navigation<\/a> <span class=\"tp_pub_type tp_  conference\">Conference<\/span> <\/p><p class=\"tp_pub_additional\"><span class=\"tp_pub_additional_booktitle\">SPIE Medical Imaging 2018: Image-Guided Procedures, Robotic Interventions, and Modeling, <\/span><span class=\"tp_pub_additional_address\">Houston, Texas, <\/span><span class=\"tp_pub_additional_year\">2018<\/span>.<\/p><p class=\"tp_pub_menu\"><span class=\"tp_resource_link\"><a id=\"tp_links_sh_106\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('106','tp_links')\" title=\"Show links and resources\" style=\"cursor:pointer;\">Links<\/a><\/span> | <span class=\"tp_bibtex_link\"><a id=\"tp_bibtex_sh_106\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('106','tp_bibtex')\" title=\"Show BibTeX entry\" style=\"cursor:pointer;\">BibTeX<\/a><\/span><\/p><div class=\"tp_bibtex\" id=\"tp_bibtex_106\" style=\"display:none;\"><div class=\"tp_bibtex_entry\"><pre>@conference{Asselin2018a,<br \/>\r\ntitle = {Towards webcam-based tracking for interventional navigation},<br \/>\r\nauthor = {Mark Asselin and Andras Lasso and Tamas Ungi and Gabor Fichtinger},<br \/>\r\nurl = {https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/Asselin2018a.pdf},<br \/>\r\nyear  = {2018},<br \/>\r\ndate = {2018-03-01},<br \/>\r\nurldate = {2018-03-01},<br \/>\r\nbooktitle = {SPIE Medical Imaging 2018: Image-Guided Procedures, Robotic Interventions, and Modeling},<br \/>\r\naddress = {Houston, Texas},<br \/>\r\nkeywords = {},<br \/>\r\npubstate = {published},<br \/>\r\ntppubtype = {conference}<br \/>\r\n}<br \/>\r\n<\/pre><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('106','tp_bibtex')\">Close<\/a><\/p><\/div><div class=\"tp_links\" id=\"tp_links_106\" style=\"display:none;\"><div class=\"tp_links_entry\"><ul class=\"tp_pub_list\"><li><i class=\"fas fa-file-pdf\"><\/i><a class=\"tp_pub_list\" href=\"https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/Asselin2018a.pdf\" title=\"https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/Asselin201[...]\" target=\"_blank\">https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/Asselin201[...]<\/a><\/li><\/ul><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('106','tp_links')\">Close<\/a><\/p><\/div><\/div><\/div><div class=\"tp_publication tp_publication_article\"><div class=\"tp_pub_info\"><p class=\"tp_pub_author\"> Lasso, Andras;  Nam, Hannah H;  Dinh, Patrick V;  Pinter, Csaba;  Fillion-Robin, Jean-Christophe;  Pieper, Steve;  Jhaveri, Sankhesh;  Vimort, Jean-Baptiste;  Martin, Ken;  Asselin, Mark;  McGowan, Francis X;  Kikinis, Ron;  Fichtinger, Gabor;  Jolley, Matthew A<\/p><p class=\"tp_pub_title\"><a class=\"tp_title_link\" href=\"https:\/\/www.onlinejase.com\/article\/S0894-7317(18)30343-2\/abstract\" title=\"https:\/\/www.onlinejase.com\/article\/S0894-7317(18)30343-2\/abstract\" target=\"blank\">Interaction with volume-rendered three-dimensional echocardiographic images in virtual reality<\/a> <span class=\"tp_pub_type tp_  article\">Journal Article<\/span> <\/p><p class=\"tp_pub_additional\"><span class=\"tp_pub_additional_in\">In: <\/span><span class=\"tp_pub_additional_journal\">Journal of the American Society of Echocardiography, <\/span><span class=\"tp_pub_additional_volume\">vol. 31, <\/span><span class=\"tp_pub_additional_issue\">iss. 10, <\/span><span class=\"tp_pub_additional_pages\">pp. 1158-1160, <\/span><span class=\"tp_pub_additional_year\">2018<\/span>.<\/p><p class=\"tp_pub_menu\"><span class=\"tp_abstract_link\"><a id=\"tp_abstract_sh_789\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('789','tp_abstract')\" title=\"Show abstract\" style=\"cursor:pointer;\">Abstract<\/a><\/span> | <span class=\"tp_resource_link\"><a id=\"tp_links_sh_789\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('789','tp_links')\" title=\"Show links and resources\" style=\"cursor:pointer;\">Links<\/a><\/span> | <span class=\"tp_bibtex_link\"><a id=\"tp_bibtex_sh_789\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('789','tp_bibtex')\" title=\"Show BibTeX entry\" style=\"cursor:pointer;\">BibTeX<\/a><\/span><\/p><div class=\"tp_bibtex\" id=\"tp_bibtex_789\" style=\"display:none;\"><div class=\"tp_bibtex_entry\"><pre>@article{fichtinger2018f,<br \/>\r\ntitle = {Interaction with volume-rendered three-dimensional echocardiographic images in virtual reality},<br \/>\r\nauthor = {Andras Lasso and Hannah H Nam and Patrick V Dinh and Csaba Pinter and Jean-Christophe Fillion-Robin and Steve Pieper and Sankhesh Jhaveri and Jean-Baptiste Vimort and Ken Martin and Mark Asselin and Francis X McGowan and Ron Kikinis and Gabor Fichtinger and Matthew A Jolley},<br \/>\r\nurl = {https:\/\/www.onlinejase.com\/article\/S0894-7317(18)30343-2\/abstract},<br \/>\r\nyear  = {2018},<br \/>\r\ndate = {2018-01-01},<br \/>\r\njournal = {Journal of the American Society of Echocardiography},<br \/>\r\nvolume = {31},<br \/>\r\nissue = {10},<br \/>\r\npages = {1158-1160},<br \/>\r\npublisher = {Elsevier},<br \/>\r\nabstract = {Three-dimensional (3D) imaging is increasingly important in echocardiography. However, viewing of 3D images on a flat, two-dimensional screen is a barrier to comprehension of latent information. There have been previous attempts to visualize the full 3D nature of the data, but they have not been widely adopted. For example, 3D printing offers realistic interaction but is time consuming, has limited means for the observer to move into or through the model, and is not yet practical for routine clinical use. Furthermore, the heart beats, and 3D printed models are static. Stereoscopic viewing on 2D screens (as at a movie theater) is possible but is expensive, may not provide an immersive experience, and does not have integrated 3D input devices (controllers). <br \/>\r\nStereoscopic virtual reality (VR) is developing rapidly but is being driven by the video gaming industry, with features not directly applicable to the visualization of \u2026},<br \/>\r\nkeywords = {},<br \/>\r\npubstate = {published},<br \/>\r\ntppubtype = {article}<br \/>\r\n}<br \/>\r\n<\/pre><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('789','tp_bibtex')\">Close<\/a><\/p><\/div><div class=\"tp_abstract\" id=\"tp_abstract_789\" style=\"display:none;\"><div class=\"tp_abstract_entry\">Three-dimensional (3D) imaging is increasingly important in echocardiography. However, viewing of 3D images on a flat, two-dimensional screen is a barrier to comprehension of latent information. There have been previous attempts to visualize the full 3D nature of the data, but they have not been widely adopted. For example, 3D printing offers realistic interaction but is time consuming, has limited means for the observer to move into or through the model, and is not yet practical for routine clinical use. Furthermore, the heart beats, and 3D printed models are static. Stereoscopic viewing on 2D screens (as at a movie theater) is possible but is expensive, may not provide an immersive experience, and does not have integrated 3D input devices (controllers). <br \/>\r\nStereoscopic virtual reality (VR) is developing rapidly but is being driven by the video gaming industry, with features not directly applicable to the visualization of \u2026<\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('789','tp_abstract')\">Close<\/a><\/p><\/div><div class=\"tp_links\" id=\"tp_links_789\" style=\"display:none;\"><div class=\"tp_links_entry\"><ul class=\"tp_pub_list\"><li><i class=\"fas fa-globe\"><\/i><a class=\"tp_pub_list\" href=\"https:\/\/www.onlinejase.com\/article\/S0894-7317(18)30343-2\/abstract\" title=\"https:\/\/www.onlinejase.com\/article\/S0894-7317(18)30343-2\/abstract\" target=\"_blank\">https:\/\/www.onlinejase.com\/article\/S0894-7317(18)30343-2\/abstract<\/a><\/li><\/ul><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('789','tp_links')\">Close<\/a><\/p><\/div><\/div><\/div><div class=\"tp_publication tp_publication_article\"><div class=\"tp_pub_info\"><p class=\"tp_pub_author\"> Asselin, Mark;  Ungi, Tamas;  Lasso, Andras;  Fichtinger, Gabor<\/p><p class=\"tp_pub_title\"><a class=\"tp_title_link\" href=\"https:\/\/link.springer.com\/chapter\/10.1007\/978-3-030-01045-4_2\" title=\"https:\/\/link.springer.com\/chapter\/10.1007\/978-3-030-01045-4_2\" target=\"blank\">A training tool for ultrasound-guided central line insertion with webcam-based position tracking<\/a> <span class=\"tp_pub_type tp_  article\">Journal Article<\/span> <\/p><p class=\"tp_pub_additional\"><span class=\"tp_pub_additional_in\">In: <\/span><span class=\"tp_pub_additional_pages\">pp. 12-20, <\/span><span class=\"tp_pub_additional_year\">2018<\/span>.<\/p><p class=\"tp_pub_menu\"><span class=\"tp_abstract_link\"><a id=\"tp_abstract_sh_900\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('900','tp_abstract')\" title=\"Show abstract\" style=\"cursor:pointer;\">Abstract<\/a><\/span> | <span class=\"tp_resource_link\"><a id=\"tp_links_sh_900\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('900','tp_links')\" title=\"Show links and resources\" style=\"cursor:pointer;\">Links<\/a><\/span> | <span class=\"tp_bibtex_link\"><a id=\"tp_bibtex_sh_900\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('900','tp_bibtex')\" title=\"Show BibTeX entry\" style=\"cursor:pointer;\">BibTeX<\/a><\/span><\/p><div class=\"tp_bibtex\" id=\"tp_bibtex_900\" style=\"display:none;\"><div class=\"tp_bibtex_entry\"><pre>@article{fichtinger2018n,<br \/>\r\ntitle = {A training tool for ultrasound-guided central line insertion with webcam-based position tracking},<br \/>\r\nauthor = {Mark Asselin and Tamas Ungi and Andras Lasso and Gabor Fichtinger},<br \/>\r\nurl = {https:\/\/link.springer.com\/chapter\/10.1007\/978-3-030-01045-4_2},<br \/>\r\nyear  = {2018},<br \/>\r\ndate = {2018-01-01},<br \/>\r\npages = {12-20},<br \/>\r\npublisher = {Springer International Publishing},<br \/>\r\nabstract = {PURPOSE: This paper describes an open-source ultrasound-guided central line insertion training system. Modern clinical guidelines are increasingly recommending ultrasound guidance for this procedure due to the decrease in morbidity it provides. However, there are no adequate low-cost systems for helping new clinicians train their inter-hand coordination for this demanding procedure. METHODS: This paper details a training platform which can be recreated with any standard ultrasound machine using inexpensive components. We describe the hardware, software, and calibration procedures with the intention that a reader can recreate this system themselves. RESULTS: The reproducibility and accuracy of the ultrasound calibration for this system was examined. We found that across the ultrasound image the calibration error was less than 2 mm. In a small feasibility study, two participants performed 5 \u2026},<br \/>\r\nkeywords = {},<br \/>\r\npubstate = {published},<br \/>\r\ntppubtype = {article}<br \/>\r\n}<br \/>\r\n<\/pre><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('900','tp_bibtex')\">Close<\/a><\/p><\/div><div class=\"tp_abstract\" id=\"tp_abstract_900\" style=\"display:none;\"><div class=\"tp_abstract_entry\">PURPOSE: This paper describes an open-source ultrasound-guided central line insertion training system. Modern clinical guidelines are increasingly recommending ultrasound guidance for this procedure due to the decrease in morbidity it provides. However, there are no adequate low-cost systems for helping new clinicians train their inter-hand coordination for this demanding procedure. METHODS: This paper details a training platform which can be recreated with any standard ultrasound machine using inexpensive components. We describe the hardware, software, and calibration procedures with the intention that a reader can recreate this system themselves. RESULTS: The reproducibility and accuracy of the ultrasound calibration for this system was examined. We found that across the ultrasound image the calibration error was less than 2 mm. In a small feasibility study, two participants performed 5 \u2026<\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('900','tp_abstract')\">Close<\/a><\/p><\/div><div class=\"tp_links\" id=\"tp_links_900\" style=\"display:none;\"><div class=\"tp_links_entry\"><ul class=\"tp_pub_list\"><li><i class=\"fas fa-globe\"><\/i><a class=\"tp_pub_list\" href=\"https:\/\/link.springer.com\/chapter\/10.1007\/978-3-030-01045-4_2\" title=\"https:\/\/link.springer.com\/chapter\/10.1007\/978-3-030-01045-4_2\" target=\"_blank\">https:\/\/link.springer.com\/chapter\/10.1007\/978-3-030-01045-4_2<\/a><\/li><\/ul><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('900','tp_links')\">Close<\/a><\/p><\/div><\/div><\/div><div class=\"tp_publication tp_publication_article\"><div class=\"tp_pub_info\"><p class=\"tp_pub_author\"> Lasso, Andras;  Nam, HannahH;  Dinh, Patrick V.;  Pinter, Csaba;  Fillion-Robin, Jean-ChristopheC.;  Pieper, Steve;  Jhaveri, Sankhesh;  Vimort, Jean-Baptiste;  Martin, Ken;  Asselin, Mark;  McGowan, FrancisX;  Kikinis, Ron;  Fichtinger, Gabor;  Jolley, MatthewA<\/p><p class=\"tp_pub_title\">Interaction with Volume-Rendered Three-Dimensional Echocardiographic Images in Virtual Reality <span class=\"tp_pub_type tp_  article\">Journal Article<\/span> <\/p><p class=\"tp_pub_additional\"><span class=\"tp_pub_additional_in\">In: <\/span><span class=\"tp_pub_additional_journal\">J Am Soc Echocardiogr, <\/span><span class=\"tp_pub_additional_volume\">vol. 31, <\/span><span class=\"tp_pub_additional_number\">no. 10, <\/span><span class=\"tp_pub_additional_pages\">pp. 1158-60\u2013, <\/span><span class=\"tp_pub_additional_year\">2018<\/span>.<\/p><p class=\"tp_pub_menu\"><span class=\"tp_abstract_link\"><a id=\"tp_abstract_sh_95\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('95','tp_abstract')\" title=\"Show abstract\" style=\"cursor:pointer;\">Abstract<\/a><\/span> | <span class=\"tp_bibtex_link\"><a id=\"tp_bibtex_sh_95\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('95','tp_bibtex')\" title=\"Show BibTeX entry\" style=\"cursor:pointer;\">BibTeX<\/a><\/span><\/p><div class=\"tp_bibtex\" id=\"tp_bibtex_95\" style=\"display:none;\"><div class=\"tp_bibtex_entry\"><pre>@article{Lasso2018,<br \/>\r\ntitle = {Interaction with Volume-Rendered Three-Dimensional Echocardiographic Images in Virtual Reality},<br \/>\r\nauthor = {Andras Lasso and HannahH Nam and Patrick V. Dinh and Csaba Pinter and Jean-ChristopheC. Fillion-Robin and Steve Pieper and Sankhesh Jhaveri and Jean-Baptiste Vimort and Ken Martin and Mark Asselin and FrancisX McGowan and Ron Kikinis and Gabor Fichtinger and MatthewA Jolley},<br \/>\r\nyear  = {2018},<br \/>\r\ndate = {2018-01-01},<br \/>\r\nurldate = {2018-01-01},<br \/>\r\njournal = {J Am Soc Echocardiogr},<br \/>\r\nvolume = {31},<br \/>\r\nnumber = {10},<br \/>\r\npages = {1158-60\u2013},<br \/>\r\nabstract = {&lt;p&gt;Three-dimensional (3D) imaging is increasingly important in echocardiography. However, viewing of 3D images on a flat, two-dimensional screen is a barrier to comprehension of latent information. There have been previous attempts to visualize the full 3D nature of the data, but they have not been widely adopted. For example, 3D printing offers realistic interaction but is time consuming, has limited means for the observer to move into or through the model, and is not yet practical for routine clinical use. Furthermore, the heart beats, and 3D printed models are static. Stereoscopic viewing on 2D screens (as at a movie theater) is possible but is expensive, may not provide an immersive experience, and does not have integrated 3D input devices (controllers).&lt;\/p&gt;},<br \/>\r\nkeywords = {},<br \/>\r\npubstate = {published},<br \/>\r\ntppubtype = {article}<br \/>\r\n}<br \/>\r\n<\/pre><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('95','tp_bibtex')\">Close<\/a><\/p><\/div><div class=\"tp_abstract\" id=\"tp_abstract_95\" style=\"display:none;\"><div class=\"tp_abstract_entry\">&lt;p&gt;Three-dimensional (3D) imaging is increasingly important in echocardiography. However, viewing of 3D images on a flat, two-dimensional screen is a barrier to comprehension of latent information. There have been previous attempts to visualize the full 3D nature of the data, but they have not been widely adopted. For example, 3D printing offers realistic interaction but is time consuming, has limited means for the observer to move into or through the model, and is not yet practical for routine clinical use. Furthermore, the heart beats, and 3D printed models are static. Stereoscopic viewing on 2D screens (as at a movie theater) is possible but is expensive, may not provide an immersive experience, and does not have integrated 3D input devices (controllers).&lt;\/p&gt;<\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('95','tp_abstract')\">Close<\/a><\/p><\/div><\/div><\/div><div class=\"tp_publication tp_publication_conference\"><div class=\"tp_pub_info\"><p class=\"tp_pub_author\"> Asselin, Mark;  Ungi, Tamas;  Lasso, Andras;  Fichtinger, Gabor<\/p><p class=\"tp_pub_title\"><a class=\"tp_title_link\" href=\"https:\/\/dx.doi.org\/10.1007\/978-3-030-01045-4_2\" title=\"A Training Tool for Ultrasound-Guided Central Line Insertion with Webcam-Based Position Tracking\" target=\"blank\">A Training Tool for Ultrasound-Guided Central Line Insertion with Webcam-Based Position Tracking<\/a> <span class=\"tp_pub_type tp_  conference\">Conference<\/span> <\/p><p class=\"tp_pub_additional\"><span class=\"tp_pub_additional_booktitle\">Simulation, Image Processing, and Ultrasound Systems for Assisted Diagnosis and Navigation, <\/span><span class=\"tp_pub_additional_year\">2018<\/span>.<\/p><p class=\"tp_pub_menu\"><span class=\"tp_abstract_link\"><a id=\"tp_abstract_sh_107\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('107','tp_abstract')\" title=\"Show abstract\" style=\"cursor:pointer;\">Abstract<\/a><\/span> | <span class=\"tp_resource_link\"><a id=\"tp_links_sh_107\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('107','tp_links')\" title=\"Show links and resources\" style=\"cursor:pointer;\">Links<\/a><\/span> | <span class=\"tp_bibtex_link\"><a id=\"tp_bibtex_sh_107\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('107','tp_bibtex')\" title=\"Show BibTeX entry\" style=\"cursor:pointer;\">BibTeX<\/a><\/span><\/p><div class=\"tp_bibtex\" id=\"tp_bibtex_107\" style=\"display:none;\"><div class=\"tp_bibtex_entry\"><pre>@conference{Asselin2018c,<br \/>\r\ntitle = {A Training Tool for Ultrasound-Guided Central Line Insertion with Webcam-Based Position Tracking},<br \/>\r\nauthor = {Mark Asselin and Tamas Ungi and Andras Lasso and Gabor Fichtinger},<br \/>\r\nurl = {http:\/\/dx.doi.org\/10.1007\/978-3-030-01045-4_2<br \/>\r\nhttps:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/Asselin2018c.pdf},<br \/>\r\ndoi = {10.1007\/978-3-030-01045-4_2},<br \/>\r\nyear  = {2018},<br \/>\r\ndate = {2018-01-01},<br \/>\r\nurldate = {2018-01-01},<br \/>\r\nbooktitle = {Simulation, Image Processing, and Ultrasound Systems for Assisted Diagnosis and Navigation},<br \/>\r\nabstract = {&lt;p&gt;PURPOSE: This paper describes an open-source ultrasound-guided central line insertion training system. Modern clinical guidelines are increasingly recommending ultrasound guidance for this procedure due to the decrease in morbidity it provides. However, there are no adequate low-cost systems for helping new clinicians train their inter-hand coordination for this demanding procedure. METHODS: This paper details a training platform which can be recreated with any standard ultrasound machine using inexpensive components. We describe the hardware, software, and calibration procedures with the intention that a reader can recreate this system themselves. RESULTS: The reproducibility and accuracy of the ultrasound calibration for this system was examined. We found that across the ultrasound image the calibration error was less than 2 mm. In a small feasibility study, two participants performed 5 needle insertions each with an average of slightly above 2 mm error. CONCLUSION: We conclude that the accuracy of the system is sufficient for clinician training.&lt;\/p&gt;},<br \/>\r\nkeywords = {},<br \/>\r\npubstate = {published},<br \/>\r\ntppubtype = {conference}<br \/>\r\n}<br \/>\r\n<\/pre><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('107','tp_bibtex')\">Close<\/a><\/p><\/div><div class=\"tp_abstract\" id=\"tp_abstract_107\" style=\"display:none;\"><div class=\"tp_abstract_entry\">&lt;p&gt;PURPOSE: This paper describes an open-source ultrasound-guided central line insertion training system. Modern clinical guidelines are increasingly recommending ultrasound guidance for this procedure due to the decrease in morbidity it provides. However, there are no adequate low-cost systems for helping new clinicians train their inter-hand coordination for this demanding procedure. METHODS: This paper details a training platform which can be recreated with any standard ultrasound machine using inexpensive components. We describe the hardware, software, and calibration procedures with the intention that a reader can recreate this system themselves. RESULTS: The reproducibility and accuracy of the ultrasound calibration for this system was examined. We found that across the ultrasound image the calibration error was less than 2&amp;nbsp;mm. In a small feasibility study, two participants performed 5 needle insertions each with an average of slightly above 2&amp;nbsp;mm error. CONCLUSION: We conclude that the accuracy of the system is sufficient for clinician training.&lt;\/p&gt;<\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('107','tp_abstract')\">Close<\/a><\/p><\/div><div class=\"tp_links\" id=\"tp_links_107\" style=\"display:none;\"><div class=\"tp_links_entry\"><ul class=\"tp_pub_list\"><li><i class=\"fas fa-globe\"><\/i><a class=\"tp_pub_list\" href=\"http:\/\/dx.doi.org\/10.1007\/978-3-030-01045-4_2\" title=\"http:\/\/dx.doi.org\/10.1007\/978-3-030-01045-4_2\" target=\"_blank\">http:\/\/dx.doi.org\/10.1007\/978-3-030-01045-4_2<\/a><\/li><li><i class=\"fas fa-file-pdf\"><\/i><a class=\"tp_pub_list\" href=\"https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/Asselin2018c.pdf\" title=\"https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/Asselin201[...]\" target=\"_blank\">https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/Asselin201[...]<\/a><\/li><li><i class=\"ai ai-doi\"><\/i><a class=\"tp_pub_list\" href=\"https:\/\/dx.doi.org\/10.1007\/978-3-030-01045-4_2\" title=\"Follow DOI:10.1007\/978-3-030-01045-4_2\" target=\"_blank\">doi:10.1007\/978-3-030-01045-4_2<\/a><\/li><\/ul><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('107','tp_links')\">Close<\/a><\/p><\/div><\/div><\/div><\/div><\/div>\n<\/div>\n","protected":false},"featured_media":732,"template":"","meta":{"_acf_changed":false,"_uag_custom_page_level_css":"","site-sidebar-layout":"default","site-content-layout":"","ast-site-content-layout":"default","site-content-style":"default","site-sidebar-style":"default","ast-global-header-display":"","ast-banner-title-visibility":"","ast-main-header-display":"","ast-hfb-above-header-display":"","ast-hfb-below-header-display":"","ast-hfb-mobile-header-display":"","site-post-title":"","ast-breadcrumbs-content":"","ast-featured-img":"","footer-sml-layout":"","ast-disable-related-posts":"","theme-transparent-header-meta":"","adv-header-id-meta":"","stick-header-meta":"","header-above-stick-meta":"","header-main-stick-meta":"","header-below-stick-meta":"","astra-migrate-meta-layouts":"default","ast-page-background-enabled":"default","ast-page-background-meta":{"desktop":{"background-color":"var(--ast-global-color-4)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"tablet":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"mobile":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""}},"ast-content-background-meta":{"desktop":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"tablet":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"mobile":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""}},"footnotes":""},"class_list":["post-2342","qsc_member","type-qsc_member","status-publish","has-post-thumbnail","hentry"],"acf":[],"spectra_custom_meta":{"_thumbnail_id":["732"],"field_qsc_member_acf_email":[""],"_field_qsc_member_acf_email":["qsc_member_acf_email"],"qsc_member_acf_position":["Masters Student"],"_qsc_member_acf_position":["field_qsc_member_acf_position"],"qsc_member_acf_department":["a:1:{i:0;s:19:\"School of Computing\";}"],"_qsc_member_acf_department":["field_qsc_member_acf_department"],"field_qsc_member_acf_organization":["Queen's University"],"_field_qsc_member_acf_organization":["qsc_member_acf_organization"],"field_qsc_member_acf_linkedin":[""],"_field_qsc_member_acf_linkedin":["qsc_member_acf_linkedin"],"field_qsc_member_acf_gscholar":[""],"_field_qsc_member_acf_gscholar":["qsc_member_acf_gscholar"],"field_qsc_member_acf_github":[""],"_field_qsc_member_acf_github":["qsc_member_acf_github"],"field_qsc_member_acf_researchgate":[""],"_field_qsc_member_acf_researchgate":["qsc_member_acf_researchgate"],"field_qsc_member_acf_web":[""],"_field_qsc_member_acf_web":["qsc_member_acf_web"],"field_qsc_member_acf_program_status":["Past"],"_field_qsc_member_acf_program_status":["qsc_member_acf_program_status"],"field_qsc_member_acf_start_year":["May 2017"],"_field_qsc_member_acf_start_year":["qsc_member_acf_start_year"],"field_qsc_member_acf_end_year":[""],"_field_qsc_member_acf_end_year":["qsc_member_acf_end_year"],"_uag_css_file_name":["uag-css-2342.css"],"_uag_page_assets":["a:9:{s:3:\"css\";s:263:\".uag-blocks-common-selector{z-index:var(--z-index-desktop) !important}@media (max-width: 976px){.uag-blocks-common-selector{z-index:var(--z-index-tablet) !important}}@media (max-width: 767px){.uag-blocks-common-selector{z-index:var(--z-index-mobile) !important}}\n\";s:2:\"js\";s:0:\"\";s:18:\"current_block_list\";a:7:{i:0;s:11:\"core\/search\";i:1;s:10:\"core\/group\";i:2;s:12:\"core\/heading\";i:3;s:17:\"core\/latest-posts\";i:4;s:20:\"core\/latest-comments\";i:5;s:13:\"core\/archives\";i:6;s:15:\"core\/categories\";}s:8:\"uag_flag\";b:0;s:11:\"uag_version\";s:10:\"1771033544\";s:6:\"gfonts\";a:0:{}s:10:\"gfonts_url\";s:0:\"\";s:12:\"gfonts_files\";a:0:{}s:14:\"uag_faq_layout\";b:0;}"]},"uagb_featured_image_src":{"full":["https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/MarkAsselin_0.jpg",480,480,false],"thumbnail":["https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/MarkAsselin_0-150x150.jpg",150,150,true],"medium":["https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/MarkAsselin_0-300x300.jpg",300,300,true],"medium_large":["https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/MarkAsselin_0.jpg",480,480,false],"large":["https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/MarkAsselin_0.jpg",480,480,false],"1536x1536":["https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/MarkAsselin_0.jpg",480,480,false],"2048x2048":["https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/uploads\/sites\/3\/2024\/02\/MarkAsselin_0.jpg",480,480,false]},"uagb_author_info":{"display_name":"Doug Martin","author_link":"https:\/\/labs.cs.queensu.ca\/perklab\/author\/"},"uagb_comment_info":0,"uagb_excerpt":"Mark Asselin Masters Student School of Computing Queen&#8217;s University Member from May 2017 to present Mark is a masters student in biomedical computing with previous experience in electrical engineering. Please see his website at https:\/\/markasselin.github.io\/ for more information about him. Kaufmann, Martin; Jamzad, Amoon; Ungi, Tamas; Rodgers, Jessica R; Koster, Teaghan; Yeung, Chris; Ehrlich, Josh;&hellip;","_links":{"self":[{"href":"https:\/\/labs.cs.queensu.ca\/perklab\/wp-json\/wp\/v2\/qsc_member\/2342","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/labs.cs.queensu.ca\/perklab\/wp-json\/wp\/v2\/qsc_member"}],"about":[{"href":"https:\/\/labs.cs.queensu.ca\/perklab\/wp-json\/wp\/v2\/types\/qsc_member"}],"version-history":[{"count":0,"href":"https:\/\/labs.cs.queensu.ca\/perklab\/wp-json\/wp\/v2\/qsc_member\/2342\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/labs.cs.queensu.ca\/perklab\/wp-json\/wp\/v2\/media\/732"}],"wp:attachment":[{"href":"https:\/\/labs.cs.queensu.ca\/perklab\/wp-json\/wp\/v2\/media?parent=2342"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}