{"id":2422,"date":"2024-05-03T15:21:29","date_gmt":"2024-05-03T15:21:29","guid":{"rendered":"https:\/\/labs.cs.queensu.ca\/perklab\/members\/david-morton\/"},"modified":"2024-05-03T15:21:29","modified_gmt":"2024-05-03T15:21:29","slug":"david-morton","status":"publish","type":"qsc_member","link":"https:\/\/labs.cs.queensu.ca\/perklab\/members\/david-morton\/","title":{"rendered":"David Morton"},"content":{"rendered":"<div class=\"wp-block-columns is-layout-flex wp-block-columns-is-layout-flex qsc-member-single-core-info-container\">\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow qsc-member-single-photo-column\">\n\t\t<img decoding=\"async\" src=\"https:\/\/labs.cs.queensu.ca\/perklab\/wp-content\/plugins\/qsc-members\/\/images\/missing-image-placeholder.png\" class=\"qsc-member-single-photo\"\/>\n\t<\/div>\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow qsc-member-single-info-column\">\n<div class=\"qsc-member-name\">\n<h1>David Morton<\/h1>\n<\/div>\n<div class=\"qsc-member-position\">Masters Student<\/div>\n<div class=\"qsc-member-department\"><\/div>\n<div class=\"qsc-member-organization\">Queen&#8217;s University<\/div>\n<div class=\"qsc-member-date-range\">Member from <em>September 2021<\/em> to <em>present<\/em><\/div>\n<div class=\"qsc-member-contact\">\n<div class=\"qsc-member-socials\">\n\t\t\t<\/div>\n<\/p><\/div>\n<\/p><\/div>\n<\/div>\n<div class=\"qsc-member-bio\">\n\tDavid completed his BASc in Electrical Engineering in 2021 from Queen&#8217;s University, Canada. He is now a MASc candidate in the Department of Electrical and Computer Engineering. He is a member of both the Laboratory for Percutaneous Surgery (Perk Lab) and the Medical Informatics Laboratory (Med-i Lab). David&#8217;s current research pertains to using spectroscopy and machine learning to classify biological tissues. His work focuses on breast and skin cancer applications. His work has been funded through the OGS award and the NSERC-CREATE program.<br \/>\n<div class=\"teachpress_pub_list\"><form name=\"tppublistform\" method=\"get\"><a name=\"tppubs\" id=\"tppubs\"><\/a><\/form><div class=\"teachpress_publication_list\"><div class=\"tp_publication tp_publication_proceedings\"><div class=\"tp_pub_info\"><p class=\"tp_pub_author\"> Kim, Andrew S.;  Yeung, Chris;  Szabo, Robert;  Sunderland, Kyle;  Hisey, Rebecca;  Morton, David;  Kikinis, Ron;  Diao, Babacar;  Mousavi, Parvin;  Ungi, Tamas;  Fichtinger, Gabor<\/p><p class=\"tp_pub_title\"><a class=\"tp_title_link\" href=\"https:\/\/dx.doi.org\/10.1117\/12.3006533\" title=\"Percutaneous nephrostomy needle guidance using real-time 3D anatomical visualization with live ultrasound segmentation\" target=\"blank\">Percutaneous nephrostomy needle guidance using real-time 3D anatomical visualization with live ultrasound segmentation<\/a> <span class=\"tp_pub_type tp_  proceedings\">Proceedings<\/span> <\/p><p class=\"tp_pub_additional\"><span class=\"tp_pub_additional_publisher\">SPIE, <\/span><span class=\"tp_pub_additional_year\">2024<\/span>.<\/p><p class=\"tp_pub_menu\"><span class=\"tp_abstract_link\"><a id=\"tp_abstract_sh_652\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('652','tp_abstract')\" title=\"Show abstract\" style=\"cursor:pointer;\">Abstract<\/a><\/span> | <span class=\"tp_resource_link\"><a id=\"tp_links_sh_652\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('652','tp_links')\" title=\"Show links and resources\" style=\"cursor:pointer;\">Links<\/a><\/span> | <span class=\"tp_bibtex_link\"><a id=\"tp_bibtex_sh_652\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('652','tp_bibtex')\" title=\"Show BibTeX entry\" style=\"cursor:pointer;\">BibTeX<\/a><\/span><\/p><div class=\"tp_bibtex\" id=\"tp_bibtex_652\" style=\"display:none;\"><div class=\"tp_bibtex_entry\"><pre>@proceedings{Kim2024,<br \/>\r\ntitle = {Percutaneous nephrostomy needle guidance using real-time 3D anatomical visualization with live ultrasound segmentation},<br \/>\r\nauthor = {Andrew S. Kim and Chris Yeung and Robert Szabo and Kyle Sunderland and Rebecca Hisey and David Morton and Ron Kikinis and Babacar Diao and Parvin Mousavi and Tamas Ungi and Gabor Fichtinger},<br \/>\r\neditor = {Maryam E. Rettmann and Jeffrey H. Siewerdsen},<br \/>\r\ndoi = {10.1117\/12.3006533},<br \/>\r\nyear  = {2024},<br \/>\r\ndate = {2024-03-29},<br \/>\r\nurldate = {2024-03-29},<br \/>\r\npublisher = {SPIE},<br \/>\r\nabstract = {<br \/>\r\nPURPOSE: Percutaneous nephrostomy is a commonly performed procedure to drain urine to provide relief in patients with hydronephrosis. Conventional percutaneous nephrostomy needle guidance methods can be difficult, expensive, or not portable. We propose an open-source real-time 3D anatomical visualization aid for needle guidance with live ultrasound segmentation and 3D volume reconstruction using free, open-source software. METHODS: Basic hydronephrotic kidney phantoms were created, and recordings of these models were manually segmented and used to train a deep learning model that makes live segmentation predictions to perform live 3D volume reconstruction of the fluid-filled cavity. Participants performed 5 needle insertions with the visualization aid and 5 insertions with ultrasound needle guidance on a kidney phantom in randomized order, and these were recorded. Recordings of the trials were analyzed for needle tip distance to the center of the target calyx, needle insertion time, and success rate. Participants also completed a survey on their experience. RESULTS: Using the visualization aid showed significantly higher accuracy, while needle insertion time and success rate were not statistically significant at our sample size. Participants mostly responded positively to the visualization aid, and 80% found it easier to use than ultrasound needle guidance. CONCLUSION: We found that our visualization aid produced increased accuracy and an overall positive experience. We demonstrated that our system is functional and stable and believe that the workflow with this system can be applied to other procedures. This visualization aid system is effective on phantoms and is ready for translation with clinical data.},<br \/>\r\nkeywords = {},<br \/>\r\npubstate = {published},<br \/>\r\ntppubtype = {proceedings}<br \/>\r\n}<br \/>\r\n<\/pre><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('652','tp_bibtex')\">Close<\/a><\/p><\/div><div class=\"tp_abstract\" id=\"tp_abstract_652\" style=\"display:none;\"><div class=\"tp_abstract_entry\"><br \/>\r\nPURPOSE: Percutaneous nephrostomy is a commonly performed procedure to drain urine to provide relief in patients with hydronephrosis. Conventional percutaneous nephrostomy needle guidance methods can be difficult, expensive, or not portable. We propose an open-source real-time 3D anatomical visualization aid for needle guidance with live ultrasound segmentation and 3D volume reconstruction using free, open-source software. METHODS: Basic hydronephrotic kidney phantoms were created, and recordings of these models were manually segmented and used to train a deep learning model that makes live segmentation predictions to perform live 3D volume reconstruction of the fluid-filled cavity. Participants performed 5 needle insertions with the visualization aid and 5 insertions with ultrasound needle guidance on a kidney phantom in randomized order, and these were recorded. Recordings of the trials were analyzed for needle tip distance to the center of the target calyx, needle insertion time, and success rate. Participants also completed a survey on their experience. RESULTS: Using the visualization aid showed significantly higher accuracy, while needle insertion time and success rate were not statistically significant at our sample size. Participants mostly responded positively to the visualization aid, and 80% found it easier to use than ultrasound needle guidance. CONCLUSION: We found that our visualization aid produced increased accuracy and an overall positive experience. We demonstrated that our system is functional and stable and believe that the workflow with this system can be applied to other procedures. This visualization aid system is effective on phantoms and is ready for translation with clinical data.<\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('652','tp_abstract')\">Close<\/a><\/p><\/div><div class=\"tp_links\" id=\"tp_links_652\" style=\"display:none;\"><div class=\"tp_links_entry\"><ul class=\"tp_pub_list\"><li><i class=\"ai ai-doi\"><\/i><a class=\"tp_pub_list\" href=\"https:\/\/dx.doi.org\/10.1117\/12.3006533\" title=\"Follow DOI:10.1117\/12.3006533\" target=\"_blank\">doi:10.1117\/12.3006533<\/a><\/li><\/ul><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('652','tp_links')\">Close<\/a><\/p><\/div><\/div><\/div><div class=\"tp_publication tp_publication_article\"><div class=\"tp_pub_info\"><p class=\"tp_pub_author\"> Pose-D\u00edez-de-la-Lastra, Alicia;  Ungi, Tamas;  Morton, David;  Fichtinger, Gabor;  Pascau, Javier<\/p><p class=\"tp_pub_title\"><a class=\"tp_title_link\" href=\"https:\/\/link.springer.com\/article\/10.1007\/s11548-023-02977-0\" title=\"https:\/\/link.springer.com\/article\/10.1007\/s11548-023-02977-0\" target=\"blank\">Real-time integration between Microsoft HoloLens 2 and 3D Slicer with demonstration in pedicle screw placement planning<\/a> <span class=\"tp_pub_type tp_  article\">Journal Article<\/span> <\/p><p class=\"tp_pub_additional\"><span class=\"tp_pub_additional_in\">In: <\/span><span class=\"tp_pub_additional_journal\">International Journal of Computer Assisted Radiology and Surgery, <\/span><span class=\"tp_pub_additional_volume\">vol. 18, <\/span><span class=\"tp_pub_additional_issue\">iss. 11, <\/span><span class=\"tp_pub_additional_pages\">pp. 2023-2032, <\/span><span class=\"tp_pub_additional_year\">2023<\/span>.<\/p><p class=\"tp_pub_menu\"><span class=\"tp_abstract_link\"><a id=\"tp_abstract_sh_938\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('938','tp_abstract')\" title=\"Show abstract\" style=\"cursor:pointer;\">Abstract<\/a><\/span> | <span class=\"tp_resource_link\"><a id=\"tp_links_sh_938\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('938','tp_links')\" title=\"Show links and resources\" style=\"cursor:pointer;\">Links<\/a><\/span> | <span class=\"tp_bibtex_link\"><a id=\"tp_bibtex_sh_938\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('938','tp_bibtex')\" title=\"Show BibTeX entry\" style=\"cursor:pointer;\">BibTeX<\/a><\/span><\/p><div class=\"tp_bibtex\" id=\"tp_bibtex_938\" style=\"display:none;\"><div class=\"tp_bibtex_entry\"><pre>@article{fichtinger2023f,<br \/>\r\ntitle = {Real-time integration between Microsoft HoloLens 2 and 3D Slicer with demonstration in pedicle screw placement planning},<br \/>\r\nauthor = {Alicia Pose-D\u00edez-de-la-Lastra and Tamas Ungi and David Morton and Gabor Fichtinger and Javier Pascau},<br \/>\r\nurl = {https:\/\/link.springer.com\/article\/10.1007\/s11548-023-02977-0},<br \/>\r\nyear  = {2023},<br \/>\r\ndate = {2023-01-01},<br \/>\r\njournal = {International Journal of Computer Assisted Radiology and Surgery},<br \/>\r\nvolume = {18},<br \/>\r\nissue = {11},<br \/>\r\npages = {2023-2032},<br \/>\r\npublisher = {Springer International Publishing},<br \/>\r\nabstract = {Purpose <br \/>\r\nUp to date, there has been a lack of software infrastructure to connect 3D Slicer to any augmented reality (AR) device. This work describes a novel connection approach using Microsoft HoloLens 2 and OpenIGTLink, with a demonstration in pedicle screw placement planning. <br \/>\r\nMethods <br \/>\r\nWe developed an AR application in Unity that is wirelessly rendered onto Microsoft HoloLens 2 using Holographic Remoting. Simultaneously, Unity connects to 3D Slicer using the OpenIGTLink communication protocol. Geometrical transform and image messages are transferred between both platforms in real time. Through the AR glasses, a user visualizes a patient\u2019s computed tomography overlaid onto virtual 3D models showing anatomical structures. We technically evaluated the system by measuring message transference latency between the platforms. Its functionality was assessed in pedicle screw placement planning \u2026},<br \/>\r\nkeywords = {},<br \/>\r\npubstate = {published},<br \/>\r\ntppubtype = {article}<br \/>\r\n}<br \/>\r\n<\/pre><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('938','tp_bibtex')\">Close<\/a><\/p><\/div><div class=\"tp_abstract\" id=\"tp_abstract_938\" style=\"display:none;\"><div class=\"tp_abstract_entry\">Purpose <br \/>\r\nUp to date, there has been a lack of software infrastructure to connect 3D Slicer to any augmented reality (AR) device. This work describes a novel connection approach using Microsoft HoloLens 2 and OpenIGTLink, with a demonstration in pedicle screw placement planning. <br \/>\r\nMethods <br \/>\r\nWe developed an AR application in Unity that is wirelessly rendered onto Microsoft HoloLens 2 using Holographic Remoting. Simultaneously, Unity connects to 3D Slicer using the OpenIGTLink communication protocol. Geometrical transform and image messages are transferred between both platforms in real time. Through the AR glasses, a user visualizes a patient\u2019s computed tomography overlaid onto virtual 3D models showing anatomical structures. We technically evaluated the system by measuring message transference latency between the platforms. Its functionality was assessed in pedicle screw placement planning \u2026<\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('938','tp_abstract')\">Close<\/a><\/p><\/div><div class=\"tp_links\" id=\"tp_links_938\" style=\"display:none;\"><div class=\"tp_links_entry\"><ul class=\"tp_pub_list\"><li><i class=\"fas fa-globe\"><\/i><a class=\"tp_pub_list\" href=\"https:\/\/link.springer.com\/article\/10.1007\/s11548-023-02977-0\" title=\"https:\/\/link.springer.com\/article\/10.1007\/s11548-023-02977-0\" target=\"_blank\">https:\/\/link.springer.com\/article\/10.1007\/s11548-023-02977-0<\/a><\/li><\/ul><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('938','tp_links')\">Close<\/a><\/p><\/div><\/div><\/div><div class=\"tp_publication tp_publication_article\"><div class=\"tp_pub_info\"><p class=\"tp_pub_author\"> Morton, David;  Connolly, Laura;  Groves, Leah;  Sunderland, Kyle;  Jamzad, Amoon;  Rudan, John F;  Fichtinger, Gabor;  Ungi, Tamas;  Mousavi, Parvin<\/p><p class=\"tp_pub_title\"><a class=\"tp_title_link\" href=\"https:\/\/www.spiedigitallibrary.org\/conference-proceedings-of-spie\/12466\/124661K\/Tracked-tissue-sensing-for-tumor-bed-inspection\/10.1117\/12.2654217.short\" title=\"https:\/\/www.spiedigitallibrary.org\/conference-proceedings-of-spie\/12466\/124661K\/Tracked-tissue-sensing-for-tumor-bed-inspection\/10.1117\/12.2654217.short\" target=\"blank\">Tracked tissue sensing for tumor bed inspection<\/a> <span class=\"tp_pub_type tp_  article\">Journal Article<\/span> <\/p><p class=\"tp_pub_additional\"><span class=\"tp_pub_additional_in\">In: <\/span><span class=\"tp_pub_additional_volume\">vol. 12466, <\/span><span class=\"tp_pub_additional_pages\">pp. 378-385, <\/span><span class=\"tp_pub_additional_year\">2023<\/span>.<\/p><p class=\"tp_pub_menu\"><span class=\"tp_abstract_link\"><a id=\"tp_abstract_sh_1006\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('1006','tp_abstract')\" title=\"Show abstract\" style=\"cursor:pointer;\">Abstract<\/a><\/span> | <span class=\"tp_resource_link\"><a id=\"tp_links_sh_1006\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('1006','tp_links')\" title=\"Show links and resources\" style=\"cursor:pointer;\">Links<\/a><\/span> | <span class=\"tp_bibtex_link\"><a id=\"tp_bibtex_sh_1006\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('1006','tp_bibtex')\" title=\"Show BibTeX entry\" style=\"cursor:pointer;\">BibTeX<\/a><\/span><\/p><div class=\"tp_bibtex\" id=\"tp_bibtex_1006\" style=\"display:none;\"><div class=\"tp_bibtex_entry\"><pre>@article{fichtinger2023x,<br \/>\r\ntitle = {Tracked tissue sensing for tumor bed inspection},<br \/>\r\nauthor = {David Morton and Laura Connolly and Leah Groves and Kyle Sunderland and Amoon Jamzad and John F Rudan and Gabor Fichtinger and Tamas Ungi and Parvin Mousavi},<br \/>\r\nurl = {https:\/\/www.spiedigitallibrary.org\/conference-proceedings-of-spie\/12466\/124661K\/Tracked-tissue-sensing-for-tumor-bed-inspection\/10.1117\/12.2654217.short},<br \/>\r\nyear  = {2023},<br \/>\r\ndate = {2023-01-01},<br \/>\r\nvolume = {12466},<br \/>\r\npages = {378-385},<br \/>\r\npublisher = {SPIE},<br \/>\r\nabstract = {Up to 30% of breast-conserving surgery patients require secondary surgery to remove cancerous tissue missed in the initial intervention. We hypothesize that tracked tissue sensing can improve the success rate of breast-conserving surgery. Tissue sensor tracking allows the surgeon to intraoperatively scan the tumor bed for leftover cancerous tissue. In this study, we characterize the performance of our tracked optical scanning testbed using an experimental pipeline. We assess the Dice similarity coefficient, accuracy, and latency of the testbed.},<br \/>\r\nkeywords = {},<br \/>\r\npubstate = {published},<br \/>\r\ntppubtype = {article}<br \/>\r\n}<br \/>\r\n<\/pre><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('1006','tp_bibtex')\">Close<\/a><\/p><\/div><div class=\"tp_abstract\" id=\"tp_abstract_1006\" style=\"display:none;\"><div class=\"tp_abstract_entry\">Up to 30% of breast-conserving surgery patients require secondary surgery to remove cancerous tissue missed in the initial intervention. We hypothesize that tracked tissue sensing can improve the success rate of breast-conserving surgery. Tissue sensor tracking allows the surgeon to intraoperatively scan the tumor bed for leftover cancerous tissue. In this study, we characterize the performance of our tracked optical scanning testbed using an experimental pipeline. We assess the Dice similarity coefficient, accuracy, and latency of the testbed.<\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('1006','tp_abstract')\">Close<\/a><\/p><\/div><div class=\"tp_links\" id=\"tp_links_1006\" style=\"display:none;\"><div class=\"tp_links_entry\"><ul class=\"tp_pub_list\"><li><i class=\"fas fa-globe\"><\/i><a class=\"tp_pub_list\" href=\"https:\/\/www.spiedigitallibrary.org\/conference-proceedings-of-spie\/12466\/124661K\/Tracked-tissue-sensing-for-tumor-bed-inspection\/10.1117\/12.2654217.short\" title=\"https:\/\/www.spiedigitallibrary.org\/conference-proceedings-of-spie\/12466\/124661K\/[...]\" target=\"_blank\">https:\/\/www.spiedigitallibrary.org\/conference-proceedings-of-spie\/12466\/124661K\/[...]<\/a><\/li><\/ul><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('1006','tp_links')\">Close<\/a><\/p><\/div><\/div><\/div><\/div><\/div>\n<\/div>\n","protected":false},"featured_media":0,"template":"","meta":{"_acf_changed":false,"_uag_custom_page_level_css":"","site-sidebar-layout":"default","site-content-layout":"","ast-site-content-layout":"default","site-content-style":"default","site-sidebar-style":"default","ast-global-header-display":"","ast-banner-title-visibility":"","ast-main-header-display":"","ast-hfb-above-header-display":"","ast-hfb-below-header-display":"","ast-hfb-mobile-header-display":"","site-post-title":"","ast-breadcrumbs-content":"","ast-featured-img":"","footer-sml-layout":"","ast-disable-related-posts":"","theme-transparent-header-meta":"","adv-header-id-meta":"","stick-header-meta":"","header-above-stick-meta":"","header-main-stick-meta":"","header-below-stick-meta":"","astra-migrate-meta-layouts":"default","ast-page-background-enabled":"default","ast-page-background-meta":{"desktop":{"background-color":"var(--ast-global-color-4)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"tablet":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"mobile":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""}},"ast-content-background-meta":{"desktop":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"tablet":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"mobile":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""}},"footnotes":""},"class_list":["post-2422","qsc_member","type-qsc_member","status-publish","hentry"],"acf":[],"spectra_custom_meta":{"field_qsc_member_acf_email":[""],"_field_qsc_member_acf_email":["qsc_member_acf_email"],"qsc_member_acf_position":["Masters Student"],"_qsc_member_acf_position":["field_qsc_member_acf_position"],"qsc_member_acf_department":[""],"_qsc_member_acf_department":["field_qsc_member_acf_department"],"field_qsc_member_acf_organization":["Queen's University"],"_field_qsc_member_acf_organization":["qsc_member_acf_organization"],"field_qsc_member_acf_linkedin":[""],"_field_qsc_member_acf_linkedin":["qsc_member_acf_linkedin"],"field_qsc_member_acf_gscholar":[""],"_field_qsc_member_acf_gscholar":["qsc_member_acf_gscholar"],"field_qsc_member_acf_github":[""],"_field_qsc_member_acf_github":["qsc_member_acf_github"],"field_qsc_member_acf_researchgate":[""],"_field_qsc_member_acf_researchgate":["qsc_member_acf_researchgate"],"field_qsc_member_acf_web":[""],"_field_qsc_member_acf_web":["qsc_member_acf_web"],"field_qsc_member_acf_program_status":["Past"],"_field_qsc_member_acf_program_status":["qsc_member_acf_program_status"],"field_qsc_member_acf_start_year":["September 2021"],"_field_qsc_member_acf_start_year":["qsc_member_acf_start_year"],"field_qsc_member_acf_end_year":[""],"_field_qsc_member_acf_end_year":["qsc_member_acf_end_year"],"_uag_css_file_name":["uag-css-2422.css"],"_uag_page_assets":["a:9:{s:3:\"css\";s:263:\".uag-blocks-common-selector{z-index:var(--z-index-desktop) !important}@media (max-width: 976px){.uag-blocks-common-selector{z-index:var(--z-index-tablet) !important}}@media (max-width: 767px){.uag-blocks-common-selector{z-index:var(--z-index-mobile) !important}}\n\";s:2:\"js\";s:0:\"\";s:18:\"current_block_list\";a:7:{i:0;s:11:\"core\/search\";i:1;s:10:\"core\/group\";i:2;s:12:\"core\/heading\";i:3;s:17:\"core\/latest-posts\";i:4;s:20:\"core\/latest-comments\";i:5;s:13:\"core\/archives\";i:6;s:15:\"core\/categories\";}s:8:\"uag_flag\";b:0;s:11:\"uag_version\";s:10:\"1771033544\";s:6:\"gfonts\";a:0:{}s:10:\"gfonts_url\";s:0:\"\";s:12:\"gfonts_files\";a:0:{}s:14:\"uag_faq_layout\";b:0;}"]},"uagb_featured_image_src":{"full":false,"thumbnail":false,"medium":false,"medium_large":false,"large":false,"1536x1536":false,"2048x2048":false},"uagb_author_info":{"display_name":"Doug Martin","author_link":"https:\/\/labs.cs.queensu.ca\/perklab\/author\/"},"uagb_comment_info":0,"uagb_excerpt":"David Morton Masters Student Queen&#8217;s University Member from September 2021 to present David completed his BASc in Electrical Engineering in 2021 from Queen&#8217;s University, Canada. He is now a MASc candidate in the Department of Electrical and Computer Engineering. He is a member of both the Laboratory for Percutaneous Surgery (Perk Lab) and the Medical&hellip;","_links":{"self":[{"href":"https:\/\/labs.cs.queensu.ca\/perklab\/wp-json\/wp\/v2\/qsc_member\/2422","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/labs.cs.queensu.ca\/perklab\/wp-json\/wp\/v2\/qsc_member"}],"about":[{"href":"https:\/\/labs.cs.queensu.ca\/perklab\/wp-json\/wp\/v2\/types\/qsc_member"}],"version-history":[{"count":0,"href":"https:\/\/labs.cs.queensu.ca\/perklab\/wp-json\/wp\/v2\/qsc_member\/2422\/revisions"}],"wp:attachment":[{"href":"https:\/\/labs.cs.queensu.ca\/perklab\/wp-json\/wp\/v2\/media?parent=2422"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}