David Morton
David completed his BASc in Electrical Engineering in 2021 from Queen's University, Canada. He is now a MASc candidate in the Department of Electrical and Computer Engineering. He is a member of both the Laboratory for Percutaneous Surgery (Perk Lab) and the Medical Informatics Laboratory (Med-i Lab). David's current research pertains to using spectroscopy and machine learning to classify biological tissues. His work focuses on breast and skin cancer applications. His work has been funded through the OGS award and the NSERC-CREATE program.
Kim, Andrew S.; Yeung, Chris; Szabo, Robert; Sunderland, Kyle; Hisey, Rebecca; Morton, David; Kikinis, Ron; Diao, Babacar; Mousavi, Parvin; Ungi, Tamas; Fichtinger, Gabor
SPIE, 2024.
@proceedings{Kim2024,
title = {Percutaneous nephrostomy needle guidance using real-time 3D anatomical visualization with live ultrasound segmentation},
author = {Andrew S. Kim and Chris Yeung and Robert Szabo and Kyle Sunderland and Rebecca Hisey and David Morton and Ron Kikinis and Babacar Diao and Parvin Mousavi and Tamas Ungi and Gabor Fichtinger},
editor = {Maryam E. Rettmann and Jeffrey H. Siewerdsen},
doi = {10.1117/12.3006533},
year = {2024},
date = {2024-03-29},
urldate = {2024-03-29},
publisher = {SPIE},
abstract = {
PURPOSE: Percutaneous nephrostomy is a commonly performed procedure to drain urine to provide relief in patients with hydronephrosis. Conventional percutaneous nephrostomy needle guidance methods can be difficult, expensive, or not portable. We propose an open-source real-time 3D anatomical visualization aid for needle guidance with live ultrasound segmentation and 3D volume reconstruction using free, open-source software. METHODS: Basic hydronephrotic kidney phantoms were created, and recordings of these models were manually segmented and used to train a deep learning model that makes live segmentation predictions to perform live 3D volume reconstruction of the fluid-filled cavity. Participants performed 5 needle insertions with the visualization aid and 5 insertions with ultrasound needle guidance on a kidney phantom in randomized order, and these were recorded. Recordings of the trials were analyzed for needle tip distance to the center of the target calyx, needle insertion time, and success rate. Participants also completed a survey on their experience. RESULTS: Using the visualization aid showed significantly higher accuracy, while needle insertion time and success rate were not statistically significant at our sample size. Participants mostly responded positively to the visualization aid, and 80% found it easier to use than ultrasound needle guidance. CONCLUSION: We found that our visualization aid produced increased accuracy and an overall positive experience. We demonstrated that our system is functional and stable and believe that the workflow with this system can be applied to other procedures. This visualization aid system is effective on phantoms and is ready for translation with clinical data.},
keywords = {},
pubstate = {published},
tppubtype = {proceedings}
}
PURPOSE: Percutaneous nephrostomy is a commonly performed procedure to drain urine to provide relief in patients with hydronephrosis. Conventional percutaneous nephrostomy needle guidance methods can be difficult, expensive, or not portable. We propose an open-source real-time 3D anatomical visualization aid for needle guidance with live ultrasound segmentation and 3D volume reconstruction using free, open-source software. METHODS: Basic hydronephrotic kidney phantoms were created, and recordings of these models were manually segmented and used to train a deep learning model that makes live segmentation predictions to perform live 3D volume reconstruction of the fluid-filled cavity. Participants performed 5 needle insertions with the visualization aid and 5 insertions with ultrasound needle guidance on a kidney phantom in randomized order, and these were recorded. Recordings of the trials were analyzed for needle tip distance to the center of the target calyx, needle insertion time, and success rate. Participants also completed a survey on their experience. RESULTS: Using the visualization aid showed significantly higher accuracy, while needle insertion time and success rate were not statistically significant at our sample size. Participants mostly responded positively to the visualization aid, and 80% found it easier to use than ultrasound needle guidance. CONCLUSION: We found that our visualization aid produced increased accuracy and an overall positive experience. We demonstrated that our system is functional and stable and believe that the workflow with this system can be applied to other procedures. This visualization aid system is effective on phantoms and is ready for translation with clinical data.
Pose-Díez-de-la-Lastra, Alicia; Ungi, Tamas; Morton, David; Fichtinger, Gabor; Pascau, Javier
Real-time integration between Microsoft HoloLens 2 and 3D Slicer with demonstration in pedicle screw placement planning Journal Article
In: International Journal of Computer Assisted Radiology and Surgery, vol. 18, iss. 11, pp. 2023-2032, 2023.
@article{fichtinger2023f,
title = {Real-time integration between Microsoft HoloLens 2 and 3D Slicer with demonstration in pedicle screw placement planning},
author = {Alicia Pose-Díez-de-la-Lastra and Tamas Ungi and David Morton and Gabor Fichtinger and Javier Pascau},
url = {https://link.springer.com/article/10.1007/s11548-023-02977-0},
year = {2023},
date = {2023-01-01},
journal = {International Journal of Computer Assisted Radiology and Surgery},
volume = {18},
issue = {11},
pages = {2023-2032},
publisher = {Springer International Publishing},
abstract = {Purpose
Up to date, there has been a lack of software infrastructure to connect 3D Slicer to any augmented reality (AR) device. This work describes a novel connection approach using Microsoft HoloLens 2 and OpenIGTLink, with a demonstration in pedicle screw placement planning.
Methods
We developed an AR application in Unity that is wirelessly rendered onto Microsoft HoloLens 2 using Holographic Remoting. Simultaneously, Unity connects to 3D Slicer using the OpenIGTLink communication protocol. Geometrical transform and image messages are transferred between both platforms in real time. Through the AR glasses, a user visualizes a patient’s computed tomography overlaid onto virtual 3D models showing anatomical structures. We technically evaluated the system by measuring message transference latency between the platforms. Its functionality was assessed in pedicle screw placement planning …},
keywords = {},
pubstate = {published},
tppubtype = {article}
}
Up to date, there has been a lack of software infrastructure to connect 3D Slicer to any augmented reality (AR) device. This work describes a novel connection approach using Microsoft HoloLens 2 and OpenIGTLink, with a demonstration in pedicle screw placement planning.
Methods
We developed an AR application in Unity that is wirelessly rendered onto Microsoft HoloLens 2 using Holographic Remoting. Simultaneously, Unity connects to 3D Slicer using the OpenIGTLink communication protocol. Geometrical transform and image messages are transferred between both platforms in real time. Through the AR glasses, a user visualizes a patient’s computed tomography overlaid onto virtual 3D models showing anatomical structures. We technically evaluated the system by measuring message transference latency between the platforms. Its functionality was assessed in pedicle screw placement planning …
Morton, David; Connolly, Laura; Groves, Leah; Sunderland, Kyle; Jamzad, Amoon; Rudan, John F; Fichtinger, Gabor; Ungi, Tamas; Mousavi, Parvin
Tracked tissue sensing for tumor bed inspection Journal Article
In: vol. 12466, pp. 378-385, 2023.
@article{fichtinger2023x,
title = {Tracked tissue sensing for tumor bed inspection},
author = {David Morton and Laura Connolly and Leah Groves and Kyle Sunderland and Amoon Jamzad and John F Rudan and Gabor Fichtinger and Tamas Ungi and Parvin Mousavi},
url = {https://www.spiedigitallibrary.org/conference-proceedings-of-spie/12466/124661K/Tracked-tissue-sensing-for-tumor-bed-inspection/10.1117/12.2654217.short},
year = {2023},
date = {2023-01-01},
volume = {12466},
pages = {378-385},
publisher = {SPIE},
abstract = {Up to 30% of breast-conserving surgery patients require secondary surgery to remove cancerous tissue missed in the initial intervention. We hypothesize that tracked tissue sensing can improve the success rate of breast-conserving surgery. Tissue sensor tracking allows the surgeon to intraoperatively scan the tumor bed for leftover cancerous tissue. In this study, we characterize the performance of our tracked optical scanning testbed using an experimental pipeline. We assess the Dice similarity coefficient, accuracy, and latency of the testbed.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}