7th Annual Surgery, Intervention, and Engineering Symposium
Date: December 12, 2018
Keynote delivered by
William R. Jarnagin, MD,
Leslie H. Blumgart, M.D. Chair in Surgery
Chief, Hepatopancreatobiliary Service
Memorial Sloan-Kettering Cancer Center
Professor of Surgery
Cornell University Weill College of Medicine
1:55 p.m. – 2:55 p.m.
Light Hall Room 202
Innovations in Surgical Data Science with Oncologic Application
Intrahepatic cholangiocarcinoma (IHC) is a devastating, largely incurable cancer, with a rising world-wide incidence and few effective treatment options. Systemic chemotherapy is the mainstay of treatment but is limited. Advances in genomics and radiomics are starting to improve our ability to classify tumors, stratify risk and tailor treatment. We are developing imaging tools to inform the care, treatment, and cure of cancer patients. We will discuss our recent work to optimally select patients for therapy, elucidate mechanisms of cancer progression, identify high-risk patients, and guide surgical resection with imaging technology.
Invited Lecture #1
Jin U. Kang, PhD,
Jacob Suter Jammer Professor (primary)
Department of Electrical and computer engineering
The Johns Hopkins university
Department of Dermatology
Johns Hopkins Medicine
1:00 p.m – 1:45 p.m.
Light Hall 202
Image-Guided Advanced Surgical Systems and Techniques for Microsurgery
Abstract: The advances in 3D optical imaging and sensing technologies are enabling the development of the next generation of smart surgical devices and systems. In this intelligent “smart” surgical platform, optical sensors/imagers, robotics, and computers are combined with surgical devices and systems to attain surgical outcomes beyond free-hand human capabilities. In our laboratories, we have been developing real-time intraoperative optical coherence tomography systems specifically for the development of practical and smart microsurgical tools and 3-D image guided surgical systems that enhance the surgeon’s ability to visualize optically transparent tissues, to identify and track visually transparent tissue edges and tools, to maintain safe surgical positions, to detect early instrument contact with tissue and to assess depth of instrument penetration into tissues. These innovations enhance the surgeon’s ability to achieve surgical objectives, diminish surgical risk, and improve outcomes. In this talk, I will summarize our recents efforts in optical image-guided robotic surgical procedures and future directions.
Invited Lecture #2
S. Kevin Zhou, PhD
Professor, Institute of Computing Technology,
Chinese Academy of Sciences
3:45 p.m. – 4:30 p.m.
Light Hall 202
Machine learning + knowledge modeling: medical image recognition, segmentation, parsing
The “Machine learning + Knowledge modeling” approaches, which combine machine learning with domain knowledge, enable us to achieve start-of-the-art performances for many tasks of medical image recognition, segmentation and parsing. In this talk, we present real success stories of such approaches and proceed to elaborate deep learning, the most powerful machine learning method. We demonstrate that an extra performance boost is rendered when deep learning is combined with knowledge modeling.
Invited Lecture #3
Gregory S. Fischer, PhD
William Smith Dean’s Professor, Mechanical Engineering & Robotics Engineering
Director, Worcester Polytechnic Institute
4:30 p.m. – 5:15 p.m.
Light Hall Room 202
Image-Guided Robotic Surgery: In-situ MRI Guidance for Enhancing Robot-Assisted Cancer Therapy
Interactively updated intraoperative medical imaging affords the opportunity to monitor and guide interventional procedures. The real-time feedback enables “closed loop medicine” wherein we ensure that the treatment plan is implemented as intended. In order to take the most advantage of robots in surgery, we work towards integrating real-time medical imaging with the interventional procedure to provide as much information to a surgeon during a procedure as possible and using that information in a way to produce better outcomes. MRI can offer high resolution 3D imaging with high soft tissue contrast, multi-modality imaging for tumor localization, thermal monitoring, and interactively updated speed, making it ideal for monitoring and guiding interventions. However, challenges with the high magnetic field, time varying magnetic gradient, strong RF signals, and high sensitivity to RF noise make leveraging these capabilities a challenge. We have developed a modular approach to MRI-compatible robotics including the software, control hardware, and mechanical systems, and have used this approach to develop robotic systems for image-guided diagnosis and therapy of prostate cancer and for stereotactic neurosurgical interventions where we can perform surgical manipulation under live MR imaging. A system for percutaneous access to the prostate has been used in a 30-patient trial for prostate cancer biopsy at the Brigham and Women’s Hospital in Boston, MA. And, an MRI-compatible stereotactic neurosurgery robot intended for conformal thermal ablation utilizing interstitial therapeutic ultrasound is in preclinical trials at UMass Medical School in Worcester, MA. This robot combines precision alignment of the probe based on intraoperative imaging that can account for brain shift, monitoring of the probe insertion, and live thermal monitoring of the dose delivered. Current work funded under the NIH NCI Academic-Industry Partnership program is focused on utilizing real-time feedback coupled with interactive robotic control of the ablation probe to produce precise conformal boundaries for ablation of irregularly shape Glioblastomas. Other work in the WPI Automation and Interventional Medicine (AIM) Robotics Research Laboratory includes task automation and intelligent teleoperation of the daVinci surgical robot, soft wearable assistive robots, and socially assistive robots for augmenting Autism therapy.