eagle-i Harvard UniversityHarvard University
See it in Search

Surgical Planning Laboratory (BWH)

Director: Kikinis, Ron

Location: Surgical Planning Laboratory, Department of Radiology, ASBI, L1-050, Brigham & Women's Hospital, 75 Francis Street, Boston, MA 02115

Summary:

The Surgical Planning Laboratory is advancing the future of health care by bringing the power of computation and imaging to new areas of medicine. The lab collaborates with groups within Brigham and Women's Hospital, with other researchers at the Harvard Medical School, with local universities such as Harvard and MIT, and with gifted clinicians, researchers, and engineers throughout the world. The Core Mission of the SPL is the extraction of medically relevant information from diagnostic imaging data.

This laboratory also has a large publications database.

Affiliations:

People:

Resources:

Services

  • Training and tutorials ( Training service )

    The Surgical Planning Laboratory offers online training courses and live events on a variety of topics including training on 3D Slider. They also routinely host visitors with a variety of backgrounds to learn and/or participate in various research projects.

    The SPL is an umbrella organization which provides a number of labs with IT services and access to our rich scientific environment. For more information about the closely affiliated labs and efforts, please see the projects page.

Software

  • 3D Slicer ( Software )

    Slicer, or 3D Slicer, is a free, open source software package for visualization and image analysis. 3D Slicer is natively designed to be available on multiple platforms, including Windows, Linux and Mac Os X.

    Features include

    * Improved Interactive Editor
    * New Color Module
    * Improved Volume Rendering
    * EM Segmenter, simple version
    * Fast Marching Segmentation
    * Robust Statistics Segmentation
    * New Registration module
    * New Slices module
    * Fiducial based tractography
    * Improved SceneSnapshot Screen Capture functionality
    * 4D Image Viewer
    * Compare View and Cross Hairs
    * Support for Extension Server for installing plug-ins
    * Improved Dicom Support
    * MRML scenes and all data load from and save into XNAT desktop

  • CT atlas of the abdomen ( Software )

    The Surgical Planning Laboratory at Brigham and Women's Hospital, Harvard Medical School, developed the SPL Abdominal Atlas. The atlas was derived from a computed tomography (CT) scan, using semi-automated image segmentation and three-dimensional reconstruction techniques. The current version consists of: 1. the original CT scan; 2. a set of detailed label maps; 3. a set of three-dimensional models of the labeled anatomical structures; 4. a mrml-file that allows loading all of the data into the 3D-Slicer for visualization (see the tutorial associated with the atlas); 5. several pre-defined 3D-views (“anatomy teaching files”). The SPL Abdominal Atlas provides important reference information for surgical planning, anatomy teaching, and template driven segmentation. Visualization of the data requires Slicer 3. This software package can be downloaded from here. We are pleased to make this atlas available to our colleagues for free download. Please note that the data is being distributed under the Slicer license. By downloading these data, you agree to acknowledge our contribution in any of your publications that result form the use of this atlas. Please acknowledge the following grants: P41 RR13218, R01 MH050740.

  • SPL Automated Segmentation of Brain Tumors Image Datasets ( Software )

    An automated brain tumor segmentation method was developed and validated against manual segmentation with three-dimensional magnetic resonance images in 10 patients with meningiomas and low-grade gliomas, Kaus et al., 2001. The automated method (operator time, 5-10 minutes) allowed rapid identification of brain and tumor tissue with an accuracy and reproducibility comparable to those of manual segmentation (operator time, 3-5 hours), making automated segmentation practical for low-grade gliomas and meningiomas. We make available the image datasets used in our study, results of our algorithms, and open source software (3D Slicer) for data access and processing to interested parties, as a free service.

  • SPL brain tumor resection image datasets ( Software )

    The accuracy of neurosurgical navigation systems is seriously compromised by brain shift, i.e. changes in the spatial position of the lesion and surrounding brain tissue, which inevitably occur during the surgical procedure, in response to surgical manipulation (resection, retraction, CSF leakage) and administration of anesthetic drugs. These changes in brain spatial configuration, summarized under the generic term of brain shift, occur according to a non-linear pattern and lead to significant mis-registration between pre-operative image data (MR, CT) and the intraoperative brain configuration. Non-rigid registration techniques are increasingly being employed to maintain an accurate alignment between pre-operative and intra-operative images. These techniques provide the ability to estimate transformations that model not only affine parameters (global translation, rotation, scale and shear), but also local deformations. Higher-order transformation models, with increased number of parameters and significant computing capabilities are usually required for this purpose. Our group was the first to demonstrate the feasibility of a non-rigid registration approach capable to compensate for the volumetric brain deformations within the time constraints imposed by neurosurgery (Archip et al., 2007). Augmented reality visualizations of functionally eloquent brain structures (based on pre-operative anatomic, functional and Diffusion Tensor MRI, non-rigidly registered with intra-operative MR-image updates) were presented to the surgeon during brain tumor resections in near real-time. We are pleased to make available the image datasets used in our study, results of our algorithms, and open source software (3D Slicer) for data access and processing to interested parties, as a free service.

  • SPL-PNL Brain Atlas ( Software )

    The initial version of the SPL-PNL Brain Atlas was developed by the Surgical Planning Laboratory in collaboration with the Harvard Neuroscience Laboratory at Brockton VA Medical Center. Dr. Martha Shenton, one of the original people involved with the Harvard Neuroscience Laboratory, is now the Director of the Psychiatry Neuroimaging Laboratory (PNL), Department of Psychiatry, Brigham and Women's Hospital, Harvard Medical School. The atlas was derived from a volumetric T1-weighted MR-scan of a healthy volunteer, using semi-automated image segmentation and three-dimensional reconstruction techniques. Over the years, the original atlas has undergone several revisions. The current version consists of: 1. the original volumetric whole brain MRI of a healthy volunteer; 2. a set of detailed label maps; 3. 160+ three-dimensional models of the labeled anatomical structures; 4. a mrml-file that allows loading all of the data into the 3D-Slicer for visualization (see the tutorial associated with the atlas); 5. several pre-defined 3D-views for the motor, visual and limbic systems, diencephalon, brain stem, and left cerebral hemisphere. The SPL-PNL Brain Atlas provides important reference information for surgical planning. It has been used for template-driven segmentation and also as a powerful neuroanatomy teaching tool. Visualization of the data requires Slicer3. This software package can be downloaded from here. We are pleased to make this brain atlas available to our colleagues for free download. Please note that the data is being distributed under the Slicer license. By downloading these data, you agree to acknowledge our contribution in any of your publications that result form the use of this atlas. Please acknowledge the following grants: P41 RR13218, R01 MH050740.


Web Links:

Last updated: 2012-05-09T12:50:04.135-05:00

Copyright © 2016 by the President and Fellows of Harvard College
The eagle-i Consortium is supported by NIH Grant #5U24RR029825-02 / Copyright 2016