This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. In most cases, these works may not be reposted without the explicit permission of the copyright holder.

Search for Publication

Year(s) from:  to 
Keywords (separated by spaces):

Quantitative Endoscopy - Virtual Interaction

Armin Haeberling and Christian Wengert
, 2007
Computer Vision Lab, ETH Zuerich


Manipulating small objects such as needles, screws or plates inside the human body during minimally invasive surgery can be very difficult for less experienced surgeons, due to the loss of 3D depth perception. This paper presents an approach for tracking a suturing needle using a standard endoscope. The resulting pose information of the needle is then used to generate artificial 3D cues on the 2D screen to optimally support surgeons during tissue suturing. Additionally, if an external tracking device is provided to report the endoscope’s position, the suturing needle can be tracked in a hybrid fashion with sub-millimeter accuracy. Finally, a visual navigation aid can be incorporated, if a 3D surface is intraoperatively reconstructed from video or registered from preoperative imaging.

Download in pdf format
  author = {Armin Haeberling and Christian Wengert},
  title = {Quantitative Endoscopy - Virtual Interaction},
  year = {2007},
  month = {August},
  institution = {Computer Vision Lab, ETH Zuerich},
  keywords = {}