[243]

Citation:

F. Ryden, H. J. Chizeck, S. Nia Kosari, H. King, B. Hannaford, 'Using Kinect and a Haptic Interface for Implementation of Real-Time Virtual Fixture,' Robotics Sciences and Systems, Workshop on RGB-D: Advanced Reasoning with Depth Cameras,, Los Angeles,, June 2011.

Abstract

The use of haptic virtual fixtures is a potential tool to improve the safety of robotic and telerobotic surgery. They can “push back” on the surgeon to prevent unintended surgical tool movements into protected zones. Previous work has suggested generating virtual fixtures from preoperative images like CT scans. However these are difficult to establish and register in dynamic environments. This paper demonstrates automatic generation of real-time haptic virtual fixtures using a low cost Xbox KinectTMdepth camera connected to a virtual environment. This allows generation of virtual fixtures and calculation of haptic forces, which are then passed on to a haptic device. This paper demonstrates that haptic forces can be successfully rendered from real-time environments containing both non-moving and moving objects. This approach has the potential to generate virtual fixtures from the patient in real-time during robotic surgery.


[Copyright]
[HELP!]
Updated: Mon Jun 15 12:39:14 2015