The ArthroNav Project

Computer Assisted Navigation in Orthopedic Surgery using Endoscopic Images

Experimental Setup (ArthroSync)

ArthroSync is a software and hardware infrastructure that was developed for the acquisition and fusion of data provided by an optical tracking system and a set of N cameras (including an arthroscope). The output of the system is real-time video enhanced with pose information about cameras and other tools.

The system is composed by the following physical components:
a) Endoscopic System
b) Optical tracking System (Optotrak Certus System)
c) Marker tools attached to cameras (arthroscope and PointGrey camera) and instruments. Optical tracking of these marker tools allows us to estimate the pose of the cameras and tolls with respect to the reference frame of the tracking system.

Optical tracking system

Endoscopic system

Marker tools attached to cameras


SYNCHRONIZATION:

Synchronous data acquisition is an essential requirement for combining the tracking system with the arthroscopic video. The ArthroNav project uses the information from the arthroscope enhanced with the pose information to construct a complete surgical navigation system. If the data from different sources would not correspond to the exact same time instant, then this would be a major source of error, with direct impact in the accuracy and reliability of the final navigator.

Syncronization board


The data captured by ArthroSync is synchronized using specifically developed hardware (see figure above). This board allows us to use the following trigger sources to synchronize the devices:
  • a) Frame trigger information of the optical tracking system
  • b) Frame trigger information of any PointGrey camera whit a General Purpose Input/ Output (GPIO) connector
  • c) Timing signal of the composite video output of the Arthroscope
  • d) Pushbutton for asynchronous triggering of the optical tracking system and PointGrey cameras



  • SOFTWARE ARCHITECTURE:

    ArthroSync may be used in Windows (Cygwin) or in Linux (currently we are developing the Linux version). A simplified scheme of the application is shown below

    Application scheme


    The user writes the configure.xml file, which is parsed by the ArthroNavMain component. This configuration file is used for selecting the devices that will be considered (e.g. optical tracking system, arthroscope) , setting their parameters (e.g. frame frequency, number of rigid bodies), and introducing user-defined modules into the application loop (visualization, storage, and/or processing) .

    FUNCTIONALITIES OF THE CURRENT ARTHROSYNC VERSION:

    The current version (v1.0) performs synchronous acquisition of data from the opto-tracker and cameras (either DV-25 from the arthroscope and IEEE1394 from standard cameras). In addition we implemented modules for visualization and storage that can be easily inserted in the software application. Tests were performed to confirm the synchronism and real-time processing.

    An example of a visualization module is shown in the video. The application has three "device" windows:
    a) Left up - arthroscopic video
    b) Left down - Flea2 PointGrey Camera
    c) Right - an animation that uses the estimated pose of the bone and arthroscope