Interactive Manipulation of Realtime Visualisation from Medical Volumedata by using 2-Handed VR-Techniques

Kai Köchy -- kai(at)cs.tu-berlin.de
Manfred Krauss -- krauss(at).ukbf.fu-berlin.de

5/29/1998
TU Berlin, Computer Science, CG


Abstract:

We propose a system for interactive 2-hand manipulation of a realtime visualisation of medical volume data. In order to achieve realtime visualization different views from every angle are precalculated. A cutting plane is modeled using VR techniques. This cutting plane can be moved through the voxel dataset and shows the original medical information. The left hand supports the right hands positioning in 3D space. Furthermore, two especially designed gloves (Gauntlets) for realtime user interaction with gestures are applied.

Introduction:

Looking at medically complex diagnosis, one x-ray of the patient will not be enough anymore. More images from different angles are necessary for making decisions. Classical x-rays consist of superimposition of many organs and bones. In Computer Tomography, single image layers of the specific region of the body are taken and put together to a data volume.

The pixel information in the image layers represents a volume information called voxel (volume element). Volume rendering offers the opportunity to display surfaces and transparent volumina at the same time as described in [DRCA88] and [LEVO88]. Surface rendering takes only the voxels of the outer boundary of the segmented object into account, not the voxels inside.

Starting with a program for interactive 3D visualization of medical volume data using precalculated views, developed by Patrick Neumann [NEUM96], a two-handed user interface is added [SZGE97]. The newly introduced gauntlets and two 3D trackers are integrated for easier and more intuitive interaction.

Method:

In his left hand the user holds an imaginary cube of the medical volume data, which he can rotate like a globus around the azimuth and elevation axes.

Volume visualisation is done by a prefered rendering algorithm. Views along the altitude and longitude of the globus are precalculated.

On the 2 dimensional view one can define a region of interest with the right hand by choosing a start point and extruding a rectangle. With a special gesture of the left hand the globus in the left hand changes to a cutting plane.

In the previous marked region the user is able to move the modeled cutting plane through the voxel data set. The actual cut of the plane with the data cube is shown as a grey-level image with the original voxel information.

The former planar cutting plane can be manipulated with the right hand. Therefore the user picks one or more controll points of the plane and translates them along one of the three axes in space. A new plane based on linear interpolation or a spline patch is then generated.

Conclusion :

This method allows the simultanous visualization of voxels in different depths of the volume data. E.g. it is possible to look at the patients spinal system in one cut. The rendered image outside of the region of interest supports the spatial correspondence in the medical context.

References :

[DRCA88] Drebin, R.A.; Carpenter, L.; Hanrahan, P.; Volume Rendering; Computer graphics, Vol. 22(4); 1988; pages 65-74

[LEVO90] Levoy, M.; A hybrid ray-tracer for rendering poygon and volume data; IEEE Computer Graphics & Applications Vol.10 (3); 1990; pages 33-40

[NEUM96] Neumann, Patrick; final thesis: "Interaktive dreidimensionale Visualisierung von medizinischen Volumendaten durch Vorberechnung von Ansichten"; Technical University of Berlin, Germany; November 1996

[SZGE97] Szalavari, Zsolt and Gervautz, Michael; The Personal Interaction Panel - a Two-Handed Interface for Augmented Reality; Eurographics `97, Volume 16, Number 3; Blackwell Publishers 1997


Kai Köchy, Manfred Krauss

Oral presentation at EuroPACS'98, Barcelona, Spain