One unavoidable drawback to using multi-dimensional transfer functions is the increased memory consumption needed to store all the transfer function variables at each voxel sample point. This is required because a hardware-based approach can not compute these quantities on the fly. Combined with the quantized normal volume (which takes three bytes per voxel instead of two, due to pixel field alignment restrictions), we require six bytes per voxel to represent the dataset. This restricts the current implementation with 104 MB of texture memory to 256256128 datasets. Future work will expand the dataset size using parallel hardware rendering methods .
Utilizing multi-dimensional transfer functions opens the possibility of rendering multi-variate volume data, such as a fluid flow simulation or meteorological data. One challenge here is determining which quantities are mapped to the transfer function axes, and whether to use data values directly, or some dependent quantity, such as a spatial derivative.
Future commodity graphics cards will provide an avenue for expanded rendering features. Specifically, both the nVidia and ATI graphics cards support a number of per-pixel operations which can significantly enhance the computation of diffuse and specular shading (assuming a small number of light sources). These features, however, come at the expense of redundancy and truncation in normal representation. Pixel texture shading, on the other hand, allows arbitrarily complex lighting and non-photorealistic effects. The trade-off between these two representations is normal interpolation. Quantized normals do not easily interpolate; vector component normals do. Vector component normals, however, do require a normalization step after component-wise interpolation if the dot product is for accurately computing the diffuse and specular lighting component. This normalization step is not yet supported by these cards.
Direct manipulation widgets and spatial interaction techniques lend themselves
well to immersive environments. We would like to experiment with
dual-domain interaction in a stereo, tracked, environment. We speculate
that an immersive environment could make interacting with a 3D
transfer function more natural and intuitive. We would also like to
perform usability studies on our direct manipulation widgets and dual-domain
interaction technique, as well as perceptual studies on 2D and
3D transfer functions for volume rendering.