Konstantin Shkurko

Konstantin Shkurko

Ph.D.
School of Computing
University of Utah

Advisors: Erik Brunvand, Cem Yuksel

School of Computing
50 S Central Campus Dr, RM 3190
Salt Lake City, UT 84112

kshkurko AT cs DOT utah DOT edu
kis9 AT cornell DOT edu

Time Interval Ray Tracing for Motion Blur

Time Interval Ray Tracing for Motion Blur

Konstantin Shkurko, Cem Yuksel, Daniel Kopta, Ian Mallett, and Erik Brunvand

TVCG (IEEE Transactions on Visualization and Computer Graphics), 2017


Abstract: We introduce a new motion blur computation method for ray tracing that provides an analytical approximation of motion blurred visibility per ray. Rather than relying on timestamped rays and Monte Carlo sampling to resolve the motion blur, we associate a time interval with rays and directly evaluate when and where each ray intersects with animated object faces. Based on our simplifications, the volume swept by each animated face is represented using a triangulation of the surface of this volume. Thus, we can resolve motion blur through ray intersections with stationary triangles, and we can use any standard ray tracing acceleration structure without modifications to account for the time dimension. Rays are intersected with these triangles to analytically determine the time interval and positions of the intersections with the moving objects. Furthermore, we explain an adaptive strategy to efficiently shade the intersection intervals. As a result, we can produce noise-free motion blur for both primary and secondary rays. We also provide a general framework for emulating various camera shutter mechanisms and an artistic modification that amplifies the visibility of moving objects for emphasizing the motion in videos or static images.




Files:     preprint pdf (high) (pdf, 41 MB)     preprint pdf (low) (pdf, 8.5 MB)     BibTex (bib, 1 KB)     TVCG
              supplement pdf (high) (pdf, 42 MB)     supplement pdf (low) (pdf, 7.5 MB)

Media: Videos were created to illustrate the speed of the entire system. We tested three datasets (930 and 770 slices at the resolution of 5122). Output images at the resolution of 10242 were generated at the rate above 10 Hz, with the actual refresh rate shown in the top right corner of the videos. We used 256 cores in parallel, each running at 2.6 GHz, to compute each frame.

The left video is at 1080p resolution (H.264, mp4, 52 MB).

Time Interval Ray Tracing for Motion Blur video

Images: Left: Stratified sampling, Center: Ours anti-aliased, Right: Reference

Slinky example

Slinky stratified sampling Slinky ours Slinky reference

Dragon Sponza example

Dragon Sponza stratified sampling Dragon Sponza ours Dragon Sponza reference

Acknowledgements: This material is supported in part by the National Science Foundation under Grant No. 1409129. Thiago Ize and Peter Shirley provided helpful feedback. Cem Yuksel provided Slinky, Clothball, and Lightsaber scenes, and combined Sponza atrium by Marko Dabrovic with the Stanford Dragon for the Dragon-Sponza scene. We also thank the anonymous reviewers for their time and helpful feedback.