Amortized Supersampling
ACM Transactions on Graphics (Proc. SIGGRAPH Asia), December 2009
Lei Yang, Diego Nehab, Pedro V. Sander, Pitchaya Sitthi-amorn, Jason Lawrence, Hugues Hoppe

Abstract
We present a real-time rendering scheme that reuses shading samples
from earlier time frames to achieve practical antialiasing of
procedural shaders. Using a reprojection strategy, we maintain several
sets of shading estimates at subpixel precision, and incrementally
update these such that for most pixels only one new shaded
sample is evaluated per frame. The key difficulty is to prevent accumulated
blurring during successive reprojections. We present a
theoretical analysis of the blur introduced by reprojection methods.
Based on this analysis, we introduce a nonuniform spatial filter,
an adaptive recursive temporal filter, and a principled scheme for
locally estimating the spatial blur. Our scheme is appropriate for
antialiasing shading attributes that vary slowly over time. It works
in a single rendering pass on commodity graphics hardware, and
offers results that surpass 4×4 stratified supersampling in quality,
at a fraction of the cost.