Feature Flow (Practical Temporal Consistency)
We present an efficient and simple method for introducing temporal consistency to a large class of optimization driven image-based computer graphics problems. Our method extends recent work in edge-aware filtering, approximating costly global regularization with a fast iterative joint filtering operation.
We extend a recent advance in edge-aware filtering called the domain transform, that embeds geodesic distance onto a manifold that allows us to perform filtering with a separable Gaussian filter. This fact causes the spatiotemporal extension to have only a linear (instead of quadratic) increase of runtime.
Using this representation, we can achieve tremendous efficiency gains both in terms of memory requirements and running time. This efficiency gain enables us to process entire shots at once, taking advantage of supporting information that exists across far away frames, something that is difficult with existing approaches due to the computational burden of video data.
Our method is able to filter along motion paths using an iterative approach that simultaneously uses and estimates per-pixel optical flow vectors. We demonstrate its utility by creating temporally consistent results for a number of applications including optical flow, disparity estimation, colorization, scribble propagation, sparse data up-sampling, and visual saliency computation