ffmpeg: adaptively stretch contrast

Consider a (color) video recorded under poor conditions. Its frames' brightest pixels are well below pure white, its darkest ones well above pure black. Can each frame's contrast be stretched closer to the full brightness range, like an audio compressor/limiter? Can this be done in ffmpeg, using some of its many filters?

Of course, simply slamming each frame's brightest value to white and darkest to black makes the exposure stutter like something shot in 1910. How much to stretch each frame's contrast should depend on the brightest and darkest values of the surrounding frames, probably +-10 seconds.

One could extract all the frames, get each frame's white and black points with imagemagick, store them in an array, smooth the array, stretch each frame's contrast again with imagemagick, and rebuild the video. But that seems positively medieval.


Solution 1:

The histeq comes close to what you want. But you can also look at the various flavors of LUT filters and also lut3d.

The lut filters are more customizable in that you can actually look at a frame and determine what values to use. Moreover, if there is wide variance in the input material viz-a-viz luminance levels, the output might judder when applying a "per frame" correction. Applying a uniform strength of correction to all frames would produce a smoother result.