How to apply "filters" to AVCaptureVideoPreviewLayer

Probably the most performant way of handling this would be to use OpenGL ES for filtering and display of these video frames. You won't be able to do much with an AVCaptureVideoPreviewLayer directly, aside from adjusting its opacity when overlaid with another view or layer.

I have a sample application here where I grab frames from the camera and apply OpenGL ES 2.0 shaders to process the video in realtime for display. In this application (explained in detail here), I was using color-based filtering to track objects in the camera view, but others have modified this code to do some neat video processing effects. All GPU-based filters in this application that display to the screen run at 60 FPS on my iPhone 4.

The only iOS device out there that supports video, yet doesn't have an OpenGL ES 2.0 capable GPU, is the iPhone 3G. If you need to target that device as well, you might be able to take the base code for video capture and generation of OpenGL ES textures, and then use the filter code from Apple's GLImageProcessing sample application. That application is built around OpenGL ES 1.1, support for which is present on all iOS devices.

However, I highly encourage looking at the use of OpenGL ES 2.0 for this, because you can pull off many more kinds of effect using shaders than you can with the fixed function OpenGL ES 1.1 pipeline.

(Edit: 2/13/2012) As an update on the above, I've now created an open source framework called GPUImage that encapsulates this kind of custom image filtering. It also handles capturing video and displaying it to the screen after being filtered, requiring as few as six lines of code to set all of this up. For more on the framework, you can read my more detailed announcement.


I would recommend looking at the Rosy Writer example from the ios development library. Brad Larson's GPUImage Library is pretty awesome but it seems a little overkill for this question.

If you are just interested in adding OpenGL Shaders (aka Filters) to a AVCaptureVideoPreviewLayer the workflow is to send the output of the capture session to an OpenGL view for rendering.

AVCaptureVideoDataOutput *videoOut = [[AVCaptureVideoDataOutput alloc] init];
videoOut.videoSettings = @{ (id)kCVPixelBufferPixelFormatTypeKey : @(_renderer.inputPixelFormat) };
[videoOut setSampleBufferDelegate:self queue:_videoDataOutputQueue];

Then in the captureOutput: delegate send the sample buffer to OpenGL Renderer

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
    CVPixelBufferRef sourcePixelBuffer = CMSampleBufferGetImageBuffer( sampleBuffer );
    _renderer copyRenderedPixelBuffer:sourcePixelBuffer];
}

In OpenGL Renderer attach the sourcePixelBuffer to a texture and you can filter it within the OpenGL Shaders. The shader is a program that is run on a perpixel base. The Rosy Writer example also shows examples of using different filtering techniques other than OpenGL.


Apple's example AVCamFilter does it all

enter image description here