How can I reduce the file size of a video created with UIImagePickerController?

I have an app that allows a user to record a video with UIImagePickerController and then upload it to YouTube. The problem is that the video file that UIImagePickerController creates is HUGE, even when the video is only 5 seconds long. For example, a 5 second long video is 16-20 megabytes. I want to keep the video in 540 or 720 quality, but I want to reduce the file size.

I've been experimenting with AVFoundation and AVAssetExportSession to try to get a smaller file size. I've tried the following code:

AVAsset *video = [AVAsset assetWithURL:videoURL];
AVAssetExportSession *exportSession = [AVAssetExportSession exportSessionWithAsset:video presetName:AVAssetExportPresetPassthrough];
exportSession.shouldOptimizeForNetworkUse = YES;
exportSession.outputFileType = AVFileTypeMPEG4;
exportSession.outputURL = [pathToSavedVideosDirectory URLByAppendingPathComponent:@"vid1.mp4"];
[exportSession exportAsynchronouslyWithCompletionHandler:^{
    NSLog(@"done processing video!");
}];

But this hasn't reduced the file size at all. I know what I'm doing is possible because in Apple's Photos app, when you select "share on YouTube", will automatically process the video file so its small enough to upload. I want to do the same thing in my app.

How can I accomplish this?


With AVCaptureSession and AVAssetWriter you can set the compression settings as such:

NSDictionary *settings = @{AVVideoCodecKey:AVVideoCodecH264,
                           AVVideoWidthKey:@(video_width),
                           AVVideoHeightKey:@(video_height),
                           AVVideoCompressionPropertiesKey:
                               @{AVVideoAverageBitRateKey:@(desired_bitrate),
                                 AVVideoProfileLevelKey:AVVideoProfileLevelH264Main31, /* Or whatever profile & level you wish to use */
                                 AVVideoMaxKeyFrameIntervalKey:@(desired_keyframe_interval)}};

AVAssetWriterInput* writer_input = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:settings];

Edit: I guess if you insist on using the UIImagePicker to create the movie in the first place, you'll have to use AVAssetReader's copyNextSampleBuffer and AVAssetWriter's appendSampleBuffer methods to do the transcode.


yourfriendzak is right: Setting cameraUI.videoQuality = UIImagePickerControllerQualityTypeLow;isn't the solution here. The solution is to reduce the data rate, or bit rate, which is what jgh is suggesting.

I have, three methods. The first method handles the UIImagePicker delegate method:

// For responding to the user accepting a newly-captured picture or movie
- (void) imagePickerController: (UIImagePickerController *) picker didFinishPickingMediaWithInfo: (NSDictionary *) info {

// Handle movie capture
NSURL *movieURL = [info objectForKey:
                            UIImagePickerControllerMediaURL];

NSURL *uploadURL = [NSURL fileURLWithPath:[[NSTemporaryDirectory() stringByAppendingPathComponent:[self randomString]] stringByAppendingString:@".mp4"]];

// Compress movie first
[self convertVideoToLowQuailtyWithInputURL:movieURL outputURL:uploadURL];
}

The second method converts the video to a lower bitrate, not to lower dimensions.

- (void)convertVideoToLowQuailtyWithInputURL:(NSURL*)inputURL
                               outputURL:(NSURL*)outputURL
{
//setup video writer
AVAsset *videoAsset = [[AVURLAsset alloc] initWithURL:inputURL options:nil];

AVAssetTrack *videoTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];

CGSize videoSize = videoTrack.naturalSize;

NSDictionary *videoWriterCompressionSettings =  [NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithInt:1250000], AVVideoAverageBitRateKey, nil];

NSDictionary *videoWriterSettings = [NSDictionary dictionaryWithObjectsAndKeys:AVVideoCodecH264, AVVideoCodecKey, videoWriterCompressionSettings, AVVideoCompressionPropertiesKey, [NSNumber numberWithFloat:videoSize.width], AVVideoWidthKey, [NSNumber numberWithFloat:videoSize.height], AVVideoHeightKey, nil];

AVAssetWriterInput* videoWriterInput = [AVAssetWriterInput
                                         assetWriterInputWithMediaType:AVMediaTypeVideo
                                         outputSettings:videoWriterSettings];

videoWriterInput.expectsMediaDataInRealTime = YES;

videoWriterInput.transform = videoTrack.preferredTransform;

AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:outputURL fileType:AVFileTypeQuickTimeMovie error:nil];

[videoWriter addInput:videoWriterInput];

//setup video reader
NSDictionary *videoReaderSettings = [NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange] forKey:(id)kCVPixelBufferPixelFormatTypeKey];

AVAssetReaderTrackOutput *videoReaderOutput = [[AVAssetReaderTrackOutput alloc] initWithTrack:videoTrack outputSettings:videoReaderSettings];

AVAssetReader *videoReader = [[AVAssetReader alloc] initWithAsset:videoAsset error:nil];

[videoReader addOutput:videoReaderOutput];

//setup audio writer
AVAssetWriterInput* audioWriterInput = [AVAssetWriterInput
                                        assetWriterInputWithMediaType:AVMediaTypeAudio
                                        outputSettings:nil];

audioWriterInput.expectsMediaDataInRealTime = NO;

[videoWriter addInput:audioWriterInput];

//setup audio reader
AVAssetTrack* audioTrack = [[videoAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];

AVAssetReaderOutput *audioReaderOutput = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:audioTrack outputSettings:nil];

AVAssetReader *audioReader = [AVAssetReader assetReaderWithAsset:videoAsset error:nil];

[audioReader addOutput:audioReaderOutput];    

[videoWriter startWriting];

//start writing from video reader
[videoReader startReading];

[videoWriter startSessionAtSourceTime:kCMTimeZero];

dispatch_queue_t processingQueue = dispatch_queue_create("processingQueue1", NULL);

[videoWriterInput requestMediaDataWhenReadyOnQueue:processingQueue usingBlock:
 ^{

     while ([videoWriterInput isReadyForMoreMediaData]) {

         CMSampleBufferRef sampleBuffer;

         if ([videoReader status] == AVAssetReaderStatusReading &&
             (sampleBuffer = [videoReaderOutput copyNextSampleBuffer])) {

             [videoWriterInput appendSampleBuffer:sampleBuffer];
             CFRelease(sampleBuffer);
         }

         else {

             [videoWriterInput markAsFinished];

             if ([videoReader status] == AVAssetReaderStatusCompleted) {

                 //start writing from audio reader
                 [audioReader startReading];

                 [videoWriter startSessionAtSourceTime:kCMTimeZero];

                 dispatch_queue_t processingQueue = dispatch_queue_create("processingQueue2", NULL);

                 [audioWriterInput requestMediaDataWhenReadyOnQueue:processingQueue usingBlock:^{

                     while (audioWriterInput.readyForMoreMediaData) {

                         CMSampleBufferRef sampleBuffer;

                         if ([audioReader status] == AVAssetReaderStatusReading &&
                             (sampleBuffer = [audioReaderOutput copyNextSampleBuffer])) {

                            [audioWriterInput appendSampleBuffer:sampleBuffer];
                                    CFRelease(sampleBuffer);
                         }

                         else {

                             [audioWriterInput markAsFinished];

                             if ([audioReader status] == AVAssetReaderStatusCompleted) {

                                 [videoWriter finishWritingWithCompletionHandler:^(){
                                     [self sendMovieFileAtURL:outputURL];
                                 }];

                             }
                         }
                     }

                 }
                  ];
             }
         }
     }
 }
 ];
}

When successful, the third method, sendMovieFileAtURL: is called, which uploads the compressed video at outputURL to the server.

Note that I've enabled ARC in my project, so you will have to add some release calls if ARC is turned off in yours.


On UImagePickerController you have a videoQuality property of UIImagePickerControllerQualityType type, and will be applied to recorded movies as well as to the ones that you picked picked from the library (that happens during transcoding phase).

Or if you have to deal with existent asset (file) not from the library you might want to look at these presets:

AVAssetExportPresetLowQuality
AVAssetExportPresetMediumQuality
AVAssetExportPresetHighestQuality

and

AVAssetExportPreset640x480
AVAssetExportPreset960x540
AVAssetExportPreset1280x720
AVAssetExportPreset1920x1080

and pass one of them to initializer of AVAssetExportSession class. I'm afraid you have to play with those for your particular content as there is no precise description for what is low and medium quality or which quality will be used for 640x480 or for 1280x720 preset. The only useful information in the docs is following:

Export Preset Names for Device-Appropriate QuickTime Files You use these export options to produce QuickTime .mov files with video size appropriate to the current device.

The export will not scale the video up from a smaller size. Video is compressed using H.264; audio is compressed using AAC

Some devices cannot support some sizes.

Aside from that I do not remember having precise control over quality such as framerate or freeform size etc in AVFoundation

I was wrong, there is a way to tweak all parameters you mentions and it is AVAssetWriter indeed: How do I export UIImage array as a movie?

btw, here is a link to a similar question with a code sample: iPhone:Programmatically compressing recorded video to share?


Erik's answer may have been correct at the time he wrote it - but now with iOS8 it's just crashing left and right, I've spent a few hours on it myself.

You need a PhD to work with AVAssetWriter - it's non-trivial: https://developer.apple.com/library/mac/documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/05_Export.html#//apple_ref/doc/uid/TP40010188-CH9-SW1

There's an amazing library for doing exactly what you want which is just an AVAssetExportSession drop-in replacement with more crucial features like changing the bit rate: https://github.com/rs/SDAVAssetExportSession

Here's how to use it:

-(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{

  SDAVAssetExportSession *encoder = [SDAVAssetExportSession.alloc initWithAsset:[AVAsset assetWithURL:[info objectForKey:UIImagePickerControllerMediaURL]]];
  NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
  NSString *documentsDirectory = [paths objectAtIndex:0];
  self.myPathDocs =  [documentsDirectory stringByAppendingPathComponent:
                      [NSString stringWithFormat:@"lowerBitRate-%d.mov",arc4random() % 1000]];
  NSURL *url = [NSURL fileURLWithPath:self.myPathDocs];
  encoder.outputURL=url;
  encoder.outputFileType = AVFileTypeMPEG4;
  encoder.shouldOptimizeForNetworkUse = YES;

  encoder.videoSettings = @
  {
  AVVideoCodecKey: AVVideoCodecH264,
  AVVideoCompressionPropertiesKey: @
    {
    AVVideoAverageBitRateKey: @2300000, // Lower bit rate here
    AVVideoProfileLevelKey: AVVideoProfileLevelH264High40,
    },
  };
  encoder.audioSettings = @
  {
  AVFormatIDKey: @(kAudioFormatMPEG4AAC),
  AVNumberOfChannelsKey: @2,
  AVSampleRateKey: @44100,
  AVEncoderBitRateKey: @128000,
  };

  [encoder exportAsynchronouslyWithCompletionHandler:^
  {
    int status = encoder.status;

    if (status == AVAssetExportSessionStatusCompleted)
    {
      AVAssetTrack *videoTrack = nil;
      AVURLAsset *asset = [AVAsset assetWithURL:encoder.outputURL];
      NSArray *videoTracks = [asset tracksWithMediaType:AVMediaTypeVideo];
      videoTrack = [videoTracks objectAtIndex:0];
      float frameRate = [videoTrack nominalFrameRate];
      float bps = [videoTrack estimatedDataRate];
      NSLog(@"Frame rate == %f",frameRate);
      NSLog(@"bps rate == %f",bps/(1024.0 * 1024.0));
      NSLog(@"Video export succeeded");
      // encoder.outputURL <- this is what you want!!
    }
    else if (status == AVAssetExportSessionStatusCancelled)
    {
      NSLog(@"Video export cancelled");
    }
    else
    {
      NSLog(@"Video export failed with error: %@ (%d)", encoder.error.localizedDescription, encoder.error.code);
    }
  }];
}

Erik Wegener code rewrited to swift 3:

class func convertVideoToLowQuailtyWithInputURL(inputURL: NSURL, outputURL: NSURL, onDone: @escaping () -> ()) {
            //setup video writer
            let videoAsset = AVURLAsset(url: inputURL as URL, options: nil)
            let videoTrack = videoAsset.tracks(withMediaType: AVMediaTypeVideo)[0]
            let videoSize = videoTrack.naturalSize
            let videoWriterCompressionSettings = [
                AVVideoAverageBitRateKey : Int(125000)
            ]

            let videoWriterSettings:[String : AnyObject] = [
                AVVideoCodecKey : AVVideoCodecH264 as AnyObject,
                AVVideoCompressionPropertiesKey : videoWriterCompressionSettings as AnyObject,
                AVVideoWidthKey : Int(videoSize.width) as AnyObject,
                AVVideoHeightKey : Int(videoSize.height) as AnyObject
            ]

            let videoWriterInput = AVAssetWriterInput(mediaType: AVMediaTypeVideo, outputSettings: videoWriterSettings)
            videoWriterInput.expectsMediaDataInRealTime = true
            videoWriterInput.transform = videoTrack.preferredTransform
            let videoWriter = try! AVAssetWriter(outputURL: outputURL as URL, fileType: AVFileTypeQuickTimeMovie)
            videoWriter.add(videoWriterInput)
            //setup video reader
            let videoReaderSettings:[String : AnyObject] = [
                kCVPixelBufferPixelFormatTypeKey as String: Int(kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange) as AnyObject
            ]

            let videoReaderOutput = AVAssetReaderTrackOutput(track: videoTrack, outputSettings: videoReaderSettings)
            let videoReader = try! AVAssetReader(asset: videoAsset)
            videoReader.add(videoReaderOutput)
            //setup audio writer
            let audioWriterInput = AVAssetWriterInput(mediaType: AVMediaTypeAudio, outputSettings: nil)
            audioWriterInput.expectsMediaDataInRealTime = false
            videoWriter.add(audioWriterInput)
            //setup audio reader
            let audioTrack = videoAsset.tracks(withMediaType: AVMediaTypeAudio)[0]
            let audioReaderOutput = AVAssetReaderTrackOutput(track: audioTrack, outputSettings: nil)
            let audioReader = try! AVAssetReader(asset: videoAsset)
            audioReader.add(audioReaderOutput)
            videoWriter.startWriting()





            //start writing from video reader
            videoReader.startReading()
            videoWriter.startSession(atSourceTime: kCMTimeZero)
            let processingQueue = DispatchQueue(label: "processingQueue1")
            videoWriterInput.requestMediaDataWhenReady(on: processingQueue, using: {() -> Void in
                while videoWriterInput.isReadyForMoreMediaData {
                    let sampleBuffer:CMSampleBuffer? = videoReaderOutput.copyNextSampleBuffer();
                    if videoReader.status == .reading && sampleBuffer != nil {
                        videoWriterInput.append(sampleBuffer!)
                    }
                    else {
                        videoWriterInput.markAsFinished()
                        if videoReader.status == .completed {
                            //start writing from audio reader
                            audioReader.startReading()
                            videoWriter.startSession(atSourceTime: kCMTimeZero)
                            let processingQueue = DispatchQueue(label: "processingQueue2")
                            audioWriterInput.requestMediaDataWhenReady(on: processingQueue, using: {() -> Void in
                                while audioWriterInput.isReadyForMoreMediaData {
                                    let sampleBuffer:CMSampleBuffer? = audioReaderOutput.copyNextSampleBuffer()
                                    if audioReader.status == .reading && sampleBuffer != nil {
                                        audioWriterInput.append(sampleBuffer!)
                                    }
                                    else {
                                        audioWriterInput.markAsFinished()
                                        if audioReader.status == .completed {
                                            videoWriter.finishWriting(completionHandler: {() -> Void in
                                                onDone();
                                            })
                                        }
                                    }
                                }
                            })
                        }
                    }
                }
            })
        }