How to do Slow Motion video in IOS

Slower + Faster with or without audio track

I have tried and able to Slower the asset.

compositionVideoTrack?.scaleTimeRange(timeRange, toDuration: scaledVideoDuration) did the trick.

I made a class which will help you to generate a slower video from AVAsset. + point is you can also make it faster and another + point is it will handle the audio too.

Here is my custom class sample:

import UIKit
import AVFoundation

enum SpeedoMode {
    case Slower
    case Faster
}

class VSVideoSpeeder: NSObject {

    /// Singleton instance of `VSVideoSpeeder`
    static var shared: VSVideoSpeeder = {
       return VSVideoSpeeder()
    }()

    /// Range is b/w 1x, 2x and 3x. Will not happen anything if scale is out of range. Exporter will be nil in case url is invalid or unable to make asset instance.
    func scaleAsset(fromURL url: URL,  by scale: Int64, withMode mode: SpeedoMode, completion: @escaping (_ exporter: AVAssetExportSession?) -> Void) {

        /// Check the valid scale
        if scale < 1 || scale > 3 {
            /// Can not proceed, Invalid range
            completion(nil)
            return
        }

        /// Asset
        let asset = AVAsset(url: url)

        /// Video Tracks
        let videoTracks = asset.tracks(withMediaType: AVMediaType.video)
        if videoTracks.count == 0 {
            /// Can not find any video track
            completion(nil)
            return
        }

        /// Get the scaled video duration
        let scaledVideoDuration = (mode == .Faster) ? CMTimeMake(asset.duration.value / scale, asset.duration.timescale) : CMTimeMake(asset.duration.value * scale, asset.duration.timescale)
        let timeRange = CMTimeRangeMake(kCMTimeZero, asset.duration)

        /// Video track
        let videoTrack = videoTracks.first!

        let mixComposition = AVMutableComposition()
        let compositionVideoTrack = mixComposition.addMutableTrack(withMediaType: AVMediaType.video, preferredTrackID: kCMPersistentTrackID_Invalid)

        /// Audio Tracks
        let audioTracks = asset.tracks(withMediaType: AVMediaType.audio)
        if audioTracks.count > 0 {
            /// Use audio if video contains the audio track
            let compositionAudioTrack = mixComposition.addMutableTrack(withMediaType: AVMediaType.audio, preferredTrackID: kCMPersistentTrackID_Invalid)

            /// Audio track
            let audioTrack = audioTracks.first!
            do {
                try compositionAudioTrack?.insertTimeRange(timeRange, of: audioTrack, at: kCMTimeZero)
                compositionAudioTrack?.scaleTimeRange(timeRange, toDuration: scaledVideoDuration)
            } catch _ {
                /// Ignore audio error
            }
        }

        do {
            try compositionVideoTrack?.insertTimeRange(timeRange, of: videoTrack, at: kCMTimeZero)
            compositionVideoTrack?.scaleTimeRange(timeRange, toDuration: scaledVideoDuration)

            /// Keep original transformation
            compositionVideoTrack?.preferredTransform = videoTrack.preferredTransform

            /// Initialize Exporter now
            let outputFileURL = URL(fileURLWithPath: "/Users/thetiger/Desktop/scaledVideo.mov")
           /// Note:- Please use directory path if you are testing with device.

            if FileManager.default.fileExists(atPath: outputFileURL.absoluteString) {
                try FileManager.default.removeItem(at: outputFileURL)
            }

            let exporter = AVAssetExportSession(asset: mixComposition, presetName: AVAssetExportPresetHighestQuality)
            exporter?.outputURL = outputFileURL
            exporter?.outputFileType = AVFileType.mov
            exporter?.shouldOptimizeForNetworkUse = true
            exporter?.exportAsynchronously(completionHandler: {
                completion(exporter)
            })

        } catch let error {
            print(error.localizedDescription)
            completion(nil)
            return
        }
    }

}

I took 1x, 2x and 3x as a valid scale. Class contains the proper validation and handling. Below is the sample of how to use this function.

let url = Bundle.main.url(forResource: "1", withExtension: "mp4")!
VSVideoSpeeder.shared.scaleAsset(fromURL: url, by: 3, withMode: SpeedoMode.Slower) { (exporter) in
     if let exporter = exporter {
         switch exporter.status {
                case .failed: do {
                      print(exporter.error?.localizedDescription ?? "Error in exporting..")
                }
                case .completed: do {
                      print("Scaled video has been generated successfully!")
                }
                case .unknown: break
                case .waiting: break
                case .exporting: break
                case .cancelled: break
           }
      }
      else {
           /// Error
           print("Exporter is not initialized.")
      }
}

This line will handle the audio scaling

compositionAudioTrack?.scaleTimeRange(timeRange, toDuration: scaledVideoDuration)


I have achieved on adding slow motion to video including audio as well with proper output orientation.

 - (void)SlowMotion:(NSURL *)URl
 {
   AVURLAsset* videoAsset = [AVURLAsset URLAssetWithURL:URl options:nil]; //self.inputAsset;

AVAsset *currentAsset = [AVAsset assetWithURL:URl];
AVAssetTrack *vdoTrack = [[currentAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
//create mutable composition
AVMutableComposition *mixComposition = [AVMutableComposition composition];

AVMutableCompositionTrack *compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableCompositionTrack *compositionAudioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];

NSError *videoInsertError = nil;
BOOL videoInsertResult = [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)
                                                        ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]
                                                         atTime:kCMTimeZero
                                                          error:&videoInsertError];
if (!videoInsertResult || nil != videoInsertError) {
    //handle error
    return;
}

NSError *audioInsertError =nil;
BOOL audioInsertResult =[compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)
                                                       ofTrack:[[currentAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0]
                                                        atTime:kCMTimeZero
                                                         error:&audioInsertError];

if (!audioInsertResult || nil != audioInsertError) {
    //handle error
    return;
}

CMTime duration =kCMTimeZero;
duration=CMTimeAdd(duration, currentAsset.duration);
//slow down whole video by 2.0
double videoScaleFactor = 2.0;
CMTime videoDuration = videoAsset.duration;

[compositionVideoTrack scaleTimeRange:CMTimeRangeMake(kCMTimeZero, videoDuration)
                           toDuration:CMTimeMake(videoDuration.value*videoScaleFactor, videoDuration.timescale)];
[compositionAudioTrack scaleTimeRange:CMTimeRangeMake(kCMTimeZero, videoDuration)
                           toDuration:CMTimeMake(videoDuration.value*videoScaleFactor, videoDuration.timescale)];
[compositionVideoTrack setPreferredTransform:vdoTrack.preferredTransform];

        NSArray *dirPaths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
        NSString *docsDir = [dirPaths objectAtIndex:0];
        NSString *outputFilePath = [docsDir stringByAppendingPathComponent:[NSString stringWithFormat:@"slowMotion.mov"]];
        if ([[NSFileManager defaultManager] fileExistsAtPath:outputFilePath])
        [[NSFileManager defaultManager] removeItemAtPath:outputFilePath error:nil];
        NSURL *_filePath = [NSURL fileURLWithPath:outputFilePath];

//export
AVAssetExportSession* assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition
                                                                        presetName:AVAssetExportPresetLowQuality];
assetExport.outputURL=_filePath;
                          assetExport.outputFileType =           AVFileTypeQuickTimeMovie;
  exporter.shouldOptimizeForNetworkUse = YES;
                           [assetExport exportAsynchronouslyWithCompletionHandler:^
                            {

                                switch ([assetExport status]) {
                                    case AVAssetExportSessionStatusFailed:
                                    {
                                        NSLog(@"Export session faiied with error: %@", [assetExport error]);
                                        dispatch_async(dispatch_get_main_queue(), ^{
                                            // completion(nil);
                                        });
                                    }
                                        break;
                                    case AVAssetExportSessionStatusCompleted:
                                    {

                                        NSLog(@"Successful");
                                        NSURL *outputURL = assetExport.outputURL;

                                        ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
                                        if ([library videoAtPathIsCompatibleWithSavedPhotosAlbum:outputURL]) {

                                            [self writeExportedVideoToAssetsLibrary:outputURL];
                                        }
                                        dispatch_async(dispatch_get_main_queue(), ^{
                                            //                                            completion(_filePath);
                                        });

                                    }
                                        break;
                                    default:

                                        break;
                                }


                            }];


 }

  - (void)writeExportedVideoToAssetsLibrary :(NSURL *)url {
NSURL *exportURL = url;
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
if ([library videoAtPathIsCompatibleWithSavedPhotosAlbum:exportURL]) {
    [library writeVideoAtPathToSavedPhotosAlbum:exportURL completionBlock:^(NSURL *assetURL, NSError *error){
        dispatch_async(dispatch_get_main_queue(), ^{
            if (error) {
                UIAlertView *alertView = [[UIAlertView alloc] initWithTitle:[error localizedDescription]
                                                                    message:[error localizedRecoverySuggestion]
                                                                   delegate:nil
                                                          cancelButtonTitle:@"OK"
                                                          otherButtonTitles:nil];
                [alertView show];
            }
            if(!error)
            {
               // [activityView setHidden:YES];
                UIAlertView *alertView = [[UIAlertView alloc] initWithTitle:@"Sucess"
                                                                    message:@"video added to gallery successfully"
                                                                   delegate:nil
                                                          cancelButtonTitle:@"OK"
                                                          otherButtonTitles:nil];
                [alertView show];
            }
 #if !TARGET_IPHONE_SIMULATOR
            [[NSFileManager defaultManager] removeItemAtURL:exportURL error:nil];
#endif
        });
    }];
} else {
    NSLog(@"Video could not be exported to assets library.");
}

}

You could scale video using AVFoundation and CoreMedia frameworks. Take a look at the AVMutableCompositionTrack method:

- (void)scaleTimeRange:(CMTimeRange)timeRange toDuration:(CMTime)duration;

Sample:

AVURLAsset* videoAsset = nil; //self.inputAsset;

//create mutable composition
AVMutableComposition *mixComposition = [AVMutableComposition composition];

AVMutableCompositionTrack *compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo
                                                                               preferredTrackID:kCMPersistentTrackID_Invalid];
NSError *videoInsertError = nil;
BOOL videoInsertResult = [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)
                                                        ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]
                                                         atTime:kCMTimeZero
                                                          error:&videoInsertError];
if (!videoInsertResult || nil != videoInsertError) {
    //handle error
    return;
}

//slow down whole video by 2.0
double videoScaleFactor = 2.0;
CMTime videoDuration = videoAsset.duration;

[compositionVideoTrack scaleTimeRange:CMTimeRangeMake(kCMTimeZero, videoDuration)
                           toDuration:CMTimeMake(videoDuration.value*videoScaleFactor, videoDuration.timescale)];

//export
AVAssetExportSession* assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition
                                                                     presetName:AVAssetExportPresetLowQuality];

(Probably audio track from videoAsset should also be added to mixComposition)