IOS audio video playback, editing, synthesis (imitation seconds to video, add background music)

Analysis:
demo this is an imitation of the second beat video synthesis. Open the second shot and shoot a video, and then enter the editing interface, add background music to video, you can choose any love music as background music, background music can also adjust the volume and the volume of the original video, to synthesize, but the synthesis of audio and video takes time, and the sliding adjustment the volume and select music without Caton, so presumably the synthesis operation and not in this editing interface, audio and video playback only together; and the synthesized operations are performed on the click release button.

Analysis completed, the realization of the function:
1, according to the length of video clips to edit the length of the audio
2, to achieve synchronous audio and video playback
3, audio and video synthesis

1 clip audio

Generate an output path based on time

#pragma mark - the output path + (NSURL * exporterPath) {NSInteger = nowInter (long) [[NSDate date] timeIntervalSince1970]; NSString *fileName = [NSString stringWithFormat:@ output%ld.mp4, (long) nowInter]; NSString *documentsDirectory = NSSearchPathForDirectoriesInDomains (NSDocumentDirectory, NSUserDomainMask, YES).LastObject; NSString *outputFilePath =[documentsDirectory stringByAppendingPathComponent:fileName]; if ([[NSFileManager defaultManager]fileExistsAtPath: outputFilePath]) {[[NSFileManager defaultManager]removeItemAtPath:outputFilePath error:nil] return [NSURL fileURLWithPath:outputFilePath];}};

The length of the audio is edited according to the length of the video to be taken so as to realize the synchronous playing of the audio and video

#pragma mark - audio and video clips, if video clips (MP4 format) to the following two types for AVFileTypeAppleM4V audio and video clips of @param assetURL / * * audio and video resources path @param time @param endTime startTime clip clip clip completionHandle @param over time after the completion of the callback * + (void) cutAudioVideoResourcePath: (NSURL *) assetURL startTime: (CGFloat) startTime endTime: (CGFloat) endTime (complition: (void ^ (NSURL) *outputPath, BOOL, isSucceed) completionHandle{/ *asset / AVAsset material) = [AVAsset assetWithURL:assetURL]; / / AVAssetExportSession *exporter [[AVAssetExportSession alloc]initWithAsset:asset derived material = presetName:AVAssetExportPresetAppleM4A]; / / clip (set derived Time) CMTime start = CMTimeMakeWithSeconds (startTime, asset.duration.timescale); CMTime duration = CMTimeMakeWithSeconds (endTime - startTime, asset.duration.timescale); exporter.timeRange = CMTimeRangeMake (start, duration); / / NSURL *outputPath = [self output path = [self exporterPath]; exporter.outputURL exporterPath]; / / output format of exporter.outputFileType = AVFileTypeAppleM4A; exporter.shouldOptimizeForNetworkUse= YES; / / synthesis after the callback [exporter exportAsynchronouslyWithCompletionHandler:^{switch ([exporter status] case AVAssetExportSessionStatusFailed: (NSLog) {{@ synthesis failed:% @ "[[exporter error], description] completionHan); Dle (outputPath, NO);} case {completionHandle break; AVAssetExportSessionStatusCancelled: (outputPath, NO);} case {completionHandle break; AVAssetExportSessionStatusCompleted: (outputPath, YES);} default: {break; completionHandle (outputPath, NO);}}}}]; break;

2, audio and video cycle synchronization play

The following notes are clear, not to mention

Add a layer of UIView *playView / play = [[UIView alloc]initWithFrame:CGRectMake (0, 0, self.view.bounds.size.width, 400)]; [self.view addSubview:playView]; / / resource path are added to the AVPlayerItem = [[AVPlayerItem AVPlayerItem *playItem alloc] initWithURL:[self filePathName:@ "abc.mp4" AVPlayer "; / / self.player = [[AVPlayer play need to add AVPlayerItem alloc]initWithPlayerItem:playItem]; self.player.volume = 0.5; / / default volume is set to 0.5 the range of 0-1, the need for video display AVPlayerLayer / *playerLayer = [AVPlayerLayer playerLayerWithPlayer:self.player] AVPlayerLayer; playerLayer.frame = playView.frame; / / playerLayer frame [playView.layer must be set to addSublayer:playerLay Er]; / / add AVPlayerLayer to add a loop [[NSNotificationCenter defaultCenter]addObserver:self selector:@selector broadcast notice layer of the layer / / object:nil] / / name:AVPlayerItemDidPlayToEndTimeNotification (repeatPlay); background music playing video clips / / calculated length, then the corresponding audio clips of AVAsset *asset = [AVAsset assetWithURL:[self filePathName:@ "abc.mp4" CMTime "; duration = asset.duration; CGFloat videoDuration = duration.value / (float) duration.timescale; NSLog (@"%f ", videoDuration); / / audio clips, call the first step audio editing tool typeof (self) weakSelf = self; [EditAudioVideo cutAudioVideoResourcePath:[self filePathName:@ startTime:0 endTime:" 123.mp3 "] VideoDuration complition:^ (NSURL *outputPath, BOOL isSucceed) {/ / audio clips after the success of the audio path NSError *error get edited; weakSelf.BGMPlayer = [[AVAudioPlayer alloc]initWithContentsOfURL:outputPath error:& error]; if (error = = Nil) {weakSelf.BGMPlayer.numberOfLoops = -1; / / loop weakSelf.BGMPlayer.volume = 0.5; [weakSelf.BGMPlayer prepareToPlay]; / / audio pre loaded into memory, play more smoothly at the same time / play audio, video playback, synchronized playback [weakSelf.BGMPlayer play]; [weakSelf.player play];}else{NSLog (@ "% @", error);}}];

3, the synthesis of audio and video

1, loading audio and video resources

*asset = [AVAsset assetWithURL:assetURL] / / AVAsset material; AVAsset *audioAsset = [AVAsset assetWithURL:BGMPath];

2, separate material
audio only audio track, and we recorded video has two tracks: audio tracks, video tracks. Audio and video resources need to be separated into tracks for editing.

*videoAssetTrack = [[asset AVAssetTrack / / separation material tracksWithMediaType:AVMediaTypeVideo]objectAtIndex:0]; / / AVAssetTrack *audioAssetTrack = [[audioAsset tracksWithMediaType:AVMediaTypeAudio]objectAtIndex:0] video / audio material;

3, editing each track is required in the AVMutableComposition class environment

Video editing environment AVMutableComposition *composition / / alloc]init] = [[AVMutableComposition;

Video footage added to video tracks

AVMutableCompositionTrack *videoCompositionTrack = [composition, addMutableTrackWithMediaType:AVMediaTypeVideo, preferredTrackID:kCMPersistentTrackID_Invalid];

Insert video tracks into the editing environment

[videoCompositionTrack, insertTimeRange:CMTimeRangeMake (kCMTimeZero, videoAssetTrack.timeRange.duration), ofTrack:videoAssetTrack, atTime:kCMTimeZero, error:nil];

Background music audio material added to audio tracks

AVMutableCompositionTrack *audioCompositionTrack = [composition, addMutableTrackWithMediaType:AVMediaTypeAudio, preferredTrackID:kCMPersistentTrackID_Invalid];

Insert audio tracks of background music into the editing environment

[audioCompositionTrack, insertTimeRange:CMTimeRangeMake (kCMTimeZero, videoAssetTrack.timeRange.duration), ofTrack:audioAssetTrack, atTime:kCMTimeZero, error:nil];

Whether to add video soundtrack, select Add as required, the principle is the same as above

AVMutableCompositionTrack *originalAudioCompositionTrack = nil; if (needOriginalVoice) {AVAssetTrack *originalAudioAssetTrack = [[asset tracksWithMediaType:AVMediaTypeAudio]objectAtIndex:0] = [composition addMutableTrackWithMediaType:AVMediaTypeAudio; originalAudioCompositionTrack preferredTrackID:kCMPersistentTrackID_Invalid]; [originalAudioCompositionTrack insertTimeRange:CMTimeRangeMake (kCMTimeZero, videoAssetTrack.timeRange.duration) ofTrack:originalAudioAssetTrack atTime: kCMTimeZero error:nil];}

Add the configured AVMutableComposition environment to the export class

AVAssetExportSession *exporter = [[AVAssetExportSession / derived material alloc]initWithAsset:composition presetName:AVAssetExportPresetMediumQuality];

There is a pit, AVMutableCompositionTrack track control volume, preferredVolume property does not work, do not know why, know the great God corrected.
so you can only adjust the volume with audio mixing

#pragma mark - regulated synthesis volume + (AVAudioMix *) buildAudioMixWithVideoTrack: (* AVCompositionTrack) videoTrack VideoVolume: (float) videoVolume BGMTrack: (AVCompositionTrack *) BGMTrack BGMVolume: (float) BGMVolume controlVolumeRange: (CMTime volumeRange) {/ / AVMutableAudioMix *audioMix create audio mixing = [AVMutableAudioMix audioMix]; / / get the video sound track set volume AVMutableAudioMixInputParameters *Videoparameters = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:videoTrack]; [Videoparameters setVolume:videoVolume atTime:volumeRange]; / / set the background music volume AVMutableAudioMixInputParameters *BGMparameters = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack: BGMTra Ck]; [Videoparameters setVolume:BGMVolume atTime:volumeRange]; / / mixed array audioMix.inputParameters = @[Videoparameters, BGMparameters]; return audioMix;}

Get video, audio resource path, take the set of audio and video corresponding volume, for the synthesis of
complete synthesis code, as follows

Synthesis of @param assetURL / * * audio and video original video path @param BGMPath background music path @param needOriginalVoice is added to the original video voice @param videoVolume video volume @param BGMVolume background music volume @param completionHandle synthesis after the callback + (void) editVideoSynthesizeVieoPath: * (NSURL *) assetURL BGMPath: (NSURL *) BGMPath needOriginalVoice: (BOOL) needOriginalVoice videoVolume: (CGFloat) videoVolume BGMVolume: (CGFloat) BGMVolume (complition: (void ^ (NSURL) *outputPath, BOOL isSucceed)) completionHandle{*asset = [AVAsset assetWithURL:assetURL] / / AVAsset material; AVAsset *audioAsset = [AVAsset assetWithURL:BGMPath]; / / AVAssetTrack = [[asset tracksWithMediaType:AVMedi *videoAssetTrack separation material ATypeVideo]objectAtIndex:0]; / / AVAssetTrack *audioAssetTrack = [[audioAsset tracksWithMediaType:AVMediaTypeAudio]objectAtIndex:0] video / audio material; / / video editing environment AVMutableComposition *composition = [[AVMutableComposition alloc]init]; / / video material adding video track AVMutableCompositionTrack *videoCompositionTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid]; [videoCompositionTrack insertTimeRange:CMTimeRangeMake (kCMTimeZero, videoAssetTrack.timeRange.duration) ofTrack:videoAssetTrack atTime:kCMTimeZero error:nil]; / / audio material adding audio track AVMutableCompositionTrack *audioCompositionTrack = [composition addMu TableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid]; [audioCompositionTrack insertTimeRange:CMTimeRangeMake (kCMTimeZero, videoAssetTrack.timeRange.duration) ofTrack:audioAssetTrack atTime:kCMTimeZero error:nil]; / / whether to join the video Soundtrack AVMutableCompositionTrack *originalAudioCompositionTrack = nil; if (needOriginalVoice) {AVAssetTrack *originalAudioAssetTrack = [[asset tracksWithMediaType:AVMediaTypeAudio]objectAtIndex:0]; originalAudioCompositionTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid] [originalAudioCompositionTrack insertTimeRange:CMTimeRangeMake (kCMTimeZero, videoAssetTrack.timeRa; Nge.duration) ofTrack:originalAudioAssetTrack atTime:kCMTimeZero error:nil];} / / *exporter = [[AVAssetExportSession alloc]initWithAsset:composition AVAssetExportSession derived material presetName:AVAssetExportPresetMediumQuality]; / / volume control CMTime duration = videoAssetTrack.timeRange.duration; CGFloat = duration.value / videoDuration (float) duration.timescale; exporter.audioMix = [self buildAudioMixWithVideoTrack:originalAudioCompositionTrack VideoVolume:videoVolume BGMTrack: audioCompositionTrack BGMVolume:BGMVolume controlVolumeRange:CMTimeMake (0, videoDuration)]; / / set the output path NSURL *outputPath = exporter.outputURL = [self [self exporterPath]; exporterPath]; exporter.outputFileT Ype = AVFileTypeMPEG4; / / [exporter exportAsynchronouslyWithCompletionHandler:^{switch specifies the output format ([exporter status]) {case {AVAssetExportSessionStatusFailed: NSLog (@ synthesis failed:% @ "[[exporter, error] description]); completionHandle (outputPath, NO);} case {completionHandle break; AVAssetExportSessionStatusCancelled: (outputPath, NO);} case {completionHandle (break; AVAssetExportSessionStatusCompleted: outputPath YES; break) {completionHandle}; default: (outputPath, NO);}}}}]; break;

GitHub code download https://github.com/D-james/AudioVideoEdit, the file is relatively large, because the audio and video to be synthesized also passed up.

Reference: “AV Foundation development Cheats: practice, master iOS & OS X applications audio-visual processing technology.”