AVAudioFoundation (2): audio and video playback

This article from: AVAudioFoundation (2): audio and video playback | www.samirchen.com

The main content of this paper is from AVFoundation Programming Guide.

To play AVAsset, use AVPlayer. During playback, you can use a AVPlayerItem instance to manage the overall playback state of the asset, and use AVPlayerItemTrack to manage the playback status of each track. For video rendering, AVPlayerLayer is used for processing.

Play Asset

AVPlayer is a controller that controls asset playback. Its functions include starting playback, stopping playback, seek, and so on. You can use AVPlayer to play a single asset. If you want to play a set of asset, you can use AVQueuePlayer, and AVQueuePlayer is the subclass of AVPlayer.

AVPlayer also provides the current playback state so that we can adjust the interaction according to the current play state. We need to output the AVPlayer picture to a specific Core Animation Layer, usually a AVPlayerLayer or AVSynchronizedLayer instance.

It’s important to note that you can create multiple AVPlayerLayer objects from a AVPlayer instance, but only the newly created ones render the screen to the screen.

For AVPlayer, although the final play is asset, we do not directly provide a AVAsset to it, but provide a AVPlayerItem instance. AVPlayerItem is used to manage the playback state of the asset associated with it, a AVPlayerItem that contains a set of AVPlayerItemTrack instances, corresponding to the audio and video track in asset. Their direct relationship is roughly as follows:

AVAudioFoundation (2): audio and video playback

Note: the original picture is on the apple official document, but the original picture is wrong. The frame of the AVPlayerItemTrack is labeled as AVAsset, which has been corrected here.

This implementation means that we can use multiple players to play a asset at the same time, and each player can use different modes to render. The figure below shows a scenario where two different AVPlayer are used to play the same AVAsset with different settings. In play, you can also ban some track playback.

AVAudioFoundation (2): audio and video playback

We can load the asset through the network, usually simple initialization of AVPlayerItem does not mean that it can be played directly, so we can status KVO AVPlayerItem attributes to monitor whether it has been broadcast before deciding subsequent behavior.

Deal with different types of Asset

The way we configure asset to play is more or less dependent on the type of asset, and we generally have two different types of asset:

  • 1) file based asset can generally be derived from local video files, photo albums, resource repositories, and so forth.
  • 2) streaming asset, such as HLS format video.

Loading file based asset is generally divided into the following steps:

  • Create AVURLAsset instances based on file path URL.
  • Create a AVPlayerItem instance based on the AVURLAsset instance.
  • Associate the AVPlayerItem instance with a AVPlayer instance.
  • KVO monitors the status property of the AVPlayerItem to wait until it is already available, that is, the load is complete.

When you create and load a HTTP Live Stream (HLS) format to play, you can follow the following steps:

  • Resource based URL initializes a AVPlayerItem instance because you can’t directly create a AVAsset to represent HLS resources.
  • When you set the AVPlayerItem and AVPlayer instance after he began to prepare for the play, when everything is done AVPlayerItem will create AVAsset and AVAssetTrack examples of audio and video content to the docking of HLS video stream.
  • To get the duration of the video stream, you need the KVO to monitor the duration property of the AVPlayerItem, and when the resource can be played, it is updated to the correct value.
NSURL *url = [NSURL URLWithString:@ < #Live stream URL#> You may find; / / a test stream at < http://devimages.apple.com/iphone/samples/bipbop/bipbopall.m3u8> self.playerItem. PlayerItemWithURL:url] = [AVPlayerItem; [playerItem addObserver:self forKeyPath:@ "status" options:0 context:& self.player = [AVPlayer playerWithPlayerItem:playerItem]; ItemStatusContext];

You can do this when you don’t know what type of URL asset corresponds to:

  • Try to initialize a AVURLAsset based on URL and load its tracks properties. If the tracks property is loaded successfully, a AVPlayerItem instance is created based on asset.
  • If the tracks property fails to load, you create a AVPlayerItem instance directly based on URL, and KVO monitors the status property of the AVPlayer to see when it can be played.
  • If the above attempts fail, clean up the AVPlayerItem.

Play an AVPlayerItem

Call the play interface of AVPlayer to start playback.

- - (IBAction) play:sender {[player play];}

In addition to simple playback, you can also set the playback rate by setting the rate property.

Player.rate = 0.5; player.rate = 2;

The playback rate is set to 1 to indicate normal play, and set to 0 to indicate pause (equivalent to calling pause effects).

In addition to the forward broadcast, some audio and video can also support inverted seeding, but you need to check a few properties:

  • CanPlayReverse: support settings, playback rate of -1.0.
  • CanPlaySlowReverse: support settings, playback rate from -1.0 to 0.
  • CanPlayFastReverse: supports setting the playback rate to less than -1.0.

You can adjust the playback position through the seekToTime: interface. However, this interface is primarily for performance considerations and does not guarantee accuracy.

CMTime fiveSecondsIn = CMTimeMake (5, 1); [player seekToTime:fiveSecondsIn];

You can use the seekToTime:toleranceBefore:toleranceAfter: interface if you want to make precise adjustments.

CMTime fiveSecondsIn = CMTimeMake (5, 1); [player, seekToTime:fiveSecondsIn, toleranceBefore:kCMTimeZero, toleranceAfter:kCMTimeZero];

It is important to note that setting tolerance to zero can be expensive, so it is generally written only in the case of complex audio editing functions.

We can get the end of play by listening to AVPlayerItemDidPlayToEndTimeNotification. After the end of play, we can adjust the playback position to zero by seekToTime:, otherwise, calling play will be invalid.

Register with the notification center after / creating the player item. [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector (playerItemDidReachEnd:) name:AVPlayerItemDidPlayToEndTimeNotification object:< #The player; item#> - (void) playerItemDidReachEnd: (NSNotification * notification) {[player seekToTime:kCMTimeZero];}

In addition, we can also set the actionAtItemEnd property of the player to set its behavior at the end of the play, such as AVPlayerActionAtItemEndPause, which indicates that the play will be suspended after the end of play.

Play multiple AVPlayerItem

We can use AVQueuePlayer to play multiple AVPlayerItem sequentially. AVQueuePlayer is a subclass of AVPlayer.

NSArray *items = < #An, array, of, player, items#>; AVQueuePlayer *queuePlayer = [[AVQueuePlayer alloc] initWithItems:items];

By calling play, you can play sequentially, or you can call advanceToNextItem to skip to the next item. In addition, we can use insertItem:afterItem:, removeItem:, removeAllItems to control playback resources.

When inserting a item, you can check with canInsertItem:afterItem: to see if you can insert the afterItem into nil, and check if it can be inserted into the end of the queue.

AVPlayerItem *anItem = < #Get a; player item#>; if ([queuePlayer, canInsertItem:anItem, afterItem:nil]) {[queuePlayer, insertItem:anItem, afterItem:nil];}

Monitoring playback status

We can monitor the status of some AVPlayer and the status of the AVPlayerItem being played, which is useful for handling state that is not directly under your control, for example:

  • If the user switches to another application using multitasking, the rate property of the player will drop to 0.
  • When playing remote media resources (such as network video), monitoring AVPlayerItem’s loadedTimeRanges and seekableTimeRanges can know the length of resources that can be played and seek.
  • When HTTP Live Stream is played, the currentItem of the player may change.
  • When HTTP Live Stream is played, AVPlayerItem’s tracks may change. This can happen when the streaming stream switches the encoding.
  • When the playback fails, the status of AVPlayer or AVPlayerItem may change.

Respond to changes in the status property

The KVO monitor AVPlayer and status property being playing AVPlayerItem, can get the corresponding notice, such as when playing error occurs, you may receive notice of AVPlayerStatusFailed or AVPlayerItemStatusFailed, then you can make corresponding treatment.

Note that since AVFoundation does not specify which thread to send a notification, you need to cut to the main thread if you need to update the user interface after receipt of the notification.

- (void) observeValueForKeyPath: (NSString *) keyPath ofObject: (ID) object change: (NSDictionary * Change) context: (void * context) {if (context = = < #Player status context#> AVPlayer) {*thePlayer = (AVPlayer *) object; if ([thePlayer status] = = AVPlayerStatusFailed) {NSError *error = [< #The AVPlayer object#> error]; / / Respond to error: for example, display an alert sheet. return with other;} / / Deal status change if appropriate. with other change} / / Deal notifications if appropriate. [super observeValueForKeyPath:keyPath ofObject:object change:change context:context]; return;}

Tracking visual content readiness status

We can monitor the readyForDisplay property of the AVPlayerLayer instance to get notifications that the player can already begin to render visual content.

Based on this capability, we can implement the insertion of player layer into the layer tree to show the user when the visual content of the player is ready.

Track playback time change

We can use the AVPlayer addPeriodicTimeObserverForInterval:queue:usingBlock: and addBoundaryTimeObserverForTimes:queue:usingBlock: two interface to track the current playback position changes, so that we can make an update on the user interface, the user feedback to the current playback time and remaining playback time etc..

  • AddPeriodicTimeObserverForInterval:queue:usingBlock:, this interface will notify us of the current play time in the callback block when the playing time changes.
  • AddBoundaryTimeObserverForTimes:queue:usingBlock:, this interface allows us to pass in a set of time (CMTime arrays) that will notify us in the callback block when the player is playing those times.

The two interface will return to a observer role object to us, we need to refer to the object in the process of strong monitoring time, and use it to call the removeTimeObserver: interface does not need to remove it.

In addition, AVFoundation does not guarantee to call block to notify you when each time change or when the set time arrives. For example, when the last callback, block has not completed the case, but also to the callback block time, AVFoundation this time will not call block. So we need to make sure that we don’t do too much and take too long in the block callback.

A property: / / Assume @property (strong) ID playerObserver; Float64 durationSeconds = CMTimeGetSeconds ([< #An asset#> duration]); CMTime firstThird = CMTimeMakeWithSeconds (durationSeconds/3.0, 1); CMTime secondThird = CMTimeMakeWithSeconds (durationSeconds*2.0/3.0, 1); NSArray *times = @[[NSValue valueWithCMTime:firstThird], [NSValue valueWithCMTime:secondThird]]; self.playerObserver = [< #A player#> addBoundaryTimeObserverForTimes:times queue:NULL usingBlock:^{NSString *timeDescription = (NSString * CFBridgingRelease) (CMTimeCopyDescription (NULL, [self.player, currentTime])); NSLog ("Passed a boundary at @% @", timeDescription);}];

End of play

Listen to the AVPlayerItemDidPlayToEndTimeNotification notification. As mentioned above, it is no longer repeated here.

A complete example

The example here will show that if you use AVPlayer to play a video file, the following steps are included:

  • Configure a UIView using AVPlayerLayer layer.
  • Create an instance of AVPlayer.
  • Asset creates a AVPlayerItem instance based on the file type, and uses KVO to monitor its status properties.
  • Respond to notifications that can be played by the AVPlayerItem instance and display a button.
  • Play AVPlayerItem and play it back to its starting position after play has been completed.

First of all, PlayerView:

#import < UIKit/UIKit.h> #import; < AVFoundation/AVFoundation.h> @interface: PlayerView UIView @property (nonatomic) AVPlayer *player; @end @implementation PlayerView (Class) + layerClass [AVPlayerLayer {return class];} - {return [(AVPlayer*) player (AVPlayerLayer *) [self layer] player];} - (void) setPlayer: (AVPlayer * player) {[(AVPlayerLayer *) [self layer] setPlayer:player] @end;}

A simple PlayerViewController:

@class PlayerView; @interface PlayerViewController UIViewController @property (nonatomic) AVPlayer *player @property (nonatomic); AVPlayerItem *playerItem; @property (nonatomic, weak) IBOutlet PlayerView *playerView; @property (nonatomic, weak) IBOutlet UIButton *playButton; loadAssetFromFile:sender; - (IBAction) - (IBAction) - (void) play:sender; syncUI; @end

Synchronous UI method:

- (void) syncUI ({if (self.player.currentItem! = Nil) & & ([self.player.currentItem; status] = = AVPlayerItemStatusReadyToPlay)) {self.playButton.enabled} {self.playButton.enabled = YES; else = NO;}}

In viewDidLoad, call syncUI first:

- (void) viewDidLoad {[super viewDidLoad]; [self syncUI];}

Creates and loads AVURLAsset, creates item, initializes the player, and adds various listeners when the load is successful:

Static const NSString *ItemStatusContext; - (IBAction) loadAssetFromFile:sender *fileURL mainBundle] URLForResource:< {NSURL = [[NSBundle; #@ "VideoFileName" #> withExtension:< #@ "extension" #> AVURLAsset *asset = []; AVURLAsset URLAssetWithURL:fileURL options:nil]; NSString *tracksKey = @ "tracks"; [asset loadValuesAsynchronouslyForKeys:@[tracksKey] completionHandler: The completion block goes ^{/ here. dispatch_async (dispatch_get_main_queue (^{), NSError *error; AVKeyValueStatus status = [asset statusOfValueForKey:tracksKey error:& error]; if (status = = AVKeyValueStatusLoaded) {self.playerItem = [AVPlayerItem playerItemWithAsset:asset]; Ensure that this is done before / the playerItem is associated with the player [self.playerItem addObserver:self forKeyPath:@ "status" options:NSKeyValueObservingOptionInitial context:& [[NSNotificationCenter defaultCenter] addObserver:self; ItemStatusContext]; selector:@selector (playerItemDidReachEnd:) name:AVPlayerItemDidPlayToEndTimeNotification object:self.playerItem]; self.player = [AVPlayer playerWithPlayerItem:self.playerItem]; [self.playerView setPlayer:self.player];} else {/ / You should deal with the error appropriately. NSLog (asset's tracks were not @ The loaded:/n%@, [error localizedDescription]); }]}}});;

Listen to the listener notification of the response to status:

- (void) observeValueForKeyPath: (NSString *) keyPath ofObject: (ID) object change: (NSDictionary * Change) context: (void * context) {if (context = = & ItemStatusContext) {dispatch_async (dispatch_get_main_queue), ^{([self syncUI]; return [super;}}); observeValueForKeyPath:keyPath ofObject:object return; change:change context:context];}

Playback, and playback processing at completion time:

- (IBAction) play:sender {[self.player play];} - - (void) playerItemDidReachEnd: (NSNotification *) notification {[self.player seekToTime:kCMTimeZero];}