AVFoundation-02 resources

Summary

AVFoundation is a framework that can be used and used to create time-based audio and video media data. The construction of AVFoundation takes into account the current hardware environment and applications, and its design process is highly dependent on multi-threaded mechanism. Make full use of the advantages of multi-core hardware, and use a large number of block and GCD mechanism, the complex computer process into the background thread running. Hardware acceleration is provided automatically to ensure that the application runs at best on most devices. The framework is designed for 64 bit processors and can leverage all of the advantages of a 64 bit processor.

AVFoundation-02 resources
iOS media environment.Png

AVAsset

AVFondation is a very powerful and extensible framework, including media capture, playback and processing a wide range of combination, function, and it is different from the traditional file oriented audio class, framework of the code design all around “resources”. The most important resource class is AVAsset, which is the core of AVFoundation design and plays a critical role in the development of almost all features and functions. AVAsset is an abstract class that defines the way in which media resources are mixed and renders static attributes of a media into a whole, more than their title, length, and metadata.
AVAsset does not need to consider two important categories of media resources. The first is that it provides layer abstractions for the underlying media format. That is to say whether the film, audio processing, face to developers and framework only the concept of resources, time to let developers face different formats of the contents are unified method, without considering the many details of the codec. The second is that it hides the location information of the resource. When we process resources, we can create resources through URL, which may be local or remote.

AVAssetTrack

AVAsset itself is not a media resource, but it can serve as a container for time-based media. It consists of one or more media with descriptions of its own metadata. We use the AVAssetTrack class to represent the unified type of media stored in the resource, and to build the corresponding model for each resource. The common form of AVAssetTrack is audio and video streams, but it can also represent text, subtitle, hidden subtitles, and other media types.

The composition of AVFoundation-02 resources
AVAsset.Png

In AVAsset, you can get a specific AVAssetTrack through TrackID.

- - (nullable, AVAssetTrack *) trackWithTrackID: (CMPersistentTrackID) trackID;

In addition to getting track through trackID, AVAsset also provides access to track in 3 other ways

@property (nonatomic, readonly) NSArray< AVAssetTrack *> *tracks; - (NSArray< AVAssetTrack; *> tracksWithMediaType: (NSString * *) - (NSArray<) mediaType; AVAssetTrack; *> *) tracksWithMediaCharacteristic: (NSString * mediaCharacteristic);

Tracks contains all of the AVAsset in the current track, and we can get the desired track by traversing it. – (NSArray< AVAssetTrack *> tracksWithMediaType: (* *) NSString mediaType); method returns an array of track according to the specified media type, the array contains all the specified media type track Asset. If there is no track of this media type in Asset, an empty array is returned. There are several media types in AVMediaFormat:

AVF_EXPORT NSString *const AVMediaTypeVideo NS_AVAILABLE (10_7, 4_0); AVF_EXPORT NSString *const AVMediaTypeAudio NS_AVAILABLE (10_7, 4_0); AVF_EXPORT NSString *const AVMediaTypeText NS_AVAILABLE (10_7, 4_0); AVF_EXPORT NSString * const AVMediaTypeClosedCaption NS_AVAILABLE (10_7, 4_0); AVF_EXPORT NSString *const AVMediaTypeSubtitle NS_AVAILABLE (10_7, 4_0); AVF_EXPORT NSString *const (NS_AVAILABLE 10_7, AVMediaTypeTimecode 4_0 AVF_EXPORT NSString *const AVMediaTypeMetadata); NS_AVAILABLE (10_8, 6_0); AVF_EXPORT NSString *const AVMediaTypeMuxed NS_AVAILABLE (10_7, 4_0);

– (NSArray< AVAssetTrack *> *) tracksWithMediaCharacteristic: (NSString *) mediaCharacteristic; the method returns the track array based on the specified media characteristics. If there is no track in this AVAsset of the media character, an empty array is returned. There are several media features in AVMediaFormat:

NSString *const AVMediaTypeMetadataObject; NSString *const AVMediaCharacteristicVisual; NSString *const AVMediaCharacteristicAudible; NSString *const AVMediaCharacteristicLegible; NSString *const AVMediaCharacteristicFrameBased; NSString *const AVMediaCharacteristicIsMainProgramContent; NSString *const AVMediaCharacteristicIsAuxiliaryContent; NSString *const AVMediaCharacteristicContainsOnlyForcedSubtitles; NSString *const AVMediaCharacteristicTranscribesSpokenDialogForAccessibility; NSString *const AVMediaCharacteristicDescribesMusicAndSoundForAccessibility; NSString *const AVMediaCharacteristicEasyToRead; NSString *const AVMediaCharacteristicDescribesVideoForAccessibility; NSString *const AVMediaCharacteristicLanguageTranslat Ion; NSString, *const, AVMediaCharacteristicDubbedTranslation; NSString, *const, AVMediaCharacteristicVoiceOverTranslation;

Create resources

AVAsset is an abstract class that cannot be instantiated directly. When you create an instance using the assetWithURL: method, you actually create its subclass, which is AVURLAsset.

NSURL *mp3URL = [NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource:@ "test" ofType:@ "mp3" AVAsset "; *asset assetWithURL:mp3URL] = [AVAsset;

We can also create AVURLAsset instances directly, and we can pass more parameters to get more accurate timing information. Of course, this may take more time to load to get more accurate time, length, and time information.

NSDictionary *dict = AVURLAssetPreferPreciseDurationAndTimingKey @{: @ (YES)}; NSURL *mp3URL = [NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource:@ "test" ofType:@ "mp3" AVURLAsset "; *asset = [[AVURLAsset alloc] initWithURL: mp3URL options:dict];

Photo library

Users capture video using a camera or third party video capture program, which is usually stored in the user’s photo library. We can access photos and create AVAsset objects through AssetsLibrary.

ALAssetsLibrary *assetLib = [[ALAssetsLibrary alloc] init]; [assetLib enumerateGroupsWithTypes:ALAssetsGroupSavedPhotos usingBlock:^ (ALAssetsGroup *group, BOOL *stop) {[group setAssetsFilter:[ALAssetsFilter allVideos]]; [group enumerateAssetsAtIndexes:[NSIndexSet indexSetWithIndex:0] options:0 usingBlock:^ (ALAsset *result, NSUInteger index, BOOL *stop) {if (result) {NSURL *url = [[result defaultRepresentation] url]; AVAsset *asset = [AVAsset assetWithURL:url];}]};} failureBlock:^ (NSError *error) {NSLog (@ "% @", [error localizedDescription]}]);

Asynchronous loading

AVAsset has a variety of useful methods and properties that provide information about the resources, such as duration, date of creation, and metadata. When creating resources, it is the handling of media files. In order to efficiently load resources, AVAsset uses a program that delays the loading of resource properties. However, access to attributes always happens synchronously, and if the requested property is not loaded in advance, the program will block. However, AVAsset and AVAssetTrack provide a solution for asynchronously loading resource properties. Both AVAsset and AVAssetTrack implement the AVAsynchronousKeyValueLoading protocol, which can query the attributes of the resource asynchronously through related interfaces.

The state of a given attribute query / / (AVKeyValueStatus) statusOfValueForKey: (NSString * key) error: (NSError * _Nullable * _Nullable outError); / / asynchronous loading of a given attribute - (void) loadValuesAsynchronouslyForKeys: (NSArray< NSString * * * gt; keys (nullable) completionHandler: (void ^) (void)) handler;
NSURL *mp3URL = [NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource:@ "test" ofType:@ "mp3" AVURLAsset "; *asset = [AVURLAsset assetWithURL:mp3URL]; [asset loadValuesAsynchronouslyForKeys:@[@" tracks "completionHandler:^{NSError * error AVKeyValueStatus status = [asset; statusOfValueForKey:@ tracks error:& error]; switch (status) {case AVKeyValueStatusLoaded: break; case AVKeyValueStatusLoading: break; case AVKeyValueStatusUnknown: break; case AVKeyValueStatusFailed: break; case AVKeyValueStatusCancelled: break; default: break;}}];

Each call – (void) loadValuesAsynchronouslyForKeys: (NSArray< NSString *> keys completionHandler: (*) nullable (void ^) (void) handler); only one call to completionHandler, call the method according to the number of times is not passed to this method and the key. Call for attributes of each request – – (AVKeyValueStatus) statusOfValueForKey: (NSString *), key, error: (NSError * _Nullable * _Nullable) outError, method. You cannot assume that all attributes return the same status value.

metadata

Both AVAsset and AVAssetTrack can implement query functions related to metadata. In most cases, we use metadata provided by AVAsset, but we use AVAssetTrack when it comes to getting tracks, first level metadata, and so on. The interface that reads the specific metadata of the resource is provided by the class named AVMetadataItem.

Contains the attribute / video metadata @property common format type (nonatomic, readonly) NSArray< AVMetadataItem *> *commonMetadata; / / @property metadata attributes contained in the current video format all types (nonatomic, readonly) NSArray< AVMetadataItem *> *metadata NS_AVAILABLE (10_10, 8_0); / / format type metadata attributes contained in the current video all of the available metadata format types defined in the AVMetadataFormat of many kinds, such as title, creator, subject, publisher and @property (nonatomic, readonly) NSArray< NSString *> *availableMetadataFormats; / / get the specific format metadata types via format (NSArray< AVMetadataItem *> metadataForFormat: (NSString * *)) format;

The correlation Key value of AVMetadata.

AVF_EXPORT NSString *const AVMetadataCommonKeyTitle NS_AVAILABLE (10_7, 4_0); AVF_EXPORT NSString *const AVMetadataCommonKeyCreator NS_AVAILABLE (10_7, 4_0); AVF_EXPORT NSString *const AVMetadataCommonKeySubject NS_AVAILABLE (10_7, 4_0); AVF_EXPORT NSString *const AVMetadataCommonKeyDescription NS_AVAILABLE (10_7, 4_0); AVF_EXPORT NSString *const AVMetadataCommonKeyPublisher NS_AVAILABLE (10_7, 4_0); AVF_EXPORT NSString *const AVMetadataCommonKeyContributor (NS_AVAILABLE 10_7, 4_0 AVF_EXPORT NSString *const AVMetadataCommonKeyCreationDate); NS_AVAILABLE (10_7, 4_0); AVF_EXPORT NSString *const AVMetadataCommonKeyLastModifiedDate NS_AVAILABLE (10_7, 4_0); AVF_EXPORT NSString *const AVMetadataCommonKeyType NS_AVAILABLE (10_7, 4_0); AVF_EXPORT NSString *const AVMetadataCommonKeyFormat NS_AVAILABLE (10_7, 4_0); AVF_EXPORT NSString *const AVMetadataCommonKeyIdentifier NS_AVAILABLE (10_7, 4_0); AVF_EXPORT NSString *const AVMetadataCommonKeySource NS_AVAILABLE (10_7, 4_0); AVF_EXPORT NSString *const AVMetadataCommonKeyLanguage NS_AVAILABLE (10_7, 4_0); AVF_EXPORT NSStr Ing *const AVMetadataCommonKeyRelation NS_AVAILABLE (10_7, 4_0); AVF_EXPORT NSString *const AVMetadataCommonKeyLocation NS_AVAILABLE (10_7, 4_0); AVF_EXPORT NSString *const AVMetadataCommonKeyCopyrights NS_AVAILABLE (10_7, 4_0); AVF_EXPORT NSString *const AVMetadataCommonKeyAlbumName NS_AVAILABLE (10_7, 4_0); AVF_EXPORT NSString *const AVMetadataCommonKeyAuthor NS_AVAILABLE (10_7, 4_0); AVF_EXPORT NSString *const AVMetadataCommonKeyArtist NS_AVAILABLE (10_7, 4_0) AVF_EXPORT; NSString *const AVMetadataCommonKeyArtwork NS_AVAILABLE (10_7, 4_0); AVF_EXPORT NSString *const AVMetadataCommonKeyMake NS_AVAILABLE (10_7, 4_0); AVF_EXPORT NSString *const AVMetadataCommonKeyModel NS_AVAILABLE (10_7, 4_0); AVF_EXPORT NSString *const AVMetadataCommonKeySoftware

Chapter metadata

There is a special metadata in Asset: chapters. It is the AVTimedMetadataGroup type, which represents a collection of metadata that is valid only for a specific period of time, that is, the metadata contained in the chapter is valid only in the time section of the current chapter.

Chapter Locale available @property / / said in the current Asset (readonly) NSArray< NSLocale *> *availableChapterLocales; / / method by locale and commonkey metadata selected metadata, the metadata is valid only in the period of the current chapter - (NSArray< AVTimedMetadataGroup *> chapterMetadataGroupsWithTitleLocale: (NSLocale * *) locale containingItemsWithCommonKeys: (nullable) NSArray< NSString *> commonKeys NS_AVAILABLE) * (10_7, 4_3); / / method by specifying a language, returns a metadata section array. The array matches the metadata of the specified language in between, and the position is closer to the front. - - (NSArray< AVTimedMetadataGroup *> *) chapterMetadataGroupsBestMatchingPreferredLanguages: (NSArray< NSString *> *) preferredLanguages, NS_AVAILABLE (10_8, 6_0);

Media selection

There are probably a lot of things about the same media features in a multimedia file, such as 2 subtitles in a video. For similar subtitles, there are several API:

/ / media feature options available in the current asset. An array type, which contains the corresponding representative media feature (nonatomic, readonly) string. @property NSArray< NSString *> *availableMediaCharacteristicsWithMediaSelectionOptions NS_AVAILABLE (10_8, 5_0); / / by introduction of a media feature type, return to alternative media options set. For example, a media feature type that imports subtitles, returning the selected subtitle options for the current Asset. (nullable * AVMediaSelectionGroup) - mediaSelectionGroupForMediaCharacteristic: (NSString *) mediaCharacteristic NS_AVAILABLE (10_8, 5_0); / / is the default option for each collection of media options. @property (nonatomic, readonly), AVMediaSelection, *preferredMediaSelection, NS_AVAILABLE (10_11, 9_0);

Reference resources

AVFoundation development Cheats: practice, master iOS, &amp, OSX applications audio-visual processing technology

Source address: AVFoundation development, https://github.com/QinminiOS/AVFoundation