Summary of various methods of sound playback in iOS

Preface

The two day ban (Shanxi) smoke (Yan) show affection, as programmers we have been quietly sent a dog food, this time has been busy in the project, more than two months are not written, nothing to do today want to play music in the iOS (including the audio segment) part out summary.

Main part:

1. sound playback,
2. music playback (local, network),
3. audio queue service

1. sound playback (AudioToolbox/AudioToolbox.h)

Audio files must be packaged into a.Caf,.Aif,.Wav in (note that this is the official document that the actual test found that some.Mp3 can also play
audio playback) this section is not greater than 30s, the 30s is not what I said, apple said API

Summary of various methods of sound playback in iOS
AudioServices_h.png
The ID that creates the sound, the playback and destruction of sound, is executed by the ID

AudioServicesCreateSystemSoundID (CFURLRef, inFileURL, SystemSoundID*, outSystemSoundID)

Play sound

AudioServicesPlaySystemSound (SystemSoundID, inSystemSoundID)

IOS9 can be used later, with block callback playback

AudioServicesPlaySystemSoundWithCompletion (SystemSoundID, inSystemSoundID, void (^__nullable, inCompletionBlock) (void))

Play with vibration

AudioServicesPlayAlertSound (SystemSoundID, inSystemSoundID)

IOS9 can be used later, with block callback playback

AudioServicesPlayAlertSoundWithCompletion (SystemSoundID, inSystemSoundID, void (^__nullable, inCompletionBlock) (void))

Before iOS9, how do you decide if a sound effect is finished? (using the following method)

AudioServicesAddSystemSoundCompletion (SystemSoundID inSystemSoundID, CFRunLoopRef __nullable inRunLoop, CFStringRef __nullable inRunLoopMode, AudioServicesSystemSoundCompletionProc inCompletionRoutine, void * __nullable inClientData)

Destroy audio playback

AudioServicesDisposeSystemSoundID (SystemSoundID, inSystemSoundID)

Here’s a demo of the above method, play some sound effects, play 48S, MP3 will be wrong

Static SystemSoundID soundID = 0; - (IBAction) play: (ID) sender *str [[NSBundle mainBundle] {/ / NSString = pathForResource:@ "vcyber_waiting" ofType:@ "wav"]; NSString *str = [[NSBundle mainBundle] pathForResource:@ "28s" ofType:@ "mp3"]; / / NSString *str = [[NSBundle mainBundle] pathForResource:@ "48S" ofType:@ "mp3"]; NSURL *url = [NSURL (fileURLWithPath:str]; AudioServicesCreateSystemSoundID (__bridge CFURLRef _Nonnull) (URL), & soundID); / / / / AudioServicesAddSystemSoundCompletion (soundID, NULL, NULL, soundCompleteCallBack, NULL); //AudioServicesPlaySystemSound (soundID); / / / / / / / / AudioServicesPlayAlertSound (soundID); / / AudioServicesPlaySystemSoundWithCompletion / / NSLog (soundID, ^{(@ "broadcast Put "); / / AudioServicesDisposeSystemSoundID (soundID);}); / / AudioServicesPlayAlertSoundWithCompletion (soundID, NSLog ^{(@" play ");}}); void soundCompleteCallBack (SystemSoundID soundID, void * clientDate) {NSLog (@" play "complete); AudioServicesDisposeSystemSoundID (soundID);} - (IBAction) stop: (ID) sender (soundID) {AudioServicesDisposeSystemSoundID};

2. local music playback

AVAudioPlayer

AVAudioPlayer is the most visited local music playback, this class for most people should be very common, here not say, talk about its basic usage and agent usage, directly on the code, the code is very detailed notes

@interface LocalMusicViewController (<); AVAudioPlayerDelegate> / * * * player @property (nonatomic, strong) AVAudioPlayer *player; / * * * @property playback progress bar (weak, nonatomic) IBOutlet UIProgressView *progress; / * * * @property change playback slider (weak, nonatomic) IBOutlet UISlider *progressSlide; / * * * / @property slider voice change (weak, nonatomic) IBOutlet UISlider *volum; / * * change the progress bar slider display timer / @property (nonatomic, strong) NSTimer *timer; @end @implementation LocalMusicViewController (void) viewDidLoad viewDidLoad] NSError {[super; *err; NSURL *url = [[NSBundle mainBundle] URLForResource:@ "1" withExtension:@ "mp3"]; / / _player = "AVAu initialization player DioPlayer alloc] initWithContentsOfURL:url error:& err]; self.volum.value = 0.5; / / set the player sound _player.volume = self.volum.value; / / set the proxy _player.delegate = self; / / set the playback rate of _player.rate = 1; / / set the playback times on behalf of negative infinite loop _player.numberOfLoops = -1; / / ready to play [_player prepareToPlay]; self.progress.progress = 0; self.progressSlide.value = 0; _timer = [NSTimer scheduledTimerWithTimeInterval:0.1 target:self selector:@selector (change) userInfo:nil repeats:YES];} - (void) viewWillDisappear: (BOOL) animated {[super viewWillDisappear:animated];} - (void change) {self.progress.progress = _player.currentTime / _player.duration;} - (IBAction) progressChange: (UISlider * sender) {/ / change the current progress of _player.currentTime * _player.duration = sender.value; self.progress.progress = sender.value;} - (IBAction) volumChange: (UISlider * sender) {/ / voice change _player.volume = sender.value;} - (IBAction) player: (ID sender) {/ / to play [_player play]}; - (IBAction) stop: (ID sender) {/ / pause [_player stop] #pragma mark --AVAudioPlayerDelegate;} / * * complete play, but in the play and pause, stop not interrupt call * / - (void) audioPlayerDidFinishPlaying: (AVAudioPlayer *) player successfully: (BOOL flag) {} / * * play error decoding process calls. - (void) audioPlayerDecodeErrorDidOccur: (AVAudioPlayer Error: (NSError) player * error * __nullable) {} / * * * / play was interrupted - (void) audioPlayerBeginInterruption: (AVAudioPlayer *) player NS_DEPRECATED_IOS (2_2, 8_0) {} / * * * / - end of interrupt (void) audioPlayerEndInterruption: (AVAudioPlayer *) player withOptions: (NSUInteger) flags NS_DEPRECATED_IOS (6_0, 8_0) {} / * * break end - * / (void) audioPlayerEndInterruption: (AVAudioPlayer *) player withFlags: (NSUInteger) flags NS_DEPRECATED_IOS (4_0, 6_0) {} / * * this method is the above method instead of the * - (void) audioPlayerEndInterruption: (AVAudioPlayer *) player NS_DEPRECATED_IOS (2_2, 6_0) {}

Network music player (AVPlayer)

AVPlayer is the most commonly played online music and video to the network, it can cache network data, and then play AVPlayer, you must create a AVPlayerLayer to display the video when playing video, if playing music, sound will not create the object. Here’s a brief demonstration of network play music

1. create AVPlayerItem through Web links

There are many ways to initialize AVPlayerItem, and I’m here to create it directly with the initWithURL: method

- (AVPlayerItem) getItemWithIndex: (NSInteger) index NSURL [NSURL URLWithString:self.musicArray[index]] {*url = AVPlayerItem; *item = [[AVPlayerItem alloc] initWithURL:url]; //KVO [item addObserver:self forKeyPath:@ to monitor the state of play "status" options:NSKeyValueObservingOptionNew context:nil]; //KVO monitor the buffer size of the [item "loadedTimeRanges" forKeyPath:@ addObserver:self options:NSKeyValueObservingOptionNew context:nil]; / / item [[NSNotificationCenter defaultCenter] broadcast notification monitoring after addObserver:self selector:@selector (playOver:) name:AVPlayerItemDidPlayToEndTimeNotification object:item]; return item;}
2., the realization of the KVO method, according to keyPath to determine which attributes are observed
- (void) observeValueForKeyPath: (NSString *) keyPath ofObject: (ID) object change: (NSDictionary< NSKeyValueChangeKey, id> *) change context: (void * context) {AVPlayerItem *item = object; if ([keyPath isEqualToString:@ "status"]) {switch (self.player.status) {case AVPlayerStatusUnknown: NSLog (@ "unknown, can not play"); break; case AVPlayerStatusReadyToPlay: NSLog (@ "ready, you can play"); break case; AVPlayerStatusFailed: NSLog (@ "failed to load network related problems"); break; default: break;}} (if [keyPath isEqualToString:@ loadedTimeRan " The "ges") {NSArray *array = item.loadedTimeRanges; / / the cache time CMTimeRange timeRange = [array.firstObject CMTimeRangeValue]; NSTimeInterval totalBufferTime = CMTimeGetSeconds (timeRange.start) + CMTimeGetSeconds (timeRange.duration); / / cache of total length self.bufferProgress.progress = totalBufferTime / CMTimeGetSeconds (item.duration);}}
3. lazy loading AVPlayer
- (AVPlayer * player) {if (_player!) {/ / according to the first link gets an array of play item, use this item to initialize AVPlayer AVPlayerItem *item = [self getItemWithIndex:self.currentIndex]; AVPlayer _player = [[AVPlayer alloc] / / initialize initWithPlayerItem:item]; __weak typeof (self) weakSelf = self; / / method for monitoring the progress of play, addPeriodicTime: ObserverForInterval: usingBlock: / DMTime at a certain time will be a callback, including the start and end broadcast block callback, is used to obtain the current broadcast time return returns an object of observation, need to be finished playing, remove the observation / _timeObserver = [_player addPeriodicTimeObserverForInter Val:CMTimeMake (1, 1) (queue:dispatch_get_main_queue) usingBlock:^ (CMTime time) {float current = CMTimeGetSeconds (time); if (current) {[weakSelf.progressView setProgress:current / CMTimeGetSeconds (item.duration) animated:YES]; weakSelf.progressSlide.value = current / CMTimeGetSeconds (item.duration);}}}}]; return _player;
4. play and pause
/ / play - (IBAction) play: (ID) sender {[self.player play];} / / pause - (IBAction) pause: (ID) sender {[self.player pause];}
5., one and the last
- (IBAction) next: (UIButton * sender) {[self removeObserver]; self.currentIndex + + if (self.currentIndex; > = self.musicArray.count; self.currentIndex = 0) {}; / / this method is to use a item to replace the current item [self.player replaceCurrentItemWithPlayerItem:[self getItemWithIndex:self.currentIndex]]; [self.player play];} - (IBAction) last: (UIButton * sender) {[self removeObserver] self.currentIndex; if; (self.currentIndex < 0) {self.currentIndex} = 0; / / this method is to use a item to replace the current item [self.player replaceCurrentItemWithPlayerItem:[self getItemWithIndex:self.currentIndex]]; [self.player play];} / / play in another, to remove the current observation item Also, remove the item player complete Notification - (void) removeObserver [self.player.currentItem removeObserver:self {forKeyPath:@ "status"]; [self.player.currentItem removeObserver:self forKeyPath:@ "loadedTimeRanges"]; [[NSNotificationCenter defaultCenter] removeObserver:self];}
6. control the playback progress, this is also a lot of methods, if not too accurate, with – (void) seekToTime: (CMTime) time: this method is accurate, if you want to use this – (void) seekToTime: (CMTime) time toleranceBefore: (CMTime) toleranceBefore toleranceAfter: (CMTime) toleranceAfter
- (IBAction) changeProgress: (UISlider * sender) {if (self.player.status = = AVPlayerStatusReadyToPlay) {[self.player seekToTime:CMTimeMake (CMTimeGetSeconds (self.player.currentItem.duration) * sender.value, 1]);}}

Audio queue service (Audio Queue Services)

Audio queue services in the AudioToolbox framework, a framework is used for streaming media broadcast network, it can do audio playback and recording, an audio service queue is composed of three parts:
1. Buffers three buffer: no buffer is a temporary warehouse to store audio data.
2., a buffer queue Buffer Queue: an ordered queue containing audio buffers.
3. a callback CallBack: a custom queue callback function.
in the audio playback buffer queue, read the audio buffer, once a buffer filled in after the buffer queue, and then continue to fill the other buffer; when to start playing, then read the audio playback from the first buffer; after once finished playing will trigger a callback function, start playing the next a buffer in the audio, also fill the first buffer; filled again after the back buffer queue. Here is the official detailed process:

Summary of various methods of sound playback in iOS
Playback_Audio_Queues.png

AudioQueue operating procedures:
1. to create AudioQueue, create an array of BufferArray, used to store AudioQueueBufferRef
2. by AudioQueueAllocateBuffer AudioQueueBufferRef created general 2-3, into a buffer removed from the buffer array
BufferArray array 3. data, memcpy data after the AudioQueueEnqueueBuffer method is used to insert the buffer Buffer AudioQueue in 4.AudioQueue after
call AudioQueueStart, play. (specific until after buffer can play much in their control, as long as to ensure uninterrupted playback can be
) 5.AudioQueue music after consumption of a certain buffer, in another thread callback and send the buffer, put the buffer back to BufferArray for next time using the
6. return to step 3 to continue the cycle until the end of play

Commonly used API
Create AudioQueue
The first parameter indicates a need for audio data format type player, is a AudioStreamBasicDescription object that is the data format information using AudioFileStream or AudioFile analysis; the second parameter AudioQueueOutputCallback is a block after Buffer using callback; third parameters for the context object; fourth parameters of inCallbackRunLoop AudioQueueOutputCallback in the RunLoop which is needed if the incoming NULL callback, you will then AudioQueue internal RunLoop is so general callback, NULL can pass; fifth parameter inCallbackRunLoopMode for the RunLoop mode, if the incoming NULL is equivalent to kCFRunLoopCommonModes, NULL can also pass; Sixth parameter inFlags is a reserved field is currently no effect, 0; seventh a parameter, return the generated AudioQue UE instance; the return value is used to determine whether the successful creation (OSStatus = noErr). Extern OSStatus AudioQueueNewOutput (const AudioStreamBasicDescription *inFormat, AudioQueueOutputCallback inCallbackProc, void inUserData CFRunLoopRef __nullable * __nullable, inCallbackRunLoop CFStringRef, __nullable inCallbackRunLoopMode, UInt32 inFlags, AudioQueueRef __nullable * __nonnull outAQ) and the above parameters are basically the same, just put RunLoop into dispatch queue AudioQueueNewOutputWithDispatchQueue (AudioQueueRef __nullable * __nonnull outAQ AudioStreamBasicDescription * inFo const. Rmat, UInt32 inFlags, dispatch_queue_t inCallbackDispatchQueue, AudioQueueOutputCallbackBlock inCallbackBlock)
Create Buffer
The first parameter method for incoming AudioQueue examples of the second parameters of the Buffer size of third outgoing BufferArray instance extern OSStatus AudioQueueAllocateBuffer (AudioQueueRef; inAQ, UInt32 inBufferByteSize, AudioQueueBufferRef __nullable * __nonnull outBuffer) than the above method is more than a inNumberPacketDescriptions, this parameter can specify the PacketDescriptions generated Buffer number extern (AudioQueueRef inAQ, AudioQueueAllocateBufferWithPacketDescriptions OSStatus UInt32 inBufferByteSize, UInt32 inNumberPacketDescriptions, AudioQueueBufferRef __nullable * __nonnull outBuffer)
Releasing buffer
The first parameter AudioQueue second examples of the specified buffer extern OSStatus AudioQueueFreeBuffer parameters (AudioQueueRef inAQ, AudioQueueBufferRef inBuffer)
Insert buffer
The first parameter AudioQueue instance second parameter specifies the third parameters of Buffer data packet number fourth packet extern OSStatus AudioQueueEnqueueBuffer description parameters (AudioQueueRef inAQ, AudioQueueBufferRef inBuffer, UInt32 inNumPacketDescs, const AudioStreamPacketDescription * __nullable inPacketDescs) the above method to meet the basic requirements, this method is more extra extern OSStatus AudioQueueEnqueueBufferWithParameters to insert operation buffer (AudioQueueRef inAQ, AudioQueueBufferRef inBuffer, UInt32 inNumPacketDescs, const AudioStreamPacketDescription * __nullable inPacketDescs, UInt32 inTrimFramesAtStart, UInt32 inTrimFramesAtEnd, UInt32 inNumParamValues, const AudioQueueParameterEvent __nullable const AudioTimeStamp * inParamValues * __nullable InStartTime, AudioTimeStamp *, __nullable, outActualStartTime) __OSX_AVAILABLE_STARTING (__MAC_10_5, __IPHONE_2_0);
Begin play
The first parameter AudioQueue instance second parameters play time, if the direct transmission NULL extern OSStatus AudioQueueStart began broadcasting (AudioQueueRef inAQ, const AudioTimeStamp * __nullable inStartTime)
Decoding data is not common, and the call starts to decode automatically
Extern OSStatus AudioQueuePrime (AudioQueueRef inAQ, UInt32 inNumberOfFramesToPrepare, UInt32 * __nullable outNumberOfFramesPrepared)
stop playing
The second parameter Bool value, whether the control will stop immediately if false, all buffer Enqueue extern OSStatus AudioQueueStop to stop playing (AudioQueueRef inAQ, Boolean inImmediate) __OSX_AVAILABLE_STARTING (__MAC_10_5, __IPHONE_2_0);
Pause play
Extern, OSStatus, AudioQueuePause (AudioQueueRef, inAQ) __OSX_AVAILABLE_STARTING (__MAC_10_5, __IPHONE_2_0);
Reset decoder
This method will play in the queue after buffer reset decoder, decoder to prevent the impact of the current under an audio, such as switch songs, and if AudioQueueStop (AQ, false) are used together and will not work, because the false parameter of the Stop method will do the same thing. Extern, OSStatus, AudioQueueFlush (AudioQueueRef, inAQ) __OSX_AVAILABLE_STARTING (__MAC_10_5, __IPHONE_2_0);
Reset AudioQueue
Resetting the AudioQueue clears all buffer that has already been Enqueue, and triggers the AudioQueueOutputCallback, which also triggers the call when the AudioQueueStop method is invoked. The direct call of this method is usually used in seek, which is used to clear the remaining buffer (when there is another way of doing seek, first AudioQueueStop, and then start again after seek is complete). Extern, OSStatus, AudioQueueReset (AudioQueueRef, inAQ) __OSX_AVAILABLE_STARTING (__MAC_10_5, __IPHONE_2_0);
Get play time
The call to the AudioTimeStamp, extern OSStatus AudioQueueGetCurrentTime to get playing time from this structure (AudioQueueRef inAQ, AudioQueueTimelineRef __nullable inTimeline, AudioTimeStamp outTimeStamp Boolean * __nullable * __nullable, outTimelineDiscontinuity) __OSX_AVAILABLE_STARTING (__MAC_10_5, __IPHONE_2_0);
Destroy AudioQueue
The significance of the parameters like extern OSStatus and AudioQueueStop AudioQueueDispose basic (AudioQueueRef inAQ, Boolean inImmediate) __OSX_AVAILABLE_STARTING (__MAC_10_5, __IPHONE_2_0);
AudioQueue parameter
The AudioQueueGetParameter AudioQueueSetParameter CF_ENUM parameter list (AudioQueueParameterID) {kAudioQueueParam_Volume = 1, kAudioQueueParam_PlayRate = 2, kAudioQueueParam_Pitch = 3, kAudioQueueParam_VolumeRampTime = 4, kAudioQueueParam_Pan = 13};
AudioQueue property
AudioQueueGetPropertySize AudioQueueGetProperty CF_ENUM (AudioQueuePropertyID) AudioQueueSetProperty attribute list {kAudioQueueProperty_IsRunning ='aqrn', is ='aqsr' kAudioQueueDeviceProperty_SampleRate UInt32 / / value / / value is Float64, kAudioQueueDeviceProperty_NumberChannels ='aqdc', is = kAudioQueueProperty_CurrentDevice UInt32 / / value / / value is CFStringRef'aqcd', kAudioQueueProperty_MagicCookie ='aqmc', is = kAudioQueueProperty_MaximumOutputPacketSize void* / / value / / value is UInt32'xops', kAudioQueueProperty_StreamDescription ='aqft'/ value is AudioStreamBasicDescription kAudioQueueProperty_ ChannelLayout ='aqcl', is = kAudioQueueProperty_EnableLevelMetering AudioChannelLayout / / value / / value is UInt32'aqme', kAudioQueueProperty_CurrentLevelMeter ='aqmv', value is array of / AudioQueueLevelMeterState, 1 per channel kAudioQueueProperty_CurrentLevelMeterDB ='aqmd', value is array of / AudioQueueLevelMeterState, 1 per channel kAudioQueueProperty_DecodeBufferSizeFrames ='dcbf', is = kAudioQueueProperty_ConverterError UInt32 / / value / / value is UInt32'qcve', kAudioQueueProperty_EnableTimePitch ='q_tp'. Is UInt32 0/1 / / value, kAudioQueueProperty_TimePitchAlgorithm ='qtpa', is / / value UInt32. See Values below. kAudioQueueProperty_TimePitchBypass ='qtpb', is / / value UInt32 1=bypassed};
Monitor the zodiac change related methods
AudioQueueAddPropertyListener AudioQueueRemovePropertyListener

Sum up:

Here everything is (can) is (force) based (have) Foundation (limited), the function of AudioQueue in fact there are many, if you want to use a detailed comparative study on the AudioQueue, here to recommend two GitHub address, one is AudioStreamer, one is FreeStreamer, two play here is the realization of the use of AudioQueue.