IOS Audio Unit (two)

In the last article introduces some related concepts of iOS Audio Unit and the basic usage, followed by the source unit audio engine framework is an open source GitHub reading abroad, learning and understanding of Audio Unit technology can be used in the scene the way the actual project development, what to consider in the code structure and design etc., deepen their understanding of the technology, and transformed into their own system to project. I come into contact with Audio is a powerful open source libraries for AudioKit, TheAmazingAudioEngine, EZAudio, DouAudioStreamer, these frameworks have different emphases, here in the case of TheAmazingAudioEngine, explain in detail.

TheAmazingAudioEngine

The framework is a can also use audio in iOS, MAC OS on mature APP audio technology framework, the construction of Core Audio Remote IO system based on good performance and low latency processing ability, and the open source AudioKit library provide a rich set of tools is not the same, AmazingAudioEngine provided contains audio acquisition, audio processing and playing a set of solutions, which can quickly build a kind of audio creation high integration APP. Using AudioKit is better if you don’t understand audio processing enough, or if you need more basic audio technology scenarios that can be used to build a particular scene.

The following diagram is a summary class diagram of the TheAmazingAudioEngine open source framework, followed by a detailed description of the design ideas of several core modules, and how these technologies are applied to their own design.

IOS Audio Unit (two)
AmazingAudioEngineDiagram.jpg

Main module

AEBufferStack

The class is in the form of a stack buffer pool storage and packaging of audio data, the interaction between each link in the whole process of Engine in the unit, for example, to collect audio data into the IP stack buffer; the effect of processing, analysis and recording of the existing buffer; multi channel audio output buffer to render buffer;; release the buffer in all treatments after the end of the pop-up. The AEBufferStack correlation structure is defined as follows:

Const UInt32 AEBufferStackMaxFramesPerSlice = 4096; static const int kDefaultPoolSize typedef struct {_AEBufferStackBufferLinkedList = 16; void * buffer * next; struct _AEBufferStackBufferLinkedList;} typedef {void AEBufferStackPoolEntry; struct * bytes * AEBufferStackPoolEntry * AEBufferStackPoolEntry; free; used;} struct {int AEBufferStackPool; AEBufferStack poolSize; int maxChannelsPerBuffer; UInt32 frameCount; int stackCount; AEBufferStackPool audioPool AEBufferStackPool; bufferListPool;};

The helper method for many operations of Stack structure is defined in the AEBufferStack.h and.M files, because the BufferStatck is the single chain structure, will be involved in the operation of the corresponding node link stack stack operation, in external use only basic face in Stack AudioBufferList objects without considering more details, the object is the basic AudioUnit Render callback.

AERenderer

This is the core of the rendering engine, is mainly responsible for driving the audio processing process, usually located between audio production and processing, is simply a series of drivers of various processing operations, sub rendering AESubRenderer also often in similar situations in the middle gear processing process form is used. The main exposure AERenderLoopBlock callback, users in the callback, applied to audio processing, data transfer over AudioRenderContext for example, create an instance of AERenderer, and the introduction to the AudioUnitOutput, AudioUnitOutput render callback renderer to run after the callback to provide LoopBlock, you can in the block to be the output of the buffer Effect effect, after the callback buffer is sent to the hardware to play.

AEAudioUnitOutput

This is the AEIOAudioUnit (real AudioUnit core package package type) of foreign, combined with AERenderer in treatment of Engine is mainly responsible for playback and audio data acquisition process, of course, the Engine framework also provides a separate AEAudioUnitInputModule to handle the acquisition, because the design mode of AudioUnit in MAC OS and iOS on different platform (iOS IO Unit of the input and output provided by a separate Unit and MAC are two independent Unit).

AEAudioFileOutput

This class encapsulates the audio data to be written into the file operation, and combines with AERenderer. The audio data can be processed before the file is written. The sample code is as follows:

- (NSError * createTestFile) {AERenderer = [AERenderer * renderer __block NSError * new]; error = nil; AEAudioFileOutput = [[AEAudioFileOutput * output alloc] initWithRenderer:renderer URL:self.fileURL type:AEAudioFileTypeAIFFInt16 sampleRate: 44100 channelCount:1 error:& error]; if (! Output) {return error;} AEOscillatorModule * OSC = [[AEOscillatorModule alloc] initWithRenderer:renderer]; osc.frequency = 440; renderer.block = const ^ (AERenderContext * context) {AEModuleProcess (OSC, context); AERenderContextOutput (context, 1);}; __block BOOL = done NO; [output runForDuration:kTestFileLength completionBlock:^ (NSError * e) {done = YES; error = E;}]; while (done!) [[NSRunLoop currentRunLoop] runMode:NSDefaultRunLoopMode beforeDate:[NSDate {dateWithTimeIntervalSinceNow:0.1]]}; [output finishWriting]; return error;}

AEModule

The basic unit of audio processing, as provided by the base class AEModuleProcessFunc address processing method and unified, in the process of render by calling AEModuleProcess (OSC, context) module specific examples and incoming audio buffer context synchronized audio processing, the following is the main processing unit of the derived class:

AEAudioUnitModule

Inherited from the AEMoudle base, the base unit and the audio sound modules, such as AEAudioFilePlayerModule and AEBandpassModule are derived from this, the base class encapsulates the unified AudioUnit creation, configuration, processing entrance (AEMoudle), only need to specify the AudioUnitType and related configuration parameter in the processing module.

AEMixerModule

The mixing processing unit, with common processing module in the class (e.g. AEAudioFilePlayerModule) array, the array will enumerate all the common module and call its processFunc function in the process, at the same time to mix one by one StackBuffer, and one by one the volume and balance of the audio processing unit.

AEAudioFilePlayerModule

Audio playback module type Unit, inherited from AEModule, all format supported by the iOS platform audio playback capabilities, mainly concentrated in the AEModuleProcessFunc process, continue to read the data into the StackBuffer from the audio file, and loop playback and fade function at the beginning of the end.

AEBandpassModule

One of the audio processing modules is derived from the AEAudioUnitModule, which defines only the parameter setting and reading methods associated with the Bandpass effect, and the other audio processing modules are similar.

AEManagedValue

On the Objective-C package and void* memory pointer operation and life cycle management, through pthread_mutex_t and OSAtomic to ensure multi thread access security, and avoid the lock operation in real-time thread.

AERealtimeWatchdog

The real-time watchdog, alloc, dealloc, socket by overloading the send/recv API system call monitoring Audio Unit work thread AURemoteIO:: is there a method call these effects in real-time IOThread, and put the call to the alloc method to print out, for example, is under the core code:

Signatures for the functions we'll override / typedef void * (*malloc_t) (size_t); / / Overrides void * malloc (size_t SZ) {static malloc_t funcptr = NULL; funcptr = if (funcptr!) (malloc_t) dlsym (RTLD_NEXT, malloc); if (AERealtimeWatchdogIsOnRealtimeThread) (AERealtimeWatchdogUnsafeActivityWarning) ("malloc") return; funcptr (SZ);} / / print function static void AERealtimeWatchdogUnsafeActivityWarning (const char * activity REPORT_EVERY_INFRACTION static BOOL once) {#ifndef = NO; if (! Once) {once = YES; #endif printf ("AERealtimeWatchdog: Caught unsafe%s on realtime thread. Put a breakpoint on%s to debug/n, activity, __FUNCTION__); #ifndef REPORT_EVERY_INFRACTION / / confirm th #endif}} E current thread is AURemote IO thread BOOL AERealtimeWatchdogIsOnRealtimeThread (void); BOOL AERealtimeWatchdogIsOnRealtimeThread (void) {pthread_t thread = pthread_self (char); name[21] = {0}; if (pthread_getname_np (thread, name, sizeof (name)) = 0 & & StrCmp! (name, AURemoteIO:, IOThread)) {return YES return NO;};

}

AEAudioBufferListUtilities

The toolset encapsulates common base operations and Stack operations for AudioBufferList, such as initialization, copy, release, and so on.

AEDSPUtilties

The toolkit encapsulates some of the algorithms for audio signal processing in Accelerate.framework/vDSP.h, such as gain processing, smoothing, mixing, and other operations. The interface is easy to call with AudioBufferList exposure

AEMainThreadEndpoint

Package of AECircularBuffer and NSThread, provides the main thread and audio processing thread message or data transfer mechanism, CircularBuffer as the carrier of message, implemented in MainThreadEndPoint main NSThread, synchronization event by semophore, the CircularBuffer internal use of atomic operations, to ensure the safety of multi-threaded operation

TheAmazingAudioEngine sample flow

The example code below mainly completes the real-time playback and recording 3 audio files, the first intermediate band pass effect of file1 audio, and then File2 and file3 were the first audio mixing, and delay effect, finally all the buffer data output mix to hardware playback, and recording to the specified URL file

Create our renderer and output / AERenderer * renderer = [AERenderer new] = [[AEAudioUnitOutput alloc]; self.output initWithRenderer:renderer]; the players AEAudioFilePlayerModule / / Create * file1 = [[AEAudioFilePlayerModule alloc] initWithRenderer: renderer URL:url1 error:NULL]; AEAudioFilePlayerModule File2 = [[AEAudioFilePlayerModule alloc] initWithRenderer:renderer * URL:url2 error:NULL] AEAudioFilePlayerModule * file3 = [[AEAudioFilePlayerModule alloc]; initWithRenderer:renderer URL:url3 error:NULL]; / / Create the filters AEBandpassModule * filter1 = [[AEBandpassModule alloc] initWithRenderer:renderer]; AEDelayModule * Filter2 = [[AEDelayModule alloc] initWithRenderer:renderer] the recorder AEAudioFileRecord; / / Create ErModule * recorder = [[AEAudioFileRecorderModule alloc] initWithRenderer:renderer URL:outputUrl error:NULL]; renderer.block (const = AERenderContext * _Nonnull ^ context) {AEModuleProcess (file1, context); / / Run player (pushes 1) AEModuleProcess (filter1, context); / / Run filter (edits top buffer) AEModuleProcess (File2, context); / / Run player (pushes 1) AEModuleProcess (file3, context); / / Run player (pushes AEBufferStackMix 1) (context-> stack, 2); / / Mix top 2 buffers AEModuleProcess (Filter2, context); / / Run filter (edits top buffer) AERenderContextOutput (context, 1); / / Put top buffer onto output AEModuleProcess (recorder, context); / / Run recorder (uses top buffer)}; [self.output start:NULL];

Some experience

Through a brief combing of the above TheAmazingAudioEngine framework of learning and technology, can learn a lot of things, a lot of technology is also the first encounter, very interesting, this article did not carefully dig, after all, each point can be expanded to write an article, if you are interested, you download the reading code.

In addition, in the field of audio technology, or to be subdivided into many aspects, such as professional audio playback, APP class, VOIP class professional creation, call large game APP and so on, different types of products required for different audio technologies, such as TheAmazingAudioEngine’s partial audio creation APP architecture design. There is no universal availability of engine framework and design, as a developer, or should the platform audio technology to different points are used to learn the job, go to the target scene.