IOS video playback process and principles

The author iOS development engineer, now many applications scenarios will use video player technology, of course, iOS APP is no exception, this is the background of writing this article.

Recently I also do a iOS engineer’s classmates said he recently interviewed a person, resume writing done video playback, video playback and asked him what is the basic principle, the person face ignorant like forced what also didn’t answer, just say a few API will use iOS SDK to call the video aired. My classmate told me that he just wanted to ask the person about the computer foundation. In fact, as long as the candidate can say “decoding”, the two words will satisfy him, but he has not said anything. It reminds me of a few years ago near the college roommate friends of the school recruit interview also encountered the same problem, he also made a video, they asked him which have two questions, one is if there is a very strange video format lets you play you what to do, he is Meng force. Two if there is now a large video 1G let you play, what do you do, he just silly, natural interview with broken halberds defeat.

In fact, only iOS APP, want to make a video playback function, it is called API of several commonly used AVFoundation framework can be used before, iOS 9 MPMoviePlayerController, IOS 9 recommended the use of streaming technology AVPlayer, but is often said that the API programmer will transfer, but really want advanced professional foundation and to know its not to know its the reason why, the biggest difference is the professional foundation and the programmer on the street is also here, this will become the article writing opportunity, these content integration is also the author of some accumulation of some of their own, we have mainly come from cooperation between a company called poly Weishi video solution provider, we want to help, the level is limited, the inadequacies also please generous with your criticism.

Video and audio technology mainly includes the following: encapsulation technology, video compression coding technology and audio compression coding technology. If network transmission is taken into consideration, streaming media protocol technology is also included.

Video player to play video files on the Internet, you need to go through the following steps: decoding, unpacking, decoding, video and audio, video and audio synchronization. If a local file is broadcast, the protocol is not required for the following steps: encapsulation, decoding, audio and video, synchronization of video and audio. The process is shown as shown.

IOS video playback process and principles
video playback flow chart.Jpg

The function of the
solution protocol is to parse the data of the streaming protocol into standard corresponding package format data. When video and audio are transmitted over the network, various streaming protocols, such as HTTP, RTMP, or MMS, are often used. These protocols also transmit some signaling data while transmitting video and audio data. These signaling data include control of play (play, pause, stop), or description of network status, etc.. The protocol will remove the signaling data and only retain the audio and video data. For example, the data transmitted by the RTMP protocol, after the operation of the protocol, outputs the data in the FLV format.

The function of encapsulation is to separate the data in the encapsulated format into audio stream compression coding data and video stream compression coding data. There are many kinds of package formats, such as MP4, MKV, RMVB, TS, FLV, AVI and so on. Its function is to put compressed video data and audio data together in a certain format. For example, data in FLV format, after unpacking operations, outputs H.264 encoded video streams and AAC encoded audio streams.

The function of decoding is to encode video / audio compression data into uncompressed video / audio raw data. Audio compression coding standards include AAC, MP3, AC-3, and so on. Video compression coding standards include H.264, MPEG2, VC-1, and so on. Decoding is the most important and complicated part of the whole system. By decoding, video data compression encoding output becomes non color data compression, such as YUV420P, RGB and so on; the audio data compression encoding output becomes non audio sampling data compression, such as PCM data.

Effect of synchronization of audio and video, is based on the parameter information de encapsulation module process to obtain the synchronization of video and audio data decoded, and the audio and video data is sent to the graphics and sound broadcast system.