[iOS] imitation micro-blog video side of the following broadcast of the encapsulated player

[iOS] imitation micro-blog video side of the following broadcast of the encapsulated player

Note: the framework has been iterated to version 2. I have re built the entire framework, and API has been better designed. I wrote an article for version 2 implementation, how does [iOS] reconfigure JPVideoPlayer?. The implementation idea in this article is still consistent, but the implementation details are no longer available. Go to my GitHub for details.

Tips: the content is divided into two articles about
01, [iOS], imitation micro-blog video, the following broadcast of the encapsulated player, about how to encapsulate a video player that implements the following side of the broadcast and cache. TableView
02, [iOS] sliding imitation micro-blog video broadcast side below on how to achieve automatic playing video playback slide in tableView, and is smooth, without blocking threads, no Caton slide play video. It will also tell you what strategy to use to determine which cell should play video when tableView rolls.


Micro-blog video features: seconds shoot, the team is mainly committed to video processing, micro-blog’s video playback technology is provided by seconds to shoot support. Micro-blog’s videos are usually long and limited, so it’s a feature that is broadcast below the edges. When it comes to video playback, you can’t help but mention WeChat’s short video, WeChat’s short video limit is 15 seconds long, after the WeChat team processed, a short video volume can be controlled within 2MB. So, WeChat’s video is downloaded first, and then read the downloaded video files for playback, that is, the first post and post broadcast. This function, WeChat’s colleagues have shared the source code, here.

I found a lot of information, did not find the full sense of the realization of the micro-blog home list video below the broadcast function information. But I have this requirement in my own project, so I have to do it myself. Finally, the effect is as follows:

[iOS] imitation micro-blog video side of the following broadcast of the encapsulated player

This list is broadcast below the video and includes the following main functional points:

  • 1 must be broadcast below the side.
  • 2, if the cached video is complete, you have to save this video. Next time you load the video again, check the local cache for good video. This is important for saving user traffic and improving the user experience. To achieve this, that is to say, we have to manually intervene the system player to load the internal implementation of the data. This detail is later discussed.
  • 3 do not block the thread, not cotton, sliding silky smooth, this is to ensure that the user experience is the most important point.
  • 4, when tableView scrolls, what policy should be used to determine which cell should play video?.

Maybe you’re in a hurry, just as soon as possible to this functionality into your project, then please go to the Github download the source code directly. Need to explain is, I said above the function points of the first and second points, do not you care, I have to help you deal with the package. But, third and fourth, you need to combine your own project to customize it. I only provide templates and very large notes.

Next, let’s see how I implemented these functions.

First, the basic use of AVPlayer?

Start with the most basic packaging player.

01, AVPlayer?

AVPlayer video playback involves several categories:

  • AVURLAsset, a subclass of AVAsset, is responsible for network connections and requesting data.
  • AVPlayerItem will establish a data model for the dynamic view of media resources and save the status of AVPlayer playback resources. To put it bluntly, data stewards.
  • AVPlayer, a player that decodes data into images and sounds.
  • AVPlayerLayer, the image layer, and the AVPlayer image are rendered through the AVPlayerLayer.

Note that the AVPlayer model is that you do not take the initiative to call the play method to play the video, but wait for the AVPlayerItem to tell you, I’m ready to play, you can play now, so we need to monitor the state of AVPlayerItem, obtained by adding the AVPlayerItem listener type state:

[_currentPlayerItem addObserver:self forKeyPath:@ / / add monitor "status" options:NSKeyValueObservingOptionNew context:nil];

Processing the play logic in the listening result. You can call the play method when you hear that the player is ready to play.
note: if the video is not ready to play, you put the AVPlayerLayer layer is added to the cell, then before the player is not ready to play, is responsible for the display of image layers will become black, until ready to play, get the data, only then can have the picture. This should be greatly avoided in auto play on the list. So wait for the player to have an image output, and then add the displayed preview layer to the cell.

- (void) observeValueForKeyPath: (NSString *) keyPath ofObject: (ID) object (change: NSDictionary< NSString id> * *, change (void) context: * context{([keyPath) if isEqualToString:@ "status"] = *playerItem) {AVPlayerItem (AVPlayerItem * object); AVPlayerItemStatus status = playerItem.status; switch (status) {case AVPlayerItemStatusUnknown:{break case AVPlayerItemStatusReadyToPlay:{[self.player}; play]; self.player.muted = self.mute; / / display image logic [self handleShowViewSublayers]; break case AVPlayerItemStatusFailed:{}}; break; default: Break;}}

You can play a network or local video here. However, during playback: connection request data –> –> integrated data –> –&gt data decoding; the output image and sound, these processes are under the framework of AVFoundation, I listed above those classes automatically help us finish.

[iOS] imitation micro-blog video side of the following broadcast of the encapsulated player
system processing.Png

In order to achieve the following functions, and to achieve the caching function, you must get the data of the player, that is, you must manually intervene in the data loading process. We need to insert a function block that we need in the middle of the network layer and the decoding layer, which is the red module in my picture below.

[iOS] imitation micro-blog video side of the following broadcast of the encapsulated player
manual intervention.Png

02, AVAssetResourceLoaderDelegate?

  • To implement the function of inserting your own modules into the player request, we need to rely on AVAssetResourceLoaderDelegate. We have a AVAssetResourceLoader attribute under the AVURLAsset we use. @property (nonatomic, readonly) AVAssetResourceLoader *resourceLoader;
  • The AVAssetResourceLoader is responsible for data loading, the most important thing is as long as we abide by the AVAssetResourceLoaderDelegate, it can become the agent after the agent, data loading may ask us by proxy method.

The AVAssetResourceLoader is responsible for data loading, the most important thing is as long as we abide by the AVAssetResourceLoaderDelegate, it can become the agent after the agent, data loading will ask us through a proxy method. In this way, we’ll find the cut in the pre dry data.

- (BOOL) resourceLoader: (AVAssetResourceLoader * resourceLoader) shouldWaitForLoadingOfRequestedResource: (AVAssetResourceLoadingRequest * loadingRequest); - (void) resourceLoader: (AVAssetResourceLoader * resourceLoader) didCancelLoadingRequest: (AVAssetResourceLoadingRequest * loadingRequest);
  • AVAssetResourceLoader adjusts the load resources needed by AVURLAsset through the delegate object you provide. But the important point is that AVAssetResourceLoader only in AVURLAsset do not know how to load the URL resources will be invoked, that is to provide you with the delegate object does not know how to load the resource will be called in AVURLAsset. So we’re going to have some ways to solve this problem with curves, replacing the scheme of our target video URL address with the scheme that the system doesn’t recognize.
    * before we officially enter the data intervention, let’s look at a very important thing first. We know that video data is large continuous media data, so when we request data, we want to set the request policy to streaming. The implication of this strategy is to segment large capacity continuous media data and split it into a large number of small files.
  • – (NSURL *) getSchemeVideoURL: (* NSURL) url{/ / NSURLComponents to replace NSMutableURL, readwrite can modify the URL. Here the change request strategy, the huge capacity of continuous media data is segmented by AVAssetResourceLoader / / you delegate object to adjust AVURLAsset to load resources. / / but it is important that the AVAssetResourceLoader only in AVURLAsset do not know how to load the URL resources will be invoked is / / you provide will be called the delegate object does not know how to load the resource in AVURLAsset. / / so we will through some method to curve to solve this problem, we replace the target video URL address scheme scheme NSURLComponents *components system is not recognized by the [[NSURLComponents alloc] initWithURL:url = resolvingAgainstBaseURL:NO]; components.scheme = @ “systemCannotRecognition”; return [components URL];}

Second, manual intervention system player loading data?

01, how to use NSURLSession to download large files?

Before NSURLSession, everyone was using NSURLConnection. Now, in Xcode7, NSURLConnection has become an overdue category, and our commonly used AFNNetwork has completely abandoned NSURLConnection and turned to NSURLSession. Now look at how to use NSURLSession:

/ / replace NSMutableURL, can dynamically modify the scheme NSURLComponents *actualURLComponents [[NSURLComponents alloc] initWithURL:url = resolvingAgainstBaseURL:NO]; actualURLComponents.scheme = @ "HTTP"; / / create a request NSMutableURLRequest *request = [NSMutableURLRequest requestWithURL:[actualURLComponents URL] cachePolicy:NSURLRequestReloadIgnoringCacheData timeoutInterval:20.0]; / / change request data range of if (offset > 0 & & self.videoLength > 0) {[request addValue:[NSString stringWithFormat:@ "bytes=%ld-%ld". (unsigned long) offset (unsigned, long) self.videoLength - 1] forHTTPHeaderField:@ "Range"];} / / invalidateAndCancel] / / [self.session reset; create Session, and set the proxy self.session = [NSURLSession sessionWithConfigu Ration:[NSURLSessionConfiguration defaultSessionConfiguration] delegate:self delegateQueue:[NSOperationQueue mainQueue]]; / / create a session object NSURLSessionDataTask *dataTask [self.session = dataTaskWithRequest:request]; / / to download [dataTask resume];

We can get the downloaded data in the proxy method of NSURLSession, and after we get the downloaded data, we use NSOutputStream to write the data to the hard disk to store the temporary file folder. At the end of the request, we decide whether to successfully download the file, and if the download is successful, then transfer the file to the folder of our successful storage file. If the download fails, delete the temporary data.

1. / / receive a response from the server - (void) URLSession: (NSURLSession * session) dataTask: (NSURLSessionDataTask * dataTask) didReceiveResponse: (NSURLResponse * response (void) completionHandler: (completionHandler ^) (NSURLSessionResponseDisposition)); / / the 2. received by the server returns the data when the call will be called many times - (void) URLSession: (NSURLSession * dataTask: (session) NSURLSessionDataTask * dataTask (NSData) didReceiveData: * data; / / 3.) when requested end use (success | failure), if failed then error has the value - (void) URLSession: (NSURLSession * session) task: (NSURLSessionTask * task) didCompleteWithError: (NSError * error);

02, AVAssetResourceLoader agent?

For better encapsulation and maintainability, create a new file that allows the file to dock with the system player. As mentioned above, as long as the file is in compliance with the AVAssetResourceLoaderDelegate protocol, he is qualified to proxy the system player to request data. And the system will pass

- - (BOOL) resourceLoader: (AVAssetResourceLoader *), resourceLoader, shouldWaitForLoadingOfRequestedResource: (AVAssetResourceLoadingRequest *) loadingRequest;

This proxy method passes the download request loadingRequest to us. After you get the request, first put the request in an array. Why do you want to save it with an array? Because when we get the request to download the data and download the data, the process takes time to be uncertain.

When we get the request, we need to call the NSURLSession downloader above to download the file.

- (void) dealLoadingRequest: (AVAssetResourceLoadingRequest *) loadingRequest{NSURL *interceptedURL = [loadingRequest.request URL]; NSRange range = NSMakeRange (loadingRequest.dataRequest.currentOffset, MAXFLOAT); if (self.manager) {if (self.manager.downLoadingOffset > [self processPendingRequests]; 0); / / if the starting position of the new rang than the current cache location is 300K, then according to range request data if (self.manager.offset + self.manager.downLoadingOffset + 1024*300 < range.location; / / if backwards again || self.manager.offset request > range.location) {[self.manager}} else{setUrl:interceptedURL offset:range.location]; Self.manager = [JPDownloadManager new]; self.manager.delegate = self; [self.manager, setUrl:interceptedURL, offset:0];}

If the file is downloaded, go to check the download good data length does not meet the request of data need to length, if met, would take out the corresponding data from the temporary file on disk, and the data to fill the request, then the request from the request list to remove the array. The player gets the data and starts decoding.

The requested data is / / judgment processing completely, and filling the data - (BOOL) respondWithDataForRequest: (AVAssetResourceLoadingDataRequest *) dataRequest{/ / long startOffset request the starting point long = dataRequest.requestedOffset; / / the current request if (dataRequest.currentOffset! = 0) startOffset = dataRequest.currentOffset; / / player drag of the cache is greater than if (startOffset > (self.manager.offset; return NO + self.manager.downLoadingOffset)); / / player drag has been less than the cached data if (startOffset < self.manager.offset return NO; NSData *fileData) = [NSData dataWithContentsOfFile:_videoPath options:NSDataReadingMappedIfSafe error:nil]; NSInteger unread Bytes = self.manager.downLoadingOffset - self.manager.offset - (NSInteger) startOffset; NSUInteger (numberOfBytesToRespondWith = MIN (NSUInteger) dataRequest.requestedLength, unreadBytes (subdataWithRange:NSMakeRange); [dataRequest respondWithData:[fileData (NSUInteger) startOffset- self.manager.offset (NSUInteger) numberOfBytesToRespondWith long long "); endOffset = startOffset + dataRequest.requestedOffset; BOOL = didRespondFully (self.manager.offset + self.manager.downLoadingOffset) > = endOffset; return; didRespondFully};

So far, the process of manually interfering with video playback is over. You can play video properly.

[iOS] imitation micro-blog video side of the following broadcast of the encapsulated player
JPVideoPlayer.png

03, load cached data logic?

The next thing to do is to implement, and when you play the same video next time, check to see if there is a cache of this file on your hard disk. With the help of NSFileManager, we can find the specified path, whether there is a specified file, and determine whether the cache can be enabled.

NSFileManager *manager = [NSFileManager defaultManager]; NSString *savePath [self = fileSavePath]; savePath = [savePath stringByAppendingPathComponent:self.suggestFileName]; if ([manager fileExistsAtPath:savePath]) {/ / this download file already exists in return;}

So far, the player package is complete.

I will automatically play the slide on the next article [iOS] imitation micro-blog video broadcast side below TableView, how to realize the video playback in tableView sliding about, and is smooth, without blocking threads, no Caton slide play video. It will also tell you what strategy to use to determine which cell should play video when tableView rolls.

03, update

  • 2016.10.09:
    in the short time to switch the video in the video cell absorption sliding event, if the current sliding video cell, will cause the tableView cannot receive slip events, causing tableView to feign death. Thanks for providing bug’s friend @ big wall 66370, see my Github JPVideoPlayer specifically.
  • 2016.11.04:
    Jane book friends @ Mr. dishes submitted a single case about repeated add monitor problem, is to inform the player specific single cases in each call to the init method will add monitor finish playing the lead to repeat notification method call, this problem may bring Caton. The latest version has been repaired this problem, I see specific Github JPVideoPlayer.
  • 2016.11.08
    thanked the author Jane @ laomeng (http://www.jianshu.com/users/9f6960a40be6/timeline), he helped me most of the real machine test equipment, including iPhone 5S’s 9.3.5 iPhone system for 6plus 10.0.2 iPhone system 6S China 9.3.2 iPhone 6S plus 10.0.0 iPhone 7plus in Hong Kong system for 10.1.1 system, due to the limited equipment before I hand, only test iPhone 6S and iPhone 6S plus, but found in the Caton @ laomeng older equipment on the phenomenon of specific performance when play local video has cache will appear 2-3 seconds in suspended animation, are blocking the main thread. Now after the modified version fixes this problem, and the above test equipment the Caton did not appear.
  • After 2016.11.10
    closes the player, the video is still playing in the background. Bug has been repaired and submitted for details, see JPVideoPlayer. Thank you friends @ us _ Jane books submitted bug.
  • 2016.11.18
    1. repair may occur, especially when playing some small video files, there can not play.
    2. adds cache management tools, you can call the -getSize: method to obtain the asynchronous cache size. You can also use the -clearVideoCacheForUrl: or -clearAllVideoCache method to clear the cache.
  • 2017.05.02 update.
    some friends respond, some video can not be broadcast below, specific solution ideas, please refer to this blog.

Note: the framework has been iterated to version 2. I have re built the entire framework, and API has been better designed. I wrote an article for version 2 implementation, how does [iOS] reconfigure JPVideoPlayer?. The implementation idea in this article is still consistent, but the implementation details are no longer available. Go to my GitHub for details.

My articles are collected

The following link is a collection of directories for all of my articles. These articles are all involved in the realization of each article has GIT address, GIT have source code. If you help to an article just in the actual development of you, or provide a way to achieve different, make you feel useful, just look at the words “every point of praise people, 99% are hotties, no single”

My collection index

You can also focus on my own brief book on iOS development. The topic of the article is really dry cargo.
If you have a problem, but at the end of the message, also can give me in micro-blog @ Panpan _HKbuy message, and visit my Github.