AVPlayer

AVPlayerLayer是CALayer的一个子类,由于AVPlayer这个播放器只能安置在AVPlayerLayer 这个图层之上,所以我们需要实例化一个UIView,并且修改它默认生成的图层类型(默认是CALayer,不具备安置AVPlayer的功能)
        
    1.修改UIView自带的CALayer类型
    +(Class)layerClass

2.媒体对象
    AVPlayerItem
    
    3.通过KVO来观察AVPlayerItem的status属性值的变化,当status的属性值变为AVPlayerItemStatusReadyToPlay的时候,才允许播放

4.通过以下方法观察播放进度的变化
    - (id)addPeriodicTimeObserverForInterval:(CMTime)interval queue:(dispatch_queue_t)queue usingBlock:(void (^)(CMTime time))block;

5. 专门用于显示视频时间的一个结构体
    CMTime
    
    ps:CMTimeMake(a,b)创建一个CMTime结构体,a代表当前第几帧,b代表每一秒钟播放多少帧

1. 然后监听playerItem的status和loadedTimeRange属性,status有三种状态 ==>这是原文中的话,但是后面列出的属性却是AVPlayer 的status(应该是作者笔误),其实AVPlayerItem和__AVPlayer__ 都有status 属性的,而且可以使用KVO监听的。
文档中枚举类型如下:

typedef NS_ENUM(NSInteger, AVPlayerItemStatus) {
AVPlayerItemStatusUnknown,
AVPlayerItemStatusReadyToPlay,
AVPlayerItemStatusFailed
};

而AVPlayer 的status 枚举类型如下:

typedef NS_ENUM(NSInteger, AVPlayerStatus) {
AVPlayerStatusUnknown,
AVPlayerStatusReadyToPlay,
AVPlayerStatusFailed
};

看清楚啊!!!看清楚这两个之间的区别,这里我主要想说明:
AVPlayerItemStatus是代表当前播放资源item 的状态(可以理解成这url链接or视频文件。。。可以播放成功/失败)
AVPlayerStatus是代表当前播放器的状态。

我在编程的时候遇到一个问题就是AVPlayer 的status 为
AVPlayerStatusReadyToPlay,但是视频就是播放不成功,后来将KVO的监听换成了AVPlayerItem ,返回了AVPlayerItemStatusFailed。

编程的时候最好使用item 的status,会准确点。
2.addPeriodicTimeObserverForInterval
给AVPlayer 添加time Observer 有利于我们去检测播放进度
但是添加以后一定要记得移除,其实不移除程序不会崩溃,但是这个线程是不会释放的,会占用你大量的内存资源(当时发现这个问题的时候我搞了一上午,说实话当时我根本不知道是哪里出现了问题,自己对AVPlayer 根本不了解)
苹果文档中的注释:

@result
An object conforming to the NSObject
protocol. You must retain this returned value as long as you want the
time observer to be invoked by the player.
Pass this object to -removeTimeObserver: to cancel time observation.

3.CMTime 结构体
连接的教程里面 给的参数是CMTimeMake(1, 1),其实就是1s调用一下block,
打个比方CMTimeMake(a, b)就是a/b秒之后调用一下block
介绍一个网站有关这个结构体的:
https://zwo28.wordpress.com/2015/03/06/%E8%A7%86%E9%A2%91%E5%90%88%E6%88%90%E4%B8%ADcmtime%E7%9A%84%E7%90%86%E8%A7%A3%EF%BC%8C%E4%BB%A5%E5%8F%8A%E5%88%A9%E7%94%A8cmtime%E5%AE%9E%E7%8E%B0%E8%BF%87%E6%B8%A1%E6%95%88%E6%9E%9C/

4.拖动slider 播放跳跃播放,要使用AVPlayer 对象的seekToTime:方法,
举个最简单的例子:假如一个video视频有20s,想要跳到10s进行播放(_palyer 为AVPlayer 对象)
[_player seekToTime:CMTimeMake(10,1)];后面的参数写1,前面的参数写将要播放的秒数,我试验得出的结果,不要问我问什么,需要自己理解。
5.播放到结尾怎么回到开头呢?
[_player seekToTime:kCMTimeZero];

下面放上我的写的代码:写的不好请指正,多谢。

AVViewController.h

#import <UIKit/UIKit.h>
@interface AVViewController : UIViewController
@end

AVViewController.m

#import "AVViewController.h"
#import "VideoView.h" @interface AVViewController () <VideoSomeDelegate> @property (nonatomic ,strong) VideoView *videoView; @property (nonatomic ,strong) NSMutableArray<NSLayoutConstraint *> *array; @property (nonatomic ,strong) UISlider *videoSlider; @property (nonatomic ,strong) NSMutableArray<NSLayoutConstraint *> *sliderArray; @end @implementation AVViewController - (void)viewDidLoad {
[super viewDidLoad];
[self.view setBackgroundColor:[UIColor whiteColor]];
[self initVideoView]; } - (void)initVideoView { //NSString *path = [[NSBundle mainBundle] pathForResource:@"some" ofType:@"mp4"];//这个时播放本地的,播放本地的时候还需要改VideoView.m中的代码
NSString *path = @"http://static.tripbe.com/videofiles/20121214/9533522808.f4v.mp4";
_videoView = [[VideoView alloc] initWithUrl:path delegate:self];
_videoView.someDelegate = self;
[_videoView setTranslatesAutoresizingMaskIntoConstraints:NO];
[self.view addSubview:_videoView];
[self initVideoSlider]; if (self.traitCollection.verticalSizeClass == UIUserInterfaceSizeClassCompact) {
[self installLandspace];
} else {
[self installVertical];
}
}
- (void)installVertical {
if (_array != nil) {
[self.view removeConstraints:_array];
[_array removeAllObjects];
[self.view removeConstraints:_sliderArray];
[_sliderArray removeAllObjects];
} else {
_array = [NSMutableArray array];
_sliderArray = [NSMutableArray array];
}
id topGuide = self.topLayoutGuide;
NSDictionary *dic = @{@"top":@100,@"height":@180,@"edge":@20,@"space":@80};
[_array addObjectsFromArray:[NSLayoutConstraint constraintsWithVisualFormat:@"H:|[_videoView]|" options:0 metrics:nil views:NSDictionaryOfVariableBindings(_videoView)]];
[_array addObjectsFromArray:[NSLayoutConstraint constraintsWithVisualFormat:@"H:|-(edge)-[_videoSlider]-(edge)-|" options:0 metrics:dic views:NSDictionaryOfVariableBindings(_videoSlider)]];
[_array addObjectsFromArray:[NSLayoutConstraint constraintsWithVisualFormat:@"V:|[topGuide]-(top)-[_videoView(==height)]-(space)-[_videoSlider]" options:0 metrics:dic views:NSDictionaryOfVariableBindings(_videoView,topGuide,_videoSlider)]];
[self.view addConstraints:_array]; }
- (void)installLandspace {
if (_array != nil) { [self.view removeConstraints:_array];
[_array removeAllObjects]; [self.view removeConstraints:_sliderArray];
[_sliderArray removeAllObjects];
} else { _array = [NSMutableArray array];
_sliderArray = [NSMutableArray array];
} id topGuide = self.topLayoutGuide;
NSDictionary *dic = @{@"edge":@20,@"space":@30}; [_array addObjectsFromArray:[NSLayoutConstraint constraintsWithVisualFormat:@"H:|[_videoView]|" options:0 metrics:nil views:NSDictionaryOfVariableBindings(_videoView)]];
[_array addObjectsFromArray:[NSLayoutConstraint constraintsWithVisualFormat:@"V:|[topGuide][_videoView]|" options:0 metrics:nil views:NSDictionaryOfVariableBindings(_videoView,topGuide)]];
[self.view addConstraints:_array]; [_sliderArray addObjectsFromArray:[NSLayoutConstraint constraintsWithVisualFormat:@"H:|-(edge)-[_videoSlider]-(edge)-|" options:0 metrics:dic views:NSDictionaryOfVariableBindings(_videoSlider)]];
[_sliderArray addObjectsFromArray:[NSLayoutConstraint constraintsWithVisualFormat:@"V:[_videoSlider]-(space)-|" options:0 metrics:dic views:NSDictionaryOfVariableBindings(_videoSlider)]];
[self.view addConstraints:_sliderArray];
}
- (void)initVideoSlider { _videoSlider = [[UISlider alloc] init];
[_videoSlider setTranslatesAutoresizingMaskIntoConstraints:NO];
[_videoSlider setThumbImage:[UIImage imageNamed:@"sliderButton"] forState:UIControlStateNormal];
[self.view addSubview:_videoSlider]; }
- (void)willTransitionToTraitCollection:(UITraitCollection *)newCollection withTransitionCoordinator:(id <UIViewControllerTransitionCoordinator>)coordinator { [super willTransitionToTraitCollection:newCollection withTransitionCoordinator:coordinator];
[coordinator animateAlongsideTransition:^(id <UIViewControllerTransitionCoordinatorContext> context) { if (newCollection.verticalSizeClass == UIUserInterfaceSizeClassCompact) {
[self installLandspace];
} else {
[self installVertical];
}
[self.view setNeedsLayout];
} completion:nil]; }
- (void)didReceiveMemoryWarning {
[super didReceiveMemoryWarning];
}
#pragma mark -
- (void)flushCurrentTime:(NSString *)timeString sliderValue:(float)sliderValue {
_videoSlider.value = sliderValue;
}
/*
#pragma mark - Navigation // In a storyboard-based application, you will often want to do a little preparation before navigation
- (void)prepareForSegue:(UIStoryboardSegue *)segue sender:(id)sender {
// Get the new view controller using [segue destinationViewController].
// Pass the selected object to the new view controller.
}
*/ @end

VideoView.h

#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h> @protocol VideoSomeDelegate <NSObject> @required - (void)flushCurrentTime:(NSString *)timeString sliderValue:(float)sliderValue; //- (void)flushVideoLength:(float)videoLength; @end @interface VideoView : UIView @property (nonatomic ,strong) NSString *playerUrl; @property (nonatomic ,readonly) AVPlayerItem *item; @property (nonatomic ,readonly) AVPlayerLayer *playerLayer; @property (nonatomic ,readonly) AVPlayer *player; @property (nonatomic ,weak) id <VideoSomeDelegate> someDelegate; - (id)initWithUrl:(NSString *)path delegate:(id<VideoSomeDelegate>)delegate; @end @interface VideoView (Guester) - (void)addSwipeView; @end

VideoView.m

#import "VideoView.h"
#import <AVFoundation/AVFoundation.h>
#import <MediaPlayer/MPVolumeView.h>
typedef enum {
ChangeNone,
ChangeVoice,
ChangeLigth,
ChangeCMTime
}Change; @interface VideoView () @property (nonatomic ,readwrite) AVPlayerItem *item; @property (nonatomic ,readwrite) AVPlayerLayer *playerLayer; @property (nonatomic ,readwrite) AVPlayer *player; @property (nonatomic ,strong) id timeObser; @property (nonatomic ,assign) float videoLength; @property (nonatomic ,assign) Change changeKind; @property (nonatomic ,assign) CGPoint lastPoint; //Gesture
@property (nonatomic ,strong) UIPanGestureRecognizer *panGesture;
@property (nonatomic ,strong) MPVolumeView *volumeView;
@property (nonatomic ,weak) UISlider *volumeSlider;
@property (nonatomic ,strong) UIView *darkView;
@end @implementation VideoView - (id)initWithUrl:(NSString *)path delegate:(id<VideoSomeDelegate>)delegate {
if (self = [super init]) {
_playerUrl = path;
_someDelegate = delegate;
[self setBackgroundColor:[UIColor blackColor]];
[self setUpPlayer];
[self addSwipeView]; }
return self;
}
- (void)setUpPlayer {
//本地视频
//NSURL *rul = [NSURL fileURLWithPath:_playerUrl]; NSURL *url = [NSURL URLWithString:_playerUrl];
NSLog(@"%@",url); _item = [[AVPlayerItem alloc] initWithURL:url];
_player = [AVPlayer playerWithPlayerItem:_item];
_playerLayer = [AVPlayerLayer playerLayerWithPlayer:_player];
_playerLayer.videoGravity = AVLayerVideoGravityResizeAspect;
[self.layer addSublayer:_playerLayer]; [self addVideoKVO];
[self addVideoTimerObserver];
[self addVideoNotic];
} #pragma mark - KVO
- (void)addVideoKVO
{
//KVO
[_item addObserver:self forKeyPath:@"status" options:NSKeyValueObservingOptionNew context:nil];
[_item addObserver:self forKeyPath:@"loadedTimeRanges" options:NSKeyValueObservingOptionNew context:nil];
[_item addObserver:self forKeyPath:@"playbackBufferEmpty" options:NSKeyValueObservingOptionNew context:nil];
}
- (void)removeVideoKVO {
[_item removeObserver:self forKeyPath:@"status"];
[_item removeObserver:self forKeyPath:@"loadedTimeRanges"];
[_item removeObserver:self forKeyPath:@"playbackBufferEmpty"];
}
- (void)observeValueForKeyPath:(nullable NSString *)keyPath ofObject:(nullable id)object change:(nullable NSDictionary<NSString*, id> *)change context:(nullable void *)context { if ([keyPath isEqualToString:@"status"]) {
AVPlayerItemStatus status = _item.status;
switch (status) {
case AVPlayerItemStatusReadyToPlay:
{
NSLog(@"AVPlayerItemStatusReadyToPlay");
[_player play];
_videoLength = floor(_item.asset.duration.value * 1.0/ _item.asset.duration.timescale);
}
break;
case AVPlayerItemStatusUnknown:
{
NSLog(@"AVPlayerItemStatusUnknown");
}
break;
case AVPlayerItemStatusFailed:
{
NSLog(@"AVPlayerItemStatusFailed");
NSLog(@"%@",_item.error);
}
break; default:
break;
}
} else if ([keyPath isEqualToString:@"loadedTimeRanges"]) { } else if ([keyPath isEqualToString:@"playbackBufferEmpty"]) { }
}
#pragma mark - Notic
- (void)addVideoNotic { //Notification
[[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(movieToEnd:) name:AVPlayerItemDidPlayToEndTimeNotification object:nil];
[[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(movieJumped:) name:AVPlayerItemTimeJumpedNotification object:nil];
[[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(movieStalle:) name:AVPlayerItemPlaybackStalledNotification object:nil];
[[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(backGroundPauseMoive) name:UIApplicationDidEnterBackgroundNotification object:nil]; }
- (void)removeVideoNotic {
//
[[NSNotificationCenter defaultCenter] removeObserver:self name:AVPlayerItemDidPlayToEndTimeNotification object:nil];
[[NSNotificationCenter defaultCenter] removeObserver:self name:AVPlayerItemPlaybackStalledNotification object:nil];
[[NSNotificationCenter defaultCenter] removeObserver:self name:AVPlayerItemTimeJumpedNotification object:nil];
[[NSNotificationCenter defaultCenter] removeObserver:self];
} - (void)movieToEnd:(NSNotification *)notic {
NSLog(@"%@",NSStringFromSelector(_cmd));
}
- (void)movieJumped:(NSNotification *)notic {
NSLog(@"%@",NSStringFromSelector(_cmd));
}
- (void)movieStalle:(NSNotification *)notic {
NSLog(@"%@",NSStringFromSelector(_cmd));
}
- (void)backGroundPauseMoive {
NSLog(@"%@",NSStringFromSelector(_cmd));
} #pragma mark - TimerObserver
- (void)addVideoTimerObserver {
__weak typeof (self)self_ = self;
_timeObser = [_player addPeriodicTimeObserverForInterval:CMTimeMake(1, 1) queue:NULL usingBlock:^(CMTime time) {
float currentTimeValue = time.value*1.0/time.timescale/self_.videoLength;
NSString *currentString = [self_ getStringFromCMTime:time]; if ([self_.someDelegate respondsToSelector:@selector(flushCurrentTime:sliderValue:)]) {
[self_.someDelegate flushCurrentTime:currentString sliderValue:currentTimeValue];
} else {
NSLog(@"no response");
}
NSLog(@"%@",self_.someDelegate);
}];
}
- (void)removeVideoTimerObserver {
NSLog(@"%@",NSStringFromSelector(_cmd));
[_player removeTimeObserver:_timeObser];
} #pragma mark - Utils
- (NSString *)getStringFromCMTime:(CMTime)time
{
float currentTimeValue = (CGFloat)time.value/time.timescale;//得到当前的播放时 NSDate * currentDate = [NSDate dateWithTimeIntervalSince1970:currentTimeValue];
NSCalendar *calendar = [[NSCalendar alloc] initWithCalendarIdentifier:NSCalendarIdentifierGregorian];
NSInteger unitFlags = NSCalendarUnitHour | NSCalendarUnitMinute | NSCalendarUnitSecond ;
NSDateComponents *components = [calendar components:unitFlags fromDate:currentDate]; if (currentTimeValue >= 3600 )
{
return [NSString stringWithFormat:@"%ld:%ld:%ld",components.hour,components.minute,components.second];
}
else
{
return [NSString stringWithFormat:@"%ld:%ld",components.minute,components.second];
}
} - (NSString *)getVideoLengthFromTimeLength:(float)timeLength
{
NSDate * date = [NSDate dateWithTimeIntervalSince1970:timeLength];
NSCalendar *calendar = [[NSCalendar alloc] initWithCalendarIdentifier:NSCalendarIdentifierGregorian];
NSInteger unitFlags = NSCalendarUnitHour | NSCalendarUnitMinute | NSCalendarUnitSecond ;
NSDateComponents *components = [calendar components:unitFlags fromDate:date]; if (timeLength >= 3600 )
{
return [NSString stringWithFormat:@"%ld:%ld:%ld",components.hour,components.minute,components.second];
}
else
{
return [NSString stringWithFormat:@"%ld:%ld",components.minute,components.second];
}
} - (void)layoutSubviews {
[super layoutSubviews];
_playerLayer.frame = self.bounds;
} #pragma mark - release
- (void)dealloc {
NSLog(@"%@",NSStringFromSelector(_cmd));
[self removeVideoTimerObserver];
[self removeVideoNotic];
[self removeVideoKVO];
} @end #pragma mark - VideoView (Guester) @implementation VideoView (Guester) - (void)addSwipeView {
_panGesture = [[UIPanGestureRecognizer alloc] initWithTarget:self action:@selector(swipeAction:)];
[self addGestureRecognizer:_panGesture];
[self setUpDarkView];
}
- (void)setUpDarkView {
_darkView = [[UIView alloc] init];
[_darkView setTranslatesAutoresizingMaskIntoConstraints:NO];
[_darkView setBackgroundColor:[UIColor blackColor]];
_darkView.alpha = 0.0;
[self addSubview:_darkView]; NSMutableArray *darkArray = [NSMutableArray array];
[darkArray addObjectsFromArray:[NSLayoutConstraint constraintsWithVisualFormat:@"H:|[_darkView]|" options:0 metrics:nil views:NSDictionaryOfVariableBindings(_darkView)]];
[darkArray addObjectsFromArray:[NSLayoutConstraint constraintsWithVisualFormat:@"V:|[_darkView]|" options:0 metrics:nil views:NSDictionaryOfVariableBindings(_darkView)]];
[self addConstraints:darkArray];
} - (void)swipeAction:(UISwipeGestureRecognizer *)gesture { switch (gesture.state) {
case UIGestureRecognizerStateBegan:
{
_changeKind = ChangeNone;
_lastPoint = [gesture locationInView:self];
}
break;
case UIGestureRecognizerStateChanged:
{
[self getChangeKindValue:[gesture locationInView:self]]; }
break;
case UIGestureRecognizerStateEnded:
{
if (_changeKind == ChangeCMTime) {
[self changeEndForCMTime:[gesture locationInView:self]];
}
_changeKind = ChangeNone;
_lastPoint = CGPointZero;
}
default:
break;
} }
- (void)getChangeKindValue:(CGPoint)pointNow { switch (_changeKind) { case ChangeNone:
{
[self changeForNone:pointNow];
}
break;
case ChangeCMTime:
{
[self changeForCMTime:pointNow];
}
break;
case ChangeLigth:
{
[self changeForLigth:pointNow];
}
break;
case ChangeVoice:
{
[self changeForVoice:pointNow];
}
break; default:
break;
}
}
- (void)changeForNone:(CGPoint) pointNow {
if (fabs(pointNow.x - _lastPoint.x) > fabs(pointNow.y - _lastPoint.y)) {
_changeKind = ChangeCMTime;
} else {
float halfWight = self.bounds.size.width / 2;
if (_lastPoint.x < halfWight) {
_changeKind = ChangeLigth;
} else {
_changeKind = ChangeVoice;
}
_lastPoint = pointNow;
}
}
- (void)changeForCMTime:(CGPoint) pointNow {
float number = fabs(pointNow.x - _lastPoint.x);
if (pointNow.x > _lastPoint.x && number > 10) {
float currentTime = _player.currentTime.value / _player.currentTime.timescale;
float tobeTime = currentTime + number*0.5;
NSLog(@"forwart to changeTo time:%f",tobeTime);
} else if (pointNow.x < _lastPoint.x && number > 10) {
float currentTime = _player.currentTime.value / _player.currentTime.timescale;
float tobeTime = currentTime - number*0.5;
NSLog(@"back to time:%f",tobeTime);
}
}
- (void)changeForLigth:(CGPoint) pointNow {
float number = fabs(pointNow.y - _lastPoint.y);
if (pointNow.y > _lastPoint.y && number > 10) {
_lastPoint = pointNow;
[self minLigth]; } else if (pointNow.y < _lastPoint.y && number > 10) {
_lastPoint = pointNow;
[self upperLigth];
}
}
- (void)changeForVoice:(CGPoint)pointNow {
float number = fabs(pointNow.y - _lastPoint.y);
if (pointNow.y > _lastPoint.y && number > 10) {
_lastPoint = pointNow;
[self minVolume];
} else if (pointNow.y < _lastPoint.y && number > 10) {
_lastPoint = pointNow;
[self upperVolume];
}
}
- (void)changeEndForCMTime:(CGPoint)pointNow {
if (pointNow.x > _lastPoint.x ) {
NSLog(@"end for CMTime Upper");
float length = fabs(pointNow.x - _lastPoint.x);
[self upperCMTime:length];
} else {
NSLog(@"end for CMTime min");
float length = fabs(pointNow.x - _lastPoint.x);
[self mineCMTime:length];
}
}
- (void)upperLigth { if (_darkView.alpha >= 0.1) {
_darkView.alpha = _darkView.alpha - 0.1;
} }
- (void)minLigth {
if (_darkView.alpha <= 1.0) {
_darkView.alpha = _darkView.alpha + 0.1;
}
} - (void)upperVolume {
if (self.volumeSlider.value <= 1.0) {
self.volumeSlider.value = self.volumeSlider.value + 0.1 ;
} }
- (void)minVolume {
if (self.volumeSlider.value >= 0.0) {
self.volumeSlider.value = self.volumeSlider.value - 0.1 ;
}
}
#pragma mark -CMTIME
- (void)upperCMTime:(float)length { float currentTime = _player.currentTime.value / _player.currentTime.timescale;
float tobeTime = currentTime + length*0.5;
if (tobeTime > _videoLength) {
[_player seekToTime:_item.asset.duration];
} else {
[_player seekToTime:CMTimeMake(tobeTime, 1)];
}
}
- (void)mineCMTime:(float)length { float currentTime = _player.currentTime.value / _player.currentTime.timescale;
float tobeTime = currentTime - length*0.5;
if (tobeTime <= 0) {
[_player seekToTime:kCMTimeZero];
} else {
[_player seekToTime:CMTimeMake(tobeTime, 1)];
}
} - (MPVolumeView *)volumeView { if (_volumeView == nil) {
_volumeView = [[MPVolumeView alloc] init];
_volumeView.hidden = YES;
[self addSubview:_volumeView];
}
return _volumeView;
} - (UISlider *)volumeSlider {
if (_volumeSlider== nil) {
NSLog(@"%@",[self.volumeView subviews]);
for (UIView *subView in [self.volumeView subviews]) {
if ([subView.class.description isEqualToString:@"MPVolumeSlider"]) {
_volumeSlider = (UISlider*)subView;
break;
}
}
}
return _volumeSlider;
} @end

如果对你有帮助,请关注我哦!

AVPlayer的更多相关文章

  1. 基于 AVPlayer 自定义播放器

    如果我只是简单的播放一个视频,而不需要考虑播放器的界面.iOS9.0 之前使用 MPMoviePlayerController, 或者内部自带一个 view 的 MPMoviePlayerViewCo ...

  2. iOS播放器 - AVPlayer

    之前有说到在播放器中一点点小技巧,现在正式记录一下AVPlayer. 这里主要是说明用AVPlayer做音乐播放器的功能实现,所以不介绍AVPlayer中那个图层类. 首先我们要声明一下播放器,这里有 ...

  3. An AVPlayerItem cannot be associated with more than one instance of AVPlayer错误

    An AVPlayerItem cannot be associated with more than one instance of AVPlayer An AVPlayerItem cannot ...

  4. iOS:基于AVPlayer实现的视频播放器

    最近在学习AVFoundation框架的相关知识,写了一个基于AVPlayer的视频播放器,相关功能如下图: 代码github:https://github.com/wzpziyi1/VideoPla ...

  5. iOS开发 - AVPlayer实现流音频边播边存

    边播边下有三套左右实现思路,本文使用AVPlayer + AVURLAsset实现. 概述 1. AVPlayer简介 AVPlayer存在于AVFoundation中,可以播放视频和音频,可以理解为 ...

  6. AVPlayer的使用本地视频

    1引入AVFoundation.framework框架 2引入头文件<AVFoundation/AVFoundation.h>,并拖入需要播放的视频文件 代码如下: 自定义播放的View, ...

  7. iOS - AVPlayer 音视频播放

    前言 NS_CLASS_AVAILABLE(10_7, 4_0) @interface AVPlayer : NSObject @available(iOS 4.0, *) public class ...

  8. iOS在线音乐播放SZKAVPlayer(基于AVPlayer的封装)

    由于最近闲着没事,想找有关在线音乐播放的demo学习一下,在gitHub跟code4APP上面查找了很多帖子,结果很多在线音乐都是基于AudioStream实现的,我感觉用起来不太方便.后来突然发现, ...

  9. AVPlayer的基本使用

    2014-5-7 06:46| 发布者: admin| 查看: 437| 评论: 0   摘要: 在iOS开发中,播放视频通常有两种方式,一种是使用MPMoviePlayerController(需要 ...

随机推荐

  1. 【原】移动web点5像素的秘密

    最近和一个朋友聊天,朋友吐露了工作上的一些不开心,说自己总是喜欢跟别人比较,活得比较累,这种感觉大部分人经历过,往往觉得是自己心态不好,其实不然,这是人性,此时应该快速摆脱这种状态,想到DOTA大9神 ...

  2. JAVA中使用FTPClient实现文件上传下载实例代码

    一.上传文件 原理就不介绍了,大家直接看代码吧 ? 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 ...

  3. PAT 1045. 快速排序(25)

    著名的快速排序算法里有一个经典的划分过程:我们通常采用某种方法取一个元素作为主元,通过交换,把比主元小的元素放到它的左边,比主元大的元素放到它的右边. 给定划分后的N个互不相同的正整数的排列,请问有多 ...

  4. [LeetCode] Count Numbers with Unique Digits 计算各位不相同的数字个数

    Given a non-negative integer n, count all numbers with unique digits, x, where 0 ≤ x < 10n. Examp ...

  5. Spark Rdd coalesce()方法和repartition()方法

    在Spark的Rdd中,Rdd是分区的. 有时候需要重新设置Rdd的分区数量,比如Rdd的分区中,Rdd分区比较多,但是每个Rdd的数据量比较小,需要设置一个比较合理的分区.或者需要把Rdd的分区数量 ...

  6. 关于js的回调函数的一点看法

    算了一下又有好几个月没写博客了,最近在忙公司android的项目,所以也就很少抽时间来写些东西了.刚闲下来,我就翻了翻之前看的东西.做了android之后更加感觉到手机端开发的重要性,现在做nativ ...

  7. Winscp开源的SSH|SFTP

    WinSCP 主要功能 图形用户界面 多语言与 Windows 完美集成(拖拽, URL, 快捷方式) 支持所有常用文件操作,支持基于 SSH-1.SSH-2 的 SFTP 和 SCP 协议 支持批处 ...

  8. 【WPF】 Timer与 dispatcherTimer 在wpf中你应该用哪个?

    源:Roboby 1.timer或重复生成timer事件,dispatchertimer是集成到队列中的一个时钟.2.dispatchertimer更适合在wpf中访问UI线程上的元素 3.Dispa ...

  9. MySQL练习题参考答案

    MySQL练习题参考答案 2.查询“生物”课程比“物理”课程成绩高的所有学生的学号: 思路: 获取所有有生物课程的人(学号,成绩) - 临时表 获取所有有物理课程的人(学号,成绩) - 临时表 根据[ ...

  10. bzoj1251

    1251: 序列终结者 Time Limit: 20 Sec  Memory Limit: 162 MBSubmit: 3776  Solved: 1581[Submit][Status][Discu ...