IOS视频压缩

一. 首先说视频压缩的几个维度

  1. 宽高
  2. 码率
  3. 帧率
  4. 编码方式
  5. GOP

二. 如果视频中有音轨,同时也可以压缩音轨

  1. 码率
  2. 采样率 但是对音轨体积的压缩相对于视频整体体积来说意义不大

三. 参数讲解

  1. 宽高:就是视频或者图像的宽高,一般单位用像素表示
  2. 码率:单位时间(每秒)内传输数据长度
  3. 帧率:单位时间(每秒)内播放的画面数量
  4. 通常是h264、h265,x264,x265
  5. GOP:两个关键帧的距离,关键帧中间有多少个非关键帧 一般情况下短视频或者直播使用264编码即可,因为现在带宽都够,数据量大一些可以接受,264相对264压缩率会更高,但是转码过程中功耗相对较大,可以权衡利弊。 h和x的区别无非就是硬编码和软编码,硬编、解码就是使用硬件,一般是gpu去编码,厂家会自己去针对自己的gpu或者cpu去实现264和265的编码协议,所以硬编、解码一般功耗小,速度快。 软编码、解码是软件厂商使用系统上层api去实现的265和264协议,所以一般无法做到效率很高,但是优势在于可以跨平台,无视硬件。

目前游戏直播,斗鱼,全民这些都使用265编码,因为一般观看者和主播很多都使用pc,性能够,而且支持动态切换分辨率(宽高)

手机端秀场直播一般采用264编码,视频尺寸大约在1000x2500这样,帧率在16-24之间,码率 20081024,可以设置为动态码率,适应手机性能和网络波动。 码率影响的东西比较多,比如画面压缩程度,非关键帧的压缩程度

GOP:加入视频帧率是16帧,那么GOP一般设置为32,也就是两个关键帧就可以播放2s的视频,但同时也需要完全加载完两个关键帧才可以播放视频,如果是直播,那么用户和主播间的视频时差最小就是2s,这个根据业务尔定

四. 代码实现

方法声明
+ (void)compressWithVideoURL:(NSURL *)videoURL          // 视频数据源路径
                   outputURL:(NSURL *)outputURL         // 压缩后输出路径
                         fps:(NSInteger)fps             // 帧率
                     bitRate:(NSInteger)bitRate         // 码率
              dimensionScale:(CGFloat)dimensionScale
                  completion:(void(^)(BOOL success))completion;
// 创建视频asset
AVAsset *videoAsset = [AVAsset assetWithURL:videoURL];
NSError *readerError;
// asset读取器
AVAssetReader *reader = [AVAssetReader assetReaderWithAsset:videoAsset error:&readerError];
NSError *writerError;
// asset写入器
AVAssetWriter *writer = [AVAssetWriter assetWriterWithURL:outputURL fileType:AVFileTypeMPEG4 error:&writerError];
// 获取视频轨道
AVAssetTrack *videoTrack = [videoAsset tracksWithMediaType:AVMediaTypeVideo].firstObject;
AVAssetWriterInput *videoInput;
AVAssetReaderOutput *videoOutput;
    if (videoTrack) {
        NSDictionary *videoOutputSetting = @{
              (__bridge NSString *)kCVPixelBufferPixelFormatTypeKey:[NSNumber numberWithUnsignedInt:kCVPixelFormatType_422YpCbCr8]
            };
            
        videoOutput = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:videoTrack outputSettings:videoOutputSetting];
        
        NSDictionary *videoCompressProperties = @{
            AVVideoAverageBitRateKey : @(bitRate),        // 码率
            AVVideoExpectedSourceFrameRateKey : @(fps),   // 帧率
            AVVideoProfileLevelKey : AVVideoProfileLevelH264HighAutoLevel
        };
        
        CGSize videoSize = [self sizeWithVideoURL:videoURL];
        CGFloat videoWidth = videoSize.width;
        CGFloat videoHeight = videoSize.height;
        NSDictionary *videoCompressSettings = @{
            AVVideoCodecKey : AVVideoCodecTypeH264,
            AVVideoWidthKey : @(videoWidth * dimensionScale),
            AVVideoHeightKey :@(videoHeight * dimensionScale),
            AVVideoCompressionPropertiesKey : videoCompressProperties,
            AVVideoScalingModeKey : AVVideoScalingModeResizeAspectFill
        };
        
        videoInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoCompressSettings];
        videoInput.transform = videoTrack.preferredTransform;
        if ([reader canAddOutput:videoOutput]) {
            [reader addOutput:videoOutput];
        }
        
        if ([writer canAddInput:videoInput]) {
            [writer addInput:videoInput];
        }
    }

音频大致相同,省略。。。。

代码地址 https://github.com/Alienchang/VideoCompresser

Continue reading

一、AVPlayer监听播放结束,实现循环播放

  1. 添加播放器播放结束监听
[[NSNotificationCenter defaultCenter] addObserver:self
                                         selector:@selector(playerItemDidPlayToEndTime:)
                                             name:AVPlayerItemDidPlayToEndTimeNotification
                                           object:nil];
  1. 在结束访法中seek视频到开始时间点
- (void)playerItemDidPlayToEndTime:(NSNotification *)notification {
    [self.player seekToTime:kCMTimeZero]
}

二、AVPlayerLooper循环播放

原生方法,可完全替代方法一

NSURL *url = nil;
AVPlayerItem *playerItem = [[AVPlayerItem alloc] initWithURL:url];
AVQueuePlayer *player = [AVQueuePlayer queuePlayerWithItems:@[playerItem]];
AVPlayerLayer *playerLayer = [AVPlayerLayer playerLayerWithPlayer:player];
AVPlayerLooper *playerLooper = [AVPlayerLooper playerLooperWithPlayer:player templateItem:playerItem];
[player play];

三、MPMoviePlayerController实现循环播放

self.url = [NSURL fileURLWithPath:[[NSBundle mainBundle]pathForResource:@"new" ofType:@"mp4"]];
self.player = [[MPMoviePlayerController alloc] initWithContentURL:self.url];
[self.view addSubview:self.player.view];
self.player.shouldAutoplay = YES;
self.player.repeatMode = MPMovieRepeatModeOne;
[self.player.view setFrame:self.view.bounds];
self.player.scalingMode = MPMovieScalingModeAspectFill;
[self.player play];

四、ijkplayer循环播放

使用ijkPlayerDemo中的IJKFFMoviePlayerController做测试,IJKFFMoviePlayerController本身没有循环播放代码,需要手动添加一下。

先在IJKFFMoviePlayerController中添加loop属性,为bool类型,然后在.m文件421行添加相关代码 ijkmp_set_loop

- (void)play
{
    if (!_mediaPlayer)
        return;

    [self setScreenOn:_keepScreenOnWhilePlaying];

    [self startHudTimer];
    if (self.loop) {
        ijkmp_set_loop(_mediaPlayer, INT_MAX);
    } else {
        ijkmp_set_loop(_mediaPlayer, 1);
    }
    
    ijkmp_start(_mediaPlayer);
}

使用方法

self.videoPlayer = [[IJKFFMoviePlayerController alloc] initWithContentURL:[NSURL fileURLWithPath:path] withOptions:options];
self.videoPlayer.view.autoresizingMask = UIViewAutoresizingFlexibleWidth|UIViewAutoresizingFlexibleHeight;
self.videoPlayer.view.frame = self.view.bounds;
self.videoPlayer.loop = YES;
self.videoPlayer.scalingMode = IJKMPMovieScalingModeAspectFill;
self.videoPlayer.shouldAutoplay = YES;
//        self.containView.autoresizesSubviews = YES;
[self.view addSubview:self.videoPlayer.view];
[self.videoPlayer prepareToPlay];
[self.videoPlayer play];

Continue reading

一、视频采集流程

1. 初始化采集会话 
2. 设置采集设备
3. 设置视频采集输入对象
4. 设置视频采集输出对象及参数
5. 开始采集
6. 采集回调数据处理

初始化采集会话

- (void)setupCaptureSession {
    self.captureSession = [AVCaptureSession new];
    [self.captureSession beginConfiguration];
    if ([self.captureSession canSetSessionPreset:AVCaptureSessionPreset1280x720]) {
        [self.captureSession setSessionPreset:AVCaptureSessionPreset1280x720];
    }
    [self.captureSession commitConfiguration];
}

设置采集设备

- (void)setupDevice {
    // 这里我们获取前置摄像头
    self.device = [self captureDeviceWithPosition:(AVCaptureDevicePositionFront)];
    // 一般直播16-18,电影24
    [self setFPS:16];
}
/// 获取采集设备
- (AVCaptureDevice *)captureDeviceWithPosition:(AVCaptureDevicePosition)position {
    AVCaptureDevice *deviceRet = nil;
    if (position != AVCaptureDevicePositionUnspecified) {
        NSArray<AVCaptureDeviceType> *deviceTypes = @[AVCaptureDeviceTypeBuiltInWideAngleCamera,    // 广角镜头
                                                      AVCaptureDeviceTypeBuiltInDualCamera];        // 正常情况下是主摄
        
        AVCaptureDeviceDiscoverySession *sessionDiscovery = [AVCaptureDeviceDiscoverySession discoverySessionWithDeviceTypes:deviceTypes mediaType:AVMediaTypeVideo position:AVCaptureDevicePositionFront];
        
        NSArray<AVCaptureDevice *> *devices = sessionDiscovery.devices;//当前可用的AVCaptureDevice集合
        for (AVCaptureDevice *device in devices) {
            if ([device position] == position) {
                deviceRet = device;
            }
        }
    }
    return deviceRet;
}
// 这是设备输出的fps
- (void)setFPS:(NSInteger)fps {
    if (fps > 0) {
        AVFrameRateRange *frameRateRange = self.device.activeFormat.videoSupportedFrameRateRanges.firstObject;
        if(!frameRateRange) {
            // 无法获取摄像头
            return;
        }

        if (fps >= frameRateRange.maxFrameRate) {
            fps = frameRateRange.maxFrameRate;
        } else if (fps <= frameRateRange.minFrameRate) {
            fps = frameRateRange.minFrameRate;
        }

        CMTime frameDuration = CMTimeMake(1 , (int)fps);
        [self.captureSession beginConfiguration];
        NSError *error = nil;
        if ([self.device lockForConfiguration:&error]) {
            self.device.activeVideoMaxFrameDuration = frameDuration;
            self.device.activeVideoMinFrameDuration = frameDuration;
            [self.device unlockForConfiguration];
        } else {
            // 设置失败
        }
        [self.captureSession commitConfiguration];
    }
}

设置视频采集输入对象

- (void)setupInput {
    NSError *error = nil;
    self.deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:self.device error:&error];
    [self.captureSession beginConfiguration];
    if ([self.captureSession canAddInput:self.deviceInput]) {
        [self.captureSession addInput:self.deviceInput];
    }
    [self.captureSession commitConfiguration];
}

设置视频采集输出对象及参数

- (void)setupOutput {
    self.deviceOutput = [AVCaptureVideoDataOutput new];
    // 输出下一帧时是否丢弃上一帧
    self.deviceOutput.alwaysDiscardsLateVideoFrames = YES;
    // 输出视频色彩空间为yuv420(也可以为RGB)
    self.deviceOutput.videoSettings = @{
        (id)kCVPixelBufferPixelFormatTypeKey : [NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarFullRange]
    };
    // 设置视频数据输出回调
    [self.deviceOutput setSampleBufferDelegate:self queue:self.bufferQueue];
    self.captureConnection = [self.deviceOutput connectionWithMediaType:AVMediaTypeVideo];
    
    // 设置输出图像方向
    if ([self.captureConnection isVideoOrientationSupported]) {
        [self.captureConnection setVideoOrientation:AVCaptureVideoOrientationPortrait];
    }
    
    // 设置是否镜像
    if ([self.captureConnection isVideoMirroringSupported]) {
        [self.captureConnection setVideoMirrored:YES];
    }
    
    [self.captureSession beginConfiguration];
    if ([self.captureSession canAddOutput:self.deviceOutput]) {
        [self.captureSession addOutput:self.deviceOutput];
    }
    [self.captureSession commitConfiguration];
}

开始采集

- (void)startCatpure {
    if (!(self.device && self.deviceInput && self.deviceOutput)) {
        return;
    }
    
    if (self.captureSession && ![self.captureSession isRunning]) {
        [self.captureSession startRunning];
    }
}

采集回调数据处理

#pragma mark -- AVCaptureVideoDataOutputSampleBufferDelegate
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
    //获取每一帧图像信息
    CVPixelBufferRef pixelBuffer = (CVPixelBufferRef)CMSampleBufferGetImageBuffer(sampleBuffer);
    NSLog(@"%@",pixelBuffer);
}

本文demo: https://github.com/Alienchang/TestiOSCapture

Continue reading

Author's picture

刘畅

记录生活学习

iOS资深开发工程师

中国