Java之视频读取IO流解帧实施方案

时间:2022-11-28 16:40:46

  获取视频处理对象的方式有很多,读取本地文件、读取url、读取摄像头等,而直接读流解析视频的实施方案却难以寻觅。此处有两种方案处理视频流(此处设定场景为用户上传视频,同时两种方式均需服务端安装ffmpeg+opencv):

  1.io流保存本地再读取

      该方案没有太多技术含量,直接借助java.io+opencv-VideoCapture即可实现视频的解帧等操作。

      1)保存本地

        本地保存为求方便,直接使用 apache.commons.io.FileUtils.copyInputStreamToFile(InputStream,File)方法

// MultipartFile videoFile
InputStream videoInputStream = videoFile.getInputStream();
File file = new File(path + getRandomFileName() + ".mp4");
FileUtils.copyInputStreamToFile(videoInputStream,file);

      2)  视频解析

        此处视频解析,可以直接使用整合了ffmpeg的opencv中的VideoCapture对象来操作

VideoCapture = new VideoCapture(file.getPath());

      3) 业务要求

        项目业务要求,取视频前两秒的20帧,转储为Mat矩阵的集合

// 此处的视频操作常量来自 javacv
Double rawFps = videoCapture.get(opencv_highgui.CV_CAP_PROP_FPS);// 帧率
Double validFps = Math.min(10.0,rawFps);// 校验
Double validTimeGap = 1.0 / validFps;
List<Mat> frameList = new ArrayList();
try {
Double currentTime = 0.0;
while (currentTime + EPSILON < timeCount) {//EPSILON为浮点数操作修正值
// 设置视频的位置
videoCapture.set(opencv_highgui.CV_CAP_PROP_POS_MSEC,currentTime * 1000);
Mat frame = new Mat();
capture.read(frame);
frameList.add(frame);
currentTime += validTimeGap;
}
} catch .... finally ..

  2.直接读流

    直接读流的依赖支撑来自 Bytedeco - javacv - FFmpegFrameGrabber 类,在此 向Bytedeco团队致敬

    1)读io,转FFmpegFrameGrabber

InputStream inputStream = videoFile.getInputStream();
FFmpegFrameGrabber grabber = new FFmpegFrameGrabber(inputStream);

    2)业务要求

    FFmpegFrameGrabber与VideoCapture在开闭时有所不同,VideoCapture如果直接构造来初始化不需手动open()即打开,FFmpegFrameGrabber有一专属方法来打开视频解析 - start() 。

grabber.start();
// get each mat
List<Mat> mats = new ArrayList<>();
double fps = grabber.getFrameRate();
double each = Math.ceil(fps / fpsDefine);
double count = fps * timeCount ;
for (int i = 0 ; i < count ; i++) {
double mod = i % each;
Frame frame = grabber.grabImage();
if (mod == 0.0) {
OpenCVFrameConverter.ToMat toMat = new OpenCVFrameConverter.ToMat();
opencv_core.Mat mat = toMat.convert(frame);
if (mat != null) {
Mat matUse = new Mat(mat.clone().address());
mats.add(matUse);
mat.release();
}
}
}

  3.两种方式的异同

    1.bytedeco - ffmpeg 包中整合有Frame - Mat - BufferImage的相关转换方法,实际应用中需注意其与opencv - Mat的转换

    2.二者都依赖ffmpeg+opencv本地方法,而pom依赖又有不同:

    VideoCapture:

        <dependency>
<groupId>org.opencv</groupId>
<artifactId>opencv</artifactId>
<version>2.4.13</version>
</dependency>
<dependency>
<groupId>org.bytedeco</groupId>
<artifactId>javacv</artifactId>
<version>1.4.3</version>
</dependency>

    FFmpegFrameGrabber:

        <dependency>
<groupId>org.opencv</groupId>
<artifactId>opencv</artifactId>
<version>2.4.13</version>
</dependency>
<dependency>
<groupId>org.bytedeco</groupId>
<artifactId>javacv</artifactId>
<version>1.4.3</version>
</dependency>
<dependency>
<groupId>org.bytedeco.javacpp-presets</groupId>
<artifactId>opencv</artifactId>
<version>3.4.3-1.4.3</version>
<classifier>linux-x86_64</classifier>
</dependency>
<dependency>
<groupId>org.bytedeco.javacpp-presets</groupId>
<artifactId>ffmpeg</artifactId>
<version>4.0.2-1.4.3</version>
<classifier>linux-x86_64</classifier>
</dependency>
<dependency>
<groupId>org.bytedeco</groupId>
<artifactId>javacpp</artifactId>
<version>1.4.3</version>
</dependency>

    3.都需手动对本地资源加以释放:这里包括io流,视频流,Mat矩阵,同时释放的方法又有不同

    VideoCapture:release()

    FFmpegFrameGrabber : stop()

        finally {
try {
inputStream.close();
} catch (IOException e) {
log.error("close InputStream error : " , e);
}
try {
grabber.stop();
} catch (FrameGrabber.Exception e) {
log.error("stop grabber error : " , e);
}
for (Mat mat : mats) {
if (mat != null) {
mat.release();
}
}
}