]> git.sesse.net Git - ffmpeg/commit
dnn: add openvino as one of dnn backend
authorGuo, Yejun <yejun.guo@intel.com>
Mon, 25 May 2020 07:38:09 +0000 (15:38 +0800)
committerGuo, Yejun <yejun.guo@intel.com>
Thu, 2 Jul 2020 01:36:34 +0000 (09:36 +0800)
commitff37ebaf30e675227655d9055069471bb45e5ceb
treedac364e4522189807697df39cbd81362fa0d5ac1
parent1884d887bae30d702ac4d059fe80646e8d2f294b
dnn: add openvino as one of dnn backend

OpenVINO is a Deep Learning Deployment Toolkit at
https://github.com/openvinotoolkit/openvino, it supports CPU, GPU
and heterogeneous plugins to accelerate deep learning inferencing.

Please refer to https://github.com/openvinotoolkit/openvino/blob/master/build-instruction.md
to build openvino (c library is built at the same time). Please add
option -DENABLE_MKL_DNN=ON for cmake to enable CPU path. The header
files and libraries are installed to /usr/local/deployment_tools/inference_engine/
with default options on my system.

To build FFmpeg with openvion, take my system as an example, run with:
$ export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/usr/local/deployment_tools/inference_engine/lib/intel64/:/usr/local/deployment_tools/inference_engine/external/tbb/lib/
$ ../ffmpeg/configure --enable-libopenvino --extra-cflags=-I/usr/local/deployment_tools/inference_engine/include/ --extra-ldflags=-L/usr/local/deployment_tools/inference_engine/lib/intel64
$ make

Here are the features provided by OpenVINO inference engine:
- support more DNN model formats
It supports TensorFlow, Caffe, ONNX, MXNet and Kaldi by converting them
into OpenVINO format with a python script. And torth model
can be first converted into ONNX and then to OpenVINO format.

see the script at https://github.com/openvinotoolkit/openvino/tree/master/model-optimizer/mo.py
which also does some optimization at model level.

- optimize at inference stage
It optimizes for X86 CPUs with SSE, AVX etc.

It also optimizes based on OpenCL for Intel GPUs.
(only Intel GPU supported becuase Intel OpenCL extension is used for optimization)

Signed-off-by: Guo, Yejun <yejun.guo@intel.com>
Signed-off-by: Pedro Arthur <bygrandao@gmail.com>
configure
libavfilter/dnn/Makefile
libavfilter/dnn/dnn_backend_openvino.c [new file with mode: 0644]
libavfilter/dnn/dnn_backend_openvino.h [new file with mode: 0644]
libavfilter/dnn/dnn_interface.c
libavfilter/dnn_interface.h