]> git.sesse.net Git - ffmpeg/commit
dnn/openvino: support run inference via GPU
authorTing Fu <ting.fu@intel.com>
Wed, 9 Sep 2020 01:52:18 +0000 (09:52 +0800)
committerGuo, Yejun <yejun.guo@intel.com>
Sat, 12 Sep 2020 08:15:30 +0000 (16:15 +0800)
commit87cb24a1ca4a76e5a5a9969e3058a94d952e1b37
tree8048e13e9e02229710e347e957d7bd2493e23246
parenta406dde1d21b9f253f996e94a2fd2045898f9c37
dnn/openvino: support run inference via GPU

for enabling OpenVINO GPU please:
1. install required OpenCL drivers, see: https://github.com/intel/compute-runtime/releases/tag/19.41.14441
2. build OpenVINO c lib with GPU enabled: use cmake config with: -DENABLE_CLDNN=ON
3. then make, and include the OpenVINO c lib in environment variables
detailed steps please refer: https://github.com/openvinotoolkit/openvino/blob/master/build-instruction.md

inference model with GPU please add: optioins=device=GPU

Signed-off-by: Ting Fu <ting.fu@intel.com>
Signed-off-by: Guo, Yejun <yejun.guo@intel.com>
libavfilter/dnn/dnn_backend_openvino.c