%define so_ver 2400 %global desc %{expand: \ OpenVINO is an open-source toolkit for optimizing and deploying deep learning models from cloud to edge. It accelerates deep learning inference across various use cases, such as generative AI, video, audio, and language with models from popular frameworks like PyTorch, TensorFlow, ONNX, and more.} Name: openvino Version: 2024.0.0 Release: %autorelease Summary: Toolkit for optimizing and deploying AI inference License: Apache-2.0 URL: https://github.com/openvinotoolkit/openvino Source0: %url/archive/%{version}/%{name}-%{version}.tar.gz Source1: https://github.com/openvinotoolkit/oneDNN/archive/f82148befdbdc9576ec721c9d500155ee4de8060/oneddn-f82148b.tar.gz Source2: https://github.com/openvinotoolkit/mlas/archive/d1bc25ec4660cddd87804fcf03b2411b5dfb2e94/mlas-d1bc25e.tar.gz Source3: dependencies.cmake Patch0: openvino-fedora.patch ExclusiveArch: x86_64 BuildRequires: cmake BuildRequires: gcc-c++ BuildRequires: patchelf BuildRequires: pugixml-devel BuildRequires: pybind11-devel BuildRequires: python3-devel BuildRequires: python3-setuptools BuildRequires: ShellCheck BuildRequires: zlib-ng-compat-devel %ifarch x86_64 BuildRequires: xbyak-devel %endif BuildRequires: tbb-devel Provides: bundled(onednn) Provides: bundled(mlas) Requires: lib%{name}-ir-frontend = %{version} Requires: lib%{name}-pytorch-frontend = %{version} Requires: numpy Recommends: %{name}-auto-plugin = %{version} Recommends: %{name}-auto-batch-plugin = %{version} Recommends: %{name}-hetero-plugin = %{version} Recommends: %{name}-intel-cpu-plugin = %{version} %description %{desc} %package devel Summary: Development files for %{name} Requires: %{name}%{?_isa} = %{version}-%{release} %description devel The %{name}-devel package contains libraries and header files for applications that use %{name}. %package auto-plugin Summary: OpenVINO Auto Plugin Requires: %{name}%{?_isa} = %{version}-%{release} %description auto-plugin AUTO plugin provides a unified device that enables developers to code deep learning applications once and deploy them anywhere. %package auto-batch-plugin Summary: OpenVINO Auto Batch Plugin Requires: %{name}%{?_isa} = %{version}-%{release} %description auto-batch-plugin Auto batch plugin provides automatic batching on-the-fly to improve device utilization by grouping inference requests together, without programming effort from the user. %package hetero-plugin Summary: OpenVINO Hetero Plugin Requires: %{name}%{?_isa} = %{version}-%{release} %description hetero-plugin Heterogeneous execution enables automatic inference splitting between several devices to utilize the power of accelerators to process the heaviest parts of the model and to execute unsupported operations on fallback devices, like the CPU and all available hardware more efficiently during one inference. %package intel-cpu-plugin Summary: OpenVINO Intel CPU Plugin Requires: %{name}%{?_isa} = %{version}-%{release} %description intel-cpu-plugin The CPU plugin is developed to achieve high performance inference of neural networks on Intel x86-64 and Arm CPUs. The newer 11th generation and later Intel CPUs provide even further performance boost, especially with INT8 models. %package -n lib%{name}-ir-frontend Summary: OpenVINO IR Frontend Requires: %{name}%{?_isa} = %{version}-%{release} %description -n lib%{name}-ir-frontend The primary function of the OpenVINO IR Frontend is to load an OpenVINO IR into memory. %package -n lib%{name}-pytorch-frontend Summary: OpenVINO PyTorch Frontend Requires: %{name}%{?_isa} = %{version}-%{release} %description -n lib%{name}-pytorch-frontend The PyTorch Frontend is a C++ based OpenVINO Frontend component that is responsible for reading and converting a PyTorch model to an ov::Model object that can be further serialized into the Intermediate Representation (IR) format %package -n python3-%{name} Summary: OpenVINO Python API Requires: %{name}%{?_isa} = %{version}-%{release} %description -n python3-%{name} OpenVINO Python API allowing users to use the OpenVINO library in their Python code. Python API provides bindings to basic and advanced APIs from OpenVINO runtime. %prep %autosetup -p1 # Remove the thirdparty deps rm -rf thirdparty/* cp %{SOURCE3} thirdparty/ # Intel-cpu-plugin thirdparty deps tar xf %{SOURCE1} cp -r oneDNN-*/* src/plugins/intel_cpu/thirdparty/onednn tar xf %{SOURCE2} cp -r mlas-*/* src/plugins/intel_cpu/thirdparty/mlas %build %cmake \ -DCMAKE_BUILD_TYPE=RelWithDebInfo \ -DCMAKE_CXX_FLAGS="%{optflags} -Wformat -Wformat-security" \ -DENABLE_CLANG_FORMAT=OFF \ -DCMAKE_COMPILE_WARNING_AS_ERROR=OFF \ -DENABLE_QSPECTRE=OFF \ -DENABLE_INTEGRITYCHECK=OFF \ -DENABLE_SANITIZER=OFF \ -DENABLE_UB_SANITIZER=OFF \ -DENABLE_THREAD_SANITIZER=OFF \ -DENABLE_COVERAGE=OFF \ -DENABLE_FASTER_BUILD=OFF \ -DENABLE_CPPLINT=OFF \ -DENABLE_CPPLINT_REPORT=OFF \ -DENABLE_GAPI_PREPROCESSING=OFF \ -DENABLE_NCC_STYLE=OFF \ -DENABLE_UNSAFE_LOCATIONS=OFF \ -DENABLE_FUZZING=OFF \ -DENABLE_PROFILING_ITT=OFF \ -DENABLE_PKGCONFIG_GEN=ON \ -DENABLE_STRICT_DEPENDENCIES=OFF \ -DENABLE_DEBUG_CAPS=ON \ -DENABLE_AUTO=ON \ -DENABLE_AUTO_BATCH=ON \ -DENABLE_HETERO=ON \ -DENABLE_INTEL_CPU=ON \ -DENABLE_MLAS_FOR_CPU=ON \ -DENABLE_MLAS_FOR_CPU_DEFAULT=ON \ -DENABLE_INTEL_GNA=OFF \ -DENABLE_INTEL_GPU=OFF \ -DENABLE_ONEDNN_FOR_GPU=OFF \ -DENABLE_MULTI=ON \ -DENABLE_PROXY=ON \ -DENABLE_TEMPLATE=ON \ -DENABLE_OV_ONNX_FRONTEND=OFF \ -DENABLE_OV_PADDLE_FRONTEND=OFF \ -DENABLE_OV_IR_FRONTEND=ON \ -DENABLE_OV_PYTORCH_FRONTEND=ON \ -DENABLE_OV_TF_FRONTEND=OFF \ -DENABLE_OV_TF_LITE_FRONTEND=OFF \ -DENABLE_PYTHON=ON \ -DOV_CPACK_PYTHONDIR=%{python3_sitearch} \ -DENABLE_JS=OFF \ -DENABLE_SYSTEM_LIBS_DEFAULT=ON \ -DENABLE_SYSTEM_OPENCL=OFF \ -DENABLE_SYSTEM_PUGIXML=ON \ -DENABLE_SYSTEM_PROTOBUF=OFF \ -DTHREADING=TBB \ -DENABLE_SYSTEM_TBB=ON \ -DTBB_LIB_INSTALL_DIR=%{_libdir} \ -DENABLE_TBBBIND_2_5=OFF \ -DENABLE_TBB_RELEASE_ONLY=ON \ -DENABLE_SAMPLES=OFF \ -DENABLE_TESTS=OFF \ -DBUILD_SHARED_LIBS=ON %cmake_build %install %cmake_install # Remove unneeded files rm -rfv %{buildroot}%{python3_sitearch}/requirements.txt rm -rfv %{buildroot}%{python3_sitearch}/openvino/preprocess/torchvision/requirements.txt %files %license LICENSE %doc CONTRIBUTING.md README.md %{_libdir}/lib%{name}.so.%{version} %{_libdir}/lib%{name}.so.%{so_ver} %{_libdir}/lib%{name}_c.so.%{version} %{_libdir}/lib%{name}_c.so.%{so_ver} %files devel %{_includedir}/%{name} %{_libdir}/lib%{name}.so %{_libdir}/lib%{name}_c.so %{_libdir}/lib%{name}_pytorch_frontend.so %{_libdir}/cmake/openvino-%{version} %{_libdir}/pkgconfig/%{name}.pc %files auto-plugin %{_libdir}/%{name}-%{version}/lib%{name}_auto_plugin.so %files auto-batch-plugin %{_libdir}/%{name}-%{version}/lib%{name}_auto_batch_plugin.so %files hetero-plugin %{_libdir}/%{name}-%{version}/lib%{name}_hetero_plugin.so %files intel-cpu-plugin %{_libdir}/%{name}-%{version}/lib%{name}_intel_cpu_plugin.so %files -n lib%{name}-ir-frontend %{_libdir}/lib%{name}_ir_frontend.so.%{version} %{_libdir}/lib%{name}_ir_frontend.so.%{so_ver} %files -n lib%{name}-pytorch-frontend %{_libdir}/lib%{name}_pytorch_frontend.so.%{version} %{_libdir}/lib%{name}_pytorch_frontend.so.%{so_ver} %files -n python3-%{name} %{python3_sitearch}/%{name} %changelog %autochangelog