Now it is the minimum version required to build this backend. #23597 Updated G-API ONNX RT backend to support ONNX RT version 1.14.1.Microsoft® ONNX Runtime inference backend: #23799 Implemented the required inference operations for the OpenVINO API 2.0 backend.#23595 Introduced a brand new OpenVINO API 2.0 backend.#23796 Aligned OpenVINO IE API 1.0 backend with the latest OpenVINO 2023.0 (as some features were removed there).Note: this backend will be deprecated after OpenVINO removes the API 1.0 support in its subsequent releases. #23668 #23786 Streamlined preprocessing in OpenVINO Inference Engine (ie) API 1.0 backend. #22750 Added API blobFromImageParam to build network inputs with pre-processings.#23349 Vulkan backend refactor for better performance and robustness.Added full FP16 computation branch on ARMv8 platform, 1.5x faster than FP32 #22275(FP16 Winograd is still pending).Further increased DNN speed on ARM and X86 by improving convolution, covering 1D and 3D cases, supporting convolution+element-wise op fusion. Fixes in nary element wise layer about broadcast:.#23491 Fixes for Segment Anything Model by Meta.#23613 Reduce Refactor for robustness and potential follow-up improvements.#23401 support ONNX Sub, PRelu, ConvTranspose.#23319 support ONNX Split, Slice, Clip (Relu6) and Conv with auto_pad.ONNX: #23047 Layer normalization, #23219 GELU and #23655 QLinearSoftmax.Improved layers => supported more models:.#23604 Enabled DNN module build without Protobuf dependency.#23161, #23409 TFLite models support, including int8 quantized models.Summer update for OpenCV 4.x has been released.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |