From 14e789ea8783c696c8461b59a6972dcab62ed939 Mon Sep 17 00:00:00 2001 From: Roman Kazantsev Date: Thu, 9 Jan 2025 17:40:56 +0400 Subject: [PATCH] [DOCS][Keras] Document Keras 3 models support and OpenVINO backend (#28345) **Details:** Document Keras 3 models support and OpenVINO backend **Ticket:** TBD --------- Signed-off-by: Kazantsev, Roman Co-authored-by: Karol Blaszczak --- README.md | 1 + .../model-preparation/convert-model-keras.rst | 87 +++++++++++++++++++ .../model-preparation/convert-model-to-ir.rst | 1 + 3 files changed, 89 insertions(+) create mode 100644 docs/articles_en/openvino-workflow/model-preparation/convert-model-keras.rst diff --git a/README.md b/README.md index 7e9b173530de61..8019bb892023f2 100644 --- a/README.md +++ b/README.md @@ -127,6 +127,7 @@ Learn how to run LLMs and GenAI with [Samples](https://github.com/openvinotoolki - [OpenVINO Execution Provider for ONNX Runtime](https://onnxruntime.ai/docs/execution-providers/OpenVINO-ExecutionProvider.html) - use OpenVINO as a backend with your existing ONNX Runtime code. - [LlamaIndex](https://docs.llamaindex.ai/en/stable/examples/llm/openvino/) - build context-augmented GenAI applications with the LlamaIndex framework and enhance runtime performance with OpenVINO. - [LangChain](https://python.langchain.com/docs/integrations/llms/openvino/) - integrate OpenVINO with the LangChain framework to enhance runtime performance for GenAI applications. +- [Keras 3](https://github.com/keras-team/keras) - Keras 3 is a multi-backend deep learning framework. Users can switch model inference to the OpenVINO backend using the Keras API. Check out the [Awesome OpenVINO](https://github.com/openvinotoolkit/awesome-openvino) repository to discover a collection of community-made AI projects based on OpenVINO! diff --git a/docs/articles_en/openvino-workflow/model-preparation/convert-model-keras.rst b/docs/articles_en/openvino-workflow/model-preparation/convert-model-keras.rst new file mode 100644 index 00000000000000..a3e456e4495354 --- /dev/null +++ b/docs/articles_en/openvino-workflow/model-preparation/convert-model-keras.rst @@ -0,0 +1,87 @@ +Converting a Keras Model +======================== + + +.. meta:: + :description: Learn how to convert a model from the + Keras format to the OpenVINO Model. + + +This document explains the process of converting Keras 3 models to the OpenVINO Intermediate Representation (IR) format. +For instructions on converting Keras 2 models, refer to :doc:`TensorFlow Model Conversion `. + +To convert a Keras 3 model, first export it to a lightweight TensorFlow SavedModel artifact, +and then convert it to an OpenVINO model, using the ``convert_model`` function. +Here is a code example of how to do this: + +.. code-block:: py + :force: + + import keras_hub + import openvino as ov + + model = keras_hub.models.BertTextClassifier.from_preset( + "bert_base_en_uncased", + num_classes=4, + preprocessor=None, + ) + + # export to SavedModel + model.export("bert_base") + + # convert to OpenVINO model + ov_model = ov.convert_model("bert_base") + + +.. note:: + + The resulting OpenVINO IR model can be saved to drive with no additional, Keras-specific steps. + Use the standard ``ov.save_model(ov_model,'model.xml')`` command. + +Alternatively, a model exported to TensorFlow SavedModel format can also be converted to OpenVINO IR using the ``ovc`` tool. Here is an example: + +.. code-block:: sh + :force: + + ovc bert_base + + +Run inference in Keras 3 with the OpenVINO backend +################################################## + +Starting with release 3.8, Keras provides native integration with the OpenVINO backend for accelerated inference. +This integration enables you to leverage OpenVINO performance optimizations directly within the Keras workflow, enabling faster inference on OpenVINO supported hardware. + +To switch to the OpenVINO backend in Keras 3, set the ``KERAS_BACKEND`` environment variable to ``"openvino"`` +or specify the backend in the local configuration file at ``~/.keras/keras.json``. +Here is an example of how to infer a model (trained with PyTorch, JAX, or TensorFlow backends) in Keras 3, using the OpenVINO backend: + +.. code-block:: py + :force: + + import os + + os.environ["KERAS_BACKEND"] = "openvino" + import numpy as np + import keras + import keras_hub + + features = { + "token_ids": np.ones(shape=(2, 12), dtype="int32"), + "segment_ids": np.array([[0, 0, 0, 0, 0, 1, 1, 1, 1, 1, 0, 0]] * 2), + "padding_mask": np.array([[1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 0, 0]] * 2), + } + + # take a model from KerasHub + bert = keras_hub.models.BertTextClassifier.from_preset( + "bert_base_en_uncased", + num_classes=4, + preprocessor=None, + ) + + predictions = bert.predict(features) + +.. note:: + + The OpenVINO backend may currently lack support for some operations. + This will be addressed in upcoming Keras releases as operation coverage is being expanded. diff --git a/docs/articles_en/openvino-workflow/model-preparation/convert-model-to-ir.rst b/docs/articles_en/openvino-workflow/model-preparation/convert-model-to-ir.rst index dd2fc35c56e92b..f36a235cf79f77 100644 --- a/docs/articles_en/openvino-workflow/model-preparation/convert-model-to-ir.rst +++ b/docs/articles_en/openvino-workflow/model-preparation/convert-model-to-ir.rst @@ -14,6 +14,7 @@ Convert to OpenVINO IR Convert from TensorFlow Lite Convert from PaddlePaddle Convert from JAX/Flax + Convert from Keras