Skip to content

Commit

Permalink
[release branch] [DOC] Links, typos and public package (#2506)
Browse files Browse the repository at this point in the history
Fix links /nightly/ -> /2024/
Fix some of the links from /main/ to /releases/2024/2
Fix typos CVS-144337
Public OpenVINO link
  • Loading branch information
dkalinowski authored Jun 17, 2024
1 parent 25a47cd commit 31ad50a
Show file tree
Hide file tree
Showing 68 changed files with 178 additions and 178 deletions.
6 changes: 3 additions & 3 deletions Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -154,11 +154,11 @@ ifeq ($(findstring ubuntu,$(BASE_OS)),ubuntu)
ifeq ($(BASE_OS_TAG),20.04)
OS=ubuntu20
INSTALL_DRIVER_VERSION ?= "22.43.24595"
DLDT_PACKAGE_URL ?= http://s3.toolbox.iotg.sclab.intel.com/ov-packages/l_openvino_toolkit_ubuntu20_2024.2.0.15519.5c0f38f83f6_x86_64.tgz
DLDT_PACKAGE_URL ?= https://storage.openvinotoolkit.org/repositories/openvino/packages/2024.2/linux/l_openvino_toolkit_ubuntu20_2024.2.0.15519.5c0f38f83f6_x86_64.tgz
else ifeq ($(BASE_OS_TAG),22.04)
OS=ubuntu22
INSTALL_DRIVER_VERSION ?= "23.22.26516"
DLDT_PACKAGE_URL ?= http://s3.toolbox.iotg.sclab.intel.com/ov-packages/l_openvino_toolkit_ubuntu22_2024.2.0.15519.5c0f38f83f6_x86_64.tgz
DLDT_PACKAGE_URL ?= https://storage.openvinotoolkit.org/repositories/openvino/packages/2024.2/linux/l_openvino_toolkit_ubuntu22_2024.2.0.15519.5c0f38f83f6_x86_64.tgz
endif
endif
ifeq ($(BASE_OS),redhat)
Expand All @@ -173,7 +173,7 @@ ifeq ($(BASE_OS),redhat)
endif
DIST_OS=redhat
INSTALL_DRIVER_VERSION ?= "23.22.26516"
DLDT_PACKAGE_URL ?= http://s3.toolbox.iotg.sclab.intel.com/ov-packages/l_openvino_toolkit_rhel8_2024.2.0.15519.5c0f38f83f6_x86_64.tgz
DLDT_PACKAGE_URL ?= https://storage.openvinotoolkit.org/repositories/openvino/packages/2024.2/linux/l_openvino_toolkit_rhel8_2024.2.0.15519.5c0f38f83f6_x86_64.tgz
endif

OVMS_CPP_DOCKER_IMAGE ?= openvino/model_server
Expand Down
40 changes: 20 additions & 20 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,22 +15,22 @@ OpenVINO™ Model Server (OVMS) is a high-performance system for serving mod

![OVMS picture](docs/ovms_high_level.png)

The models used by the server need to be stored locally or hosted remotely by object storage services. For more details, refer to [Preparing Model Repository](https://docs.openvino.ai/nightly/ovms_docs_models_repository.html) documentation. Model server works inside [Docker containers](https://docs.openvino.ai/nightly/ovms_docs_deploying_server.html#deploying-model-server-in-docker-container), on [Bare Metal](https://docs.openvino.ai/nightly/ovms_docs_deploying_server.html#deploying-model-server-on-baremetal-without-container), and in [Kubernetes environment](https://docs.openvino.ai/nightly/ovms_docs_deploying_server.html#deploying-model-server-in-kubernetes).
Start using OpenVINO Model Server with a fast-forward serving example from the [Quickstart guide](https://docs.openvino.ai/nightly/ovms_docs_quick_start_guide.html) or explore [Model Server features](https://docs.openvino.ai/nightly/ovms_docs_features.html).
The models used by the server need to be stored locally or hosted remotely by object storage services. For more details, refer to [Preparing Model Repository](https://docs.openvino.ai/2024/ovms_docs_models_repository.html) documentation. Model server works inside [Docker containers](https://docs.openvino.ai/2024/ovms_docs_deploying_server.html#deploying-model-server-in-docker-container), on [Bare Metal](https://docs.openvino.ai/2024/ovms_docs_deploying_server.html#deploying-model-server-on-baremetal-without-container), and in [Kubernetes environment](https://docs.openvino.ai/2024/ovms_docs_deploying_server.html#deploying-model-server-in-kubernetes).
Start using OpenVINO Model Server with a fast-forward serving example from the [Quickstart guide](https://docs.openvino.ai/2024/ovms_docs_quick_start_guide.html) or explore [Model Server features](https://docs.openvino.ai/2024/ovms_docs_features.html).

Read [release notes](https://github.com/openvinotoolkit/model_server/releases) to find out what’s new.

### Key features:
- **[NEW]** [Efficient Text Generation via OpenAI API - preview](https://docs.openvino.ai/nightly/ovms_docs_llm_reference.html)
- [Python code execution](https://docs.openvino.ai/nightly/ovms_docs_python_support_reference.html)
- [gRPC streaming](https://docs.openvino.ai/nightly/ovms_docs_streaming_endpoints.html)
- [MediaPipe graphs serving](https://docs.openvino.ai/nightly/ovms_docs_mediapipe.html)
- Model management - including [model versioning](https://docs.openvino.ai/nightly/ovms_docs_model_version_policy.html) and [model updates in runtime](https://docs.openvino.ai/nightly/ovms_docs_online_config_changes.html)
- [Dynamic model inputs](https://docs.openvino.ai/nightly/ovms_docs_shape_batch_layout.html)
- [Directed Acyclic Graph Scheduler](https://docs.openvino.ai/nightly/ovms_docs_dag.html) along with [custom nodes in DAG pipelines](https://docs.openvino.ai/nightly/ovms_docs_custom_node_development.html)
- [Metrics](https://docs.openvino.ai/nightly/ovms_docs_metrics.html) - metrics compatible with Prometheus standard
- **[NEW]** [Efficient Text Generation via OpenAI API - preview](https://docs.openvino.ai/2024/ovms_docs_llm_reference.html)
- [Python code execution](https://docs.openvino.ai/2024/ovms_docs_python_support_reference.html)
- [gRPC streaming](https://docs.openvino.ai/2024/ovms_docs_streaming_endpoints.html)
- [MediaPipe graphs serving](https://docs.openvino.ai/2024/ovms_docs_mediapipe.html)
- Model management - including [model versioning](https://docs.openvino.ai/2024/ovms_docs_model_version_policy.html) and [model updates in runtime](https://docs.openvino.ai/2024/ovms_docs_online_config_changes.html)
- [Dynamic model inputs](https://docs.openvino.ai/2024/ovms_docs_shape_batch_layout.html)
- [Directed Acyclic Graph Scheduler](https://docs.openvino.ai/2024/ovms_docs_dag.html) along with [custom nodes in DAG pipelines](https://docs.openvino.ai/2024/ovms_docs_custom_node_development.html)
- [Metrics](https://docs.openvino.ai/2024/ovms_docs_metrics.html) - metrics compatible with Prometheus standard
- Support for multiple frameworks, such as TensorFlow, PaddlePaddle and ONNX
- Support for [AI accelerators](https://docs.openvino.ai/nightly/about-openvino/compatibility-and-support/supported-devices.html)
- Support for [AI accelerators](https://docs.openvino.ai/2024/about-openvino/compatibility-and-support/supported-devices.html)

**Note:** OVMS has been tested on RedHat, and Ubuntu. The latest publicly released docker images are based on Ubuntu and UBI.
They are stored in:
Expand All @@ -40,26 +40,26 @@ They are stored in:

## Run OpenVINO Model Server

A demonstration on how to use OpenVINO Model Server can be found in [our quick-start guide](https://docs.openvino.ai/nightly/ovms_docs_quick_start_guide.html).
A demonstration on how to use OpenVINO Model Server can be found in [our quick-start guide](https://docs.openvino.ai/2024/ovms_docs_quick_start_guide.html).
For more information on using Model Server in various scenarios you can check the following guides:

* [Model repository configuration](https://docs.openvino.ai/nightly/ovms_docs_models_repository.html)
* [Model repository configuration](https://docs.openvino.ai/2024/ovms_docs_models_repository.html)

* [Deployment options](https://docs.openvino.ai/nightly/ovms_docs_deploying_server.html)
* [Deployment options](https://docs.openvino.ai/2024/ovms_docs_deploying_server.html)

* [Performance tuning](https://docs.openvino.ai/nightly/ovms_docs_performance_tuning.html)
* [Performance tuning](https://docs.openvino.ai/2024/ovms_docs_performance_tuning.html)

* [Directed Acyclic Graph Scheduler](https://docs.openvino.ai/nightly/ovms_docs_dag.html)
* [Directed Acyclic Graph Scheduler](https://docs.openvino.ai/2024/ovms_docs_dag.html)

* [Custom nodes development](https://docs.openvino.ai/nightly/ovms_docs_custom_node_development.html)
* [Custom nodes development](https://docs.openvino.ai/2024/ovms_docs_custom_node_development.html)

* [Serving stateful models](https://docs.openvino.ai/nightly/ovms_docs_stateful_models.html)
* [Serving stateful models](https://docs.openvino.ai/2024/ovms_docs_stateful_models.html)

* [Deploy using a Kubernetes Helm Chart](https://github.com/openvinotoolkit/operator/tree/main/helm-charts/ovms)

* [Deployment using Kubernetes Operator](https://operatorhub.io/operator/ovms-operator)

* [Using binary input data](https://docs.openvino.ai/nightly/ovms_docs_binary_input.html)
* [Using binary input data](https://docs.openvino.ai/2024/ovms_docs_binary_input.html)



Expand All @@ -73,7 +73,7 @@ For more information on using Model Server in various scenarios you can check th

* [RESTful API](https://restfulapi.net/)

* [Benchmarking results](https://docs.openvino.ai/nightly/openvino_docs_performance_benchmarks.html)
* [Benchmarking results](https://docs.openvino.ai/2024/openvino_docs_performance_benchmarks.html)

* [Speed and Scale AI Inference Operations Across Multiple Architectures](https://techdecoded.intel.io/essentials/speed-and-scale-ai-inference-operations-across-multiple-architectures/?elq_cid=3646480_ts1607680426276&erpm_id=6470692_ts1607680426276) - webinar recording

Expand Down
2 changes: 1 addition & 1 deletion client/go/kserve-api/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ RUN go install google.golang.org/protobuf/cmd/[email protected]
RUN go install google.golang.org/grpc/cmd/[email protected]

# Compile API
RUN wget https://raw.githubusercontent.com/openvinotoolkit/model_server/main/src/kfserving_api/grpc_predict_v2.proto
RUN wget https://raw.githubusercontent.com/openvinotoolkit/model_server/releases/2024/2/src/kfserving_api/grpc_predict_v2.proto
RUN echo 'option go_package = "./grpc-client";' >> grpc_predict_v2.proto
RUN protoc --go_out="./" --go-grpc_out="./" ./grpc_predict_v2.proto

Expand Down
2 changes: 1 addition & 1 deletion client/java/kserve-api/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -64,7 +64,7 @@
</goals>
<configuration>
<url>
https://raw.githubusercontent.com/openvinotoolkit/model_server/main/src/kfserving_api/grpc_predict_v2.proto</url>
https://raw.githubusercontent.com/openvinotoolkit/model_server/releases/2024/2/src/kfserving_api/grpc_predict_v2.proto</url>
<outputFileName>grpc_predict_v2.proto</outputFileName>
<outputDirectory>src/main/proto</outputDirectory>
</configuration>
Expand Down
4 changes: 2 additions & 2 deletions client/python/ovmsclient/lib/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ OVMS client library contains only the necessary dependencies, so the whole packa

As OpenVINO Model Server API is compatible with TensorFlow Serving, it's possible to use `ovmsclient` with TensorFlow Serving instances on: Predict, GetModelMetadata and GetModelStatus endpoints.

See [API documentation](https://github.com/openvinotoolkit/model_server/blob/main/client/python/ovmsclient/lib/docs/README.md) for details on what the library provides.
See [API documentation](https://github.com/openvinotoolkit/model_server/blob/releases/2024/2/client/python/ovmsclient/lib/docs/README.md) for details on what the library provides.

```bash
git clone https://github.com/openvinotoolkit/model_server.git
Expand Down Expand Up @@ -136,4 +136,4 @@ results = client.predict(inputs=inputs, model_name="model")
#
```

For more details on `ovmsclient` see [API reference](https://github.com/openvinotoolkit/model_server/blob/main/client/python/ovmsclient/lib/docs/README.md)
For more details on `ovmsclient` see [API reference](https://github.com/openvinotoolkit/model_server/blob/releases/2024/2/client/python/ovmsclient/lib/docs/README.md)
4 changes: 2 additions & 2 deletions client/python/ovmsclient/lib/docs/pypi_overview.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ The `ovmsclient` package works both with OpenVINO&trade; Model Server and Tensor
The `ovmsclient` can replace `tensorflow-serving-api` package with reduced footprint and simplified interface.


See [API reference](https://github.com/openvinotoolkit/model_server/blob/main/client/python/ovmsclient/lib/docs/README.md) for usage details.
See [API reference](https://github.com/openvinotoolkit/model_server/blob/releases/2024/2/client/python/ovmsclient/lib/docs/README.md) for usage details.


## Usage example
Expand Down Expand Up @@ -38,4 +38,4 @@ results = client.predict(inputs=inputs, model_name="model")

```

Learn more on `ovmsclient` [documentation site](https://github.com/openvinotoolkit/model_server/tree/main/client/python/ovmsclient/lib).
Learn more on `ovmsclient` [documentation site](https://github.com/openvinotoolkit/model_server/tree/releases/2024/2/client/python/ovmsclient/lib).
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ def reload_servables(self):
ConfigStatusResponse object with all models and their versions statuses
Raises:
Exceptions for different serving reponses...
Exceptions for different serving responses...
Examples:
Expand All @@ -50,7 +50,7 @@ def get_servables(self):
ConfigStatusResponse object with all models and their versions statuses
Raises:
Exceptions for different serving reponses...
Exceptions for different serving responses...
Examples:
Expand Down
Loading

0 comments on commit 31ad50a

Please sign in to comment.