Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merge pull request #1206 from run-ai/support-matrix-integration #1207

Merged
merged 1 commit into from
Oct 31, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
54 changes: 27 additions & 27 deletions docs/platform-admin/integrations/integration-overview.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,33 +11,33 @@ The Run:ai community portal is password protected and access is provided to cust

## Integrations

| Tool | Category | Run:ai support details | Additional Links|
| ------------------ | ------------------------------------- | --------------------------------------------------------------------------------------------------------------------------------------------------------- | -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| Spark | Orchestration | <div style="width: 300px;"> It is possible to schedule Spark workflows with the Run:ai scheduler. For details, please contact Run:ai customer support. </div>| Run:ai customer success community portal: [https://runai.my.site.com/community/s/article/How-to-Run-Spark-jobs-with-Run-AI](https://runai.my.site.com/community/s/article/How-to-Run-Spark-jobs-with-Run-AI){target=_blank} |
| Kubeflow Pipelines | Orchestration | It is possible to schedule kubeflow pipelines with the Run:ai scheduler. For details please contact Run:ai customer support| Run:ai customer success community portal<br>[https://runai.my.site.com/community/s/article/How-to-integrate-Run-ai-with-Kubeflow](https://runai.my.site.com/community/s/article/How-to-integrate-Run-ai-with-Kubeflow){target=_blank} |
| Apache Airflow | Orchestration | It is possible to schedule Airflow workflows with the Run:ai scheduler. For details, please contact Run:ai customer support. | Run:ai customer success community portal: [https://runai.my.site.com/community/s/article/How-to-integrate-Run-ai-with-Apache-Airflow](https://runai.my.site.com/community/s/article/How-to-integrate-Run-ai-with-Apache-Airflow){target=_blank} |
| Argo workflows | Orchestration | It is possible to schedule Argo workflows with the Run:ai scheduler. For details, please contact Run:ai customer support. | Run:ai customer success community portal [https://runai.my.site.com/community/s/article/How-to-integrate-Run-ai-with-Argo-Workflows](https://runai.my.site.com/community/s/article/How-to-integrate-Run-ai-with-Argo-Workflows){target=_blank} |
| SeldonX | Orchestration | It is possible to schedule Seldon Core workloads with the Run:ai scheduler. For details, please contact Run:ai customer success. | Run:ai customer success community portal: [https://runai.my.site.com/community/s/article/How-to-integrate-Run-ai-with-Seldon-Core](https://runai.my.site.com/community/s/article/How-to-integrate-Run-ai-with-Apache-Airflow){target=_blank} |
| Triton | Orchestration | Usage via docker base image | Quickstart inference [example](../../Researcher/Walkthroughs/quickstart-inference.md) |
| Jupyter Notebook | Development | Run:ai provides integrated support with Jupyter Notebooks | Quickstart example: [https://docs.run.ai/latest/Researcher/Walkthroughs/quickstart-jupyter/](https://docs.run.ai/latest/Researcher/Walkthroughs/quickstart-jupyter/) |
| Jupyter Hub | Development | It is possible to submit Run:ai workloads via JupyterHub. For more information please contact Run:ai customer support | |
| PyCharm | Development | Containers created by Run:ai can be accessed via PyCharm | PyCharm [example](../../Researcher/tools/dev-pycharm.md) |
| VScode | Development | - Containers created by Run:ai can be accessed via Visual Studio Code.<br>- You can automatically launch Visual Studio code web from the Run:ai console | VSCode [example](../../Researcher/tools/dev-vscode.md) and VSCode quickstart [example](../../Researcher/Walkthroughs/quickstart-vscode.md). |
| Kubeflow notebooks | Development | It is possible to launch a kubeflow notebook with the Run:ai scheduler. For details please contact Run:ai customer support | Run:ai customer success community portal:[https://runai.my.site.com/community/s/article/How-to-integrate-Run-ai-with-Kubeflow<br>](https://runai.my.site.com/community/s/article/How-to-integrate-Run-ai-with-Kubeflow){target=_blank} |
| Ray | training, inference, data processing. | It is possible to schedule Ray jobs with the Run:ai scheduler. | Run:ai customer success community portal [https://runai.my.site.com/community/s/article/How-to-Integrate-Run-ai-with-Ray](https://runai.my.site.com/community/s/article/How-to-Integrate-Run-ai-with-Ray){target=_blank} |
| TensorBoard | Experiment tracking | Run:ai comes with a preset Tensorboard [Environment](../workloads/assets/environments.md) asset. | TensorBoard [example](../../Researcher/tools/dev-tensorboard.md). <br> Additional [sample](https://github.com/run-ai/use-cases/tree/master/runai_tensorboard_demo_with_resnet){target=_blank} |
| Weights & Biases | Experiment tracking | It is possible to schedule W&B workloads with the Run:ai scheduler. For details, please contact Run:ai customer success. | Run:ai Customer success community portal: [https://runai.my.site.com/community/s/article/How-to-integrate-with-Weights-and-Biases](https://runai.my.site.com/community/s/article/How-to-integrate-with-Weights-and-Biases){target=_blank} <br> Additional samples [here](https://github.com/run-ai/use-cases/tree/master/runai_wandb){target=_blank} |
| ClearML | Experiment tracking | It is possible to schedule ClearML workloads with the Run:ai scheduler. For details, please contact Run:ai customer success. | [https://runai.my.site.com/community/s/article/How-to-integrate-Run-ai-with-ClearML](https://runai.my.site.com/community/s/article/How-to-integrate-Run-ai-with-ClearML){target=_blank} |
| MLFlow | Model Serving | It is possible to use ML Flow together with the Run:ai scheduler. For details, please contact Run:ai customer support. | Run:ai customer success community portal: [https://runai.my.site.com/community/s/article/How-to-integrate-Run-ai-with-MLflow](https://runai.my.site.com/community/s/article/How-to-integrate-Run-ai-with-MLflow){target=_blank} <br> Additional MLFlow [sample](https://github.com/run-ai/use-cases/tree/master/runai_mlflow_demo){target=_blank} |
| Hugging Face | Repositories | Run:ai provides an out of the box integration with Hugging Face | |
| Docker Registry | Repositories | Run:ai allows using a docker registry as a Credentials asset | Credentials [documentation](../workloads/assets/credentials.md) |
| S3 | Storage| Run:ai communicates with S3 by defining it as a data source asset| Data source [documentation](../workloads/assets/datasources.md) |
| Github | Storage| Run:ai communicates with GitHub by defining it as a data source asset | Data source [documentation](../workloads/assets/datasources.md) |
| Tensorflow | Training | Run:ai provides out of the box support for submitting TensorFlow workloads | TensorFlow [via API](../../Researcher/cli-reference/new-cli/runai_tensorflow.md) or use Workflow submission [via user interface](../../Researcher/workloads/trainings.md) |
| Pytorch | Training | Run:ai provides out of the box support for submitting PyTorch workloads | PyTorch [via API](../../Researcher/cli-reference/new-cli/runai_pytorch.md) or use Workflow submission [via user interface](../../Researcher/workloads/trainings.md) |
| [Kubeflow MPI](https://www.kubeflow.org/docs/components/training/user-guides/mpi/){target=_blank} | Training | Run:ai provides out of the box support for submitting MPI workloads | MPI [via API](../../Researcher/cli-reference/new-cli/runai_mpi.md) or use Workflow submission [via user interface](../../Researcher/workloads/trainings.md) |
| [XGBoost](https://xgboost.readthedocs.io/en/stable/){target=_blank} | Training | Run:ai provides out of the box support for submitting XGBoost workloads | XGBoost [via API](../../Researcher/cli-reference/new-cli/runai_xgboost.md) or use Workflow submission [via user interface](../../Researcher/workloads/trainings.md) |
| [Karpenter](https://karpenter.sh){target=_blank} | Cost Optimization | Run:ai provides out of the box support for Karpenter to save cloud costs | Integration notes with Karpenter can be found [here](karpenter.md) |
| Tool | Category | Run:ai support details | Additional Information|
| ------------------ | ----------------| --------------------------------------------------------------------------------------------------------------------------------------------------------- | -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| Triton | Orchestration | Supported | Usage via docker base image. Quickstart inference [example](../../Researcher/Walkthroughs/quickstart-inference.md) |
| Spark | Orchestration | | <div style="width: 300px;"> It is possible to schedule Spark workflows with the Run:ai scheduler. For details, please contact Run:ai customer support. </div> Sample code can be found in the Run:ai customer success community portal: [https://runai.my.site.com/community/s/article/How-to-Run-Spark-jobs-with-Run-AI](https://runai.my.site.com/community/s/article/How-to-Run-Spark-jobs-with-Run-AI){target=_blank} |
| Kubeflow Pipelines | Orchestration | | It is possible to schedule kubeflow pipelines with the Run:ai scheduler. For details please contact Run:ai customer support. Sample code can be found in the Run:ai customer success community portal<br>[https://runai.my.site.com/community/s/article/How-to-integrate-Run-ai-with-Kubeflow](https://runai.my.site.com/community/s/article/How-to-integrate-Run-ai-with-Kubeflow){target=_blank} |
| Apache Airflow | Orchestration | | It is possible to schedule Airflow workflows with the Run:ai scheduler. For details, please contact Run:ai customer support. Sample code can be found in the Run:ai customer success community portal: [https://runai.my.site.com/community/s/article/How-to-integrate-Run-ai-with-Apache-Airflow](https://runai.my.site.com/community/s/article/How-to-integrate-Run-ai-with-Apache-Airflow){target=_blank} |
| Argo workflows | Orchestration | | It is possible to schedule Argo workflows with the Run:ai scheduler. For details, please contact Run:ai customer support. Sample code can be found in the Run:ai customer success community portal [https://runai.my.site.com/community/s/article/How-to-integrate-Run-ai-with-Argo-Workflows](https://runai.my.site.com/community/s/article/How-to-integrate-Run-ai-with-Argo-Workflows){target=_blank} |
| SeldonX | Orchestration | | It is possible to schedule Seldon Core workloads with the Run:ai scheduler. For details, please contact Run:ai customer success. Sample code can be found in the Run:ai customer success community portal: [https://runai.my.site.com/community/s/article/How-to-integrate-Run-ai-with-Seldon-Core](https://runai.my.site.com/community/s/article/How-to-integrate-Run-ai-with-Apache-Airflow){target=_blank} |
| Jupyter Notebook | Development | Supported | Run:ai provides integrated support with Jupyter Notebooks. Quickstart example: [https://docs.run.ai/latest/Researcher/Walkthroughs/quickstart-jupyter/](https://docs.run.ai/latest/Researcher/Walkthroughs/quickstart-jupyter/) |
| Jupyter Hub | Development | | It is possible to submit Run:ai workloads via JupyterHub. For more information please contact Run:ai customer support | |
| PyCharm | Development | Supported | Containers created by Run:ai can be accessed via PyCharm. PyCharm [example](../../Researcher/tools/dev-pycharm.md) |
| VScode | Development | Supported | - Containers created by Run:ai can be accessed via Visual Studio Code. [example](../../Researcher/tools/dev-vscode.md) <br>- You can automatically launch Visual Studio code web from the Run:ai console. [example](../../Researcher/Walkthroughs/quickstart-vscode.md). |
| Kubeflow notebooks | Development | | It is possible to launch a kubeflow notebook with the Run:ai scheduler. For details please contact Run:ai customer support Sample code can be found in the Run:ai customer success community portal:[https://runai.my.site.com/community/s/article/How-to-integrate-Run-ai-with-Kubeflow<br>](https://runai.my.site.com/community/s/article/How-to-integrate-Run-ai-with-Kubeflow){target=_blank} |
| Ray | training, inference, data processing. | | It is possible to schedule Ray jobs with the Run:ai scheduler. Sample code can be found in the Run:ai customer success community portal [https://runai.my.site.com/community/s/article/How-to-Integrate-Run-ai-with-Ray](https://runai.my.site.com/community/s/article/How-to-Integrate-Run-ai-with-Ray){target=_blank} |
| TensorBoard | Experiment tracking | Supported | Run:ai comes with a preset Tensorboard [Environment](../workloads/assets/environments.md) asset. TensorBoard [example](../../Researcher/tools/dev-tensorboard.md). <br> Additional [sample](https://github.com/run-ai/use-cases/tree/master/runai_tensorboard_demo_with_resnet){target=_blank} |
| Weights & Biases | Experiment tracking | | It is possible to schedule W&B workloads with the Run:ai scheduler. For details, please contact Run:ai customer success. | Run:ai Customer success community portal: [https://runai.my.site.com/community/s/article/How-to-integrate-with-Weights-and-Biases](https://runai.my.site.com/community/s/article/How-to-integrate-with-Weights-and-Biases){target=_blank} <br> Additional samples [here](https://github.com/run-ai/use-cases/tree/master/runai_wandb){target=_blank} |
| ClearML | Experiment tracking | | It is possible to schedule ClearML workloads with the Run:ai scheduler. For details, please contact Run:ai customer success. | [https://runai.my.site.com/community/s/article/How-to-integrate-Run-ai-with-ClearML](https://runai.my.site.com/community/s/article/How-to-integrate-Run-ai-with-ClearML){target=_blank} |
| MLFlow | Model Serving | | It is possible to use ML Flow together with the Run:ai scheduler. For details, please contact Run:ai customer support. Sample code can be found in the Run:ai customer success community portal: [https://runai.my.site.com/community/s/article/How-to-integrate-Run-ai-with-MLflow](https://runai.my.site.com/community/s/article/How-to-integrate-Run-ai-with-MLflow){target=_blank} <br> Additional MLFlow [sample](https://github.com/run-ai/use-cases/tree/master/runai_mlflow_demo){target=_blank} |
| Hugging Face | Repositories | Supported | Run:ai provides an out of the box integration with Hugging Face |
| Docker Registry | Repositories | Supported | Run:ai allows using a docker registry as a [Credentials](../workloads/assets/credentials.md) asset. |
| S3 | Storage | Supported | Run:ai communicates with S3 by defining a [data source](../workloads/assets/datasources.md) asset. |
| Github | Storage | Supported | Run:ai communicates with GitHub by defining it as a [data source](../workloads/assets/datasources.md) asset |
| Tensorflow | Training | Supported | Run:ai provides out of the box support for submitting TensorFlow workloads [via API](../../Researcher/cli-reference/new-cli/runai_tensorflow.md) or by submitting workloads [via user interface](../../Researcher/workloads/trainings.md). |
| Pytorch | Training | Supported | Run:ai provides out of the box support for submitting PyTorch workloads [via API](../../Researcher/cli-reference/new-cli/runai_pytorch.md) or by submitting workloads [via user interface](../../Researcher/workloads/trainings.md). |
| [Kubeflow MPI](https://www.kubeflow.org/docs/components/training/user-guides/mpi/){target=_blank} | Training | Supported |Run:ai provides out of the box support for submitting MPI workloads [via API](../../Researcher/cli-reference/new-cli/runai_mpi.md) or by submitting workloads [via user interface](../../Researcher/workloads/trainings.md) |
| [XGBoost](https://xgboost.readthedocs.io/en/stable/){target=_blank} | Training | Supported | Run:ai provides out of the box support for submitting XGBoost workloads [via API](../../Researcher/cli-reference/new-cli/runai_xgboost.md) or by submitting workloads [via user interface](../../Researcher/workloads/trainings.md) |
| [Karpenter](https://karpenter.sh){target=_blank} | Cost Optimization | Supported | Run:ai provides out of the box support for Karpenter to save cloud costs. Integration notes with Karpenter can be found [here](karpenter.md) |

## Kubernetes Workloads Integration

Expand Down
Loading
Loading