Skip to content

Commit

Permalink
Merge pull request #19 from RedHatQuickCourses/lab_updates
Browse files Browse the repository at this point in the history
Lab updates
  • Loading branch information
kknoxrht authored Nov 11, 2024
2 parents fb0f95b + ad6b6cc commit 79a1858
Show file tree
Hide file tree
Showing 34 changed files with 729 additions and 214 deletions.
Binary file modified .DS_Store
Binary file not shown.
5 changes: 5 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -106,3 +106,8 @@ dist

# Local VSCode settings
.vscode/settings*
.DS_Store
modules/.DS_Store
modules/chapter2/.DS_Store
modules/chapter3/.DS_Store
modules/ROOT/.DS_Store
Binary file modified modules/.DS_Store
Binary file not shown.
Binary file modified modules/ROOT/.DS_Store
Binary file not shown.
Binary file added modules/ROOT/images/demo_platform_catalog.gif
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
39 changes: 22 additions & 17 deletions modules/ROOT/pages/index.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ There will be some challenges along the way, all designed to teach us about a co
If you're ready, let’s get started!


IMPORTANT: The hands-on labs in this course were created and tested with RHOAI v2.10 & later versions. Labs will work without any changes in minor dot release upgrades of the product. https://github.com/RedHatQuickCourses/llm-on-rhoai[Please open issues in this repository if you face any problem.]
IMPORTANT: The hands-on labs in this course were updated to and tested with RHOAI v2.14. Labs should work without any changes in minor dot release upgrades of the product. https://github.com/RedHatQuickCourses/llm-on-rhoai[Please open issues in this repository if you face any problem] or feel free to send me a note on slack.


== Authors
Expand All @@ -34,26 +34,31 @@ The PTL team acknowledges the valuable contributions of the following Red Hat as

== Classroom Environment

We will use the https://demo.redhat.com/catalog?item=babylon-catalog-prod%2Fopenshift-cnv.ocpmulti-wksp-cnv.prod[*Red Hat OpenShift Container Platform Cluster*] catalog item in the Red Hat Demo Platform (RHDP) to run the hands-on exercises in this course.
We will use the https://catalog.demo.redhat.com/catalog?search=openshift+on+aws&item=babylon-catalog-prod%2Fsandboxes-gpte.ocp-wksp.prod[*Red Hat OpenShift Container Platform Cluster (AWS)*] catalog item in the Red Hat Demo Platform (RHDP) to run the hands-on exercises in this course.

[TIP]
If you are planning on starting this course now, go ahead & launch the workshop. It takes ~15 minutes to provision it, which is just enough time to finish the introduction section.
If you are planning on starting this course now, go ahead and launch the lab environment. It takes ~45 minutes to provision it, enough time to read through to understand the concepts, then follow along on the second pass.

video::demohub_resources_v4.mp4[width=640]
.Animated - Walkthrough of Demo Hub order selections.
image::demo_platform_catalog.gif[width=640]

When ordering this catalog item in RHDP:

. Select Practice/Enablement for the Activity field
. Select *Red Hat OpenShift Container Platform Cluster (AWS)* catalog item.

. Select Learning about the Product for the Purpose field
. Select *order* from the pop up lab menu page.

. Select *Practice/Enablement* for the Activity field.

. Leave the Salesforce ID field blank
. Select *Learning about the Product* for the Purpose field.

. Scroll to the bottom, read the usage cost section, then check the box to confirm acceptance of terms and conditions
. Leave the Salesforce ID field *blank*.

. Click order
. Scroll to the bottom, read the usage cost section, then *check the box* to confirm acceptance of terms and conditions.

For Red Hat partners who do not have access to RHDP, provision an environment using the Red Hat Hybrid Cloud Console. Unfortunately, the labs will NOT work on the trial sandbox environment. You need to provision an OpenShift AI cluster on-premises, or in the supported cloud environments by following the product documentation at https://docs.redhat.com/en/documentation/red_hat_openshift_ai_self-managed/2.10/html/installing_and_uninstalling_openshift_ai_self-managed/index[Product Documentation for installing Red Hat OpenShift AI 2.10].
. Click *order*.

For Red Hat partners who do not have access to RHDP, provision an environment using the Red Hat Hybrid Cloud Console. Unfortunately, the labs will NOT work on the trial sandbox environment. You need to provision an OpenShift AI cluster on-premises, or in the supported cloud environments by following the https://docs.redhat.com/en/documentation/red_hat_openshift_ai_self-managed/2.14/html/installing_and_uninstalling_openshift_ai_self-managed/index[Product Documentation for installing Red Hat OpenShift AI 2.14].

== Prerequisites

Expand All @@ -65,16 +70,16 @@ For Red Hat partners who do not have access to RHDP, provision an environment us

The overall objectives of this course include:

* Utilize Red Hat OpenShift AI to serve & interact with an LLM
* Utilize Red Hat OpenShift AI to serve & interact with an LLM.

* Install Red Hat OpenShift AI operators & dependencies
* Install Red Hat OpenShift AI operators & dependencies.

* Add a custom model serving runtime
* Add a custom model serving runtime.

* Create a data science project, workbench & data connections
* Create a data science project, workbench & data connections.

* Load an LLM model into the Ollama runtime framework
* Load an LLM model into the Ollama runtime framework.

* Import (from git repositories), interact with LLM model via Jupyter Notebooks
* Import (from git repositories), interact with LLM model via Jupyter Notebooks.

* Experiment with the Mistral LLM and Llama3 large language models
* Experiment with the Mistral LLM and Llama3 large language models.
Binary file modified modules/chapter2/.DS_Store
Binary file not shown.
Binary file added modules/chapter2/images/authorino_install.gif
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added modules/chapter2/images/data_science_project.gif
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added modules/chapter2/images/dsc_install_214.gif
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added modules/chapter2/images/dsp_create_214.gif
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added modules/chapter2/images/serverless_operator.gif
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added modules/chapter2/images/servicemesh_install.gif
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
6 changes: 3 additions & 3 deletions modules/chapter2/nav.adoc
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
* xref:index.adoc[]
** xref:section1.adoc[]
** xref:section2.adoc[]
** xref:section3.adoc[]
** xref:rhoai_install_guide.adoc[]
//** xref:section2.adoc[]
// ** xref:section3.adoc[]
2 changes: 1 addition & 1 deletion modules/chapter2/pages/index.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ OpenShift AI is supported in two configurations:
For information about OpenShift AI on a Red Hat managed environment, see https://access.redhat.com/documentation/en-us/red_hat_openshift_ai_cloud_service/1[Product Documentation for Red Hat OpenShift AI Cloud Service].

* Self-managed software that you can install on-premise or on the public cloud in a self-managed environment, such as *OpenShift Container Platform*.
For information about OpenShift AI as self-managed software on your OpenShift cluster in a connected or a disconnected environment, see https://docs.redhat.com/en/documentation/red_hat_openshift_ai_self-managed/2.10[Product Documentation for Red Hat OpenShift AI Self-Managed 2.10].
For information about OpenShift AI as self-managed software on your OpenShift cluster in a connected or a disconnected environment, see https://docs.redhat.com/en/documentation/red_hat_openshift_ai_self-managed/2.14[Product Documentation for Red Hat OpenShift AI Self-Managed 2.14].

In this course we cover installation of *Red Hat OpenShift AI self-managed* using the OpenShift Web Console.

Expand Down
171 changes: 171 additions & 0 deletions modules/chapter2/pages/rhoai_install_guide.adoc
Original file line number Diff line number Diff line change
@@ -0,0 +1,171 @@
= OpenShift AI Using the Web Console

*Red{nbsp}Hat OpenShift AI* is available as an operator via the OpenShift Operator Hub. You will install the *Red{nbsp}Hat OpenShift AI operator* and dependencies using the OpenShift web console in this section.

== Lab Exercise: Installation of Red Hat OpenShift AI

This section will discuss the process for installing the dependent operators using the OpenShift Web Console. ( ~15 minutes )

IMPORTANT: The installation requires a user with the _cluster-admin_ role

This exercise uses the Red Hat Demo Platform; specifically the OpenShift Container Cluster Platform Resource. If you haven't already you'll need to launch the lab environment before continuing.

. Login to the Red Hat OpenShift using a user who has the _cluster-admin_ role assigned.

. Navigate to **Operators** -> **OperatorHub** and search for each of the following Operators individually. For this lab, you can skip the installation of the optional operators.

[*] You do not have to wait for the previous Operator to complete before installing the next. For this lab you can skip the installation of the optional operators as there is no accelerator required.
// Should this be a note?

* Red Hat OpenShift Serverless

* Red Hat OpenShift Service Mesh

* Red Hat Authorino technical preview

* GPU Support

** Node Feature Discovery Operator (optional)

** NVIDIA GPU Operator (optional)


=== Installation of Red Hat OpenShift Serverless Operator

The following section discusses installing the *Red{nbsp}Hat OpenShift Serverless* operator.

.Animated - Operator Hub installation of Serverless Operator in OpenShift
image::serverless_operator.gif[width=600]

1. Login to Red{nbsp}Hat OpenShift using a user who has the _cluster-admin_ role assigned.

2. Navigate to **Operators** -> **OperatorHub** and search for *Red{nbsp}Hat OpenShift Serverless*.

3. Click on the *Red{nbsp}Hat OpenShift Serverless* operator. In the pop-up window, select the *stable* channel and the most recent version of the serverless operator. Click on **Install** to open the operator's installation view.


4. In the `Install Operator` page, select the default values for all the fields and click *Install*.


5. A window showing the installation progress will pop up.

6. When the installation finishes the operator is ready to be used by *Red{nbsp}Hat OpenShift AI*.


*Red{nbsp}Hat OpenShift Serverless* is now successfully installed.

=== Installation of Red Hat OpenShift Service Mesh Operator

The following section discusses installing the *Red{nbsp}Hat OpenShift Service Mesh* operator.

.Animated - Operator Hub installation of Service Mesh Operator in OpenShift
image::servicemesh_install.gif[width=600]

1. Login to Red{nbsp}Hat OpenShift using a user who has the _cluster-admin_ role assigned.

2. Navigate to **Operators** -> **OperatorHub** and search for *Red{nbsp}Hat OpenShift Service Mesh*.

3. Click on the *Red{nbsp}Hat OpenShift Service Mesh* operator. In the pop-up window, select the *stable* channel and the most recent version of the server mesh operator. Click on **Install** to open the operator's installation view.

4. In the `Install Operator` page, select the default values for all the fields and click *Install*.

5. A window showing the installation progress will pop up.

6. When the installation finishes the operator is ready to be used by *Red{nbsp}Hat OpenShift AI*.

*Red{nbsp}Hat OpenShift Service Mesh* is now successfully installed.

=== Installation of Red Hat Authorino Operator

The following section discusses installing the *Red{nbsp}Hat - Authorino* operator.

.Animated - Operator Hub installation of Authorino (tech preview) Operator in OpenShift
image::authorino_install.gif[width=600]

1. Login to Red{nbsp}Hat OpenShift using a user who has the _cluster-admin_ role assigned.

2. Navigate to **Operators** -> **OperatorHub** and search for *Red{nbsp}Hat Authorino.

3. Click on the *Red{nbsp}Hat Authorino *operator*. In the pop-up window, select the *tech-preview-v1* channel and the most recent version of the operator. Click on **Install** to open the operator's installation view.

4. In the `Install Operator` page, select the default values for all the fields and click *Install*.

5. A window showing the installation progress will pop-up.

6. When the installation finishes the operator is ready to be used by *Red{nbsp}Hat OpenShift AI*.

*Red{nbsp}Hat Authorino* is now successfully installed.


[TIP]

Installing these Operators prior to the installation of the OpenShift AI Operator increases the speed in OpenShift AI acknowledging the availability of these components and adjusting the initial configuration to shift management of these components to OpenShift AI.

== Installation of Red Hat OpenShift AI Operator

.Animated - Operator Hub installation of OpenShift AI Operator on OpenShift
image::openshiftai_install_214.gif[width=600]

* Navigate to **Operators** -> **OperatorHub** and search for *OpenShift AI*.


. Click on the `Red{nbsp}Hat OpenShift AI` operator. In the pop-up window that opens, ensure you select the latest version in the *fast* channel. Any version equal to or greater than 2.14 and click on **Install** to open the operator's installation view.
+

. In the `Install Operator` page, leave all of the options as default and click on the *Install* button to start the installation.

. The operator Installation progress window will pop up. The installation may take a couple of minutes.


== Create OpenShift AI Data Science Cluster

The next step is to create an OpenShift AI *Data Science Cluster (DSC)*.

_A DataScienceCluster is the plan in the form of an YAML outline for Data Science Cluster API deployment. Manually editing the YAML configuration can adjust the settings of the OpenShift AI DSC._

.Animated - Create Data Science Cluster to enable OpenShift AI on OpenShift
image::dsc_install_214.gif[width=600]

Return to the OpenShift Navigation Menu, Select Installed Operators, and click on the OpenShift AI Operator name to open the operator.

. *Select the option to create a Data Science Cluster.*

. *Click Create* to deploy the Data Science Cluster.


== OpenShift AI install Summary

Congratulations, you have successfully completed the installation of OpenShift AI on an OpenShift Container Cluster. OpenShift AI is now running on a new Dashboard!


* We installed the required OpenShift AI Operators
** Red Hat OpenShift Serverless
** Red Hat OpenShift ServiceMesh
** Red Hat Authorino (technical preview)
** OpenShift AI Operator



== Create a Data Science Project

Navigate to the menu selector, located at the top right of the OCP dashboard. Select the grid of squares, then select OpenShift AI. At the login screen, use the OCP admin credentials to login to OpenShift AI.

.Animated - Create data science project from OpenShift AI dashboard
image::data_science_project.gif[width=600]

Explore the dashboard navigation menus to familiarize yourself with the options.

Navigate to & select the Data Science Projects section.

. Select the Create Data Science Project button.

. Enter a name for your project, such as *ollama-model*. (differs from animated example)

. The resource name should be populated automatically.

. Optionally add a description to the data science project.

. Select Create.


Once complete, you should be on the landing page of the "ollama-model" Data Science Project section of the OpenShift AI Console / Dashboard.
Loading

0 comments on commit 79a1858

Please sign in to comment.