Skip to content

Commit a125f64

Browse files
committed
OSDOCS-13287: adds RHOAI to MicroShift
1 parent a8f1d57 commit a125f64

File tree

9 files changed

+108
-13
lines changed

9 files changed

+108
-13
lines changed

_attributes/attributes-microshift.adoc

+2
Original file line numberDiff line numberDiff line change
@@ -6,6 +6,8 @@
66
:OCP: OpenShift Container Platform
77
:ocp-version: 4.18
88
:oc-first: pass:quotes[OpenShift CLI (`oc`)]
9+
:rhoai-full: Red{nbsp}Hat OpenShift AI
10+
:rhoai: RHOAI
911
//OpenShift Kubernetes Engine
1012
:oke: OpenShift Kubernetes Engine
1113
:product-title-first: Red Hat build of MicroShift (MicroShift)

_topic_maps/_topic_map_ms.yml

+20-13
Original file line numberDiff line numberDiff line change
@@ -99,19 +99,6 @@ Topics:
9999
- Name: Listing update package contents
100100
File: microshift-list-update-contents
101101
---
102-
Name: Support
103-
Dir: microshift_support
104-
Distros: microshift
105-
Topics:
106-
- Name: The etcd service
107-
File: microshift-etcd
108-
- Name: The sos report tool
109-
File: microshift-sos-report
110-
- Name: Getting your cluster ID
111-
File: microshift-getting-cluster-id
112-
- Name: Getting support
113-
File: microshift-getting-support
114-
---
115102
Name: Configuring
116103
Dir: microshift_configuring
117104
Distros: microshift
@@ -198,6 +185,13 @@ Topics:
198185
- Name: Understanding storage migration
199186
File: microshift-storage-migration
200187
---
188+
Name: Using AI models
189+
Dir: microshift_ai
190+
Distros: microshift
191+
Topics:
192+
- Name: Using artificial intelligence with MicroShift
193+
File: microshift-rhoai
194+
---
201195
Name: Running applications
202196
Dir: microshift_running_apps
203197
Distros: microshift
@@ -239,6 +233,19 @@ Topics:
239233
- Name: Automated recovery from manual backups
240234
File: microshift-auto-recover-manual-backup
241235
---
236+
Name: Support
237+
Dir: microshift_support
238+
Distros: microshift
239+
Topics:
240+
- Name: The etcd service
241+
File: microshift-etcd
242+
- Name: The sos report tool
243+
File: microshift-sos-report
244+
- Name: Getting your cluster ID
245+
File: microshift-getting-cluster-id
246+
- Name: Getting support
247+
File: microshift-getting-support
248+
---
242249
Name: Troubleshooting
243250
Dir: microshift_troubleshooting
244251
Distros: microshift

microshift_ai/_attributes

+1
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
../_attributes/

microshift_ai/images

+1
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
../images/

microshift_ai/microshift-rhoai.adoc

+16
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,16 @@
1+
:_mod-docs-content-type: ASSEMBLY
2+
[id="microshift-rh-openshift-ai"]
3+
include::_attributes/attributes-microshift.adoc[]
4+
= Using Red Hat OpenShift AI with {microshift-short}
5+
:context: microshift-rh-openshift-ai
6+
7+
toc::[]
8+
9+
Learn how to serve artificial intelligence (AI) models on your {microshift-short} edge deployments in a lightweight manner.
10+
11+
include::modules/microshift-rhoai-con.adoc[leveloffset=+1]
12+
13+
include::modules/microshift-rhoai-install.adoc[leveloffset=+1]
14+
//hmm, do we want to include again in the install books?
15+
16+

microshift_ai/modules

+1
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
../modules/

microshift_ai/snippets

+1
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
../snippets/

modules/microshift-rhoai-con.adoc

+12
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,12 @@
1+
// Module included in the following assemblies:
2+
//
3+
// * microshift_ai/microshift-rhoai.adoc
4+
5+
:_mod-docs-content-type: CONCEPT
6+
[id="microshift-rhoai-con_{context}"]
7+
= How {rhoai-full} works in {microshift-short}
8+
9+
You can train an artificial intelligence (AI) model in the cloud or datacenter on {OCP}, then run your model in your edge deployments on {microshift-short}.
10+
11+
{rhoai-full} ({rhoia}) is a platform for data scientists and developers of AI and machine learning (AI/ML) applications. {rhoia} provides an environment to develop, train, serve, test, and monitor AI/ML models and applications on-premise or in the cloud.
12+

modules/microshift-rhoai-install.adoc

+54
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,54 @@
1+
// Module included in the following assemblies:
2+
//
3+
// * microshift_ai/microshift-rhoai.adoc
4+
5+
:_mod-docs-content-type: CONCEPT
6+
[id="microshift-rhoai-install_{context}"]
7+
= Installing the {rhoai-full} RPM
8+
9+
To use AI models in {microshift-short} deployments, use this procedure to install the {rhoai-full} {{rhoai}} RPM a new {microshift-short} installation. You can install the RPM on an existing {microshift} instance as long as you restart the system.
10+
11+
.Prerequisites
12+
13+
* The system requirements for installing {microshift-short} have been met.
14+
* You have root user access to your machine.
15+
* You configured your LVM VG with the capacity needed for the PVs of your workload.
16+
//Do we need to config the CSI?
17+
* You added memory as required for your AI model.
18+
//I assume there is some kind of memory requirement for using an AI model?
19+
20+
21+
. Install the {microshift-short} AI-model-serving RPM package by running the following command:
22+
+
23+
[source,terminal]
24+
----
25+
$ sudo dnf install microshift-microshift-ai-model-serving
26+
----
27+
+
28+
[TIP]
29+
====
30+
If you create your manifests and custom resources (CRs) while you are completing your installation of {microshift-short}, you can avoid restarting the service a second time to apply them.
31+
====
32+
33+
. As a root user, restart the {microshift-short} service by entering the following command:
34+
+
35+
[source,terminal]
36+
----
37+
$ sudo systemctl restart microshift
38+
----
39+
40+
//Q: Can we check the namespace to verify?
41+
//Kserve manifest and ServingRuntimes CRs are deployed in a `redhat-ods-applications` namespace.
42+
43+
.Next steps
44+
45+
. Configure your hardware, the operating system, and additional components to make use of their accelerators.
46+
47+
. Create manifests that include
48+
* `ServingRuntime` in your namespace
49+
* `InferenceService` referencing `ServingRuntime` and a model
50+
* `Route` CR
51+
52+
// Kserve creates Deployment and other resources.
53+
54+
. Resources from previous step become ready and user can make HTTP/GRPC calls to the model server.

0 commit comments

Comments
 (0)