Composer AI and building Productivity Apps

What is Composer AI?

Composer AI is a service lead engagement designed to accelerate the adoption of generative AI.

It’s a full-stack AI deployment for OpenShift that includes all the necessary components for a Generative AI solution, including required operators, datasources, LLMs, and a custom application for querying the LLM.

Architecture

Composer AI Architecture

The above is a high-level architecture diagram of what’s included in Composer AI. The major components are:

Today’s Lab

As mentioned above, when installed using the default GitOps approach or retrieved through the demo.redhat.com catalog, your cluster will come with an instance of vLLM/Granite and an Elasticsearch Vector Database preinstalled.

However, for this lab, we will only be including the Conductor API and the Composer AI on the provisioned cluster. This allows you to install your own instance of vLLM using Red Hat OpenShift AI and an Elasticsearch Database. You will also create new assistants using Composer AI and ingest and use data from our Red Hat documentation to more mimic match what you may be doing in a customer environment.

Here are some important links to help you on your journey:

Important Credentials

The cluster admin user can be used to access all of the resources on the cluster including the ArgoCD instance and OpenShift AI Dashboard.

Cluster admin user: username

Cluster admin password: password