Skip to content

oci

Getting started with OCI Generative AI

For developers wanting to leverage OCI Generative AI, this tutorial demonstrates how to use LangChain for seamless integration. Includes a working Python example and covers essential prerequisites.

Introduction

Oracle Cloud Infrastructure (OCI) Generative AI is a suite of services that allows you to leverage powerful large language models (LLMs) for various tasks like text generation, summarization, and more. This guide provides a quick start guide on how to use OCI Generative AI with LangChain, simplifying the interaction with the underlying APIs.

Supported models

Available models may vary by region. Please refer to the official documentation for the most up-to-date list:

https://docs.oracle.com/en-us/iaas/Content/generative-ai/pretrained-models.htm

This page lists the pre-trained models currently supported by OCI Generative AI, such as Cohere and Meta Llama models. It is highly recommended to check this page to choose the appropriate model for your use case.

Example

OCI Generative AI's native API can be complex to work with directly. By integrating with LangChain, we can abstract away much of this complexity and interact with the service in a more streamlined and intuitive manner. This example demonstrates how to use LangChain to interact with OCI Generative AI without using the original API directly.

Prerequisites

Before you begin, ensure you have the following prerequisites in place:

  1. OCI CLI installation and configuration: If you haven't already, install and configure the OCI CLI. Refer to this guide for instructions: link to our guide. This step is necessary to authenticate your requests to OCI.
  2. Install required pip packages: Install the langchain-community and oci Python packages using pip:
    bash
    pip install langchain-community oci
    langchain-community provides the necessary integrations with OCI Generative AI, while oci is the Oracle Cloud Infrastructure SDK for Python.

Check compartment-id

You'll need your compartment ID to authenticate your requests. Use the following OCI CLI command to list your compartments and identify the correct ID:

bash
oci iam compartment list

A compartment is a logical container within your Oracle Cloud Infrastructure tenancy. It's used to organize and isolate your cloud resources. The output of this command will be a JSON list of compartments. Find the compartment you want to use and copy its id.

Code example

Here's a Python code example that uses LangChain to send a message to OCI Generative AI and stream the response:

python
from langchain_community.chat_models import ChatOCIGenAI
from langchain_core.messages import HumanMessage

compartment_id = "<your-compartment-id>" # Replace with your actual compartment ID

chat = ChatOCIGenAI(
  compartment_id=compartment_id,
  model_id="meta.llama-3.1-70b-instruct", # Select a suitable model from the supported models list
  is_stream=True, # Enable streaming for real-time output
  # model_kwargs={"temperature": 0, "max_tokens": 512}, # Optional parameters for the model
)

messages = [
  HumanMessage(content="What is the capital of Australia?"),
]

for chunk in chat.stream(messages):
  print(chunk.content, end="", flush=True) # Print the streamed content

Explanation of the code:

  • The ChatOCIGenAI class initializes the connection to OCI Generative AI.
  • model_id specifies the model to use. Make sure to choose a model appropriate for your task and available in your region.
  • The HumanMessage object contains the prompt.
  • The chat.stream(messages) method sends the message to the model and returns a generator that yields response chunks.

Run the code

Save the code as oci-generative-ai.py and run it from your terminal:

bash
python oci-generative-ai.py

You should see the following output (or similar):

> The capital of Australia is Canberra.

This confirms that you have successfully connected to OCI Generative AI using LangChain and received a response.