Skip to main content
This will help you get started with ChatAbso chat models. For detailed documentation of all ChatAbso features and configurations, head to the API reference. You can also review the full Abso router documentation.

Overview

Integration details

ClassPackageSerializableJS supportDownloadsVersion
ChatAbsolangchain-absoPyPI - DownloadsPyPI - Version

Setup

To access ChatAbso models, you’ll need to create an OpenAI account, get an API key, and install the langchain-abso integration package.

Credentials

  • TODO: Update with relevant info.
Head to (TODO: link) to sign up for ChatAbso and generate an API key. Once you’ve done this, set the ABSO_API_KEY environment variable:
import getpass
import os

if not os.getenv("OPENAI_API_KEY"):
    os.environ["OPENAI_API_KEY"] = getpass.getpass("Enter your OpenAI API key: ")

Installation

The LangChain ChatAbso integration lives in the langchain-abso package:
pip install -qU langchain-abso

Instantiation

Now we can instantiate our model object and generate chat completions:
from langchain_abso import ChatAbso

llm = ChatAbso(fast_model="gpt-4.1", slow_model="o3-mini")

Invocation

messages = [
    (
        "system",
        "You are a helpful assistant that translates English to French. Translate the user sentence.",
    ),
    ("human", "I love programming."),
]
ai_msg = llm.invoke(messages)
ai_msg
print(ai_msg.content)

API reference

For detailed documentation of all ChatAbso features and configurations head to the API reference: python.langchain.com/api_reference/en/latest/chat_models/langchain_abso.chat_models.ChatAbso.html