Integrate TaskingAI with LangChain

May 10, 2024

LangChain is a comprehensive framework designed for developers to create and deploy sophisticated language-based AI applications. By integrating TaskingAI into LangChain, developers can harness a broad spectrum of AI models from various providers via a unified API that also supports OpenAI-standard responses. This integration enriches LangChain's capabilities, making it an ideal solution for developing advanced AI-driven applications.

TaskingAI is a developer-friendly cloud platform for building and running LLM agents for AI-native applications. The platform's compatibility facilitates the integration of TaskingAI with various existing frameworks, including LangChain, enabling developers to leverage enhanced AI functionalities seamlessly.

Here's a step-by-step tutorial for accessing TaskingAI services using OpenAI-compatible APIs for both models and assistants.

Setting up the Environment

First, install the langchain-openai module through pip. Do this step only if the module isn't already installed. You can check this by attempting to import it in your Python environment. To install, use the command:

1pip install langchain-openai

Once completed, set up authentication as required by the langchain-openai module. This involves assigning the OpenAI API key to OPENAI_API_KEY in your environment variables. You may choose to do this step before the installation process, if preferred. Here's the code to do it:

1import os
2from langchain_openai import ChatOpenAI
3
4os.environ["OPENAI_API_KEY"] = "YOUR_TASKINGAI_API_KEY"

Integrating a TaskingAI Model

Configure a LangChain model to utilize TaskingAI by setting up the model ID and specifying the service URL. This allows the LangChain model to interact directly with TaskingAI, using its diverse range of AI models.

1llm = ChatOpenAI(
2    model="YOUR_TASKINGAI_MODEL_ID",
3    base_url="https://oapi.tasking.ai/v1"
4)
5
6response = llm.invoke("How can TaskingAI help us build agents?")
7print(response)

This integration demonstrates how a TaskingAI model can be utilized within the LangChain framework, enabling sophisticated interactions with the AI.

Integrating a TaskingAI Agent

To integrate a TaskingAI agent with LangChain, you would similarly set up the agent using the TaskingAI APIs. This setup involves configuring the agent with various tools and retrieval mechanisms to enhance its capabilities.

agent = ChatOpenAI(
    model="YOUR_TASKINGAI_ASSISTANT_ID",
    base_url="https://oapi.tasking.ai/v1"
)

response = llm.invoke("How can TaskingAI help us build agents?")
print(response)

This snippet illustrates how a TaskingAI agent can manage complex queries and utilize integrated tools and data within the LangChain framework, providing not just answers but comprehensive solutions.

Conclusion

Integrating TaskingAI with LangChain empowers developers to create more robust and versatile language-based applications. For detailed information about TaskingAI, please refer to the documentation.

On this page

©️ 2024 TaskingAI All Copyright Reserved