Yuan2.0
This notebook shows how to use YUAN2 API in LangChain with the langchain.chat_models.ChatYuan2.
Yuan2.0 is a new generation Fundamental Large Language Model developed by IEIT System. We have published all three models, Yuan 2.0-102B, Yuan 2.0-51B, and Yuan 2.0-2B. And we provide relevant scripts for pretraining, fine-tuning, and inference services for other developers. Yuan2.0 is based on Yuan1.0, utilizing a wider range of high-quality pre training data and instruction fine-tuning datasets to enhance the model's understanding of semantics, mathematics, reasoning, code, knowledge, and other aspects.
Getting started
Installation
First, Yuan2.0 provided an OpenAI compatible API, and we integrate ChatYuan2 into langchain chat model by using OpenAI client. Therefore, ensure the openai package is installed in your Python environment. Run the following command:
%pip install --upgrade --quiet openai
Importing the Required Modules
After installation, import the necessary modules to your Python script:
from langchain_community.chat_models import ChatYuan2
from langchain_core.messages import AIMessage, HumanMessage, SystemMessage
Setting Up Your API server
Setting up your OpenAI compatible API server following yuan2 openai api server.
If you deployed api server locally, you can simply set yuan2_api_key="EMPTY"
or anything you want.
Just make sure, the yuan2_api_base
is set correctly.
yuan2_api_key = "your_api_key"
yuan2_api_base = "http://127.0.0.1:8001/v1"
Initialize the ChatYuan2 Model
Here's how to initialize the chat model:
chat = ChatYuan2(
yuan2_api_base="http://127.0.0.1:8001/v1",
temperature=1.0,
model_name="yuan2",
max_retries=3,
streaming=False,
)
Basic Usage
Invoke the model with system and human messages like this:
messages = [
SystemMessage(content="你是一个人工智能助手。"),
HumanMessage(content="你好,你是谁?"),
]
print(chat.invoke(messages))
Basic Usage with streaming
For continuous interaction, use the streaming feature:
from langchain_core.callbacks import StreamingStdOutCallbackHandler
chat = ChatYuan2(
yuan2_api_base="http://127.0.0.1:8001/v1",
temperature=1.0,
model_name="yuan2",
max_retries=3,
streaming=True,
callbacks=[StreamingStdOutCallbackHandler()],
)
messages = [
SystemMessage(content="你是个旅游小助手。"),
HumanMessage(content="给我介绍一下北京有哪些好玩的。"),
]
chat.invoke(messages)