langchain.chat_models.init_chat_model()
函数事实上跟单独使用langchain-openai
、langchain-deepseek
性质一样的。
(所以需要提前安装对应的包,如pip install langchain-openai
)
文章目录
1. 使用OpenAI接口的示例
import os
os.environ["OPENAI_API_KEY"] = OPENAI_KEY
from langchain.chat_models import init_chat_model
model = init_chat_model("gpt-4o-mini", model_provider="openai")
response = model.invoke("Hello, world!")
print(response.content)
打印内容:Hello! How can I assist you today?
2. 另一种格式的示例,在init_chat_model函数里加参数
import os
os.environ["OPENAI_API_KEY"] = OPENAI_KEY
from langchain.chat_models import init_chat_model
o4_mini = init_chat_model("openai:gpt-4o-mini", temperature=0)
response = o4_mini.invoke("what's your name")
print(response.content)
打印内容:I’m called ChatGPT. How can I assist you today?
3. 在invoke()
中才指定参数的示例
↓这里面不用指定openai因为langchain会自己infer
import os
os.environ["OPENAI_API_KEY"] = OPENAI_KEY
from langchain.chat_models import init_chat_model
configurable_model = init_chat_model(temperature=0)
response=configurable_model.invoke(
"what's your name",
config={"configurable": {"model": "gpt-4o-mini"}}
)
print(response.content)
4. 在init_chat_model()
指定参数默认值,在invoke()
中可以使用默认值也可以修改的示例
# pip install langchain langchain-openai langchain-anthropic
from langchain.chat_models import init_chat_model
configurable_model_with_default = init_chat_model(
"openai:gpt-4o",
configurable_fields="any", # this allows us to configure other params like temperature, max_tokens, etc at runtime.
config_prefix="foo",
temperature=0
)
configurable_model_with_default.invoke("what's your name")
# GPT-4o response with temperature 0
configurable_model_with_default.invoke(
"what's your name",
config={
"configurable": {
"foo_model": "anthropic:claude-3-5-sonnet-20240620",
"foo_temperature": 0.6
}
}
)
# Claude-3.5 sonnet response with temperature 0.6
5. 带工具调用功能的示例
# pip install langchain langchain-openai langchain-anthropic
from langchain.chat_models import init_chat_model
from pydantic import BaseModel, Field
class GetWeather(BaseModel):
'''Get the current weather in a given location'''
location: str = Field(..., description="The city and state, e.g. San Francisco, CA")
class GetPopulation(BaseModel):
'''Get the current population in a given location'''
location: str = Field(..., description="The city and state, e.g. San Francisco, CA")
configurable_model = init_chat_model(
"gpt-4o",
configurable_fields=("model", "model_provider"),
temperature=0
)
configurable_model_with_tools = configurable_model.bind_tools([GetWeather, GetPopulation])
configurable_model_with_tools.invoke(
"Which city is hotter today and which is bigger: LA or NY?"
)
# GPT-4o response with tool calls
configurable_model_with_tools.invoke(
"Which city is hotter today and which is bigger: LA or NY?",
config={"configurable": {"model": "claude-3-5-sonnet-20240620"}}
)
# Claude-3.5 sonnet response with tools
6. advanced使用参考资料
LangChain v0.3简介:https://python.langchain.com/docs/introduction/
init_chat_model函数文档:https://python.langchain.com/api_reference/langchain/chat_models/langchain.chat_models.base.init_chat_model.html
可以在这个里面看模型对应的参数值,还有2个使用该接口的最佳实践样例。
langchain_openai.chat_models.base.ChatOpenAI文档:https://python.langchain.com/api_reference/openai/chat_models/langchain_openai.chat_models.base.ChatOpenAI.html
有更多更具体的使用技巧介绍。
全部providers可以参考这个网站:https://python.langchain.com/docs/integrations/providers/