LangChain v0.3 langchain.chat_models.init_chat_model调用现成大模型接口进行智能问答教程

诸神缄默不语-个人技术博文与视频目录

langchain.chat_models.init_chat_model()函数事实上跟单独使用langchain-openailangchain-deepseek性质一样的。
(所以需要提前安装对应的包,如pip install langchain-openai

1. 使用OpenAI接口的示例

import os

os.environ["OPENAI_API_KEY"] = OPENAI_KEY

from langchain.chat_models import init_chat_model

model = init_chat_model("gpt-4o-mini", model_provider="openai")

response = model.invoke("Hello, world!")
print(response.content)

打印内容:Hello! How can I assist you today?

2. 另一种格式的示例,在init_chat_model函数里加参数

import os

os.environ["OPENAI_API_KEY"] = OPENAI_KEY

from langchain.chat_models import init_chat_model

o4_mini = init_chat_model("openai:gpt-4o-mini", temperature=0)

response = o4_mini.invoke("what's your name")
print(response.content)

打印内容:I’m called ChatGPT. How can I assist you today?

3. 在invoke()中才指定参数的示例

↓这里面不用指定openai因为langchain会自己infer

import os

os.environ["OPENAI_API_KEY"] = OPENAI_KEY

from langchain.chat_models import init_chat_model

configurable_model = init_chat_model(temperature=0)

response=configurable_model.invoke(
    "what's your name",
    config={"configurable": {"model": "gpt-4o-mini"}}
)
print(response.content)

4. 在init_chat_model()指定参数默认值,在invoke()中可以使用默认值也可以修改的示例

# pip install langchain langchain-openai langchain-anthropic
from langchain.chat_models import init_chat_model

configurable_model_with_default = init_chat_model(
    "openai:gpt-4o",
    configurable_fields="any",  # this allows us to configure other params like temperature, max_tokens, etc at runtime.
    config_prefix="foo",
    temperature=0
)

configurable_model_with_default.invoke("what's your name")
# GPT-4o response with temperature 0

configurable_model_with_default.invoke(
    "what's your name",
    config={
        "configurable": {
            "foo_model": "anthropic:claude-3-5-sonnet-20240620",
            "foo_temperature": 0.6
        }
    }
)
# Claude-3.5 sonnet response with temperature 0.6

5. 带工具调用功能的示例

# pip install langchain langchain-openai langchain-anthropic
from langchain.chat_models import init_chat_model
from pydantic import BaseModel, Field

class GetWeather(BaseModel):
    '''Get the current weather in a given location'''

    location: str = Field(..., description="The city and state, e.g. San Francisco, CA")

class GetPopulation(BaseModel):
    '''Get the current population in a given location'''

    location: str = Field(..., description="The city and state, e.g. San Francisco, CA")

configurable_model = init_chat_model(
    "gpt-4o",
    configurable_fields=("model", "model_provider"),
    temperature=0
)

configurable_model_with_tools = configurable_model.bind_tools([GetWeather, GetPopulation])
configurable_model_with_tools.invoke(
    "Which city is hotter today and which is bigger: LA or NY?"
)
# GPT-4o response with tool calls

configurable_model_with_tools.invoke(
    "Which city is hotter today and which is bigger: LA or NY?",
    config={"configurable": {"model": "claude-3-5-sonnet-20240620"}}
)
# Claude-3.5 sonnet response with tools

6. advanced使用参考资料

LangChain v0.3简介:https://python.langchain.com/docs/introduction/

init_chat_model函数文档:https://python.langchain.com/api_reference/langchain/chat_models/langchain.chat_models.base.init_chat_model.html
可以在这个里面看模型对应的参数值,还有2个使用该接口的最佳实践样例。

langchain_openai.chat_models.base.ChatOpenAI文档:https://python.langchain.com/api_reference/openai/chat_models/langchain_openai.chat_models.base.ChatOpenAI.html
有更多更具体的使用技巧介绍。

全部providers可以参考这个网站:https://python.langchain.com/docs/integrations/providers/

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包

打赏作者

诸神缄默不语

你的鼓励将是我创作的最大动力

¥1 ¥2 ¥4 ¥6 ¥10 ¥20
扫码支付:¥1
获取中
扫码支付

您的余额不足,请更换扫码支付或充值

打赏作者

实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值