LangChain MCP Adapters - 实现 Anthropic MCP 与 LangChain/LangGraph 的兼容适配


一、关于 LangChain MCP Adapters

1、项目概览

轻量级适配器库,使 Anthropic Model Context Protocol (MCP) 工具能够兼容 LangChainLangGraph 生态。

在这里插入图片描述


2、相关链接资源


3、功能特性

  1. MCP工具转换:将MCP工具转换为标准的LangChain工具,可直接用于LangGraph智能体

  2. 多服务器支持:提供客户端实现,支持同时连接多个MCP服务器并加载其工具


二、安装配置

pip install langchain-mcp-adapters

三、使用示例

1、基础用法

# 安装依赖
pip install langchain-mcp-adapters langgraph langchain-openai

# 设置环境变量
export OPENAI_API_KEY=<your_api_key>

服务端实现

创建一个能够进行数字加减乘除运算的MCP服务器。

# math_server.py
from mcp.server.fastmcp import FastMCP

mcp = FastMCP("Math")

@mcp.tool()
def add(a: int, b: int) -> int:
    """两数相加"""  
    return a + b 

@mcp.tool()
def multiply(a: int, b: int) -> int:
    """两数相乘""" 
    return a * b 

if __name__ == "__main__":
    mcp.run(transport="stdio")

客户端调用
from mcp import ClientSession, StdioServerParameters
from mcp.client.stdio import stdio_client
from langchain_mcp_adapters.tools import load_mcp_tools
from langgraph.prebuilt import create_react_agent

server_params = StdioServerParameters(
    command="python",
    args=["/path/to/math_server.py"],
)

async with stdio_client(server_params) as (read, write):
    async with ClientSession(read, write) as session:
        await session.initialize()
        tools = await load_mcp_tools(session)
        agent = create_react_agent("openai:gpt-4.1", tools)
        agent_response = await agent.ainvoke({"messages": "what's (3 + 5) x 12?"})

2、多服务器连接

服务端配置
# weather_server.py
from mcp.server.fastmcp import FastMCP

mcp = FastMCP("Weather")

@mcp.tool()
async def get_weather(location: str) -> str:
    """获取指定地点天气"""
    return "It's always sunny in New York"

if __name__ == "__main__":
    mcp.run(transport="sse")

客户端调用
from langchain_mcp_adapters.client import MultiServerMCPClient
from langgraph.prebuilt import create_react_agent

async with MultiServerMCPClient(
    {
        "math": {
            "command": "python",
            "args": ["/path/to/math_server.py"],
            "transport": "stdio",
        },
        "weather": {
            "url": "http://localhost:8000/sse",
            "transport": "sse",
        }
    }
) as client:
    agent = create_react_agent("openai:gpt-4.1", client.get_tools())
    math_response = await agent.ainvoke({"messages": "what's (3 + 5) x 12?"})
    weather_response = await agent.ainvoke({"messages": "what is the weather in nyc?"})

3、HTTP流式传输

MCP 现已支持可流式传输的 HTTP 传输协议。

启动 示例 服务器:

cd examples/servers/streamable-http-stateless/
uv run mcp-simple-streamablehttp-stateless --port 3000

要将其与 Python MCP SDK 的 streamablehttp_client 一起使用:

# Use server from examples/servers/streamable-http-stateless/

from mcp import ClientSession
from mcp.client.streamable_http import streamablehttp_client

from langgraph.prebuilt import create_react_agent
from langchain_mcp_adapters.tools import load_mcp_tools

async with streamablehttp_client("http://localhost:3000/mcp") as (read, write, _):
    async with ClientSession(read, write) as session:
        # Initialize the connection
        await session.initialize() 

        # Get tools
        tools = await load_mcp_tools(session)
        agent = create_react_agent("openai:gpt-4.1", tools)
        math_response = await agent.ainvoke({"messages": "what's (3 + 5) x 12?"})

MultiServerMCPClient 配合使用:

# Use server from examples/servers/streamable-http-stateless/
from langchain_mcp_adapters.client import MultiServerMCPClient
from langgraph.prebuilt import create_react_agent

async with MultiServerMCPClient(
    {
        "math": {
            "transport": "streamable_http",
            "url": "http://localhost:3000/mcp"
        },
    }
) as client:
    agent = create_react_agent("openai:gpt-4.1", client.get_tools())
    math_response = await agent.ainvoke({"messages": "what's (3 + 5) x 12?"})

4、LangGraph StateGraph集成

from langchain_mcp_adapters.client import MultiServerMCPClient
from langgraph.graph import StateGraph, MessagesState, START
from langgraph.prebuilt import ToolNode, tools_condition

from langchain.chat_models import init_chat_model
model = init_chat_model("openai:gpt-4.1")

async with MultiServerMCPClient(
    {
        "math": {
            "command": "python",
            # Make sure to update to the full absolute path to your math_server.py file
            "args": ["./examples/math_server.py"],
            "transport": "stdio",
        },
        "weather": {
            # make sure you start your weather server on port 8000
            "url": "http://localhost:8000/sse",
            "transport": "sse",
        }
    }
) as client:
    tools = client.get_tools()
    def call_model(state: MessagesState):
        response = model.bind_tools(tools).invoke(state["messages"])
        return {"messages": response}

    builder = StateGraph(MessagesState)
    builder.add_node(call_model)
    builder.add_node(ToolNode(tools))
    builder.add_edge(START, "call_model")
    builder.add_conditional_edges(
        "call_model",
        tools_condition,
    )
    builder.add_edge("tools", "call_model")
    graph = builder.compile()
    math_response = await graph.ainvoke({"messages": "what's (3 + 5) x 12?"})
    weather_response = await graph.ainvoke({"messages": "what is the weather in nyc?"})

5、LangGraph API服务集成

提示:查看本指南了解如何开始使用LangGraph API服务器。

若需要在 LangGraph API 服务器中运行一个 使用 MCP 工具的 LangGraph 智能体,可采用以下配置方案。

创建graph.py配置文件:

# graph.py
from contextlib import asynccontextmanager
from langchain_mcp_adapters.client import MultiServerMCPClient
from langgraph.prebuilt import create_react_agent
from langchain_anthropic import ChatAnthropic

model = ChatAnthropic(model="claude-3-5-sonnet-latest")

@asynccontextmanager
async def make_graph():
    async with MultiServerMCPClient(
        {
            "math": {
                "command": "python", 
                # Make sure to update to the full absolute path to your math_server.py file
                "args": ["/path/to/math_server.py"],
                "transport": "stdio",
            },
            "weather": { 
                # make sure you start your weather server on port 8000
                "url": "http://localhost:8000/sse",  
                "transport": "sse",  
            }
        }
    ) as client:
        agent = create_react_agent(model, client.get_tools()) 
        yield agent

配置langgraph.json,请确保将 make_graph 指定为图的入口点:

{
  "dependencies": ["."],
  "graphs": {
    "agent": "./graph.py:make_graph"
  }
}

伊织 xAI 2025-05-14(三)

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包

打赏作者

编程乐园

请我喝杯伯爵奶茶~!

¥1 ¥2 ¥4 ¥6 ¥10 ¥20
扫码支付:¥1
获取中
扫码支付

您的余额不足,请更换扫码支付或充值

打赏作者

实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值