AutoGen AttributeError: ‘str‘ object has no attribute ‘model‘

目录

一、错误

二、解决方法


一、错误

AutoGen 本地运行,运行时错误如下所示。

(autogen) D:\code\autogen>python main.py
[33muser_proxy[0m (to assistant):

Tell me a joke about NVDA and TESLA stock prices.

--------------------------------------------------------------------------------
Traceback (most recent call last):
  File "D:\code\autogen\main.py", line 12, in <module>
    user_proxy.initiate_chat(
  File "D:\Software\python3.10.9\lib\site-packages\autogen\agentchat\conversable_agent.py", line 1018, in initiate_chat
    self.send(msg2send, recipient, silent=silent)
  File "D:\Software\python3.10.9\lib\site-packages\autogen\agentchat\conversable_agent.py", line 655, in send
    recipient.receive(message, self, request_reply, silent)
  File "D:\Software\python3.10.9\lib\site-packages\autogen\agentchat\conversable_agent.py", line 818, in receive
    reply = self.generate_reply(messages=self.chat_messages[sender], sender=sender)
  File "D:\Software\python3.10.9\lib\site-packages\autogen\agentchat\conversable_agent.py", line 1972, in generate_reply
    final, reply = reply_func(self, messages=messages, sender=sender, config=reply_func_tuple["config"])
  File "D:\Software\python3.10.9\lib\site-packages\autogen\agentchat\conversable_agent.py", line 1340, in generate_oai_reply
    extracted_response = self._generate_oai_reply_from_client(
  File "D:\Software\python3.10.9\lib\site-packages\autogen\agentchat\conversable_agent.py", line 1359, in _generate_oai_reply_from_client
    response = llm_client.create(
  File "D:\Software\python3.10.9\lib\site-packages\autogen\oai\client.py", line 755, in create
    response.cost = client.cost(response)
  File "D:\Software\python3.10.9\lib\site-packages\autogen\oai\client.py", line 326, in cost
    model = response.model
AttributeError: 'str' object has no attribute 'model'

二、解决方法

开始配置的 llm_config 如下所示。

llm_config = {"model": "moonshot-v1-8k", 
              "api_key": str_key,
              "base_url": "http://172.29.19.146:3001"}

修改 base_url 在端口后面加上 /v1 即可,如下所示。

llm_config = {"model": "moonshot-v1-8k", 
              "api_key": str_key,
              "base_url": "http://172.29.19.146:3001/v1"}

参考链接:

[1] [BUG] 添加openai模型报错,‘str’ object has no attribute ‘model_dump’ · Issue #63 · 1Panel-dev/MaxKB · GitHub

  • 3
    点赞
  • 2
    收藏
    觉得还不错? 一键收藏
  • 打赏
    打赏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包

打赏作者

Linux猿

你的鼓励将是我创作的最大动力

¥1 ¥2 ¥4 ¥6 ¥10 ¥20
扫码支付:¥1
获取中
扫码支付

您的余额不足,请更换扫码支付或充值

打赏作者

实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值