Langchain 的 Conversation buffer memory

本文演示了如何使用Langchain库中的ConversationBufferMemory来保存和加载对话上下文。此存储器可以存储输入和输出消息,并以字符串或消息列表的形式检索历史记录。此外,还展示了在对话链(ConversationChain)中集成此内存管理器以实现连续的对话交互。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

Langchain 的 Conversation buffer memory

本笔记本展示了如何使用 ConversationBufferMemory 。该存储器允许存储消息,然后将消息提取到变量中。

我们可以首先将其提取为字符串。

示例代码,

from langchain.memory import ConversationBufferMemory
memory = ConversationBufferMemory()
memory.save_context({"input": "hi"}, {"output": "whats up"})
memory = ConversationBufferMemory()
memory.save_context({"input": "hi"}, {"output": "whats up"})
memory.load_memory_variables({})

输出结果,

    {'history': 'Human: hi\nAI: whats up'}

我们还可以获取历史记录作为消息列表(如果您将其与聊天模型一起使用,这非常有用)。

示例代码,

memory = ConversationBufferMemory(return_messages=True)
memory.save_context({"input": "hi"}, {"output": "whats up"})
memory.load_memory_variables({})

输出结果,

    {'history': [HumanMessage(content='hi', additional_kwargs={}),
      AIMessage(content='whats up', additional_kwargs={})]}

Using in a chain

最后,让我们看一下在链中使用它(设置 verbose=True 以便我们可以看到提示)。

示例代码,

from langchain.llms import OpenAI
from langchain.chains import ConversationChain


llm = OpenAI(temperature=0)
conversation = ConversationChain(
    llm=llm, 
    verbose=True, 
    memory=ConversationBufferMemory()
)
conversation.predict(input="Hi there!")

输出结果,

    
    
    > Entering new ConversationChain chain...
    Prompt after formatting:
    The following is a friendly conversation between a human and an AI. The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, it truthfully says it does not know.
    
    Current conversation:
    
    Human: Hi there!
    AI:
    
    > Finished chain.





    " Hi there! It's nice to meet you. How can I help you today?"

示例代码,

conversation.predict(input="I'm doing well! Just having a conversation with an AI.")

输出结果,

    
    
    > Entering new ConversationChain chain...
    Prompt after formatting:
    The following is a friendly conversation between a human and an AI. The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, it truthfully says it does not know.
    
    Current conversation:
    Human: Hi there!
    AI:  Hi there! It's nice to meet you. How can I help you today?
    Human: I'm doing well! Just having a conversation with an AI.
    AI:
    
    > Finished chain.





    " That's great! It's always nice to have a conversation with someone new. What would you like to talk about?"

示例代码,

conversation.predict(input="Tell me about yourself.")

输出结果,

    
    
    > Entering new ConversationChain chain...
    Prompt after formatting:
    The following is a friendly conversation between a human and an AI. The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, it truthfully says it does not know.
    
    Current conversation:
    Human: Hi there!
    AI:  Hi there! It's nice to meet you. How can I help you today?
    Human: I'm doing well! Just having a conversation with an AI.
    AI:  That's great! It's always nice to have a conversation with someone new. What would you like to talk about?
    Human: Tell me about yourself.
    AI:
    
    > Finished chain.





    " Sure! I'm an AI created to help people with their everyday tasks. I'm programmed to understand natural language and provide helpful information. I'm also constantly learning and updating my knowledge base so I can provide more accurate and helpful answers."

完结!

为了在代码中使用`langchain.memory.ConversationBufferMemory`,我们首先需要安装`langchain`库。这个库提供了一个基于内存的会话管理器,我们可以用来存储和恢复对话历史。以下是使用`ConversationBufferMemory`的一个简化示例,它创建了一个简单的Flask应用程序,包含了与OpenAI类似API的调用以及对话历史记录: 首先,安装必要的库: ```bash pip install langchain ``` 然后,创建`app.py`: ```python from flask import Flask, request, jsonify from langchain.memory import ConversationBufferMemory from openai import api # 初始化Flask应用和内存管理器 app = Flask(__name__) memory = ConversationBufferMemory() # 假设OpenAI API key 和模型设置 api_key = 'your_openai_api_key' model = 'davinci' @app.route('/chat', methods=['POST']) def chat(): # 从请求体获取用户输入 user_input = request.json.get('prompt') # 记录当前对话到内存 memory.update(user_input) # 调用OpenAI API ai_response = api.Completion.create( engine=model, prompt=user_input, max_tokens=50, temperature=0.7, stop=None, logprobs=False, ) # 获取AI回复并保存到内存 ai_reply = ai_response.choices[0].text memory.update(ai_reply) # 返回用户和AI的完整对话 conversation = list(memory.history()) return jsonify({"conversation": conversation}) # 开始服务器 if __name__ == '__main__': app.run(debug=True) ``` 在这个例子中,`ConversationBufferMemory`被用来存储每一次的用户输入(`user_input`)和AI的回答(`ai_reply`)。每次收到新的用户输入,都会更新对话历史。
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值