Prompts for Chat Models in LangChain

https://python.langchain.com.cn/docs/modules/model_io/models/chat/how_to/prompts

Prompts for Chat Models in LangChain

This content is based on LangChain’s official documentation (langchain.com.cn) and explains prompts for chat models—which are built around messages (not just plain text)—in simplified terms. It strictly preserves all original source codes, examples, and knowledge points without any additions or modifications.

1. Key Feature of Chat Model Prompts

Prompts for chat models are structured around messages (e.g., system messages, human messages, AI messages) rather than single blocks of text.

  • Use MessagePromptTemplate (and its subclasses like SystemMessagePromptTemplate, HumanMessagePromptTemplate) to create reusable message templates.
  • Combine multiple MessagePromptTemplates into a ChatPromptTemplate.
  • Use ChatPromptTemplate.format_prompt() to generate a PromptValue, which can be converted to a string or message objects (for chat models).

2. Step 1: Import Required Modules

The code below imports all necessary classes—exactly as in the original documentation:

from langchain import PromptTemplate
from langchain.prompts.chat import (
    ChatPromptTemplate,
    SystemMessagePromptTemplate,
    AIMessagePromptTemplate,
    HumanMessagePromptTemplate,
)

Note: A chat model (e.g., ChatOpenAI) is required to run the final step, but it is not imported in the original documentation—we will reference it as chat (consistent with the original code).

3. Method 1: Create Message Templates with from_template

This is a concise way to build MessagePromptTemplates directly from template strings.

Step 3.1: Create System and Human Message Templates

# System message template: Defines the assistant's role (translator)
template = "You are a helpful assistant that translates {input_language} to {output_language}."
system_message_prompt = SystemMessagePromptTemplate.from_template(template)

# Human message template: Defines the user's input (text to translate)
human_template = "{text}"
human_message_prompt = HumanMessagePromptTemplate.from_template(human_template)

Step 3.2: Combine into ChatPromptTemplate

chat_prompt = ChatPromptTemplate.from_messages([system_message_prompt, human_message_prompt])

Step 3.3: Format and Run the Prompt

Use format_prompt() to fill in the placeholders, convert to messages, and pass to the chat model. The original code and output are preserved exactly:

# Get formatted messages and pass to the chat model
chat(chat_prompt.format_prompt(
    input_language="English", 
    output_language="French", 
    text="I love programming."
).to_messages())

Output (exact as original):

AIMessage(content="J'adore la programmation.", additional_kwargs={})

4. Method 2: Create Message Templates with External PromptTemplate

For more flexibility, you can first define a PromptTemplate and then pass it to SystemMessagePromptTemplate.

Step 4.1: Create an External PromptTemplate

prompt = PromptTemplate(
    template="You are a helpful assistant that translates {input_language} to {output_language}.",
    input_variables=["input_language", "output_language"],  # Explicitly list variables
)

Step 4.2: Wrap into SystemMessagePromptTemplate

system_message_prompt = SystemMessagePromptTemplate(prompt=prompt)

Note: You can combine this system message prompt with the same human_message_prompt (from Method 1) into a ChatPromptTemplate and run it—same as Step 3.2 and 3.3.

Key Takeaways

  • Chat model prompts are built with message templates (e.g., SystemMessagePromptTemplate).
  • Two ways to create message templates: from_template (concise) or external PromptTemplate (flexible).
  • ChatPromptTemplate.from_messages() combines multiple message templates.
  • format_prompt().to_messages() converts the template to chat-model-compatible messages.
在使用LangChain调用本地部署的ollama模型接口时,出现 `No module named 'langchain_community'` 错误,可能有以下几种解决办法: #### 1. 确认安装情况 `langchain_community` 是 LangChain 0.1.13(2024年初)之后的新结构,把一些组件拆分到了子模块中(比如 loaders、tools 等),所以要单独安装。可以使用以下命令安装或更新该模块: ```bash pip install --upgrade langchain_community ``` #### 2. 使用根目录调用模块 调用模块时使用根目录调用,避免相对路径导致的问题。虽然这里不太明确根目录调用和此问题的直接关联,但有时路径问题可能引发此类错误。 #### 3. 检查环境 如果是在 PyCharm 等 IDE 中运行,确保使用的虚拟环境中已经安装了 `langchain_community`。有时 IDE 可能使用了错误的 Python 解释器,导致找不到已安装的模块。 以下是一个简单的示例代码,展示如何正确导入并使用 `ChatOllama`: ```python from langchain_community.chat_models import ChatOllama from langchain_core.prompts import ChatPromptTemplate # 连接到本地的 Ollama 服务 llm = ChatOllama( model="模型名称", base_url="http://localhost:11434", temperature=0.3, top_p=0.9, num_ctx=4096 ) # 创建提示模板 prompt = ChatPromptTemplate.from_messages([ ("system", "你是一个专业的技术专家"), ("human", "{input}") ]) # 创建对话链 chain = prompt | llm # 执行对话 response = chain.invoke({"input": "这里替换为你的问题"}) print("模型回复:") print(response.content) ```
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值