Deepseek API||Deepseek 对话补全||Deepseek FIM 补全(Beta)

Deepseek API

Version: 1.0.0

Deepseek API

使用 DeepSeek API 之前,请先 创建 API 密钥

Authentication

  • HTTP: Bearer Auth

Security Scheme Type:

http

HTTP Authorization Scheme:

bearer

Contact

DeepSeek 技术支持: api-service@deepseek.com

Terms of Service

https://platform.deepseek.com/downloads/DeepSeek开放平台用户协议.html

License

MIT

Deepseek 对话补全

根据输入的上下文,来让模型补全对话内容。

请参考官方API文档: 对话补全 | DeepSeek API Docs

curl示例

curl -L -X POST 'https://api.deepseek.com/chat/completions' \
-H 'Content-Type: application/json' \
-H 'Accept: application/json' \
-H 'Authorization: Bearer <TOKEN>' \
--data-raw '{
  "messages": [
    {
      "content": "You are a helpful assistant",
      "role": "system"
    },
    {
      "content": "Hi",
      "role": "user"
    }
  ],
  "model": "deepseek-chat",
  "frequency_penalty": 0,
  "max_tokens": 2048,
  "presence_penalty": 0,
  "response_format": {
    "type": "text"
  },
  "stop": null,
  "stream": false,
  "stream_options": null,
  "temperature": 1,
  "top_p": 1,
  "tools": null,
  "tool_choice": "none",
  "logprobs": false,
  "top_logprobs": null
}'

Python 示例(OpenAI SDK)

from openai import OpenAI

# for backward compatibility, you can still use `https://api.deepseek.com/v1` as `base_url`.
client = OpenAI(api_key="<your API key>", base_url="https://api.deepseek.com")

response = client.chat.completions.create(
    model="deepseek-chat",
    messages=[
        {"role": "system", "content": "You are a helpful assistant"},
        {"role": "user", "content": "Hello"},
  ],
    max_tokens=1024,
    temperature=0.7,
    stream=False
)

print(response.choices[0].message.content)

Python 示例(Requests)

import requests
import json

url = "https://api.deepseek.com/chat/completions"

payload = json.dumps({
  "messages": [
    {
      "content": "You are a helpful assistant",
      "role": "system"
    },
    {
      "content": "Hi",
      "role": "user"
    }
  ],
  "model": "deepseek-chat",
  "frequency_penalty": 0,
  "max_tokens": 2048,
  "presence_penalty": 0,
  "response_format": {
    "type": "text"
  },
  "stop": None,
  "stream": False,
  "stream_options": None,
  "temperature": 1,
  "top_p": 1,
  "tools": None,
  "tool_choice": "none",
  "logprobs": False,
  "top_logprobs": None
})
headers = {
  'Content-Type': 'application/json',
  'Accept': 'application/json',
  'Authorization': 'Bearer <TOKEN>'
}

response = requests.request("POST", url, headers=headers, data=payload)

print(response.text)

Go 示例

package main

import (
  "fmt"
  "strings"
  "net/http"
  "io/ioutil"
)

func main() {

  url := "https://api.deepseek.com/chat/completions"
  method := "POST"

  payload := strings.NewReader(`{
  "messages": [
    {
      "content": "You are a helpful assistant",
      "role": "system"
    },
    {
      "content": "Hi",
      "role": "user"
    }
  ],
  "model": "deepseek-chat",
  "frequency_penalty": 0,
  "max_tokens": 2048,
  "presence_penalty": 0,
  "response_format": {
    "type": "text"
  },
  "stop": null,
  "stream": false,
  "stream_options": null,
  "temperature": 1,
  "top_p": 1,
  "tools": null,
  "tool_choice": "none",
  "logprobs": false,
  "top_logprobs": null
}`)

  client := &http.Client {
  }
  req, err := http.NewRequest(method, url, payload)

  if err != nil {
    fmt.Println(err)
    return
  }
  req.Header.Add("Content-Type", "application/json")
  req.Header.Add("Accept", "application/json")
  req.Header.Add("Authorization", "Bearer <TOKEN>")

  res, err := client.Do(req)
  if err != nil {
    fmt.Println(err)
    return
  }
  defer res.Body.Close()

  body, err := ioutil.ReadAll(res.Body)
  if err != nil {
    fmt.Println(err)
    return
  }
  fmt.Println(string(body))
}

Node.js 示例

import OpenAI from "openai";

# for backward compatibility, you can still use `https://api.deepseek.com/v1` as `baseURL`.
const openai = new OpenAI({
        baseURL: 'https://api.deepseek.com',
        apiKey: '<your API key>'
});

async function main() {
  const completion = await openai.chat.completions.create({
    messages: [{ role: "system", content: "You are a helpful assistant." }],
    model: "deepseek-chat",
  });

  console.log(completion.choices[0].message.content);
}

main();

Deepseek FIM 补全(Beta)

FIM(Fill-In-the-Middle)补全 API。

用户需要设置 base_url="https://api.deepseek.com/beta" 来使用此功能。

请参考官方文档: FIM 补全(Beta) | DeepSeek API Docs

curl 示例

curl -L -X POST 'https://api.deepseek.com/beta/completions' \
-H 'Content-Type: application/json' \
-H 'Accept: application/json' \
-H 'Authorization: Bearer <TOKEN>' \
--data-raw '{
  "model": "deepseek-chat",
  "prompt": "Once upon a time, ",
  "echo": false,
  "frequency_penalty": 0,
  "logprobs": 0,
  "max_tokens": 1024,
  "presence_penalty": 0,
  "stop": null,
  "stream": false,
  "stream_options": null,
  "suffix": null,
  "temperature": 1,
  "top_p": 1
}'

Python 示例

from openai import OpenAI

# user should set `base_url="https://api.deepseek.com/beta"` to use this feature.
client = OpenAI(
  api_key="<your API key>",
  base_url="https://api.deepseek.com/beta",
)
response = client.completions.create(
  model="deepseek-chat",
  prompt="def fib(a):",
  suffix="    return fib(a-1) + fib(a-2)",
  max_tokens=128)
print(response.choices[0].text)

Go 示例

package main

import (
  "fmt"
  "strings"
  "net/http"
  "io/ioutil"
)

func main() {

  url := "https://api.deepseek.com/beta/completions"
  method := "POST"

  payload := strings.NewReader(`{
  "model": "deepseek-chat",
  "prompt": "Once upon a time, ",
  "echo": false,
  "frequency_penalty": 0,
  "logprobs": 0,
  "max_tokens": 1024,
  "presence_penalty": 0,
  "stop": null,
  "stream": false,
  "stream_options": null,
  "suffix": null,
  "temperature": 1,
  "top_p": 1
}`)

  client := &http.Client {
  }
  req, err := http.NewRequest(method, url, payload)

  if err != nil {
    fmt.Println(err)
    return
  }
  req.Header.Add("Content-Type", "application/json")
  req.Header.Add("Accept", "application/json")
  req.Header.Add("Authorization", "Bearer <TOKEN>")

  res, err := client.Do(req)
  if err != nil {
    fmt.Println(err)
    return
  }
  defer res.Body.Close()

  body, err := ioutil.ReadAll(res.Body)
  if err != nil {
    fmt.Println(err)
    return
  }
  fmt.Println(string(body))
}

Node.js 示例

import OpenAI from "openai";

# user should set `baseURL="https://api.deepseek.com/beta"` to use this feature.
const openai = new OpenAI({
  baseURL: 'https://api.deepseek.com/beta', 
  apiKey: '<your API key>'
});
async function main() {
  const completion = await openai.completions.create({
    model: "deepseek-chat",
    prompt: "def fib(a):",
    suffix: "    return fib(a-1) + fib(a-2)",
    max_tokens: 128
  });
  console.log(completion);
}
main();

<think>好的,用户想了解如何使用DeepSeek API,特别是文档、使用指南和示例。首先,我需要确认用户的需求是什么。可能他们是一个开发者,想要集成DeepSeek的AI功能到自己的应用中。接下来,我应该回忆之前提供的引用内容,特别是引用[1]和[2],其中提到了API的不同功能,比如对话前缀续写、FIM补全、JSON输出、函数调用等,还有样例脚本。 用户的问题主要集中在文档和使用示例上,所以需要分步骤回答。首先,可能需要介绍如何获取API Key,这是使用API的第一步。然后,说明API的基本调用方法,包括请求的URL、头部信息、参数设置,比如模型选择、prompt、温度参数等。接着,给出一个具体的代码示例,帮助用户快速上手,同时提到流式和非流式输出的区别,因为引用[2]中提到了设置stream参数为true来启用流式输出。 另外,用户可能对不同的功能模块感兴趣,比如对话前缀续写、FIM补全、JSON输出和函数调用,需要简要说明每个功能的用途和使用场景。例如,对话前缀续写适用于根据已有内容生成后续文本,而FIM补全可能用于代码补全等场景。 还需要注意文档的引用,在回答中提到相关功能时,要在句尾添加引用标识,如[^1]和[^2]。此外,要确保使用中文,并且所有数学公式正确使用$...$或$$...$$格式,但当前问题不涉及数学公式,所以可能不需要处理这部分。最后,生成相关问题,帮助用户深入探索API的其他方面,比如认证方式、流式输出实现、功能调用应用场景等。 总结回答结构:先介绍API的基本使用步骤,包括获取API Key和调用方法,给出代码示例,说明不同功能模块,最后添加相关问题。确保每个部分清晰,引用正确,并符合用户的要求。</think>### DeepSeek API 使用指南 #### 1. **获取API Key** - 登录DeepSeek平台,在控制台创建API Key - 密钥用于身份验证,需在请求头中携带 #### 2. **基础API调用** ```python import requests url = "https://api.deepseek.com/v1/chat/completions" headers = { "Content-Type": "application/json", "Authorization": "Bearer YOUR_API_KEY" # 替换实际密钥 } data = { "model": "deepseek-chat", "messages": [{"role": "user", "content": "解释量子计算"}], "temperature": 0.7, "stream": False # 切换流式/非流式 } response = requests.post(url, headers=headers, json=data) print(response.json()) ``` #### 3. **核心功能模块** - **对话前缀续写**:根据上下文生成连贯文本 - **FIM补全**:支持`<|fim▁begin|>` `<|fim▁hole|>` `<|fim▁end|>`标记符的代码补全 - **JSON输出**:设置`response_format: { "type": "json_object" }` - **函数调用**:通过`tools`参数定义可调用函数 #### 4. **流式输出示例** ```python # 设置stream=True data["stream"] = True with requests.post(url, headers=headers, json=data, stream=True) as res: for chunk in res.iter_lines(): if chunk: print(chunk.decode("utf-8")) ```
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值