SpringAI 集成本地Ollama大模型

随着人工智能技术的快速演进,DeepSeek 作为国内领先的大模型研发团队,其开源的 R1 系列模型凭借 "低成本、高性能" 的特性,在数学推理、代码生成等领域展现出与国际顶尖模型相媲美的能力。与此同时,Spring AI 作为 Spring 生态体系中专门针对 AI 工程化的框架,通过模块化设计和多模型适配能力,为 Java 开发者提供了便捷的 AI 集成方案。二者的结合,为企业级智能应用开发带来了新的技术路径。

关于如何在本地搭建deepSeek,可参考文献:基于DeepSeek R1 微调自己的大模型&Ollama本地部署_如何使用魔搭微调deepseek,并将大模型部署在本地-CSDN博客

SpringAi工程搭建

在idea中创建一个普通的Maven项目

添加项目依赖:

<project xmlns="http://maven.apache.org/POM/4.0.0"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">

    <modelVersion>4.0.0</modelVersion>

    <parent>
        <groupId>org.springframework.boot</groupId>
        <artifactId>spring-boot-starter-parent</artifactId>
        <version>3.2.4</version>
        <relativePath/> <!-- lookup parent from repository -->
    </parent>

    <artifactId>springAi</artifactId>
    <version>1.0.0</version>
    <packaging>jar</packaging>

    <properties>
        <maven.compiler.source>22</maven.compiler.source>
        <maven.compiler.target>22</maven.compiler.target>
        <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
        <spring-ai.version>1.0.0-M5</spring-ai.version>
    </properties>

    <dependencies>
        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-web</artifactId>
        </dependency>

        <!--spring ai的starter依赖,启动依赖-->
        <dependency>
            <groupId>org.springframework.ai</groupId>
            <artifactId>spring-ai-ollama-spring-boot-starter</artifactId>
            <version>${spring-ai.version}</version>
        </dependency>

        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-devtools</artifactId>
            <scope>runtime</scope>
            <optional>true</optional>
        </dependency>
        <dependency>
            <groupId>org.projectlombok</groupId>
            <artifactId>lombok</artifactId>
            <optional>true</optional>
        </dependency>
        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-test</artifactId>
            <scope>test</scope>
        </dependency>
    </dependencies>


    <build>
        <plugins>
            <plugin>
                <groupId>org.springframework.boot</groupId>
                <artifactId>spring-boot-maven-plugin</artifactId>
                <version>3.2.3</version>
                <configuration>
                    <excludes>
                        <exclude>
                            <groupId>org.projectlombok</groupId>
                            <artifactId>lombok</artifactId>
                        </exclude>
                    </excludes>
                </configuration>
            </plugin>
        </plugins>
    </build>

    <!--配置本项目的仓库:因为maven中心仓库还没有更新spring ai的jar包-->
    <repositories>
        <repository>
            <id>spring-milestones</id>
            <url>https://repo.spring.io/milestone</url>
        </repository>
    </repositories>

</project>

这里注意,一定要指定milestones的仓库,因为在正式的仓库中还没有spring-ai的package。

 

属性文件application.properties配置

属性文件的配置,一定要根据引入jar合适使用。网上有很多

上述做法都是错误的,依赖的jar没有相关的视线类,或者类根本不存在。

翻看OllamaChatModel的源码,可以找到实现类:

@AutoConfiguration(after = RestClientAutoConfiguration.class)
@ConditionalOnClass(OllamaApi.class)
@EnableConfigurationProperties({ OllamaChatProperties.class, OllamaEmbeddingProperties.class,
		OllamaConnectionProperties.class, OllamaInitializationProperties.class })
@ImportAutoConfiguration(classes = { RestClientAutoConfiguration.class, WebClientAutoConfiguration.class })
public class OllamaAutoConfiguration {
...
}

找到属性配置类:

 配置属性文件application.properties

server.port=8080
spring.application.name=SpringAi

# local ollama chat model
spring.ai.ollama.base-url=http://localhost:11434
spring.ai.ollama.chat.model=divine:latest

# self define model parameters
spring.ai.ollama.parameters.temperature=0.7
spring.ai.ollama.parameters.max_tokens=2048
spring.ai.ollama.parameters.streaming=true

配置类:

package com.test.ai;


import org.springframework.ai.autoconfigure.ollama.OllamaConnectionDetails;
import org.springframework.ai.ollama.api.OllamaApi;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.http.client.JdkClientHttpRequestFactory;
import org.springframework.web.client.RestClient;
import org.springframework.web.reactive.function.client.WebClient;

import java.net.http.HttpClient;
import java.time.Duration;

@Configuration
public class OllamaConfig {

    // 配置restClient超时时间
    @Bean
    @Qualifier("OllamaRestClientBuilder")
    public RestClient.Builder ollamaRestClientBuilder() {
        JdkClientHttpRequestFactory requestFactory = new JdkClientHttpRequestFactory(
                HttpClient.newHttpClient());
        requestFactory.setReadTimeout(Duration.ofMinutes(3));
        return RestClient.builder().requestFactory(requestFactory);
    }


    // 配置WebClient超时时间
    @Bean
    @Qualifier("OllamaWebClientBuilder")
    public WebClient.Builder ollamaWebClientBuilder() {
        return WebClient.builder()
                .defaultHeader("Content-Type", "application/json")
                ;
    }


    @Bean
    public OllamaApi ollamaApi(OllamaConnectionDetails connectionDetails,
                               @Qualifier("OllamaRestClientBuilder") RestClient.Builder restClientBuilder,
                               @Qualifier("OllamaWebClientBuilder") WebClient.Builder ollamaWebClientBuilder) {
        return new OllamaApi(connectionDetails.getBaseUrl(), restClientBuilder, ollamaWebClientBuilder);
    }

}

RestController类:

package com.test.ai;

import jakarta.annotation.Resource;
import org.springframework.ai.ollama.OllamaChatModel;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.RequestParam;
import org.springframework.web.bind.annotation.RestController;

@RestController
public class ChatController {

    @Resource
    private OllamaChatModel chatModel;

    @GetMapping("/chat")
    public String generate(@RequestParam String prompt) {
        return chatModel.call(prompt);
    }
}

启动配置类:

package com.test.ai;

import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;

@SpringBootApplication
public class SpringAiApplication {

    public static void main(String[] args) {
        SpringApplication.run(SpringAiApplication.class, args);
    }
}

 

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值