【LangChain4j快速入门】5分钟用Java接入AI大模型,Spring Boot整合实战!
前言:当Java遇上大模型
在AI浪潮席卷全球的今天,Java开发者如何快速拥抱大语言模型?LangChain4j作为专为Java打造的AI开发框架,以极简的API设计和强大的扩展能力,让集成ChatGPT、GPT-4o-mini等模型变得异常轻松!本文将带你通过实战代码+图文详解,5分钟完成Spring Boot与GPT-4o-mini的对接,开启你的AI应用开发之旅!
一、环境准备:闪电战配置
1.1 添加关键依赖
在Spring Boot项目的pom.xml
中加入LangChain4j核心库与OpenAI扩展:
<dependency><groupId>dev.langchain4j</groupId><artifactId>langchain4j-open-ai</artifactId><version>1.0.0-beta3</version>
</dependency>
<dependency><groupId>dev.langchain4j</groupId><artifactId>langchain4j</artifactId><version>1.0.0-beta3</version>
</dependency>
1.2 申请密钥(零门槛!)
- 正式环境:通过OpenAI平台获取API Key
- 尝鲜体验:直接使用官方Demo Key(配额有限,仅支持gpt-4o-mini)
// 配置类中直接使用demo密钥
.baseUrl("http://langchain4j.dev/demo/openai/v1")
.apiKey("demo")
- ⚠️如果你没有API密钥怎么办?
如果你没有自己的OpenAI API密钥,别担心。你可以临时使用官方免费提供的演示密钥,用于演示目的。请注意,当使用演示密钥时,所有对OpenAI API的请求都需要通过官方的代理服务器,该代理会在将你的请求转发给OpenAI API之前注入真实的密钥。官方不会以任何方式收集或使用你的数据。演示密钥有配额限制,仅限于gpt-4o-mini模型,并且只应用于演示目的。
OpenAiChatModel model = OpenAiChatModel.builder().baseUrl("http://langchain4j.dev/demo/openai/v1").apiKey("demo").modelName("gpt-4o-mini").build();
1.3配置application.yml
spring:application:name: AIai:openai:api-key: "demo"base-url: "http://langchain4j.dev/demo/openai/v1"
二、核心实现:3步构建AI聊天接口
2.1 模型配置(智能引擎)
@Configuration
public class LangChain4jConfig {@Beanpublic ChatLanguageModel chatLanguageModel() {return OpenAiChatModel.builder().baseUrl("http://langchain4j.dev/demo/openai/v1").apiKey("demo") // 替换为真实KEY时移除baseUrl.modelName("gpt-4o-mini") // 最新轻量级模型.build();}
}
关键点说明:
baseUrl
仅在使用Demo Key时需要modelName
指定模型版本,推荐性能优异的gpt-4o-mini
2.2 聊天控制器(对话大脑)
@RestController
@RequestMapping("/ai")
public class ChatController {private final ChatLanguageModel model;private final ChatMemory chatMemory; // 自动记忆上下文@GetMapping(value = "/chat", produces = "text/plain;charset=utf-8")public Mono<String> chat(@RequestParam String message) {UserMessage userMsg = UserMessage.from("你叫小智,是一个人工智能\n" + message);chatMemory.add(userMsg);AiMessage aiMsg = model.chat(chatMemory.messages()).aiMessage();chatMemory.add(aiMsg);return Mono.just(aiMsg.text());}
}
package org.example.ai.config;import dev.langchain4j.memory.ChatMemory;
import dev.langchain4j.memory.chat.MessageWindowChatMemory;
import dev.langchain4j.store.memory.chat.ChatMemoryStore;
import dev.langchain4j.store.memory.chat.InMemoryChatMemoryStore;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;@Configuration
public class CommonConfiguration {/*** 定义一个基于消息数量限制的 ChatMemory Bean*/@Beanpublic ChatMemory messageWindowChatMemory(ChatMemoryStore chatMemoryStore) {return MessageWindowChatMemory.builder().id("session-1") // 会话 ID.maxMessages(10) // 最大消息数量.chatMemoryStore(chatMemoryStore) // 持久化存储.build();}/*** 定义一个简单的内存存储实现*/@Beanpublic ChatMemoryStore inMemoryChatMemoryStore() {return new InMemoryChatMemoryStore();}
}
亮点功能:
ChatMemory
自动维护对话上下文- 强制UTF-8编码解决中文乱码
- 响应式编程支持(Mono)
三、效果验证:你的第一个AI接口
启动应用后访问:
http://localhost:8080/ai/chat?message=讲个程序员笑话
预期响应:
好的,主人!为什么程序员总把万圣节和圣诞节搞混?
因为 Oct 31 == Dec 25!(Octal 31 = Decimal 25)
加入前端代码
<!DOCTYPE html>
<html lang="zh-CN">
<head><meta charset="UTF-8"><meta name="viewport" content="width=device-width, initial-scale=1.0"><title>小智AI助手</title><script src="https://unpkg.com/vue@3/dist/vue.global.js"></script><style>/* 现代聊天界面样式 */:root {--primary: #4CAF50;--bg: #f5f5f5;--user-bg: #e3f2fd;--ai-bg: #ffffff;}body {margin: 0;font-family: -apple-system, BlinkMacSystemFont, 'Segoe UI', Roboto, sans-serif;background: var(--bg);}.chat-container {max-width: 800px;margin: 20px auto;border-radius: 12px;box-shadow: 0 2px 15px rgba(0,0,0,0.1);background: white;height: 90vh;display: flex;flex-direction: column;}.messages {flex: 1;overflow-y: auto;padding: 20px;display: flex;flex-direction: column;gap: 15px;}.message {max-width: 70%;padding: 12px 16px;border-radius: 18px;animation: fadeIn 0.3s ease;}.user-message {background: var(--user-bg);align-self: flex-end;border-bottom-right-radius: 4px;}.ai-message {background: var(--ai-bg);align-self: flex-start;border-bottom-left-radius: 4px;box-shadow: 0 2px 4px rgba(0,0,0,0.05);}.loading-dots {display: inline-block;font-size: 24px;}.loading-dots::after {content: '...';animation: dots 1.5s infinite;}.input-area {padding: 20px;border-top: 1px solid #eee;display: flex;gap: 10px;}input {flex: 1;padding: 12px;border: 1px solid #ddd;border-radius: 25px;font-size: 16px;outline: none;transition: 0.3s;}input:focus {border-color: var(--primary);box-shadow: 0 0 0 3px rgba(76,175,80,0.1);}button {padding: 12px 24px;background: var(--primary);border: none;border-radius: 25px;color: white;cursor: pointer;transition: 0.3s;}button:disabled {opacity: 0.7;cursor: not-allowed;}@keyframes dots {0%, 20% { content: '.'; }40% { content: '..'; }60%, 100% { content: '...'; }}@keyframes fadeIn {from { opacity: 0; transform: translateY(10px); }to { opacity: 1; transform: translateY(0); }}</style>
</head>
<body>
<div id="app"><div class="chat-container"><div class="messages"><div v-for="(msg, index) in messages":key="index":class="['message', msg.role === 'user' ? 'user-message' : 'ai-message']">{{ msg.content }}</div><div v-if="loading" class="message ai-message"><span class="loading-dots"></span></div></div><div class="input-area"><inputv-model="inputMessage"@keyup.enter="sendMessage"placeholder="和小智聊天吧~":disabled="loading"><button @click="sendMessage" :disabled="!inputMessage || loading">{{ loading ? '思考中' : '发送' }}</button></div></div>
</div><script>const { createApp } = Vue;createApp({data() {return {messages: [],inputMessage: '',loading: false}},methods: {async sendMessage() {if (!this.inputMessage.trim() || this.loading) return;const userMessage = this.inputMessage;this.messages.push({ role: 'user', content: userMessage });this.inputMessage = '';this.loading = true;try {const response = await fetch(`/ai/chat?message=${encodeURIComponent(userMessage)}`);const text = await response.text();this.messages.push({ role: 'assistant', content: text });} catch (error) {this.messages.push({role: 'assistant',content: '哎呀,小智暂时无法连接,请稍后再试~'});} finally {this.loading = false;this.scrollToBottom();}},scrollToBottom() {this.$nextTick(() => {const container = document.querySelector('.messages');container.scrollTop = container.scrollHeight;});}},mounted() {// 初始欢迎语this.messages.push({role: 'assistant',content: '你好!我是小智,有什么可以帮您的?'});}}).mount('#app');
</script>
</body>
</html>
完整代码
package org.example.ai.config;import dev.langchain4j.model.chat.ChatLanguageModel;
import dev.langchain4j.model.openai.OpenAiChatModel;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;@Configuration
public class LangChain4jConfig {@Beanpublic ChatLanguageModel chatLanguageModel() {return OpenAiChatModel.builder().baseUrl("http://langchain4j.dev/demo/openai/v1").apiKey("demo").modelName("gpt-4o-mini").build();}
}
package org.example.ai.config;import dev.langchain4j.memory.ChatMemory;
import dev.langchain4j.memory.chat.MessageWindowChatMemory;
import dev.langchain4j.store.memory.chat.ChatMemoryStore;
import dev.langchain4j.store.memory.chat.InMemoryChatMemoryStore;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;@Configuration
public class CommonConfiguration {/*** 定义一个基于消息数量限制的 ChatMemory Bean*/@Beanpublic ChatMemory messageWindowChatMemory(ChatMemoryStore chatMemoryStore) {return MessageWindowChatMemory.builder().id("session-1") // 会话 ID.maxMessages(10) // 最大消息数量.chatMemoryStore(chatMemoryStore) // 持久化存储.build();}/*** 定义一个简单的内存存储实现*/@Beanpublic ChatMemoryStore inMemoryChatMemoryStore() {return new InMemoryChatMemoryStore();}
}
package org.example.ai.controller;import dev.langchain4j.data.message.AiMessage;
import dev.langchain4j.data.message.UserMessage;
import dev.langchain4j.memory.ChatMemory;
import dev.langchain4j.model.chat.ChatLanguageModel;
import lombok.RequiredArgsConstructor;
import org.springframework.web.bind.annotation.*;
import reactor.core.publisher.Mono;@RestController
@RequestMapping("/ai")
@RequiredArgsConstructor
public class ChatController {private final ChatLanguageModel chatLanguageModel;private final ChatMemory chatMemory;@RequestMapping (value = "/chat", produces = "text/plain;charset=utf-8")public Mono<String> chat(@RequestParam(required = false, defaultValue = "") String message) {UserMessage userMessage = UserMessage.from("你叫小智,是一个人工智能\n" + message);chatMemory.add(userMessage);AiMessage aiMessage = chatLanguageModel.chat(chatMemory.messages()).aiMessage();chatMemory.add(aiMessage);return Mono.just(aiMessage.text());}
}
四、生产级优化建议
- 密钥安全:通过
application.yml
配置,避免硬编码langchain4j:openai:api-key: ${OPENAI_API_KEY}
- 限流降级:集成Resilience4j实现请求限流
- 性能监控:通过Micrometer对接Prometheus
- 上下文管理:使用
PersistentChatMemory
实现对话持久化
总结:为什么选择LangChain4j?
通过本文实战,我们体验到了:
✅ 极简集成:5行代码对接GPT-4o-mini
✅ 开箱即用:自动记忆管理、流式响应支持
✅ 生态丰富:支持20+模型厂商,轻松切换
✅ Spring Boot友好:自动配置+依赖注入
下一步探索:
- 尝试AI服务(Assistant API)实现复杂逻辑
- 结合Embedding实现RAG知识库
- 使用Tools API让模型调用外部服务
立即访问LangChain4j官网开启你的AI应用开发之旅!如果你在实践过程中遇到任何问题,欢迎在评论区留言交流~