DISCOVER THE FUTURE OF AI AGENTSarrow_forward

LangChain4j

calendar_todayAdded Feb 25, 2026
categoryAgent & Tooling
codeOpen Source
Workflow AutomationJavaSpring BootLangChainModel Context ProtocolRAGAI AgentsAgent FrameworkSDKAgent & ToolingModel & Inference FrameworkDeveloper Tools & CodingProtocol, API & Integration

An open-source framework simplifying LLM integration in Java apps, supporting RAG, tool calling, and agents. Provides a unified API for major models like OpenAI and Gemini, with deep integration for Spring Boot and Quarkus.

LangChain4j is an LLM integration framework designed for the Java ecosystem, incorporating ideas from LangChain and Haystack while offering a type-safe API.

Core Features#

Unified API Abstraction#

  • 20+ LLM Providers: OpenAI, Azure OpenAI, Google Gemini (AI/Vertex AI), Anthropic, Mistral AI, Ollama, Hugging Face, AWS Bedrock, IBM Watsonx, and more
  • 30+ Vector Databases: Pinecone, Milvus, Qdrant, Chroma, PgVector, Elasticsearch, Weaviate, MongoDB Atlas, Oracle, Cassandra, and more

Advanced LLM Capabilities#

  • RAG (Retrieval Augmented Generation): From simple to advanced RAG pipelines, including query transformation, re-ranking, multi-path retrieval
  • Tool Calling (Function Calling): Enables LLMs to invoke Java methods with MCP (Model Context Protocol) support
  • Agents: Built-in agentic patterns with autonomous planning and execution
  • Chat Memory: Multiple memory management implementations (window memory, persistent memory, per-user isolation)
  • Structured Outputs: Support for boolean, Enum, POJO typed outputs
  • Streaming: Token-level streaming with cancellation and callbacks

AI Services Declarative API#

Build LLM applications using interfaces + annotations:

  • @SystemMessage / @UserMessage for prompt templates
  • @MemoryId for multi-session memory isolation
  • @Tool for tool method registration

Enterprise Framework Integration#

  • Spring Boot auto-configuration (starter-based)
  • Quarkus extension (quarkus-langchain4j)
  • Helidon integration
  • Micronaut integration

Quick Start#

Maven Dependency#

<dependency>
    <groupId>dev.langchain4j</groupId>
    <artifactId>langchain4j-open-ai</artifactId>
    <version>1.11.0</version>
</dependency>

Minimal Example#

OpenAiChatModel model = OpenAiChatModel.builder()
    .apiKey(System.getenv("OPENAI_API_KEY"))
    .modelName("gpt-4o-mini")
    .build();

String answer = model.chat("Say 'Hello World'");

AI Services Example#

interface Assistant {
    @SystemMessage("You are a polite assistant.")
    String chat(@MemoryId int userId, @UserMessage String userMessage);
}

Assistant assistant = AiServices.builder(Assistant.class)
    .chatModel(model)
    .chatMemoryProvider(id -> MessageWindowChatMemory.withMaxMessages(10))
    .tools(new MyTools())
    .build();

Architecture#

Layered Design#

  • Low-level Components: ChatModel / StreamingChatModel / ChatMemory / EmbeddingModel / EmbeddingStore
  • High-level Abstractions: AiServices / RetrievalAugmentor / ContentRetriever
  • Modular Integration: One submodule per provider, interface + SPI style

Design Philosophy#

  • Optimized for Java ecosystem with emphasis on type safety and POJO-friendliness
  • Native integration with mainstream Java frameworks

Use Cases#

  1. Enterprise chatbots / intelligent customer service
  2. Document QA / knowledge retrieval
  3. Automated workflow agents
  4. Content generation / structured extraction
  5. Multi-model switching experiments
  6. Java backend AI capability enhancement

Requirements#

  • JDK 17+
  • Maven or Gradle
  • License: Apache-2.0

Related Projects

View All arrow_forward

STAY UPDATED

Get the latest AI tools and trends delivered straight to your inbox. No spam, just intelligence.

rocket_launch