Memory
Memory enables LangChainGo applications to persist state between calls, maintaining conversation context and enabling sophisticated, stateful interactions.
Key Concepts
Memory in LangChainGo provides several critical capabilities:
- Conversation History: Store and retrieve past messages and responses
- Context Management: Maintain relevant context across multiple interactions
- State Persistence: Save conversation state to various storage backends
- Memory Types: Different strategies for managing conversation context
Do not share the same memory instance between different chains. Each memory instance represents the history of a single conversation and should be isolated.
Memory Types
Buffer Memory
Stores all conversation messages in a simple buffer. Best for short conversations where you want complete history.
Window Buffer Memory
Maintains a sliding window of recent messages. Useful when you want to limit context length while preserving recent history.
Token Buffer Memory
Manages memory based on token count rather than message count. Provides precise control over context size for LLM token limits.
Summary Memory
Automatically summarizes older conversation history while keeping recent messages intact. Balances context preservation with memory efficiency.
Chat Message History
Provides a lower-level interface for managing individual chat messages. Useful for custom memory implementations.
Storage Options
LangChainGo memory can persist to various backends:
- In-Memory: Fast, temporary storage (default)
- File-based: Simple persistence to local files
- Database: SQL or NoSQL database integration
- Redis: High-performance, distributed memory storage
- Custom: Implement your own storage backend
Memory Integration Patterns
With Chains
Chains automatically handle memory integration:
chain := chains.NewConversationChain(llm, memory)
With Agents
Agents use memory to maintain context across tool calls:
agent := agents.NewConversationalAgent(llm, tools, agents.WithMemory(memory))
Manual Memory Management
For custom applications, manage memory directly:
// Add user message
memory.ChatHistory.AddUserMessage(ctx, userInput)
// Add AI response
memory.ChatHistory.AddAIMessage(ctx, aiResponse)
// Retrieve conversation history
messages, err := memory.ChatHistory.Messages(ctx)
Best Practices
- Choose Appropriate Memory Type: Select based on conversation length and context requirements
- Monitor Memory Usage: Track memory growth and implement cleanup strategies
- Handle Errors Gracefully: Implement fallback behavior when memory operations fail
- Consider Privacy: Be mindful of sensitive data in conversation history
- Test Memory Behavior: Verify memory works correctly across conversation flows
Memory Classes
📄️ Memory
Conceptual Guide • How-to Guides