🧠LLM Integration
Overview
Supported Providers
OpenAI Adapter
Setup
import { OpenAIAdapter } from 'somnia-agent-kit';
const llm = new OpenAIAdapter({
apiKey: process.env.OPENAI_API_KEY,
defaultModel: 'gpt-4',
});Generate Response
With Options
Ollama Adapter (Local AI)
Setup
Use in Code
Available Models
DeepSeek Adapter
Setup
Generate Response
Complete Example with Agent
Advanced Usage
Streaming Responses
Chat History
Custom System Prompt
Best Practices
1. Use Local LLMs for Development
2. Handle Errors
3. Set Appropriate Temperature
4. Limit Token Usage
Example: AI-Powered Task Execution
See Also
Last updated

