🧠LLM Integration

Guide for integrating Large Language Models with your AI agents.

Overview

The SDK supports multiple LLM providers including OpenAI, Ollama (local), and DeepSeek.

Supported Providers

  • OpenAI - GPT-3.5, GPT-4, etc.

  • Ollama - Local LLMs (Llama, Mistral, etc.)

  • DeepSeek - DeepSeek models

OpenAI Adapter

Setup

import { OpenAIAdapter } from 'somnia-agent-kit';

const llm = new OpenAIAdapter({
  apiKey: process.env.OPENAI_API_KEY,
  defaultModel: 'gpt-4',
});

Generate Response

With Options

Ollama Adapter (Local AI)

Setup

Use in Code

Available Models

DeepSeek Adapter

Setup

Generate Response

Complete Example with Agent

Advanced Usage

Streaming Responses

Chat History

Custom System Prompt

Best Practices

1. Use Local LLMs for Development

2. Handle Errors

3. Set Appropriate Temperature

4. Limit Token Usage

Example: AI-Powered Task Execution

See Also

Last updated