Examples
Ready-to-use code examples for common use cases with Super Agent Stack.
Getting Started
Memory Examples
Examples showcasing the memory system features for building AI applications with context.
Advanced Features
Examples showcasing the new OpenRouter features available through Super Agent Stack.
Structured OutputsNew
Get type-safe JSON responses with schema validation. Perfect for data extraction.
Web SearchNew
Real-time web search with citations. Get current information with sources.
Model RoutingNew
Fallback models for high availability. Never miss a request.
Tool Calling
Enable AI to call functions and interact with external systems.
Quick Code Snippets
Basic Request
const response = await client.chat.completions.create({
model: 'openai/gpt-4o',
messages: [{ role: 'user', content: 'Hello!' }]
});With Session Memory
const response = await client.chat.completions.create({
model: 'openai/gpt-4o',
messages: [{ role: 'user', content: 'Remember my name is John' }],
sessionId: crypto.randomUUID(), // UUID for session tracking
saveToMemory: true, // Save to memory
autoMemory: true // Enable "remember" commands
});With User Memory
const response = await client.chat.completions.create({
model: 'openai/gpt-4o',
messages: [{ role: 'user', content: 'What do you know about me?' }],
sessionId: crypto.randomUUID(),
useUserMemory: true, // Include cross-session memory
userMemoryTokens: 500 // Max tokens for user memory
});With Hybrid Search
const response = await client.chat.completions.create({
model: 'openai/gpt-4o',
messages: [{ role: 'user', content: 'Find docs about authentication' }],
useRAG: true,
searchMode: 'hybrid', // 'vector' | 'keyword' | 'hybrid'
keywordWeight: 0.3 // 30% keyword, 70% semantic
});With Web Search
const response = await client.chat.completions.create({
model: 'openai/gpt-4o:online', // Add :online suffix
messages: [{ role: 'user', content: 'Latest AI news' }]
});With Structured Output
const response = await client.chat.completions.create({
model: 'openai/gpt-4o',
messages: [{ role: 'user', content: 'Extract: John, 25, NYC' }],
response_format: {
type: 'json_schema',
json_schema: {
name: 'person',
schema: {
type: 'object',
properties: {
name: { type: 'string' },
age: { type: 'number' },
city: { type: 'string' }
}
}
}
}
});With Model Fallbacks
const response = await client.chat.completions.create({
model: 'openai/gpt-4o',
messages: [{ role: 'user', content: 'Hello!' }],
models: ['anthropic/claude-3.5-sonnet', 'google/gemini-2.0-flash']
});