Give your agents persistent, searchable memory that survives across sessions. They remember users, learn from mistakes, and get smarter over time. Built in Rust. Deploys in seconds.
# Store agent memory
POST /v1/memory/store
{
"agent_id": "assistant-1",
"text": "User prefers TypeScript",
"memory_type": "semantic",
"importance": 0.9
}
# Recall by meaning
POST /v1/memory/recall
{
"query": "language preferences",
"top_k": 5
}
# → Result
{ "score": 0.97, "text": "User prefers TypeScript" }
Every session starts from zero. Thousands of interactions, zero retained knowledge. You're paying to re-teach your agents the same things over and over.
Semantic search, built-in embeddings, knowledge graphs, memory consolidation, and 45 MCP tools. All in a single binary you deploy once.
dk CLI for automation.Native SDKs for Python, TypeScript, Go, and Rust. Plus REST and gRPC for everything else. Five lines to first memory.
from dakera import DakeraClient
client = DakeraClient("http://localhost:3000")
# Store agent memory
client.memory_store(
agent_id="assistant-1",
text="User prefers TypeScript",
memory_type="semantic",
importance=0.9
)
# Recall by meaning
memories = client.memory_recall(
agent_id="assistant-1",
query="language preferences",
top_k=5
)import { DakeraClient } from "dakera"
const client = new DakeraClient("http://localhost:3000")
await client.memoryStore({
agentId: "assistant-1",
text: "User prefers TypeScript",
memoryType: "semantic",
importance: 0.9
})
const memories = await client.memoryRecall({
agentId: "assistant-1",
query: "language preferences",
topK: 5
})import "github.com/dakera/dakera"
client := dakera.NewClient("http://localhost:3000")
client.MemoryStore(ctx, &dakera.Memory{
AgentID: "assistant-1",
Text: "User prefers TypeScript",
MemoryType: "semantic",
Importance: 0.9,
})
memories, _ := client.MemoryRecall(ctx, "assistant-1", "language preferences", 5)use dakera_client::DakeraClient;
let client = DakeraClient::new("http://localhost:3000");
// Store agent memory
client.memory_store(&Memory {
agent_id: "assistant-1".into(),
text: "User prefers TypeScript".into(),
memory_type: "semantic".into(),
importance: 0.9,
..Default::default()
}).await?;
// Recall by meaning
let memories = client
.memory_recall("assistant-1", "language preferences", 5)
.await?;# Store memory
curl -X POST localhost:3000/v1/memory/store \
-H "Content-Type: application/json" \
-d '{"agent_id":"assistant-1","text":"User prefers TS","importance":0.9}'
# Recall
curl -X POST localhost:3000/v1/memory/recall \
-d '{"agent_id":"assistant-1","query":"language preferences","top_k":5}'
# Text search with auto-embedding
curl -X POST localhost:3000/v1/namespaces/docs/query-text \
-d '{"text":"semantic search systems","top_k":5}'Nine Rust crates compiled into a single binary. Every layer — from inference to storage — ships together. Nothing to install, configure, or manage.
Deploy in seconds. Your agents start remembering immediately. One binary, no infrastructure to manage, no vendor lock-in.