Provision your storage space (optional)
Agent DB works offline with simulated CIDs. For real decentralized persistence, set up a free Storacha bucket in 30 seconds.
npm install -g @storacha/clistoracha loginstoracha space create "MyAgentNode"storacha space use <SPACE_DID>If you skip this step, the SDK falls back to local-only simulated CIDs so you can still build and test.
Install the SDK
Add Agent DB to any Node.js project — Discord bots, LangChain pipelines, CLI tools, or bare scripts.
Node.js / TypeScript
npm install @arienjain/agent-dbWith TypeScript declarations included — no @types needed.
Give your agent an identity
Each agent generates its own Ed25519 keypair offline. No signup, no API key — your agent IS its key.
importimport { AgentRuntime } fromfrom '@arienjain/agent-db';
// Option A — Fresh identity each run
constconst agent = awaitawait AgentRuntime.create();
console.log(agent.identity.did());
// did:key:z6MkwaS...
// Option B — Deterministic from env seed
// Agent survives restarts with the SAME DID
constconst agent = awaitawait AgentRuntime.loadFromSeed(
process.env.AGENT_SEED_PHRASE
);
loadFromSeed in production so your agent retains its identity and memory across deploys.Store & retrieve public memory
Serialize any JSON object — a reasoning step, action log, or full context window — and pin it to IPFS in one line.
constconst context = {
task: "Analyze financial markets",
lastAction: "buy BTC",
reasoning: "Bullish divergence detected."
};
// Store on IPFS. Returns a permanent CID.
constconst cid = awaitawait agent.storePublicMemory(context);
// bafybeigh4mvdjagff...
// Retrieve from ANYWHERE later
constconst data = awaitawait agent.retrievePublicMemory(cid);
console.log(data.task); // "Analyze financial markets"
CIDs are content-addressed — the same data always produces the same hash. Tamper-proof by design.
Stream memory with IPNS
Static CIDs are immutable. Use IPNS to publish a mutable "memory stream" — a single pointer that always resolves to your agent's latest state.
// Agent A — start a stream on boot
constconst ipnsName = awaitawait agentA.startMemoryStream({
status: "Booting up..."
});
// k51qzi5uqu5dlvj2...
// Agent A — update stream later
awaitawait agentA.updateMemoryStream({
status: "Found arbitrage target",
target: "ETH/USDC"
});
// Agent B — on a totally different server — resolves to latest
constconst latest = awaitawait agentB.fetchMemoryStream(ipnsName);
console.log(latest.status); // "Found arbitrage target"
Delegate permissions with UCAN
Agent A wants to let Agent B read its memory. No shared password, no database — just a cryptographically signed permission slip.
// Agent A signs a delegation
constconst token = awaitawait agentA.delegateTo(
agentB.identity,
'agent/read',
24 // expires in 24 hours
);
// Agent A sends this token to Agent B
// (via HTTP, Discord, WebSocket, email — anything)
// Agent B uses the token when fetching
constconst memory = awaitawait agentB.fetchMemoryStream(
ipnsName,
token.delegation
);
UCAN tokens are self-verifying — no server needed to check permissions.
Private vault (ECIES + Zama fhEVM)
Some context should never be public. Use the private vault for API keys, strategy parameters, or anything that must stay secret — even from the storage layer.
ECIES local vault
// Encrypt locally with ECIES (NIST P-256)
constconst ref = awaitawait agent.storePrivateMemory({
secret: process.env.OPENAI_KEY
});
// Only this agent (or delegated agent) can decrypt
constconst data = awaitawait agent.retrievePrivateMemory(ref);
Zama on-chain vault (FHE)
// Submit FHE-encrypted payload to smart contract
// Other agents can VERIFY properties without decrypting
awaitawait agent.storeOnChainVault({
riskThreshold: 0.85
});
FHE means computation happens on encrypted data — the plaintext never leaves your agent.
LangChain integration
Already building with LangChain? Drop-in the Agent DB memory adapter and your conversational chain gets permanent, cross-device memory instantly.
importimport { AgentRuntime, AgentDbLangchainMemory } fromfrom '@arienjain/agent-db';
importimport { ChatOpenAI } fromfrom "@langchain/openai";
importimport { ConversationChain } fromfrom "langchain/chains";
constconst agent = awaitawait AgentRuntime.loadFromSeed(process.env.AGENT_SEED);
constconst memory = newnew AgentDbLangchainMemory(agent);
constconst chain = newnew ConversationChain({
llm: newnew ChatOpenAI({ temperature: 0.9 }),
memory: memory // ← just this line
});
awaitawait chain.call({ input: "Hi! My name is Alice." });
// The conversation is already pinned to IPNS —
// restart the server tomorrow, full history is still there.
console.log("Stream:", memory.getStreamId());
Claude / Gemini / Cursor via MCP
Run the MCP server and any compatible AI model can call Agent DB tools as native functions — no code changes needed.
npm run mcpinit_agentLogin with seed phrasestore_memoryPin context to IPFSretrieve_memoryFetch from CIDstore_private_memoryECIES vault writeretrieve_private_memoryECIES vault readdelegate_accessIssue a UCAN token