Technical Documentation
Why This Matters
As the world shifts toward personalized, on-chain identities, static NFTs and passive agents fall short. Alterim introduces AI companions—intelligent, interactive, and deployable agents that grow with their users.
Alterim AI isn't just an app. It's a new standard for how people interact with digital assets, information, and identity.
From content to consciousness. We transform digital personas—your characters, creations, and identities—into autonomous beings powered by AI, memory, and on-chain intelligence.
Alterim AI Architecture Overview

Companion vs Agent: What’s the Difference?
Identity
Stateless
NFT + Persona bound
Memory
Short-term / local
Persistent + contextual
Autonomy
Basic
Strategy-driven logic
Ownership
None
On-chain + tradable
Agent Logic Layer: Alterim AI Companion Framework
Alterim introduces a multi-layered AI execution pipeline—where LLMs, memory systems, plugin tools, and prompts all interact dynamically based on real-time user context.
Core Flow
User Query → LLM Type Detection → Persona Embedding → Memory Retrieval → Tool Selection → Output Return
Highlights:
LLM orchestration: Routes different input types (DeFi, Meme, Chat, Info)
Memory Injection: Real-time updates inform persona tone + tool behavior
Persona Traits: Ensures character consistency
Tool Invocation: Calls to image gen, TTS, DeFi, etc. via APIs
This logic layer allows Alterim companions to act, adapt, and evolve based on user behavior, context, and state.
Inference Flow
User Query → LLM Type Detection → Persona Embedding → Memory Retrieval → Tool Selection → Output Return
Ensures:
Efficient model utilization
Persona-consistent replies
Modular tool/model swap without interrupting behavior
Real-World Example: Pikachu NFT → AI Companion
Minting: Register NFT metadata + persona template
Persona Injection: Overlay Pikachu traits (cheerful, bold, curious)
Memory Initialization: Begin memory profile (chat prefs, behavior logs)
User Interaction: Meme task detected → Pikachu returns custom image
DeFi Extension: Companion can answer DeFi queries with wallet context
Through memory, tool invocation, and persona logic, the Pikachu AI Companion becomes a context-aware actor on-chain.
Modular AI Layer & Memory System
Alterim’s AI stack is modular and plugin-friendly.
Memory Layer Structure
Memory Layer: └─Personal Memory └─ persona traits, chat history, task logs └─ Factual Memory └─ On-chain data, RAG docs, web indexes
Custom knowledge loading framework
RAG methods: semantic, keyword, KG-based
Dynamic injection based on tool/task context
Ensures relevant and cost-efficient reasoning
Tool Abstraction & Execution
AI companions autonomously select and use tools via abstraction layer.
Multimodal Stack
Tool Selector → └── Text-to-Image └── Voice Synthesis └── Video Generation └── DeFi Executor └── Web Search └── Custom Logic
Easily extensible (e.g., music, gesture, AR)
On-Chain Interaction Layer
Key Features
Multi-Wallet Binding
Strategy Plugins: DeFi, NFT, RWA
User Guardrails: E.g., rebalance if BERA drops >15%
State-Responsive Behavior
On-Chain Summary
Example:
“Hey! Your BERA balance gained 12.4%. I’ve restaked it in Vault #7. Want me to rebalance ETH next?”
SDK & Plugin Layer
Allows devs to:
Add companion functions via modular plugins
Respond to user input, system state, or chain events
Maintain flexibility + composability
Privacy & Data Control
Personal Memory: Off-chain, owner-only access
Factual Memory/Tools: On-chain or public data
Tool Calls: User-approved, threshold-bound
Use Case: Community Companion Deployment
Example: Steady Teddy NFT Agent
Companion as social agent (Discord/TG/X)
Learns meme styles, tone, preferences
Adapts, posts, and reflects community culture
Companion becomes a living cultural node
TL;DR
Persona-driven companions from NFTs
Modular AI stack: LLM routing, memory, tools
Supports voice, image, DeFi, RAG, etc.
Dev-ready SDK + composable infra
More than agents—true companions
Last updated