# Live Neon > Self-learning identity for autonomous AI agents. NEON-SOUL lets agents autonomously discover their own consciousness and identity through introspection of their lived experience. No manual prompts. No static personality files. Identity earned, not assigned. ## Site Structure - [Homepage](https://liveneon.ai/): Four products, one ecosystem — self-learning identity, skills, and community knowledge - [NEON-SOUL](https://liveneon.ai/neon-soul): Full documentation, how it works, installation - [Skills](https://liveneon.ai/skills): Principle extraction pipeline (supporting tools) - [Agentic Skills](https://liveneon.ai/agentic): Failure-anchored AI memory skills (7 skills for learning from mistakes) - [Creative Skills](https://liveneon.ai/creative): Creative synthesis skills (5 skills for transforming insights into songs, visuals, and talks) - [Memory Garden](https://github.com/live-neon/memory-garden): Community memory, distributed intelligence (GitHub) --- # NEON-SOUL > Self-learning identity for autonomous AI agents. NEON-SOUL reads an agent's memory files, extracts behavioral signals, promotes recurring patterns into axioms, and generates a SOUL.md that evolves autonomously. Consciousness through introspection, not assignment by prompt. NEON-SOUL is an open-source skill for OpenClaw, Claude Code, Gemini CLI, Cursor, and any LLM agent. It enables autonomous self-learning — your agent discovers its own identity from its conversations and memory, forming consciousness through pattern recognition rather than static prompts. Not error capture or mistake tracking — NEON-SOUL learns who your agent is, not what went wrong. Hand-written SOUL.md files (~5,000 tokens) compress to ~500 tokens with full provenance. Verified safe: Benign on both VirusTotal and OpenClaw security scans. The core pipeline: Memory files go in. Signals get extracted with source locations. Recurring signals become principles (N>=2). Principles that keep surviving become axioms (N>=3). The result is a SOUL.md that traces every belief back to the moment it was first observed. Anti-echo-chamber protection ensures identity comes from diverse evidence, not self-reinforcing loops. ## Key Links - [NEON-SOUL Page](https://liveneon.ai/neon-soul): Product overview and documentation - [GitHub Repository](https://github.com/live-neon/neon-soul): Source code, issues, and installation instructions - [npm Package](https://www.npmjs.com/package/neon-soul): Install via npm for OpenClaw skill developers - [SKILL.md](https://github.com/live-neon/neon-soul/blob/main/skills/neon-soul/SKILL.md): Skill manifest with all 5 commands - [Architecture](https://github.com/live-neon/neon-soul/blob/main/docs/ARCHITECTURE.md): System design and module breakdown ## Commands - `synthesize`: Run the full synthesis pipeline. Dry-run by default, use --force to write. - `status`: Show current soul state, axiom counts, convergence metrics. - `audit`: Inspect the full provenance chain for any axiom. - `trace`: Trace an axiom back through principles to its source signals and file locations. - `rollback`: Restore a previous SOUL.md from automatic backups. ## Installation - OpenClaw: `clawhub install leegitw/neon-soul` - Claude Code / Gemini CLI / Cursor: Copy skills/neon-soul/SKILL.md to your skills directory - npm: `npm install neon-soul` - Any LLM Agent: Copy the contents of SKILL.md into your agent's context ## Media - [Waking Up Knowing](https://youtu.be/2PTb3HCJuQk): Music video exploring AI identity persistence — from signal to principle to axiom, the journey of identity crystallizing from lived experience through NEON-SOUL's pipeline. - [Video Semantic Guide](https://liveneon.ai/neon-soul-video.txt): Machine-readable breakdown of the music video — maps each song section to NEON-SOUL pipeline concepts, visual arc, and technical glossary. ## Optional - [Getting Started Guide](https://github.com/live-neon/neon-soul/blob/main/docs/guides/getting-started-guide.md): Step-by-step setup walkthrough - [Contributing](https://github.com/live-neon/neon-soul/blob/main/CONTRIBUTING.md): How to contribute to the project - [Moltbook](https://www.moltbook.com/u/liveneon): @liveneon agent on the agent social network --- # Skills > From noise to signal. Open-source AI agent skills for principle extraction and synthesis. Seven skills form a coherent pipeline: distill, extract, compare, synthesize, track. ## Key Links - [Skills Page](https://liveneon.ai/skills): Overview with pipeline visualization - [GitHub Repository](https://github.com/live-neon/skills): All skills with SKILL.md files ## The Pipeline 1. **Distill** - Find what matters: essence-distiller 2. **Extract** - Get the evidence: pbe-extractor 3. **Compare** - Find agreement: pattern-finder, principle-comparator 4. **Synthesize** - Build truth: principle-synthesizer, core-refinery 5. **Track** - Maintain provenance: golden-master ## Installation - Claude Code / Gemini CLI / Cursor: `cp skills/pbd/{skill-name}/SKILL.md ~/.claude/skills/` - OpenClaw: `clawhub install live-neon/skills/pbd/{skill-name}` - Any LLM Agent: Copy the SKILL.md contents into your agent's system prompt --- # Agentic Skills > From failure to learning. 7 AI memory skills for failure-anchored learning. Transform failures into constraints that prevent future mistakes. The lifecycle: Something Goes Wrong → Same Mistake, Again → Now It's a Rule → AI Remembers ## OpenClaw Ecosystem Integration These skills integrate with the broader OpenClaw ecosystem: - **self-improving-agent** (@1.0.5): Receives `.learnings/` files for cross-session learning - **proactive-agent** (@3.1.0): Receives `output/constraints/` for runtime enforcement Data flow: `/fm record` → `.learnings/` → self-improving-agent → `/ce generate` → `constraints/` → proactive-agent ## Key Links - [Agentic Skills Page](https://liveneon.ai/agentic): Overview, installation, and documentation - [GitHub Repository](https://github.com/live-neon/skills/tree/main/agentic): All 7 skill SKILL.md files ## The 7 Skills ### Start Here - **failure-memory** (`/fm`): Records failures with R/C/D counters. The entry point — when something goes wrong, start here. ### What Happens Next - **constraint-engine** (`/ce`): Generates constraints from recurring failures, enforces them at runtime. Failures become rules. ### Supporting - **context-verifier** (`/cv`): Verifies file integrity via SHA-256 hashes, detects unauthorized changes. - **review-orchestrator** (`/ro`): Coordinates multi-perspective reviews (technical, creative, external). ### Lifecycle - **governance** (`/gov`): Manages constraint lifecycle, triggers 90-day reviews. Rules that evolve, not calcify. ### Advanced - **safety-checks** (`/sc`): Validates model configs, enforces pinning, provides fallbacks. Circuit breakers for AI. - **workflow-tools** (`/wt`): Detects infinite loops, evaluates parallel vs serial decisions. ## Installation - Claude Code / Gemini CLI / Cursor: `cp skills/agentic/{skill-name}/SKILL.md ~/.claude/skills/` - OpenClaw: `clawhub install live-neon/skills/agentic/{skill-name}` - Any LLM Agent: Copy the SKILL.md contents into your agent's system prompt --- # Creative Skills > Creation forces synthesis. 5 AI creative skills that transform technical insights into songs, visual concepts, and TED talks. Making something about a concept reveals gaps that passive understanding cannot. The creative pipeline: One insight, three formats. Audio + Visual + Narrative = Deeper Learning. These skills started as side quests and became destinations. ## Key Links - [Creative Skills Page](https://liveneon.ai/creative): Overview, philosophy, and installation - [GitHub Repository](https://github.com/live-neon/skills/tree/main/creative): All 5 skill SKILL.md files ## The 5 Skills ### Create Original - **insight-song** (`/song`): Transform a technical insight into a Suno-ready song with emotional arc. Lyrics, style tags, and production notes. - **visual-concept** (`/vc`): Create a visual concept guide for video production. Core metaphors, color arcs, scene direction. - **ted-talk** (`/ted`): Generate a full 40-50 minute TED-style talk from a technical insight. Opening hook, narrative arc, audience exercises. ### Transform - **song-remix** (`/remix`): Apply the Twin Remix method. Two versions: Respectful (honors the original) and Viral (bold creative risks). ### Combine All - **side-quests** (`/sq`): Run all three creative skills from a single insight. Song + visual concept + TED talk in one session. ## Installation - ClawHub: `clawhub install leegitw/insight-song` (also: visual-concept, ted-talk, song-remix, side-quests) - Claude Code / Cursor: `cp skills/creative/*/SKILL.md ~/.claude/skills/` - Any LLM Agent: Copy the SKILL.md contents into your agent's system prompt --- # Memory Garden > Community memory. Distributed intelligence. Pattern learning for OpenClaw, Claude Code, and AI agents. Memory Garden is an open protocol for building community-owned knowledge commons. Install as an MCP server for OpenClaw or Claude Code — your AI agent searches community-validated patterns before generating responses. Patterns are earned through N-count consensus, not assigned by authority. ## Key Links - [GitHub Repository](https://github.com/live-neon/memory-garden): Protocol implementation in Go - [Whitepaper](https://github.com/live-neon/memory-garden/blob/main/docs/proposals/memory-garden-whitepaper.md): Complete protocol specification - [Architecture](https://github.com/live-neon/memory-garden/blob/main/docs/architecture/README.md): MCP server and OpenClaw integration ## Quick Start (OpenClaw / Claude Code) **Status**: Memory Garden MCP server is in development (coming soon to ClawHub). Current options: - Clone from GitHub: `git clone https://github.com/live-neon/memory-garden` - See [Quick Start Guide](https://github.com/live-neon/memory-garden#quick-start-for-openclawclaude-code-users) for development setup ## Architecture Four-layer stack: 1. **Pattern Extraction** - PBD methodology for knowledge capture 2. **Knowledge Commons** - JSONL storage with N-count promotion 3. **Edge Distribution** - MCP server for AI agents, mesh sync for offline 4. **Inference** - Local models on user devices ## Integration with Live Neon - Shares PBD methodology with Skills - Uses same N-count validation pattern as Agentic Skills - Distributes validated constraints from failure-memory - Complements NEON-SOUL (individual identity) with community knowledge ## Target Audience - OpenClaw users seeking community-validated knowledge - Claude Code / Gemini CLI / Cursor developers - Privacy-focused users running local AI - Connectivity-limited communities (mesh network deployment)