Background Intelligence, Smart Feed & Library
Your knowledge now lives in a neutral layer that works with any AI tool. Switch freely between Claude Code, Cursor, ChatGPT, or whatever ships next. Background Intelligence builds connections and writes your daily briefing. Smart Feed, Library, and Exchange v2 round out the release.
阅读发布博客- Smart Feed: one input for captures, questions, and file imports
- Background Intelligence: the system finds connections, surfaces contradictions, and writes daily briefings on its own
- Working Memory: a daily briefing of what matters, delivered to ~/ai-now/memory.md
- Library: search PDFs, Word docs, and presentations alongside your memories
- Access Mem Anywhere: one-click secure remote URL + API key for your other machines and agents
- Exchange v2 (Chrome): auto-captures AI conversations across 13+ platforms
- Browse-Now (Chrome): give AI agents access to your browser
- Search engine rewritten in Rust with new BM25, vector search, and SOTA local embeddings
- Alma plugin: native memory integration with tools, command palette actions, and auto-capture controls
- Smart Feed: one input for captures, questions, and file imports
- Unified timeline: captures, insights, flags, and daily briefings in one stream
- Background Intelligence: the system finds connections, surfaces contradictions, and writes daily briefings
- Working Memory: daily briefing at ~/ai-now/memory.md, readable by any MCP agent
- Memory evolution: detects when memories update, enrich, confirm, or contradict each other
- Library: search PDFs, DOCX, PPTX, and Markdown alongside memories
- Access Mem Anywhere: start a secure Cloudflare tunnel from Settings (Quick link or Cloudflare account mode)
- Remote API auth hardening: every remote request requires your API key, with key rotation support
- Exchange v2: auto-captures AI conversations across 13+ platforms (Claude, ChatGPT, Gemini, and more)
- Browse-Now CLI: give AI agents access to your browser with your login sessions
- Interactive graph visualization in Claude Desktop and other MCP hosts
- Chinese full-text search
- AI-powered search reranking with human-readable explanations
- Session auto-sync: real-time watching for Claude Code, Cursor, Codex, and OpenCode
- Linux headless server deployment with systemd
- Full CLI for server setup, license, LLM config, and knowledge settings
- New providers: ChatGPT (Codex) subscription, MiniMax, Z.AI, MoonShot AI
- MCP: Graph View in Memory Search for Claude Desktop and other MCP-UI hosts
- OpenClaw plugin: native memory integration for the OpenClaw agent framework
- Alma plugin: native memory integration with tools, command palette actions, and optional auto-capture
- Raycast extension: search memories, save insights, read Working Memory
- npx Skills: one-command memory integration for any agent (search, save, Working Memory, graph exploration)
- Knowledge structure: Trace → Unit → Crystal three-layer model inspired by cognitive science
- Community detection: automatic topic clustering across the knowledge graph
- Node importance scoring: surfaces the most influential concepts in your graph
- LLM-friendly docs: every page serves clean Markdown via Accept: text/markdown header, plus /llms-full.txt and /llms.mdx endpoints
- Search engine rewritten: new tokenization, index engine, and SOTA local embedding models
- MCP: 24 tools exposing the full knowledge surface (up from basic search/add/labels)
- Visual refresh: calmer, cleaner design with content-first layouts
- Search is 50% faster with native Chinese support
- Graph timeline slider: filter your knowledge graph by time range
- Real-time streaming: see agent thinking as it happens
- Graph visualization performance improved
- OAuth connections silently failing on macOS
- Search model downloads failing on Windows
- Session export showing wrong message count from Cursor
- Access Mem Anywhere copy actions now survive desktop reload/restart without forcing key rotation
- Cloudflare account token parsing now accepts full command formats (`service install`, `--token`, `--token=`)
- Linux: Full headless deployment with nmem serve and nmem service
- Linux: Interactive TUI control plane for headless server management
- Community: native support in DeepChat and LobeHub
