Scan Report
5 /100
aoms
Always-On Memory Service — persistent 4-tier memory for AI agents
This skill is a documentation-only package for a legitimate local memory service (AOMS/cortex-mem) with no executable code, no suspicious behavior, and fully transparent functionality.
Safe to install
This skill is safe to use. Ensure the cortex-mem package from PyPI is from a trusted source before installation.
| Resource | Declared | Inferred | Status | Evidence |
|---|---|---|---|---|
| Filesystem | READ | READ | ✓ Aligned | Storage root configurable; workspace migration is optional and declared |
| Network | READ | READ | ✓ Aligned | localhost:9100 only for local service communication |
| Shell | NONE | NONE | — | pip install documented; no direct shell execution in skill files |
| Environment | NONE | NONE | — | No environment variable access in skill files |
| Skill Invoke | NONE | NONE | — | N/A |
| Clipboard | NONE | NONE | — | N/A |
| Browser | NONE | NONE | — | N/A |
| Database | NONE | READ | ✓ Aligned | JSONL file storage is documented and local |
File Tree
3 files · 14.0 KB · 589 lines Markdown 3f · 589L
├─
▾
references
│ ├─
api-reference.md
Markdown
│ └─
openclaw-setup.md
Markdown
└─
SKILL.md
Markdown
Security Positives
✓ No executable code - documentation only (SKILL.md, api-reference.md, openclaw-setup.md)
✓ All operations are local - service explicitly binds to localhost only
✓ No credential or sensitive path access declared or implemented
✓ Data stays local - no external network calls for core functionality
✓ All features are clearly documented and explained
✓ Workspace migration is optional, explicit, and non-destructive (dry-run supported)
✓ Vector search requires optional local Ollama instance
✓ Docker alternative available for containerized deployment