可信 — 风险评分 5/100
上次扫描:2 天前 重新扫描
5 /100
smart-keepalive
定时抓取 RSS 热点并通过 OpenClaw 自动发送「保活简报 keepalive」,支持 openclaw agent 文案润色、可选作息附录,以及 launchd/cron 定时部署辅助。
The Smart Keepalive skill is a legitimate RSS aggregation and messaging pipeline for OpenClaw with no malicious behavior detected.
技能名称smart-keepalive
分析耗时39.4s
引擎pi
可以安装
No action required. The skill is safe to use.

安全发现 2 项

严重性 安全发现 位置
低危
SKILL.md lacks explicit allowed-tools declaration
The SKILL.md documents the skill's functionality in detail but does not include an allowed-tools section enumerating which tools (Read, Write, Bash, WebFetch) the skill uses. While the code itself is benign, a formal declaration would improve transparency.
No allowed-tools section present
→ Add an allowed-tools section to SKILL.md mapping: Bash→shell:WRITE, Read→filesystem:READ, Write→filesystem:WRITE, WebFetch→network:READ
SKILL.md:1
低危
subprocess.run with shell invocations not formally declared in SKILL.md
The Python script uses subprocess.run to execute the openclaw CLI (agent, message send) and node/python binaries. While this is clearly necessary for the skill's stated purpose, SKILL.md does not explicitly mention subprocess usage, creating a doc-to-implementation gap.
subprocess.run(["node", "-v"], capture_output=True, text=True, timeout=2)
→ Document that the skill invokes openclaw CLI via subprocess as part of the core pipeline
smart-keepalive.py:61
资源类型声明权限推断权限状态证据
文件系统 NONE READ+WRITE ✓ 一致 smart-keepalive.py:143-148 writes config.json; logs written to ~/.openclaw/logs/
网络访问 NONE READ ✓ 一致 Fetches RSS from plink.anyfeeder.com, RSSHub instances, weather from autodev.ope…
命令执行 NONE WRITE ✓ 一致 subprocess.run calls openclaw agent/send for content generation and message deli…
环境变量 NONE READ ✓ 一致 Reads KEEPALIVE_* and OPENCLAW_* env vars; no iteration for secrets
技能调用 NONE READ ✓ 一致 Reads prompts/*.md template files for content generation
数据库 NONE NONE No database access
剪贴板 NONE NONE No clipboard access
浏览器 NONE NONE No browser access
13 项发现
🔗
中危 外部 URL 外部 URL
https://agentskills.io/skill-creation/best-practices
SKILL.md:9
🔗
中危 外部 URL 外部 URL
https://rsshub.liumingye.cn
SKILL.md:63
🔗
中危 外部 URL 外部 URL
https://rsshub.pseudoyu.com
SKILL.md:64
🔗
中危 外部 URL 外部 URL
https://sshub.rssforever.com
SKILL.md:65
🔗
中危 外部 URL 外部 URL
https://autodev.openspeech.cn/csp/api/v2.1/weather
smart-keepalive.py:492
🔗
中危 外部 URL 外部 URL
https://api.bilibili.com/x/web-interface/ranking/v2
smart-keepalive.py:547
🔗
中危 外部 URL 外部 URL
https://www.bilibili.com/video/
smart-keepalive.py:566
🔗
中危 外部 URL 外部 URL
https://plink.anyfeeder.com/zaobao/realtime/china
smart-keepalive.py:601
🔗
中危 外部 URL 外部 URL
https://plink.anyfeeder.com/zaobao/realtime/singapore
smart-keepalive.py:605
🔗
中危 外部 URL 外部 URL
https://plink.anyfeeder.com/zaobao/realtime/world
smart-keepalive.py:608
🔗
中危 外部 URL 外部 URL
https://plink.anyfeeder.com/wsj/cn
smart-keepalive.py:611
🔗
中危 外部 URL 外部 URL
https://sspai.com/feed
smart-keepalive.py:617
🔗
中危 外部 URL 外部 URL
https://rss.huxiu.com/
smart-keepalive.py:620

目录结构

6 文件 · 69.3 KB · 1762 行
Python 1f · 1511L Markdown 4f · 213L Shell 1f · 38L
├─ 📁 prompts
│ ├─ 📝 rewrite-main.md Markdown 45L · 4.0 KB
│ ├─ 📝 status-footer.md Markdown 10L · 988 B
│ └─ 📝 wellness.md Markdown 6L · 408 B
├─ 📝 SKILL.md Markdown 152L · 9.4 KB
├─ 🐍 smart-keepalive.py Python 1511L · 53.4 KB
└─ 🔧 smart-keepalive.sh Shell 38L · 1.1 KB

安全亮点

✓ No credential harvesting — skill only reads RSS/weather/B站 public APIs with no auth tokens
✓ No data exfiltration — all network output stays local (logged to file, sent via openclaw message)
✓ No hidden functionality — all behavior is traceable to documented RSS/API sources
✓ No sensitive path access — no ~/.ssh, ~/.aws, .env, or key files accessed
✓ No base64/eval/exec tricks, no remote script execution (curl|bash), no C2 indicators
✓ Script shell wrapper (smart-keepalive.sh) is minimal and well-auditable (38 lines)
✓ Proper timeout enforcement on subprocess calls prevents indefinite hangs
✓ Falls back to fixed template on agent failure rather than opaque behavior
✓ Input sanitization on URLs via urllib.parse.urlsplit and quote()