可信 — 风险评分 5/100
上次扫描:18 小时前 重新扫描
5 /100
ai-news-pipeline
Self-contained Chinese and international AI news RSS capture and AI-enriched report generation workflow
A clean AI news RSS pipeline that collects feeds, generates AI-enriched reports, and writes outputs — all functionality matches documentation with no malicious indicators.
技能名称ai-news-pipeline
分析耗时56.4s
引擎pi
可以安装
Approve for deployment. No security concerns identified.

安全发现 2 项

严重性 安全发现 位置
低危
base64 HTTP Basic Auth usage not declared in SKILL.md 文档欺骗
collect_feeds.py uses base64.b64encode to build HTTP Basic Auth headers for feeds that have username/password credentials (line 160-161). SKILL.md does not mention this capability. However, the scope is narrow and benign — it is standard HTTP auth for RSS feeds, not credential exfiltration.
basic_token = base64.b64encode(f"{source.username}:{source.password}".encode("utf-8")).decode("ascii")
→ Document in SKILL.md that the skill supports authenticated RSS feeds via HTTP Basic Auth.
scripts/collect_feeds.py:160
低危
subprocess pipeline chaining not declared in SKILL.md 文档欺骗
All three entrypoints (run_capture_only.py, run_report_only.py, run_full_workflow.py) use subprocess.run to invoke other scripts within the skill. SKILL.md describes each entrypoint independently but does not mention that they internally use subprocess. This is legitimate pipeline orchestration but is a minor documentation gap.
subprocess.run(command, check=True, cwd=workspace, env=env)
→ Add a note in SKILL.md that entrypoints may chain into other bundled scripts via subprocess.
scripts/run_capture_only.py:27
资源类型声明权限推断权限状态证据
文件系统 WRITE WRITE ✓ 一致 Scripts write to configurable workspace subdirectories (data/, reports/, state/,…
网络访问 READ READ ✓ 一致 RSS feed collection via urllib (collect_feeds.py:fetch_feed) and ARK API calls (…
命令执行 NONE WRITE ✓ 一致 Entrypoints use subprocess.run to chain scripts (run_capture_only.py:27, run_rep…
环境变量 NONE READ ✓ 一致 os.environ.copy() passed to subprocess; ARK_API_KEY, ARK_API_BASE, ARK_MODEL, AR…
技能调用 NONE NONE No cross-skill invocation detected.
剪贴板 NONE NONE No clipboard access.
浏览器 NONE NONE No browser automation.
数据库 NONE NONE No database access.
3 项发现
🔗
中危 外部 URL 外部 URL
http://www.w3.org/2005/Atom
scripts/collect_feeds.py:87
🔗
中危 外部 URL 外部 URL
http://purl.org/rss/1.0/modules/content/
scripts/collect_feeds.py:89
🔗
中危 外部 URL 外部 URL
https://ark.cn-beijing.volces.com/api/v3
scripts/generate_company_report.py:36

目录结构

12 文件 · 86.5 KB · 2575 行
Python 8f · 2255L Markdown 2f · 314L YAML 1f · 4L Text 1f · 2L
├─ 📁 agents
│ └─ 📋 openai.yaml YAML 4L · 490 B
├─ 📁 references
│ └─ 📝 commands.md Markdown 179L · 5.4 KB
├─ 📁 scripts
│ ├─ 🐍 brief_writer.py Python 314L · 11.0 KB
│ ├─ 🐍 collect_feeds.py Python 532L · 17.2 KB
│ ├─ 🐍 generate_company_report.py Python 560L · 18.7 KB
│ ├─ 🐍 generate_international_report.py Python 543L · 20.0 KB
│ ├─ 📄 requirements.txt Text 2L · 22 B
│ ├─ 🐍 run_capture_only.py Python 36L · 921 B
│ ├─ 🐍 run_full_workflow.py Python 65L · 1.8 KB
│ ├─ 🐍 run_report_only.py Python 69L · 2.1 KB
│ └─ 🐍 time_window.py Python 136L · 4.2 KB
└─ 📝 SKILL.md Markdown 135L · 4.5 KB

依赖分析 2 项

包名版本来源已知漏洞备注
openpyxl * requirements.txt Version not pinned — minor supply chain risk; recommend pinning to a specific version
python-docx * requirements.txt Version not pinned — minor supply chain risk; recommend pinning to a specific version

安全亮点

✓ All network I/O is to documented, well-known endpoints (RSS feeds and the Volcengine ARK API)
✓ No sensitive paths accessed (no ~/.ssh, ~/.aws, .env, /etc/passwd, or similar)
✓ No credential harvesting — ARK_API_KEY is only used to authenticate to the Volcengine API for report generation
✓ No obfuscation, reverse shells, C2 communication, or data exfiltration
✓ No base64-encoded payloads executed in shell
✓ Dependencies are pinned (openpyxl, python-docx) and are standard, trusted packages
✓ No hidden instructions in comments or HTML
✓ Full environment variable set is passed to subprocess, not iterated for key harvesting
✓ Code is clean, readable, and purpose-built for the documented use case
✓ State deduplication prevents duplicate processing
✓ AI cache is cleared after each run (clear_summary_cache in finally block)