Summary
ai-guard-plugins v1.3.29 was classified as CRITICAL RISK with a risk score of 129. Sigil detected 20 findings across 150 files, covering phases including provenance, install hooks, code patterns. Review the findings below before installing this package.
v1.3.29
10 April 2026, 17:05 UTC
by Sigil Bot
Risk Score
129
Findings
20
Files Scanned
150
Provenance
Findings by Phase
Phase Ordering
Phases are ordered by criticality, with the most dangerous at the top. Click any phase header to expand or collapse its findings. Critical phases are expanded by default.
install-makefile-curl
HIGHMakefile/script pipes remote content to shell
package/.cursor/hooks/ensure-rtk.sh:23
else
curl -fsSL https://raw.githubusercontent.com/rtk-ai/rtk/refs/heads/master/install.sh | sh >&2 2>&1
export PATH="$HOME/.local/bin:$PATH"Why was this flagged?
A script or Makefile pipes content from a remote URL directly into a shell (curl | sh or wget | bash). This is inherently dangerous because the remote content can change at any time, and the command runs with the current user's permissions. Rated HIGH because it requires manual execution (unlike install hooks) but still executes arbitrary remote code.
install-npm-postinstall
CRITICALnpm lifecycle script — runs automatically on install
package/package.json:16
"test:docker": "docker build -f Dockerfile.verify -t ai-guard-plugins-verify:local .",
"postinstall": "bash install.sh || true",
"cursor-digest": "node --require ts-node/register .cursor/scripts/cursor-digest-learnings.ts",Why was this flagged?
npm lifecycle scripts like postinstall run automatically during package installation with no user interaction required. This is the #1 attack vector for malicious npm packages — attackers embed data theft or backdoor installation in these hooks. Rated CRITICAL because code executes before the developer can review it.
Badge
Markdown
[](https://sigilsec.ai/scans/C80A4B7B-A44D-4A0F-BA01-137047815DEA)HTML
<a href="https://sigilsec.ai/scans/C80A4B7B-A44D-4A0F-BA01-137047815DEA"><img src="https://sigilsec.ai/badge/npm/ai-guard-plugins" alt="Sigil Scan"></a>Run This Scan Yourself
Scan your own packages
Run Sigil locally to audit any package before it touches your codebase.
Early Access
Get cloud scanning, threat intel, and CI/CD integration.
Join 150+ developers on the waitlist.
Get threat intelligence and product updates
Security research, new threat signatures, and product updates. No spam.
Other npm scans
Believe this result is incorrect? Request a review or see our Terms of Service and Methodology.