TL;DR
AI coding assistants can hallucinate package names, creating phantom dependencies that don’t exist in official repositories. Attackers exploit this predictable behavior through slopsquatting, which involves registering malicious packages with names that AI models commonly suggest. This emerging supply chain attack requires new detection approaches focused on behavioral analysis to complement existing security tools.
*** This is a Security Bloggers Network syndicated blog from AppSec Observer authored by Jake Milstein. Read the original post at: https://www.contrastsecurity.com/security-influencers/slopsquatting-attacks-how-ai-phantom-dependencies-create-security-risks