AI Hallucinated This Dependency — Here’s How to Catch It (2026 Guide)
You open a pull request generated by Cursor or another AI coding tool.
Everything looks clean.
Until you see this:
import { secureValidator } from 'enterprise-security-utils';
There’s just one problem.
The package doesn’t exist.
AI-generated code frequently hallucinates dependencies, helper functions, internal utilities, and even entire libraries. These hallucinations are not random — they are structural artifacts of how LLMs generate code.
In this deep guide, we’ll explain:
- Why LLMs hallucinate dependencies
- The different types of hallucinated imports
- How they slip past tests
- How to systematically detect them in PR review
Why LLMs Hallucinate Dependencies
LLMs are trained to predict the most statistically plausible continuation of text.
When generating code, they:
- Infer common patterns
- Predict likely library names
- Generalize from similar ecosystems
If you ask:
Add input validation using a secure validation library.
The model may generate:
import { validateInput } from 'secure-validator-pro';
Even if no such library exists.
The name sounds plausible. The pattern is common. The syntax is correct.
But it’s fabricated.
Types of Hallucinated Dependencies
1. Completely Fake External Libraries
import authShield from 'auth-shield-pro';
- No NPM package exists
- No documentation
- No repository
2. Real Library, Wrong API Surface
import { advancedEncrypt } from 'bcrypt';
The library exists. The method does not.
3. Internal Utilities That Don’t Exist
import { sanitizeUserInput } from '../utils/security';
The folder exists. The function does not.
4. Schema Assumptions
AI may reference:
- Database columns that don’t exist
- ORM relations never defined
- Environment variables not configured
Why Tests Often Don’t Catch This
Hallucinated dependencies sometimes:
- Only run in code paths not tested
- Are mocked implicitly
- Are generated alongside matching test assumptions
If the AI generates:
jest.mock('enterprise-security-utils');
The tests will pass.
Production will fail.
How to Catch Hallucinated Dependencies in PR Review
Step 1: Verify Every New Import
For each new import:
- Check package.json
- Verify node_modules presence
- Confirm official documentation
Step 2: Validate API Surface
Even if the package exists:
- Confirm exported methods
- Check correct usage signatures
- Ensure version compatibility
Step 3: Cross-Check Internal Utilities
- Search for the function definition
- Confirm implementation is real
- Verify behavior matches name
Step 4: Validate Environment Assumptions
- Are environment variables configured?
- Are feature flags present?
- Are secrets provisioned?
Advanced Pattern: Plausible but Dangerous Defaults
Some hallucinations are not nonexistent — they are dangerously permissive.
Example:
import cors from 'cors'; app.use(cors());
This may enable wide-open CORS in production.
The dependency exists. The configuration is unsafe.
Automating Detection
Manual detection does not scale as AI-generated PR volume increases.
AI-specific PR diff analysis tools can:
- Flag imports not present in dependency tree
- Detect mismatched API usage
- Identify suspicious helper patterns
- Highlight schema mismatches
Codebase X-Ray analyzes PR diffs specifically for AI-introduced hallucinated dependencies and generates a fix branch automatically.
Run 3 free PR scans at prodmoh.com.
The Core Principle
Hallucinated dependencies are not stupidity.
They are statistical prediction artifacts.
The model generates what looks plausible — not what is verified.
Your review process must assume:
If it looks new, verify it exists.