AI System Inventory: Govern What You Can Find
Here's a question we ask every organization that claims to have an AI governance program: how many AI systems are you running?
The answer is almost always wrong. Not slightly — wrong by a factor of two or three. A financial services firm told us "about fifteen." An internal audit found forty-three. A healthcare organization said "we only use AI in radiology." They had AI in claims processing, patient scheduling, clinical decision support, and fraud detection.
An AI system inventory is the foundational registry of every AI system your organization develops, deploys, or procures. It catalogs what each system does, where it's deployed, who owns it, what data it processes, what risk tier it carries, and what governance controls apply. Without it, every other governance activity — risk assessment, compliance tracking, audit preparation — is built on incomplete information.
The Shadow AI Problem
Shadow AI — systems deployed without organizational oversight — is the inventory problem at its most acute.
The numbers are stark. A 2025 IBM survey found only 37% of organizations have AI governance policies in place. Harmonic Security's analysis of 22.4 million enterprise AI prompts identified 665 distinct generative AI tools across enterprise environments, while only 40% of those companies had official AI subscriptions. A survey of 12,000 white-collar employees found 60% had used AI tools at work, but only 18.5% knew of any company AI policy.
Shadow AI isn't malicious. It's a natural consequence of accessible AI. A marketing team using an AI writing tool, a finance analyst building a Jupyter notebook forecasting model, a customer service manager plugging in a sentiment API — none think they're creating governance risk. But each creates exposure: privacy violations, unvalidated decision-making, bias in customer treatment.
The 97% finding from a 2025 security analysis is particularly telling: of organizations suffering AI-related security incidents, 97% lacked proper AI access controls. You can't control access to systems you don't know exist.
EU AI Act Article 49: Registration Requirements
The EU AI Act makes inventories a legal obligation. Article 49 requires that before placing a high-risk AI system on the market, providers must register themselves and their systems in the EU database under Article 71.
High-risk systems under Annex III must be registered before market placement, including intended purpose, provider identity, and conformity assessment results. Public authority deployers must also register.
Non-high-risk determinations still trigger registration. If a provider concludes an Annex III system isn't actually high-risk under Article 6(3), they must register the system and document their reasoning — preventing self-classification out of obligations without accountability.
Sensitive domains receive special treatment. Law enforcement, migration, and border control systems go into a secure non-public database section. Critical infrastructure systems register nationally.
These requirements take effect August 2, 2026. The EU AI Act requirements guide covers the full timeline.
What Belongs in an AI System Inventory
A useful inventory goes beyond a list of names. The minimum viable record includes:
System identification: Name, unique ID, version, provider (internal or vendor), deployment date.
Purpose and scope: Business process supported, decisions influenced or automated, affected populations.
Risk classification: Risk tier under applicable frameworks — EU AI Act level, internal tier, sector-specific classification. This drives governance control intensity.
Data profile: Ingested data, sources, presence of personal data or protected characteristics, retention policies.
Ownership: Named system owner (a person, not a team), technical lead, and business sponsor.
Governance status: Assessment status, last validation date, next review date, known issues, compliance framework mappings.
Dependencies: Upstream data sources, downstream consumers, third-party components (APIs, foundation models, vendor services).
ISO 42001 reinforces these elements, requiring organizations to identify and document all AI systems within their management system scope along with risk and impact profiles.
What Doesn't Belong
Inventory scope is a governance decision. Cast too narrowly and you miss risky systems. Cast too wide and it becomes unmanageable.
Basic automation isn't AI. Rules-based systems with explicit if-then logic don't belong. The EU AI Act targets systems using ML, logic-based approaches, or statistical methods to generate predictions, recommendations, or decisions.
Embedded vendor AI needs judgment. If your CRM embeds sentiment analysis, does it go in? Under the EU AI Act, if you deploy a high-risk system, yes — obligations exist regardless of who built the AI. For lower-risk vendor-embedded AI, the answer depends on risk appetite.
Research prototypes occupy a gray area. A sandbox model isn't deployed AI. But if it starts informing real decisions — even informally — it needs inventorying. The threshold is influence on consequential decisions.
Building and Maintaining the Inventory
Discovery is hardest. Use multiple approaches simultaneously.
Top-down policy. Mandatory registration before deployment. Tie registration to procurement approval and deployment gates so teams can't skip it.
Bottom-up discovery. Survey business units, review procurement records, audit API integrations, analyze cloud service usage for AI-specific services.
Technical detection. Network monitoring and SaaS management tools identify AI service API calls — critical for catching shadow AI that nobody registered.
The inventory isn't a project — it's a process. Build review cadences (quarterly for high-risk, annually for lower-risk) and assign responsibility for currency. Model governance extends the inventory into deeper lifecycle tracking — not just knowing what exists, but tracking validation status, performance, and governance posture over time.
Every governance capability depends on the inventory. Risk assessments require knowing which systems to assess. Compliance tracking requires knowing which regulations apply. Audit preparation requires knowing what documentation should exist. Start with the inventory. Everything else builds on it.
Get visibility into your full AI portfolio. Start with Starkguard or request a demo to see how centralized inventory management makes governance possible.