A Sensitive Compartmented Information Facility (SCIF) is a secure room or area accredited to process classified intelligence at specified classification levels. SCIF accreditation requirements include strict physical security controls, emanation shielding, and — critically for AI procurement — mandatory network isolation. Any software deployed in a SCIF must be able to operate with no external network connectivity whatsoever. This is not a preference. It is a requirement.
What SCIF Means for AI System Architecture
Most commercial AI systems — including many marketed to defence and government — are not SCIF-compatible. The reason is architectural: they rely on external API calls, cloud-hosted model inference, or remote telemetry. Even a single outbound HTTP request to a cloud server disqualifies a system from SCIF deployment.
The categories of external connectivity that typically disqualify AI systems include: cloud model inference APIs (the model runs on a remote server rather than locally), license validation servers (periodic pings to check license validity), telemetry and usage reporting (common in enterprise software), automatic update services, and any outbound logging or monitoring calls.
A system that passes all other security requirements but sends a single analytics ping on startup cannot be deployed in a SCIF. Procurement teams must verify zero outbound connectivity — not just reduced connectivity.
What On-Premise AI Deployment Actually Requires
Genuine on-premise AI deployment means the model weights, inference engine, and all processing logic reside on hardware within the deployment environment. For SCIF deployment specifically, this means:
- Self-contained model files: all model weights and dependencies must be transferred to the deployment environment through approved secure channels (typically removable media) before deployment
- Local inference only: no API calls to external model hosting services — all inference runs on local CPU or GPU
- No telemetry: the system produces no outbound data of any kind during operation — no usage logging, no performance telemetry, no error reporting
- Air-gap verification: the system can be verified to produce zero network traffic using standard network monitoring tools
- Containerisation: Docker or equivalent containerisation enables consistent, verifiable deployment without internet access — dependencies are bundled in the container image
EchoDepth meets all of these requirements. It deploys via Docker containers, runs all inference locally on standard server hardware (no GPU required for standard resolution), and produces zero outbound network traffic. The deployment image can be transferred to an isolated network via approved removable media and verified against a published cryptographic hash.
"SCIF-compatible" must mean zero external data transmission — not reduced external transmission. Any system that produces outbound network traffic, regardless of encryption, cannot be deployed in a classified compartmented environment.
Hardware Requirements for On-Premise AI in Classified Environments
A common concern in SCIF AI procurement is hardware: do classified environments need specialist servers to run AI models locally? For EchoDepth, the answer is no. The system runs on standard server hardware — no GPU required for standard 720p camera inputs. A contemporary server-class machine with 16GB+ RAM and a multi-core CPU is sufficient for single-camera deployment.
For multi-camera deployments — for example, monitoring multiple interview rooms simultaneously or monitoring an entire SOC analyst floor — GPU acceleration reduces latency and enables higher concurrent stream counts. Standard data centre GPU hardware (NVIDIA RTX class or equivalent) is sufficient; specialist defence-grade hardware is not required.
Camera requirements are similarly minimal: a standard RGB camera at 720p minimum. Existing CCTV, interview room cameras, and laptop webcams all qualify. No infrared, no thermal imaging, and no specialist optics are required.
UK Data Residency and Compliance in Classified Contexts
For UK government and MoD procurement, data residency requirements add another layer of constraint. Even where cloud deployment is theoretically permitted, processing classified or sensitive biometric data on US-based or European cloud infrastructure raises data sovereignty questions that many procurement frameworks cannot accommodate.
EchoDepth processes all data within UK borders by default. For SCIF deployment, all processing occurs within the facility itself — the most stringent possible data residency posture. The system is aligned with NCSC principles for on-premise deployment and produces DSAT-compatible audit records.
A full Data Processing Agreement covering biometric data under UK GDPR and the Data Protection Act 2018 is available under NDA for procurement teams requiring detailed compliance documentation.
Verifying SCIF Compatibility in Procurement
Procurement teams evaluating AI systems for classified environment deployment should request the following from vendors:
- Network traffic analysis confirming zero outbound connections during operation
- Deployment architecture documentation confirming all inference is local
- License management approach that does not require network connectivity
- Container image hash or equivalent verification for supply chain integrity
- Air-gapped demo environment for hands-on evaluation
EchoDepth provides all of these. An air-gapped demo environment is available on request for vetted procurement teams, allowing hands-on evaluation in a network-isolated environment before any classified facility deployment.
Air-gapped demo environment available on request
EchoDepth deploys fully on-premise in SCIF and classified environments. Zero outbound transmission. No cloud dependency. UK data residency as default. NDA available.