Master Architecture — Constitution v8.0
Document Status: RATIFIED (The Operational Resilience Update)
Classification: INTERNAL // COMMAND EYES ONLY
Effective Date: 2026-03-30
Preamble: The Doctrine of Sovereign Computation
We, the People of Opplet, in order to secure our digital sovereignty, ensure the integrity of our data, and cultivate a meritocratic forge for talent, do ordain and establish this Infrastructure Constitution.
This enclave exists to enforce a separation of powers between The Sovereign (who owns the infrastructure) and The Talent (who utilizes it). We reject the fragility of monolithic chaos and the tyranny of third-party dependence.
Our Four Pillars of Operation:
- Identity is Sovereign: Control of the root credential is the only true ownership. We delegate access, never authority.
- Code is Law: Policy is not written in memos; it is enforced by firewalls, pipelines, and automation.
- Automation is the Manager: Human intervention is a failure of design. The machine must govern the routine; the human governs the exception.
- Observation is Truth: Trust is a vulnerability. We do not trust; we verify through logs, metrics, and immutable audit trails.
Therefore, let this document serve as the supreme law for all hardware, software, and network protocols within the Opplet, KenyaX, and WiseNxt domains. Any configuration found in violation of these articles is null and void.
Changelog: v7.9 → v8.0
- Section 6D – Backup Bridge: Increased Annex backup frequency from nightly to every four hours. Added mandatory backup arrival verification via
n8n-Alphacanary workflow. - Section 7 – Kill Switch Matrix: Added L0 (Backup Failure) escalation level for proactive alerting when scheduled backups do not arrive within their expected window.
- Section 8 – Intelligence Layer: Added Section 8D (Active Alerting Baseline) defining mandatory alert rules for Wazuh, Grafana, and ZFS health monitoring across all zones.
- Section 9A – OPNsense Resilience: New section mandating Proxmox HA priority for the OPNsense VM and automated configuration backup export.
- Section 10 – Disaster Recovery Protocol: New section establishing Recovery Time Objectives, Recovery Point Objectives, rebuild priority order, and tabletop exercise cadence.
- Section 11 – RAM Headroom Audit: New section mandating quarterly memory utilization reviews and establishing a 75% sustained utilization ceiling per node.
- Section 12 – Documentation Structure: Relocated from Section 9 in v7.9 and expanded to include DR runbook storage requirements.
- Terminology correction: Replaced all references to The Outpost as “air-gapped” with “network-isolated” to accurately reflect its restricted-routing topology.
- Zone renames: Zone 1 renamed from “The Vault” to “The Den” (reflects its operational role, not just storage). Zone 5 renamed from “The Sandbox” to “The Range” (reflects adversarial nature, not safe containment).
1. The Hemispheric Strategy (Physical Topology)
The enclave is strictly partitioned into three physical nodes to isolate high-risk Talent workloads from the Sovereign control plane.
- The Manor (Sovereign Core): High-Availability, Low-Power, Sovereign. Role: The Structure and Brain. Identity (Alpha), Internal Automation, Capital Preservation, Observability.
- The Annex (Delivery Edge): High-Performance, High-Bandwidth. Role: The Forge & Front Door. Source Code, Talent Lifecycle, CI/CD Compilation, Public Traffic Proxies.
- The Outpost (Live Fire Range): High-Density, Volatile. Role: The Muscle. Network-isolated range existing strictly for exploitation and target practice.
2. Infrastructure Zoning Strategy (The Dwelling Analogy)
| Zone | Dwelling Designation | Identity Provider | Location | Risk Profile |
|---|---|---|---|---|
| Zone 0 | The Basement | LDAP-Alpha | The Manor | Critical |
| Zone 1 | The Den | Admin Only | The Manor | Sovereign |
| Zone 2 | The Office | LDAP-Alpha | The Manor | High |
| Zone 3 | The Kitchen | LDAP-Beta / CI | The Annex | Medium |
| Zone 4 | The Lounge | Mixed (OIDC) | The Annex | Low |
| Zone 5 | The Range | LDAP-Beta | The Outpost | Extreme |
The Alpha-Override Rule: Zone 4/5 apps use LDAP-Beta for talents but MUST map Administrative privileges to LDAP-Alpha to prevent Sovereign lockout during a Talent Wipe.
3. Hardware Allocation (The Metal)
3A. The Manor: The Command Post (Zones 0–2)
Nodes: 3x Xeon E3-1275v5 (64 GB RAM). Storage: Local ZFS Replication (15-minute interval). Mission: Run the Sovereign Core without public resource contention.
| Component | Role | Zone | RAM |
|---|---|---|---|
| Authentik | Gatekeeper. OIDC/SAML Hub. | 0 (Basement) | 4 GB |
| LDAP-Alpha | Identity A. Sovereign Directory. | 0 (Basement) | 2 GB |
| Watchtower | Wazuh / Loki / Grafana / Matomo. | 0 (Basement) | 8 GB |
| n8n-Alpha | The Butler. Internal Ops. | 1 (Den) | 4 GB |
| BookStack/Vaultwarden | The Grimoire / Private Data. | 1 (Den) | 6 GB |
| ERPNext | The Bursar. Finance/Inventory. | 2 (Office) | 14 GB |
Note: BookStack/Vaultwarden allocation reduced from 8 GB to 6 GB and ERPNext from 16 GB to 14 GB following the v8.0 RAM Headroom Audit (Section 11), freeing 4 GB of headroom per node for Watchtower spikes.
3B. The Annex: The Delivery Edge (Zones 3–4)
Node: 1x AMD Ryzen 9 7950X3D, 128 GB DDR5 ECC RAM. Storage: 2x 1.92 TB Gen4 Datacenter NVMe SSDs (Local ZFS Mirror). Mission: Handle heavy I/O, CI/CD compilation, source code management, and proxy all public/talent web traffic.
| Component | Role | Zone | RAM |
|---|---|---|---|
| GitLab Core | The Factory. Source Code & Registry. | 3 (Kitchen) | 24 GB |
| LDAP-Beta | Identity B. Talent Directory. | 3 (Kitchen) | 4 GB |
| Build Farm | Runners. CI/CD Compilers. | 3 (Kitchen) | 40 GB |
| Moodle | Ledger. Talent Database & Web UI. | 4 (Lounge) | 16 GB |
| Arena Comms | Jitsi / Discourse / HumHub. | 4 (Lounge) | 32 GB |
| Traefik/Proxies | Ingress, Auth Outpost, Guacamole. | 4 (Lounge) | 12 GB |
3C. The Outpost: The Live Fire Range (Zone 5)
Node: 1x AMD Ryzen 9 3900 (Auction), 128 GB DDR4 ECC RAM. Storage: 2x 1+ TB U.2 Datacenter NVMe SSDs (Local ZFS Mirror). Mission: Host defensible VMs and vulnerable targets in a network-isolated environment.
| Component | Role | Zone | RAM |
|---|---|---|---|
| Range Targets | Defensible VMs & Payloads. | 5 (Range) | 120 GB |
| Telemetry | Local Wazuh Forwarders. | 5 (Range) | 8 GB |
4. Software Matrix (The Weapons Locker)
4A. The Sovereign Core (The Manor)
| App | Product Role | Zone | Tech | Identity Source |
|---|---|---|---|---|
| OpenLDAP-A | Root Identity Provider | 0 | Native | Self (Alpha Root) |
| WireGuard | Secure Sovereign Tunnel | 0 | Kernel | Key Pairs |
| Wazuh | SIEM / Security Monitoring | 0 | C++ | Local Admin |
| Matomo | Privacy-First Analytics | 0 | PHP | Local Admin |
| Grafana | Observability Dashboards | 0 | Go | LDAP-Alpha |
| n8n-Alpha | The Butler (Internal Auto) | 1 | Node | LDAP-Alpha |
| BookStack-A | The Grimoire (Private SOPs) | 1 | PHP | LDAP-Alpha |
| Vaultwarden | Credential Storage | 1 | Mixed | Local Admin |
| ERPNext | The Bursar (Finance/Inv) | 2 | Python | LDAP-Alpha |
4B. The Factory & Edge (The Annex & The Outpost)
| App | Product Role | Location | Identity Source |
|---|---|---|---|
| GitLab CE | The Forge (Opplet/KenyaX) | Kitchen (Z3) | Mixed (Alpha/Beta) |
| OpenLDAP-B | Talent Directory | Kitchen (Z3) | Self (Beta Root) |
| GitLab Runner | CI/CD Compiler (The Muscle) | Kitchen (Z3) | Reg. Token |
| Moodle | Talent Ledger & LMS | Lounge (Z4) | LDAP-Beta |
| Jitsi/HumHub | Arena Comms (Community) | Lounge (Z4) | Authentik (OIDC) |
| Traefik | The Gatekeeper (Ingress) | Lounge (Z4) | Local Config |
| Guacamole | The Air-Lock (Remote Access) | Lounge (Z4) | Authentik (OIDC) |
| BookStack-B | Common Library (Public Docs) | Lounge (Z4) | LDAP-Beta |
| Target VMs | Exploitation Targets | Range (Z5) | Local Accounts |
4C. Infrastructure OS & Dependencies
- Hypervisor: Proxmox VE (Debian) across all physical nodes.
- Edge Router: OPNsense (Hardened BSD) virtualized on The Manor (Basement). See Section 9A for resilience mandate.
- Storage: OpenZFS (Strictly local to each node).
- SMTP Relay: Mailgun / SES for trusted outbound email.
- Watchdog: External Uptime Kuma (Micro-VPS) for external uptime monitoring. See Section 8C for expanded health checks.
5. The CMS/Static Triad (Public Fronts)
Hosted on The Annex (Zone 4 – The Lounge) behind Traefik.
- Opplet.com: Hugo (Static). Commercial / Infrastructure Brand.
- KenyaX.com: Grav (Flat-File). Logistics & Impact.
- WiseNxt.com: MkDocs (Static). Recruitment & Training.
6. Network Protocol (The Sovereign Gap)
6A. The Janitor Rule (Traffic Flow)
- The Manor → Annex/Outpost: ALLOWED. (Telemetry Pull, Admin Management).
- Annex/Outpost → The Manor: DENIED.
Exceptions:
- Exception 1: OIDC calls to Authentik (HTTPS 443).
- Exception 2:
n8n-Alphatriggers (Encrypted Internal Webhooks viaX-Internal-Token). - Exception 3: The Backup Bridge. The Annex pushes state to PBS (Zone 1). See Section 6D.
6B. The Storage Isolation Mandate
Distributed storage protocols (including Ceph, GlusterFS, vSAN) are EXPLICITLY BANNED from spanning across physical nodes. Storage must remain strictly local (ZFS) to each hypervisor to preserve NVMe IOPS capabilities and enforce the Sovereign Gap. State transfer shall occur exclusively via the encrypted Backup Bridge.
6C. The Talent Proxy
- Direct Access: Talents log into
access.wisenxt.com(Zone 4) on The Annex. - Isolation: Apache Guacamole on The Annex proxies the VNC/SSH connection directly to The Outpost via Hetzner’s vSwitch. The talent’s local hardware never touches the execution network layer.
6D. The Backup Bridge (Revised in v8.0)
The Annex pushes State (GitLab Artifacts, Moodle DB) to The Manor via Proxmox Backup Server. This is a Drop-Only permission: The Annex cannot read or delete existing backups on The Manor.
v8.0 Changes:
- Frequency: Increased from nightly to every four hours (
00:00, 04:00, 08:00, 12:00, 16:00, 20:00 UTC). - Canary Verification:
n8n-Alpharuns a scheduled workflow every four hours (offset by 30 minutes from the backup window) that checks the latest PBS backup timestamp. If the most recent backup is older than 5 hours,n8n-Alphafires a high-priority Pushover alert to the Owner. - GitLab Backup Integrity: Once per week (Sunday
03:00 UTC),n8n-Alphatriggers a test restore of the latest GitLab backup archive into a throwaway container on The Manor. If the integrity check fails, a critical alert is dispatched immediately.
7. The Kill Switch Matrix
| Level | Trigger | Action | Mechanism |
|---|---|---|---|
| L0 | Backup Failure | Alert Owner | n8n-Alpha detects missing PBS backup; Pushover critical notification. (NEW in v8.0) |
| L1 | Moodle Inactivity | Suspend User | n8n-Alpha locks LDAP-Beta account. |
| L2 | Range Breach | Isolate Zone 5 | OPNsense cuts VLAN 5 WAN access. |
| L3 | Annex Compromise | Physical Sever | OPNsense on The Manor disables the uplink port to The Annex. |
8. The Intelligence Layer (Observability & Meritocracy)
8A. The Split-Brain Protocol
- Sovereign Data: (Internal ops) Stays on The Manor ZFS. No offsite transit except via encrypted PBS.
- Liability Data: (Talent logs) Wazuh agents on The Outpost and Annex forward immutable telemetry over the Janitor Rule exception directly to Watchtower (Zone 0) for non-repudiation.
8B. The Meritocracy Loop
- Event: Talent passes the foundational exam in Moodle (The Lounge).
- Signal: Moodle sends an encrypted webhook over the Janitor Rule exception to
n8n-Alpha(The Den). - Action:
n8n-Alphaconnects to LDAP-Beta (The Kitchen) and elevates the user’s group status. - Result: The user instantly gains Single Sign-On (OIDC) access to HumHub and BookStack-Beta, establishing residency in The Lounge.
8C. The External Pulse (Expanded in v8.0)
- Uptime Kuma (VPS): Pings The Annex and Manor public IPs every 60 seconds.
- Service-Level Health Checks (NEW): Uptime Kuma additionally monitors the GitLab health endpoint, Moodle login page, Traefik dashboard, and Authentik OIDC discovery URL. Each check runs on a 120-second interval with independent alerting.
- Dead Man’s Switch: If The Manor goes dark, a high-priority Pushover notification is sent to the Owner via 5G/LTE.
8D. Active Alerting Baseline (NEW in v8.0)
Passive log collection without active alerting is an incomplete defense posture. The following alert rules are mandatory across all zones and must be configured in Wazuh and/or Grafana:
| Alert | Source | Threshold | Destination |
|---|---|---|---|
| Failed SSH Attempts | Wazuh (All Zones) | >5 within 10 minutes per source IP | Pushover (High) |
| LDAP Bind Failures | Wazuh (Zones 0, 3) | >10 within 5 minutes | Pushover (High) |
| ZFS Pool Degradation | Grafana (All Nodes) | Pool status != ONLINE | Pushover (Critical) |
| Unusual Outbound Traffic | Wazuh (Zone 5) | Any egress to non-whitelisted IPs | Pushover (Critical) |
| Disk Usage | Grafana (All Nodes) | >85% on any ZFS dataset | Pushover (Medium) |
| Container Restart Loop | Grafana (All Zones) | >3 restarts in 15 minutes | Pushover (High) |
| Backup Canary Failure | n8n-Alpha | PBS timestamp > 5 hours old | Pushover (Critical) |
9. Edge Router Resilience (NEW in v8.0)
9A. OPNsense High-Availability Mandate
OPNsense is virtualized on The Manor (Basement), which couples edge routing availability to the Manor hypervisor. Full redundancy would require a second physical router appliance (outside the scope of this update). The following low-cost mitigations are mandated:
- Proxmox HA Priority: The OPNsense VM must be assigned the highest HA restart priority on The Manor cluster, ensuring it is the first VM restored after any hypervisor interruption.
- Automated Config Export:
n8n-Alphaexports the OPNsense configuration backup (XML) to BookStack-Alpha (The Grimoire) every 24 hours. This ensures a known-good configuration is always available for rapid rebuild on any node. - Recovery Target: With a current config export and a Proxmox template, OPNsense must be rebuildable from scratch within 30 minutes.
10. Disaster Recovery Protocol (NEW in v8.0)
The v7.9 architecture referenced disaster recovery keys in BookStack-Alpha but did not define a formal recovery protocol. This section establishes binding recovery objectives and procedures.
10A. Recovery Objectives
| Node | RTO (Recovery Time) | RPO (Recovery Point) | Rationale |
|---|---|---|---|
| The Manor | 4 hours | 15 minutes (ZFS snap) | Sovereign services; local replication provides tight RPO. |
| The Annex | 8 hours | 4 hours (PBS push) | Dependent on backup restore from Manor PBS; larger data volume. |
| The Outpost | 24 hours | Last known good snapshot | Volatile by design; targets are rebuildable from templates. |
10B. Rebuild Priority Order
In a multi-node failure scenario, nodes are rebuilt in the following strict order:
- Priority 1: OPNsense (The Manor). Without edge routing, no other service is reachable.
- Priority 2: LDAP-Alpha and Authentik (The Manor, Zone 0). Identity must be online before any dependent service.
- Priority 3:
n8n-Alphaand Watchtower (The Manor, Zones 0–1). Automation and observability restore operational awareness. - Priority 4: GitLab and Traefik (The Annex, Zones 3–4). Restores development and public-facing services.
- Priority 5: Moodle, Arena Comms, Guacamole (The Annex, Zone 4). Talent-facing services restored last.
- Priority 6: The Outpost (Zone 5). Range is rebuilt from VM templates; no backup dependency.
10C. DR Runbook Location
The complete step-by-step disaster recovery runbook, including credential bootstrap procedures, is stored in BookStack-Alpha (The Grimoire, Zone 1). A printed hard copy of the credential bootstrap section is maintained in the Owner’s physical safe.
10D. Tabletop Exercise Cadence
A tabletop DR walkthrough must be conducted quarterly. The exercise simulates the total loss of one node and walks through the rebuild sequence from bare metal to service restoration. Findings are documented in BookStack-Alpha and any identified gaps must be resolved within 14 days.
11. RAM Headroom Audit (NEW in v8.0)
Memory contention is a silent availability risk. The following governance applies to all nodes:
- Quarterly Review: Actual memory utilization per container/VM is reviewed against allocated RAM every quarter. Results are logged in BookStack-Alpha.
- 75% Ceiling: No node may sustain average memory utilization above 75% of total physical RAM over a rolling 7-day window. Breaching this threshold triggers a mandatory rebalancing review.
- Initial Adjustments (v8.0): The Manor’s BookStack/Vaultwarden allocation has been reduced from 8 GB to 6 GB and ERPNext from 16 GB to 14 GB, reclaiming 4 GB of headroom for Watchtower (Wazuh/Grafana/Loki) burst capacity.
12. Documentation Structure
With GitLab on The Annex, the physical location of documentation is split to enforce the Sovereign Gap.
- GitLab (The Annex – Kitchen): Technical Source of Truth. Infrastructure as Code, CI/CD pipelines, Opplet/KenyaX source code.
- BookStack-Alpha (The Manor – Den): The Sovereign’s Grimoire. Owner’s private SOPs, architecture blueprints, disaster recovery runbook (Section 10C), OPNsense config backups (Section 9A), and disaster recovery keys.
- BookStack-Beta (The Annex – Lounge): The Common Library. Public knowledge base, community guides, and Talent onboarding instructions.
END OF DOCUMENT