Skip to main content
The Enterprise SOC Architecture relies on seamless integrations between multiple security and monitoring platforms. This page documents the integration mechanisms, protocols, and data formats used throughout the system.

Integration Architecture

All integrations are designed to be loosely coupled and protocol-based to ensure scalability and maintainability.

Core Integration Points

IDS to Log Pipeline

Purpose: Stream IDS alerts and network events to the central log processing pipelineProtocol: Syslog, File-based, EVE JSONData Format:
  • Suricata: EVE JSON (Extensible Event Format)
  • Snort: Unified2 binary format or syslog
Configuration:
# Suricata EVE JSON output
- eve-log:
    enabled: yes
    filetype: regular
    filename: eve.json
    types:
      - alert
      - http
      - dns
      - tls
Logstash Input:
input {
  file {
    path => "/var/log/suricata/eve.json"
    codec => json
    type => "suricata"
  }
}
Key Fields: timestamp, src_ip, dest_ip, alert_signature, severity, protocol

Log Pipeline to Storage

Purpose: Store processed and normalized events in searchable indexProtocol: HTTP/HTTPS (Elasticsearch REST API)API Endpoint: POST /_bulk (bulk indexing)Data Format: JSON documents with normalized schemaIndex Strategy:
  • Time-based indices: soc-events-YYYY.MM.DD
  • Index templates for consistent field mapping
  • Lifecycle policies for data retention
Authentication: API key or username/passwordSample Document:
{
  "@timestamp": "2026-03-04T10:30:00.000Z",
  "event_type": "alert",
  "source_ip": "192.168.1.100",
  "destination_ip": "203.0.113.50",
  "severity": "high",
  "signature": "ET MALWARE Trojan Download",
  "protocol": "tcp",
  "source_system": "suricata"
}

Storage to SIEM Platform

Purpose: Query and analyze stored security eventsProtocol: Elasticsearch Query DSL (REST API)Integration Method:
  • Wazuh indexer (Elasticsearch fork) or
  • Direct Elasticsearch backend
Query Operations:
  • Real-time event streaming
  • Historical event search
  • Aggregation and statistics
  • Correlation queries
Wazuh Configuration:
<indexer>
  <host>https://elasticsearch:9200</host>
  <user>wazuh</user>
  <password>password</password>
</indexer>

Metrics to Security Platform

Purpose: Correlate infrastructure metrics with security eventsProtocol: Prometheus Remote Read API or WebhookIntegration Approach:
  • Prometheus Alertmanager webhooks to Wazuh
  • Custom exporters for Wazuh metrics
  • Grafana as unified visualization layer
Alert Format:
{
  "status": "firing",
  "alerts": [{
    "labels": {
      "alertname": "HighCPUUsage",
      "severity": "warning",
      "instance": "server-01"
    },
    "annotations": {
      "summary": "CPU usage above 90%"
    }
  }]
}

Infrastructure Monitoring to Metrics Platform

Purpose: Export Zabbix metrics to Prometheus for unified monitoringProtocol: Prometheus exporter (pull-based)Integration: Zabbix Prometheus ExporterExported Metrics:
  • Host availability
  • Item values
  • Trigger states
  • Problem counts
Prometheus Scrape Config:
scrape_configs:
  - job_name: 'zabbix'
    static_configs:
      - targets: ['zabbix-exporter:9091']

SIEM to Incident Management

Purpose: Automatically create incident cases from security alertsProtocol: TheHive REST API (HTTP/HTTPS)API Endpoint: POST /api/alert or POST /api/caseTrigger: Wazuh integration module or custom webhookWazuh Integration Config:
<integration>
  <name>thehive</name>
  <hook_url>https://thehive:9000/api/alert</hook_url>
  <api_key>YOUR_API_KEY</api_key>
  <alert_format>json</alert_format>
  <level>7</level>
</integration>
Alert Payload:
{
  "title": "Wazuh Alert: Malware Detected",
  "description": "Suspicious file detected on endpoint",
  "severity": 3,
  "tags": ["malware", "endpoint", "wazuh"],
  "source": "wazuh",
  "sourceRef": "alert-12345",
  "artifacts": [
    {"dataType": "ip", "data": "192.168.1.100"},
    {"dataType": "hash", "data": "abc123..."}
  ]
}

Incident Management to SOAR

Purpose: Automate incident analysis and response actionsProtocol: Cortex REST APIAPI Endpoints:
  • POST /api/analyzer/{analyzerId}/run - Run analysis
  • POST /api/responder/{responderId}/run - Execute response
Integration Type: Built-in TheHive-Cortex connectorAnalyzers: Investigate observables (IPs, domains, hashes)
  • VirusTotal lookup
  • MaxMind GeoIP
  • MISP threat intelligence
  • Custom analyzers
Responders: Execute response actions
  • Block IP at firewall
  • Isolate endpoint
  • Send notifications
  • Update threat feeds
Configuration:
play.modules.enabled += org.thp.thehive.connector.cortex.CortexModule

cortex {
  servers = [
    {
      name = "Cortex-01"
      url = "http://cortex:9001"
      auth {
        type = "bearer"
        key = "API_KEY"
      }
    }
  ]
}

Endpoint to Central Manager

Purpose: Collect endpoint security events and system logsProtocol: Wazuh Agent Protocol (TCP/UDP port 1514, 1515)Communication:
  • Agent registration and authentication
  • Real-time event forwarding
  • Command execution (for active response)
  • File integrity monitoring
Agent Configuration:
<client>
  <server>
    <address>wazuh-manager.local</address>
    <port>1514</port>
    <protocol>tcp</protocol>
  </server>
</client>
Collected Data:
  • System logs
  • Security events
  • File integrity changes
  • Process information
  • Network connections

Future Integration Points

The following integrations are planned for long-term implementation.

Deception Technology to SIEM

Purpose: Feed honeypot interaction data into security analysisProtocol: Syslog, JSON over HTTPData Types:
  • SSH login attempts
  • HTTP request logs
  • Malware samples
  • Attack signatures
Integration Approach:
  • Honeypot logs → Logstash → Elasticsearch → Wazuh
  • Direct Wazuh agent on honeypot VM

Firewall to Security Platform

Purpose: Centralize firewall logs and alertsProtocol: Syslog, NetflowLog Types:
  • Connection logs
  • Blocked traffic
  • IPS alerts (Suricata on OPNsense)
  • VPN connections
OPNsense Configuration: Enable remote syslog to Logstash endpoint

VPN Access to Security Monitoring

Purpose: Monitor and audit VPN access patternsProtocol: Tailscale API, WebhookLogged Events:
  • User authentication
  • Device connections
  • Access policy changes
  • Network activity
Integration: Tailscale audit logs → Logstash → Wazuh

Data Format Standards

Normalized Event Schema

All events are normalized to a common schema for correlation:
{
  "@timestamp": "ISO8601 timestamp",
  "event_type": "alert|log|metric",
  "severity": "critical|high|medium|low|info",
  "source_system": "component name",
  "source_ip": "IP address",
  "destination_ip": "IP address",
  "user": "username",
  "action": "blocked|allowed|detected",
  "message": "human-readable description",
  "tags": ["array", "of", "tags"],
  "metadata": {"additional": "context"}
}

Authentication & Security

API Authentication

  • API Keys: TheHive, Cortex, Elasticsearch
  • Bearer Tokens: Wazuh API
  • Client Certificates: Wazuh Agents
  • Username/Password: Zabbix, Grafana

Transport Security

  • TLS/SSL: All HTTP-based integrations
  • Encrypted Channels: Wazuh agent communication
  • VPN Tunnels: Inter-site communication
  • Network Segmentation: Isolate SOC components

Integration Testing

Recommended Testing Approach:
  1. Unit test each integration endpoint
  2. Validate data format transformations
  3. Test authentication mechanisms
  4. Verify end-to-end data flow
  5. Load test with realistic event volumes

Troubleshooting Common Integration Issues

IssueComponentSolution
Events not appearingLogstash → ElasticsearchCheck index patterns, verify bulk API
Alerts not triggeringWazuh → TheHiveValidate API key, check alert level threshold
Missing metricsPrometheus scrapeVerify network connectivity, check exporter status
Agent disconnectedWazuh Agent → ManagerCheck port 1514/1515, verify certificates

Build docs developers (and LLMs) love