CrowdStrike Falcon Integration with Energy Logserver
There are two main approaches to integrate CrowdStrike with Energy Logserver. The SIEM Connector method is more reliable and commonly used.
Recommended Method: CrowdStrike SIEM Connector + Filebeat
This is a two-component setup where CrowdStrike's SIEM Connector streams events to local files, and Filebeat forwards them to Energy Logserver.
Architecture:
CrowdStrike Cloud → SIEM Connector (Linux VM) → Log Files → Filebeat → Logstash → Elasticsearch
Required Access from CrowdStrike
1. Enable Event Streaming API
Contact CrowdStrike support to enable streaming APIs for your tenant. This is not enabled by default.
2. Create API Client in CrowdStrike Console
Login to https://falcon.crowdstrike.com/ (or your region-specific URL):
- Navigate to Support → API Clients and Keys
- Click Add new API client
- Configure permissions:
- Event Streams: Read
- Alerts: Read (optional, for alert data)
- Hosts: Read (optional, for endpoint inventory)
- Save the Client ID and Client Secret - you'll need these for configuration
Important: The secret is only shown once - save it securely.
Setup Steps
1. Install CrowdStrike SIEM Connector
Download connector from CrowdStrike documentation portal (requires login):
Install on a dedicated Linux server (Ubuntu/RHEL):
bash
# Extract and install package
sudo dpkg -i falcon-siem-connector_*.deb
# or for RHEL
sudo rpm -i falcon-siem-connector-*.rpm
2. Configure SIEM Connector
Edit /opt/crowdstrike/etc/cs.falconhoseclient.cfg:
ini
[Connectivity]
falcon_cloud_region=autodiscover
client_id=YOUR_CLIENT_ID_HERE
client_secret=YOUR_CLIENT_SECRET_HERE
[Output]
output_format=json
output_path=/var/log/crowdstrike/falconhoseclient/output
[EventTypeCollection]
DetectionSummaryEvent=true
IncidentSummaryEvent=true
AuthActivityAuditEvent=true
UserActivityAuditEvent=true
# Enable other event types as needed
Start the connector:
bash
sudo systemctl enable cs.falconhoseclient
sudo systemctl start cs.falconhoseclient
Verify it's running:
bash
sudo systemctl status cs.falconhoseclient
tail -f /var/log/crowdstrike/falconhoseclient/cs.falconhoseclient.log
3. Configure Filebeat
On the same server, configure Filebeat to read CrowdStrike logs.
Enable CrowdStrike module in modules.d/crowdstrike.yml:
yaml
- module: crowdstrike
falcon:
enabled: true
var.paths:
- /var/log/crowdstrike/falconhoseclient/output
# Exclude connector's own log files
exclude_files: ['cs\.falconhoseclient.*\.log$']
Configure output to Logstash in filebeat.yml:
yaml
output.logstash:
hosts: ["your-els-server:5044"]
Start Filebeat:
bash
sudo systemctl enable filebeat
sudo systemctl start filebeat
4. Verify Data Flow
Check Filebeat is reading files:
bash
sudo filebeat test output
tail -f /var/log/filebeat/filebeat
Check data in Elasticsearch:
bash
curl localhost:9200/filebeat-*/_search?q=event.module:crowdstrike&pretty
Data Format
CrowdStrike events arrive in JSON format with fields like:
json
{
"event.module": "crowdstrike",
"event.dataset": "crowdstrike.falcon",
"crowdstrike.event.DetectName": "...",
"crowdstrike.event.Severity": "...",
"host.hostname": "...",
"process.name": "...",
"@timestamp": "..."
}
The Filebeat module automatically maps fields to ECS (Elastic Common Schema) for better compatibility.
Alternative: CrowdStrike API Integration
You can also pull data directly via CrowdStrike APIs using Logstash HTTP poller, but this requires:
- More complex Logstash configuration
- Manual pagination handling
- Rate limit management
- Less real-time data (polling vs streaming)
The SIEM Connector method is more reliable and recommended by CrowdStrike.
Common Issues
No events appearing:
- Verify SIEM Connector is running:
systemctl status cs.falconhoseclient
- Check connector logs for API authentication errors
- Verify files are being created in
/var/log/crowdstrike/falconhoseclient/output
- Confirm Filebeat is reading these files:
filebeat test config
API authentication failures:
- Double-check Client ID and Secret in config
- Verify API client has "Event Streams: Read" permission
- Confirm correct cloud region (US-1, US-2, EU-1, etc.)
High event volume:
- Adjust
EventTypeCollection in connector config to collect only needed event types
- Use Filebeat processors to filter events before sending to ELS
Energy Logserver Integration
If ELS has an Integrations plugin, check for a CrowdStrike integration package - it may provide:
- Pre-built Logstash pipelines
- Ready-made dashboards for CrowdStrike detection data
- Automated field mapping
Otherwise, the standard Filebeat → Logstash → Elasticsearch flow works with default ELS configuration.
Resources