You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The Crowdstrike FDR data stream contains a logfile input with the implied purpose of reading the logs written by https://github.com/CrowdStrike/FDR/blob/main/standalone/falcon_data_replicator.py. This script dumps .gz files from the S3 bucket onto the local filesystem. (The script can also replicate the files to your own S3 bucket. This along with SQS notification on your own bucket can be used with the s3 input part of the the integration.)
The problem is that the logfile input does not support reading from gzip files. So there's no way this can be compatible until elastic/beats#637 is implemented. (The aws-s3 input can uncompress the gzip files so that's not a problem.)
Either we need to enhance the logfile input to support gzip or we need to remove the input from this integration since it cannot work.
The text was updated successfully, but these errors were encountered:
@andrewkroh I lean towards removing the logfile input for FDR in the short-term and if/when gzip support is added to the logfile input, we can look at updating the FDR integration again accordingly.
Hi! We just realized that we haven't looked into this issue in a while. We're sorry! We're labeling this issue as Stale to make it hit our filters and make sure we get back to it as soon as possible. In the meantime, it'd be extremely helpful if you could take a look at it as well and confirm its relevance. A simple comment with a nice emoji will be enough :+1. Thank you for your contribution!
The Crowdstrike FDR data stream contains a logfile input with the implied purpose of reading the logs written by https://github.com/CrowdStrike/FDR/blob/main/standalone/falcon_data_replicator.py. This script dumps
.gz
files from the S3 bucket onto the local filesystem. (The script can also replicate the files to your own S3 bucket. This along with SQS notification on your own bucket can be used with the s3 input part of the the integration.)The problem is that the logfile input does not support reading from gzip files. So there's no way this can be compatible until elastic/beats#637 is implemented. (The aws-s3 input can uncompress the gzip files so that's not a problem.)
Either we need to enhance the logfile input to support gzip or we need to remove the input from this integration since it cannot work.
The text was updated successfully, but these errors were encountered: