Skip to content

Commit

Permalink
Merge branch 'master' into filebeat_http_endpoint_headersecret
Browse files Browse the repository at this point in the history
  • Loading branch information
P1llus authored Aug 5, 2020
2 parents 408600f + 945da26 commit a9c08e1
Show file tree
Hide file tree
Showing 128 changed files with 477 additions and 83 deletions.
1 change: 1 addition & 0 deletions CHANGELOG.next.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -506,6 +506,7 @@ https://github.com/elastic/beats/compare/v7.0.0-alpha2...master[Check the HEAD d
- Add support for additional fields and FirewallMatchEvent type events in CrowdStrike module {pull}20138[20138]
- Add event.ingested for Suricata module {pull}20220[20220]
- Add support for custom header and headersecret for filebeat http_endpoint input {pull}20435[20435]
- Add event.ingested to all Filebeat modules. {pull}20386[20386]

*Heartbeat*

Expand Down
3 changes: 3 additions & 0 deletions filebeat/module/apache/access/ingest/pipeline.yml
Original file line number Diff line number Diff line change
@@ -1,6 +1,9 @@
description: "Pipeline for parsing Apache HTTP Server access logs. Requires the geoip and user_agent plugins."

processors:
- set:
field: event.ingested
value: '{{_ingest.timestamp}}'
- grok:
field: message
patterns:
Expand Down
3 changes: 3 additions & 0 deletions filebeat/module/apache/error/ingest/pipeline.yml
Original file line number Diff line number Diff line change
@@ -1,5 +1,8 @@
description: Pipeline for parsing apache error logs
processors:
- set:
field: event.ingested
value: '{{_ingest.timestamp}}'
- grok:
field: message
patterns:
Expand Down
3 changes: 3 additions & 0 deletions filebeat/module/auditd/log/ingest/pipeline.yml
Original file line number Diff line number Diff line change
@@ -1,6 +1,9 @@
---
description: Pipeline for parsing Linux auditd logs
processors:
- set:
field: event.ingested
value: '{{_ingest.timestamp}}'
- grok:
field: message
pattern_definitions:
Expand Down
3 changes: 3 additions & 0 deletions filebeat/module/elasticsearch/audit/ingest/pipeline.yml
Original file line number Diff line number Diff line change
@@ -1,5 +1,8 @@
description: Pipeline for parsing elasticsearch audit logs
processors:
- set:
field: event.ingested
value: '{{_ingest.timestamp}}'
- rename:
field: '@timestamp'
target_field: event.created
Expand Down
3 changes: 3 additions & 0 deletions filebeat/module/elasticsearch/deprecation/ingest/pipeline.yml
Original file line number Diff line number Diff line change
@@ -1,5 +1,8 @@
description: Pipeline for parsing elasticsearch deprecation logs
processors:
- set:
field: event.ingested
value: '{{_ingest.timestamp}}'
- rename:
field: '@timestamp'
target_field: event.created
Expand Down
3 changes: 3 additions & 0 deletions filebeat/module/elasticsearch/gc/ingest/pipeline.yml
Original file line number Diff line number Diff line change
@@ -1,5 +1,8 @@
description: Pipeline for parsing Elasticsearch JVM garbage collection logs
processors:
- set:
field: event.ingested
value: '{{_ingest.timestamp}}'
- grok:
field: message
patterns:
Expand Down
3 changes: 3 additions & 0 deletions filebeat/module/elasticsearch/server/ingest/pipeline.yml
Original file line number Diff line number Diff line change
@@ -1,5 +1,8 @@
description: Pipeline for parsing elasticsearch server logs
processors:
- set:
field: event.ingested
value: '{{_ingest.timestamp}}'
- rename:
field: '@timestamp'
target_field: event.created
Expand Down
3 changes: 3 additions & 0 deletions filebeat/module/elasticsearch/slowlog/ingest/pipeline.yml
Original file line number Diff line number Diff line change
@@ -1,5 +1,8 @@
description: Pipeline for parsing elasticsearch slow logs.
processors:
- set:
field: event.ingested
value: '{{_ingest.timestamp}}'
- rename:
field: '@timestamp'
target_field: event.created
Expand Down
3 changes: 3 additions & 0 deletions filebeat/module/haproxy/log/ingest/pipeline.yml
Original file line number Diff line number Diff line change
@@ -1,6 +1,9 @@
description: Pipeline for parsing HAProxy http, tcp and default logs. Requires the
geoip plugin.
processors:
- set:
field: event.ingested
value: '{{_ingest.timestamp}}'
- grok:
field: message
patterns:
Expand Down
3 changes: 3 additions & 0 deletions filebeat/module/icinga/debug/ingest/pipeline.yml
Original file line number Diff line number Diff line change
@@ -1,5 +1,8 @@
description: Pipeline for parsing icinga debug logs
processors:
- set:
field: event.ingested
value: '{{_ingest.timestamp}}'
- grok:
field: message
patterns:
Expand Down
3 changes: 3 additions & 0 deletions filebeat/module/icinga/main/ingest/pipeline.yml
Original file line number Diff line number Diff line change
@@ -1,5 +1,8 @@
description: Pipeline for parsing icinga main logs
processors:
- set:
field: event.ingested
value: '{{_ingest.timestamp}}'
- grok:
field: message
patterns:
Expand Down
3 changes: 3 additions & 0 deletions filebeat/module/icinga/startup/ingest/pipeline.yml
Original file line number Diff line number Diff line change
@@ -1,5 +1,8 @@
description: Pipeline for parsing icinga startup logs
processors:
- set:
field: event.ingested
value: '{{_ingest.timestamp}}'
- grok:
field: message
patterns:
Expand Down
3 changes: 3 additions & 0 deletions filebeat/module/iis/access/ingest/pipeline.yml
Original file line number Diff line number Diff line change
@@ -1,6 +1,9 @@
description: Pipeline for parsing IIS access logs. Requires the geoip and user_agent
plugins.
processors:
- set:
field: event.ingested
value: '{{_ingest.timestamp}}'
- grok:
field: message
patterns:
Expand Down
3 changes: 3 additions & 0 deletions filebeat/module/iis/error/ingest/pipeline.yml
Original file line number Diff line number Diff line change
@@ -1,5 +1,8 @@
description: Pipeline for parsing IIS error logs. Requires the geoip plugin.
processors:
- set:
field: event.ingested
value: '{{_ingest.timestamp}}'
- grok:
field: message
patterns:
Expand Down
3 changes: 3 additions & 0 deletions filebeat/module/kafka/log/ingest/pipeline.yml
Original file line number Diff line number Diff line change
@@ -1,5 +1,8 @@
description: Pipeline for parsing Kafka log messages
processors:
- set:
field: event.ingested
value: '{{_ingest.timestamp}}'
- grok:
field: message
trace_match: true
Expand Down
3 changes: 3 additions & 0 deletions filebeat/module/kibana/log/ingest/pipeline.yml
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,9 @@ on_failure:
field: error.message
value: '{{ _ingest.on_failure_message }}'
processors:
- set:
field: event.ingested
value: '{{_ingest.timestamp}}'
- rename:
field: '@timestamp'
target_field: event.created
Expand Down
3 changes: 3 additions & 0 deletions filebeat/module/logstash/log/ingest/pipeline.yml
Original file line number Diff line number Diff line change
@@ -1,5 +1,8 @@
description: Pipeline for parsing logstash node logs
processors:
- set:
field: event.ingested
value: '{{_ingest.timestamp}}'
- rename:
field: '@timestamp'
target_field: event.created
Expand Down
3 changes: 3 additions & 0 deletions filebeat/module/logstash/slowlog/ingest/pipeline.yml
Original file line number Diff line number Diff line change
@@ -1,5 +1,8 @@
description: Pipeline for parsing logstash slow logs
processors:
- set:
field: event.ingested
value: '{{_ingest.timestamp}}'
- rename:
field: '@timestamp'
target_field: event.created
Expand Down
3 changes: 3 additions & 0 deletions filebeat/module/mongodb/log/ingest/pipeline.yml
Original file line number Diff line number Diff line change
@@ -1,5 +1,8 @@
description: Pipeline for parsing MongoDB logs
processors:
- set:
field: event.ingested
value: '{{_ingest.timestamp}}'
- grok:
field: message
patterns:
Expand Down
3 changes: 3 additions & 0 deletions filebeat/module/mysql/error/ingest/pipeline.yml
Original file line number Diff line number Diff line change
@@ -1,5 +1,8 @@
description: Pipeline for parsing MySQL error logs
processors:
- set:
field: event.ingested
value: '{{_ingest.timestamp}}'
- grok:
field: message
patterns:
Expand Down
5 changes: 5 additions & 0 deletions filebeat/module/mysql/slowlog/ingest/pipeline.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,11 @@
{
"description": "Pipeline for parsing MySQL slow logs.",
"processors": [{
"set": {
"field": "event.ingested",
"value": "{{_ingest.timestamp}}"
}
}, {
"grok": {
"field": "message",
"patterns":[
Expand Down
3 changes: 3 additions & 0 deletions filebeat/module/nats/log/ingest/pipeline.yml
Original file line number Diff line number Diff line change
@@ -1,5 +1,8 @@
description: Pipeline for parsing nats log logs
processors:
- set:
field: event.ingested
value: '{{_ingest.timestamp}}'
- grok:
field: message
patterns:
Expand Down
5 changes: 4 additions & 1 deletion filebeat/module/nginx/access/ingest/pipeline.yml
Original file line number Diff line number Diff line change
@@ -1,6 +1,9 @@
description: Pipeline for parsing Nginx access logs. Requires the geoip and user_agent
plugins.
processors:
- set:
field: event.ingested
value: '{{_ingest.timestamp}}'
- grok:
field: message
patterns:
Expand Down Expand Up @@ -145,7 +148,7 @@ processors:
- set:
field: event.outcome
value: failure
if: "ctx?.http?.response?.status_code != null && ctx.http.response.status_code >= 400"
if: "ctx?.http?.response?.status_code != null && ctx.http.response.status_code >= 400"
- append:
field: related.ip
value: "{{source.ip}}"
Expand Down
3 changes: 3 additions & 0 deletions filebeat/module/nginx/error/ingest/pipeline.yml
Original file line number Diff line number Diff line change
@@ -1,5 +1,8 @@
description: Pipeline for parsing the Nginx error logs
processors:
- set:
field: event.ingested
value: '{{_ingest.timestamp}}'
- grok:
field: message
patterns:
Expand Down
3 changes: 3 additions & 0 deletions filebeat/module/nginx/ingress_controller/ingest/pipeline.yml
Original file line number Diff line number Diff line change
@@ -1,6 +1,9 @@
description: Pipeline for parsing Nginx ingress controller access logs. Requires the
geoip and user_agent plugins.
processors:
- set:
field: event.ingested
value: '{{_ingest.timestamp}}'
- grok:
field: message
patterns:
Expand Down
5 changes: 5 additions & 0 deletions filebeat/module/osquery/result/ingest/pipeline.json
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,11 @@
"description": "Pipeline for parsing osquery result logs",
"processors": [
{
"set":{
"field": "event.ingested",
"value": "{{_ingest.timestamp}}"
}
}, {
"rename": {
"field": "@timestamp",
"target_field": "event.created"
Expand Down
3 changes: 3 additions & 0 deletions filebeat/module/postgresql/log/ingest/pipeline.yml
Original file line number Diff line number Diff line change
@@ -1,5 +1,8 @@
description: Pipeline for parsing PostgreSQL logs.
processors:
- set:
field: event.ingested
value: '{{_ingest.timestamp}}'
- grok:
field: message
ignore_missing: true
Expand Down
3 changes: 3 additions & 0 deletions filebeat/module/redis/log/ingest/pipeline.yml
Original file line number Diff line number Diff line change
@@ -1,5 +1,8 @@
description: Pipeline for parsing redis logs
processors:
- set:
field: event.ingested
value: '{{_ingest.timestamp}}'
- grok:
field: message
patterns:
Expand Down
3 changes: 3 additions & 0 deletions filebeat/module/santa/log/ingest/pipeline.yml
Original file line number Diff line number Diff line change
@@ -1,5 +1,8 @@
description: Pipeline for parsing Google Santa logs.
processors:
- set:
field: event.ingested
value: '{{_ingest.timestamp}}'
- grok:
field: message
patterns:
Expand Down
3 changes: 3 additions & 0 deletions filebeat/module/system/auth/ingest/pipeline.yml
Original file line number Diff line number Diff line change
@@ -1,5 +1,8 @@
description: Pipeline for parsing system authorisation/secure logs
processors:
- set:
field: event.ingested
value: '{{_ingest.timestamp}}'
- grok:
field: message
ignore_missing: true
Expand Down
3 changes: 3 additions & 0 deletions filebeat/module/system/syslog/ingest/pipeline.yml
Original file line number Diff line number Diff line change
@@ -1,5 +1,8 @@
description: Pipeline for parsing Syslog messages.
processors:
- set:
field: event.ingested
value: '{{_ingest.timestamp}}'
- grok:
field: message
patterns:
Expand Down
3 changes: 3 additions & 0 deletions filebeat/module/traefik/access/ingest/pipeline.yml
Original file line number Diff line number Diff line change
@@ -1,6 +1,9 @@
description: Pipeline for parsing Traefik access logs. Requires the geoip and user_agent
plugins.
processors:
- set:
field: event.ingested
value: '{{_ingest.timestamp}}'
- dissect:
field: message
pattern: '%{source.address} %{traefik.access.user_identifier} %{user.name} [%{traefik.access.time}]
Expand Down
4 changes: 4 additions & 0 deletions filebeat/tests/system/test_modules.py
Original file line number Diff line number Diff line change
Expand Up @@ -161,6 +161,10 @@ def run_on_file(self, module, fileset, test_file, cfgfile):
assert obj["event"]["module"] == module, "expected event.module={} but got {}".format(
module, obj["event"]["module"])

# All modules must include a set processor that adds the time that
# the event was ingested to Elasticsearch
assert "ingested" in obj["event"], "missing event.ingested timestamp"

assert "error" not in obj, "not error expected but got: {}".format(
obj)

Expand Down
25 changes: 21 additions & 4 deletions libbeat/tests/system/beat/compose.py
Original file line number Diff line number Diff line change
@@ -1,8 +1,11 @@
import io
import logging
import os
import sys
import tarfile
import time
import io

from contextlib import contextmanager


INTEGRATION_TESTS = os.environ.get('INTEGRATION_TESTS', False)
Expand Down Expand Up @@ -54,9 +57,12 @@ def is_healthy(container):
return container.inspect()['State']['Health']['Status'] == 'healthy'

project = cls.compose_project()
project.pull(
ignore_pull_failures=True,
service_names=cls.COMPOSE_SERVICES)

with disabled_logger('compose.service'):
project.pull(
ignore_pull_failures=True,
service_names=cls.COMPOSE_SERVICES)

project.up(
strategy=ConvergenceStrategy.always,
service_names=cls.COMPOSE_SERVICES,
Expand Down Expand Up @@ -231,3 +237,14 @@ def service_log_contains(cls, service, msg):
if line.find(msg.encode("utf-8")) >= 0:
counter += 1
return counter > 0


@contextmanager
def disabled_logger(name):
logger = logging.getLogger(name)
old_level = logger.getEffectiveLevel()
logger.setLevel(logging.CRITICAL)
try:
yield logger
finally:
logger.setLevel(old_level)
3 changes: 1 addition & 2 deletions libbeat/tests/system/requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ docker-compose==1.25.3
docker-pycreds==0.4.0
dockerpty==0.4.1
docopt==0.6.2
elasticsearch==7.1.0
elasticsearch==7.8.1
enum34==1.1.6
idna==2.6
ipaddress==1.0.19
Expand All @@ -19,7 +19,6 @@ nose==1.3.7
nose-timer==0.7.1
pycodestyle==2.4.0
PyYAML==4.2b1
Pillow>=7.1.0
redis==2.10.6
requests==2.20.0
six==1.11.0
Expand Down
2 changes: 1 addition & 1 deletion metricbeat/module/elasticsearch/test_elasticsearch.py
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ class Test(metricbeat.BaseTest):
def setUp(self):
super(Test, self).setUp()
self.es = Elasticsearch(self.get_hosts())
self.ml_es = client.xpack.ml.MlClient(self.es)
self.ml_es = client.ml.MlClient(self.es)

es_version = self.get_version()
if es_version["major"] < 7:
Expand Down
Loading

0 comments on commit a9c08e1

Please sign in to comment.