Skip to content

Commit

Permalink
Merge Input and Output plugins into Integration plugin
Browse files Browse the repository at this point in the history
  • Loading branch information
yaauie committed Oct 14, 2019
2 parents 32b039b + b99d69d commit 17acfcb
Show file tree
Hide file tree
Showing 23 changed files with 1,968 additions and 800 deletions.
5 changes: 4 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -7,4 +7,7 @@ lib/log4j/
lib/net/
lib/org/
vendor/
build/
build/
.idea/
vendor
*.jar
3 changes: 0 additions & 3 deletions .travis.yml
Original file line number Diff line number Diff line change
@@ -1,4 +1,3 @@
---
sudo: false
language: ruby
cache: bundler
Expand All @@ -12,8 +11,6 @@ matrix:
env: LOGSTASH_BRANCH=6.7
- rvm: jruby-9.1.13.0
env: LOGSTASH_BRANCH=6.6
- rvm: jruby-1.7.27
env: LOGSTASH_BRANCH=5.6
fast_finish: true
install: true
before_script: ci/build.sh
Expand Down
176 changes: 6 additions & 170 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,170 +1,6 @@
## 9.1.0
- Updated Kafka client version to 2.3.0

## 9.0.1
- Added support for `sasl_jaas_config` setting to allow JAAS config per plugin, rather than per JVM [#313](https://github.com/logstash-plugins/logstash-input-kafka/pull/313)

## 9.0.0
- Removed obsolete `ssl` option

## 8.3.1
- Added support for kafka property ssl.endpoint.identification.algorithm #302(https://github.com/logstash-plugins/logstash-input-kafka/pull/302)

## 8.3.0
- Changed Kafka client version to 2.1.0

## 8.2.1
- Changed Kafka client version to 2.0.1 [#295](https://github.com/logstash-plugins/logstash-input-kafka/pull/295)

## 8.2.0
- Upgrade Kafka client to version 2.0.0

## 8.1.2
- Docs: Correct list formatting for `decorate_events`
- Docs: Add kafka default to `partition_assignment_strategy`

## 8.1.1
- Fix race-condition where shutting down a Kafka Input before it has finished starting could cause Logstash to crash

## 8.1.0
- Internal: Update build to gradle
- Upgrade Kafka client to version 1.1.0

## 8.0.6
- Fix broken 8.0.5 release

## 8.0.5
- Docs: Set the default_codec doc attribute.

## 8.0.4
- Upgrade Kafka client to version 1.0.0

## 8.0.3
- Update gemspec summary

## 8.0.2
- Fix some documentation issues

## 8.0.1
- Fixed an issue that prevented setting a custom `metadata_max_age_ms` value

## 8.0.0
- Breaking: mark deprecated `ssl` option as obsolete

## 7.0.0
- Breaking: Nest the decorated fields under `@metadata` field to avoid mapping conflicts with beats.
Fixes #198, #180

## 6.3.4
- Fix an issue that led to random failures in decoding messages when using more than one input thread

## 6.3.3
- Upgrade Kafka client to version 0.11.0.0

## 6.3.1
- fix: Added record timestamp in event decoration

## 6.3.0
- Upgrade Kafka client to version 0.10.2.1

## 6.2.7
- Fix NPE when SASL_SSL+PLAIN (no Kerberos) is specified.

## 6.2.6
- fix: Client ID is no longer reused across multiple Kafka consumer instances

## 6.2.5
- Fix a bug where consumer was not correctly setup when `SASL_SSL` option was specified.

## 6.2.4
- Make error reporting more clear when connection fails

## 6.2.3
- Docs: Update Kafka compatibility matrix

## 6.2.2
- update kafka-clients dependency to 0.10.1.1

## 6.2.1
- Docs: Clarify compatibility matrix and remove it from the changelog to avoid duplication.

## 6.2.0
- Expose config `max_poll_interval_ms` to allow consumer to send heartbeats from a background thread
- Expose config `fetch_max_bytes` to control client's fetch response size limit

## 6.1.0
- Add Kerberos authentication support.

## 6.0.1
- default `poll_timeout_ms` to 100ms

## 6.0.0
- Breaking: Support for Kafka 0.10.1.0. Only supports brokers 0.10.1.x or later.

## 5.0.5
- place setup_log4j for logging registration behind version check

## 5.0.4
- Update to Kafka version 0.10.0.1 for bug fixes

## 5.0.3
- Internal: gem cleanup

## 5.0.2
- Release a new version of the gem that includes jars

## 5.0.1
- Relax constraint on logstash-core-plugin-api to >= 1.60 <= 2.99

## 5.0.0
- Support for Kafka 0.10 which is not backward compatible with 0.9 broker.

## 4.0.0
- Republish all the gems under jruby.
- Update the plugin to the version 2.0 of the plugin api, this change is required for Logstash 5.0 compatibility. See https://github.com/elastic/logstash/issues/5141
- Support for Kafka 0.9 for LS 5.x

## 3.0.0.beta7
- Fix Log4j warnings by setting up the logger

## 3.0.0.beta5 and 3.0.0.beta6
- Internal: Use jar dependency
- Fixed issue with snappy compression

## 3.0.0.beta3 and 3.0.0.beta4
- Internal: Update gemspec dependency

## 3.0.0.beta2
- internal: Use jar dependencies library instead of manually downloading jars
- Fixes "java.lang.ClassNotFoundException: org.xerial.snappy.SnappyOutputStream" issue (#50)

## 3.0.0.beta2
- Added SSL/TLS connection support to Kafka
- Breaking: Changed default codec to plain instead of SSL. Json codec is really slow when used
with inputs because inputs by default are single threaded. This makes it a bad
first user experience. Plain codec is a much better default.

## 3.0.0.beta1
- Refactor to use new Java based consumer, bypassing jruby-kafka
- Breaking: Change configuration to match Kafka's configuration. This version is not backward compatible

## 2.0.7
- Update to jruby-kafka 1.6 which includes Kafka 0.8.2.2 enabling LZ4 decompression.

## 2.0.6
- Depend on logstash-core-plugin-api instead of logstash-core, removing the need to mass update plugins on major releases of logstash

## 2.0.5
- New dependency requirements for logstash-core for the 5.0 release

## 2.0.4
- Fix safe shutdown while plugin waits on Kafka for new events
- Expose auto_commit_interval_ms to control offset commit frequency

## 2.0.3
- Fix infinite loop when no new messages are found in Kafka

## 2.0.0
- Plugins were updated to follow the new shutdown semantic, this mainly allows Logstash to instruct input plugins to terminate gracefully,
instead of using Thread.raise on the plugins' threads. Ref: https://github.com/elastic/logstash/pull/3895
- Dependency on logstash-core update to 2.0
## 10.0.0
- Initial release of the Kafka Integration Plugin, which combines
previously-separate Kafka plugins and shared dependencies into a single
codebase; independent changelogs for previous versions can be found:
- [Kafka Input Plugin @9.1.0](https://github.com/logstash-plugins/logstash-input-rabbitmq/blob/v9.1.0/CHANGELOG.md)
- [Kafka Output Plugin @8.1.0](https://github.com/logstash-plugins/logstash-output-rabbitmq/blob/v8.1.0/CHANGELOG.md)
3 changes: 3 additions & 0 deletions CONTRIBUTORS
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,9 @@ Contributors:
* Richard Pijnenburg (electrical)
* Suyog Rao (suyograo)
* Tal Levy (talevy)
* João Duarte (jsvd)
* Kurt Hurtado (kurtado)
* Ry Biesemeyer (yaauie)

Note: If you've sent us patches, bug reports, or otherwise contributed to
Logstash, and you aren't on the list above and want to be, please let us know
Expand Down
70 changes: 62 additions & 8 deletions DEVELOPER.md
Original file line number Diff line number Diff line change
@@ -1,14 +1,24 @@
logstash-input-kafka
====================
# logsstash-integration-kafka

Apache Kafka integration for Logstash, including Input and Output plugins.

# Dependencies

* Apache Kafka version 0.8.1.1
* jruby-kafka library

# Plugins


## logstash-input-kafka

Apache Kafka input for Logstash. This input will consume messages from a Kafka topic using the high level consumer API exposed by Kafka.

For more information about Kafka, refer to this [documentation](http://kafka.apache.org/documentation.html)

Information about high level consumer API can be found [here](http://kafka.apache.org/documentation.html#highlevelconsumerapi)

Logstash Configuration
====================
### Logstash Configuration

See http://kafka.apache.org/documentation.html#consumerconfigs for details about the Kafka consumer options.

Expand Down Expand Up @@ -36,8 +46,52 @@ See http://kafka.apache.org/documentation.html#consumerconfigs for details about

The default codec is json

Dependencies
====================
## logstash-output-kafka

* Apache Kafka version 0.8.1.1
* jruby-kafka library
Apache Kafka output for Logstash. This output will produce messages to a Kafka topic using the producer API exposed by Kafka.

For more information about Kafka, refer to this [documentation](http://kafka.apache.org/documentation.html)

Information about producer API can be found [here](http://kafka.apache.org/documentation.html#apidesign)

### Logstash Configuration

See http://kafka.apache.org/documentation.html#producerconfigs for details about the Kafka producer options.

output {
kafka {
topic_id => ... # string (required), The topic to produce the messages to
broker_list => ... # string (optional), default: "localhost:9092", This is for bootstrapping and the producer will only use it for getting metadata
compression_codec => ... # string (optional), one of ["none", "gzip", "snappy"], default: "none"
compressed_topics => ... # string (optional), default: "", This parameter allows you to set whether compression should be turned on for particular
request_required_acks => ... # number (optional), one of [-1, 0, 1], default: 0, This value controls when a produce request is considered completed
serializer_class => ... # string, (optional) default: "kafka.serializer.StringEncoder", The serializer class for messages. The default encoder takes a byte[] and returns the same byte[]
partitioner_class => ... # string (optional) default: "kafka.producer.DefaultPartitioner"
request_timeout_ms => ... # number (optional) default: 10000
producer_type => ... # string (optional), one of ["sync", "async"] default => 'sync'
key_serializer_class => ... # string (optional) default: kafka.serializer.StringEncoder
message_send_max_retries => ... # number (optional) default: 3
retry_backoff_ms => ... # number (optional) default: 100
topic_metadata_refresh_interval_ms => ... # number (optional) default: 600 * 1000
queue_buffering_max_ms => ... # number (optional) default: 5000
queue_buffering_max_messages => ... # number (optional) default: 10000
queue_enqueue_timeout_ms => ... # number (optional) default: -1
batch_num_messages => ... # number (optional) default: 200
send_buffer_bytes => ... # number (optional) default: 100 * 1024
client_id => ... # string (optional) default: ""
partition_key_format => ... # string (optional) default: nil, Provides a way to specify a partition key as a string
}
}

The default codec is json for outputs. If you select a codec of plain, logstash will encode your messages with not only the message
but also with a timestamp and hostname. If you do not want anything but your message passing through, you should make
the output configuration something like:

output {
kafka {
codec => plain {
format => "%{message}"
}
topic_id => "my_topic_id"
}
}
2 changes: 1 addition & 1 deletion NOTICE.TXT
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
Elasticsearch
Copyright 2012-2015 Elasticsearch
Copyright 2012-2019 Elastic NV

This product includes software developed by The Apache Software
Foundation (http://www.apache.org/).
29 changes: 18 additions & 11 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Logstash Plugin

[![Travis Build Status](https://travis-ci.org/logstash-plugins/logstash-input-kafka.svg)](https://travis-ci.org/logstash-plugins/logstash-input-kafka)
[![Travis Build Status](https://travis-ci.org/logstash-plugins/logstash-integration-kafka.svg)](https://travis-ci.org/logstash-plugins/logstash-integration-kafka)

This is a plugin for [Logstash](https://github.com/elastic/logstash).

Expand Down Expand Up @@ -38,6 +38,7 @@ Need help? Try #logstash on freenode IRC or the https://discuss.elastic.co/c/log
- Create a new plugin or clone and existing from the GitHub [logstash-plugins](https://github.com/logstash-plugins) organization. We also provide [example plugins](https://github.com/logstash-plugins?query=example).

- Install dependencies

```sh
bundle install
rake install_jars
Expand All @@ -52,19 +53,30 @@ bundle install
rake install_jars
```

- Run tests
- Run unit tests

```sh
bundle exec rspec
```

- Run integration tests

you'll need to have docker available within your test environment before
running the integration tests. The tests depend on a specific Kafka image
found in Docker Hub called `spotify/kafka`. You will need internet connectivity
to pull in this image if it does not already exist locally.

```sh
bundle exec rspec --tag integration
```

### 2. Running your unpublished Plugin in Logstash

#### 2.1 Run in a local Logstash clone

- Edit Logstash `Gemfile` and add the local plugin path, for example:
```ruby
gem "logstash-filter-awesome", :path => "/your/local/logstash-filter-awesome"
gem "logstash-output-kafka", :path => "/your/local/logstash-output-kafka"
```
- Install plugin
```sh
Expand All @@ -77,7 +89,7 @@ bin/plugin install --no-verify
```
- Run Logstash with your plugin
```sh
bin/logstash -e 'filter {awesome {}}'
bin/logstash -e 'output { kafka { topic_id => "kafka_topic" }}'
```
At this point any modifications to the plugin code will be applied to this local Logstash setup. After modifying the plugin, simply rerun Logstash.

Expand All @@ -87,16 +99,11 @@ You can use the same **2.1** method to run your plugin in an installed Logstash

- Build your plugin gem
```sh
gem build logstash-filter-awesome.gemspec
gem build logstash-output-kafka.gemspec
```
- Install the plugin from the Logstash home
```sh
# Logstash 2.3 and higher
bin/logstash-plugin install --no-verify

# Prior to Logstash 2.3
bin/plugin install --no-verify

bin/plugin install /your/local/plugin/logstash-output-kafka.gem
```
- Start Logstash and proceed to test the plugin

Expand Down
2 changes: 0 additions & 2 deletions Rakefile
Original file line number Diff line number Diff line change
@@ -1,4 +1,3 @@

# encoding: utf-8
require "logstash/devutils/rake"
require "jars/installer"
Expand All @@ -17,4 +16,3 @@ task :clean do
FileUtils.rm_rf(p)
end
end

Loading

0 comments on commit 17acfcb

Please sign in to comment.