Skip to content

Releases: neo4j-contrib/neo4j-streams

Minor release 3.5.6

04 Mar 20:48
82678dd
Compare
Choose a tag to compare

This minor release contains the following fixes:

  • fixes #284: Core server will not rejoin cluster if Kafka plugin is enabled.

3.5.5

23 Jan 16:00
Compare
Choose a tag to compare

Hello Neo4j Community!
We're very happy to present the new 3.5.5 release of the Neo4j Kafka integrations, that is the result of great community feedback!

Thanks a lot to everyone who contributed to this: @moxious, @jexp, @conker84 and @F-Guardian

Following a list of improvements:

  • with fixes #252: Source with non-existant topic hangs indefinitely (#257) we improved the behaviour in case the BROKER config auto.create.topics.enable is false and you're sending an event to a topic that is not created in the Kafka cluster. If you're publish events to a topic that is not created the Producer#send can block until the timeout defined by max.block.ms, so we add a system that checks every topic defined in the Kafka Cluster and log if the topic that you defined into your neo4j.conf file exist.

  • with fixes #247: Schema Ingestion Strategy: double node creation (#248) we fixed a bug where, under particular circumstances, can lead to a double node creation in case you're ingesting data with the Schema ingestion strategy

  • with Fixes an issue with wrong ids value in cdc when updating relationships we fixed a bug where that could lead to stream wrong ids for the target node while you're updating relationships

Other fixes:

  • fix typo in documentation section 4.2 (#253)
  • fixes #249: Correct typo in DLQ property (#262)
  • fixes #259: Fix Kafka plugin version (#260)
  • fixes #264: Neo4j Sink Connector: CDC ingestion with schema strategy

To get more info about it please look at the documentation.

3.5.4

20 Sep 09:56
Compare
Choose a tag to compare

Hello Neo4j Community!
We're very happy to present the new 3.5.4 release of the Neo4j Kafka integrations, that is the result of great community feedback!

  • We improved the management of the DLQ. Please look at the documentation

  • We introduced the support for the transient errors into the Kafka Connect Plugin

  • With #211 we introduced the AVRO support to the Neo4j Sink Plugin, the streams.consume procedure manages it as well! Please look at the Supported Kafka deserializers

  • With #188 we introduced the support for a new file interexchange format, we called it CUD File Format and is a JSON file that represents Graph Entities (Nodes/Relationships) and how to mange them in term of Create/Update/Delete operations. i.e.:

{
  "op": "merge",
  "properties": {
    "foo": "value",
    "key": 1
  },
  "ids": {"key": 1, "otherKey":  "foo"},
  "labels": ["Foo","Bar"],
  "type": "node",
  "detach": true
}

will be transformed into the follwing Cypher Query:

UNWIND [..., {
  "op": "merge",
  "properties": {
    "foo": "value",
    "key": 1
  },
  "ids": {"key": 1, "otherKey":  "foo"},
  "labels": ["Foo","Bar"],
  "type": "node",
  "detach": true
}, ...] AS event
MERGE (n:Foo:Bar {key: event.ids.key, otherKey: event.ids.otherKey})
SET n += event.properties

For more details please look into the documentation

  • With #181 we validate a minimun amount of configuration params in orde to make the streams plugin work.

To get more info about it please look at the documentation.

3.5.3

02 Jul 08:21
Compare
Choose a tag to compare

We're very happy to present the new 3.5.3 release of the Neo4j Kafka integrations.

Our focus since the last release was on making our integration easier to use and fixing some issues.

We introduced several new features in the Sink:

With fixes #177: Create a Dead Letter Queue you can manage the Dead Lettere Queue in case of bad data, this feature is initially available only into the Streams Plugin, we plan to also release it for the Kafka Connect Sink Plugin in the near future.

With fixes #198: Tombstone record Management both the Streams Sink and the Kafka Connect Sink Plugin can get improved, in particular, the pattern strategy come out with the support to the Tombstone Record,
in order to leverage it your event should contain as key the record that you want to delete and null for the value.

To get more info about it please look at the documentation.

3.4.4

02 Jul 08:24
Compare
Choose a tag to compare

We're very happy to present the new 3.4.4 release of the Neo4j Kafka integrations.

Our focus since the last release was on making our integration easier to use and fixing some issues.

We introduced several new features in the Sink:

With fixes #177: Create a Dead Letter Queue you can manage the Dead Lettere Queue in case of bad data, this feature is initially available only into the Streams Plugin, we plan to also release it for the Kafka Connect Sink Plugin in the near future.

With fixes #198: Tombstone record Management both the Streams Sink and the Kafka Connect Sink Plugin can get improved, in particular, the pattern strategy come out with the support to the Tombstone Record,
in order to leverage it your event should contain as key the record that you want to delete and null for the value.

To get more info about it please look at the documentation.

3.5.2

17 Jun 13:30
Compare
Choose a tag to compare
New release

3.4.3

17 Jun 15:08
Compare
Choose a tag to compare
New release

Release 3.5.1 of the Neo4j Kafka integration

06 Jun 12:55
Compare
Choose a tag to compare

We're very happy to present the new 3.5.1 release of the Neo4j Kafka integrations.

Our focus since the last release was on making our integration easier to use and fixing some issues.

We introduced several new features in the Sink:

The fixes #99: Provide a roundtrip-sink-config allows you to ingest data that that comes from another Neo4j instances as CDC events.

In a similar way fixes #154: provide a common pattern for ingestion allows you to define simple expressions in order to extract data from any nested event structure and transfrom that data into Nodes and/or Relationships.

For example to create a user and their purchases from the users and orders topics:

streams.sink.topic.pattern.node.users=User{!userId}
streams.sink.topic.pattern.relationship.orders=User{!userId} BOUGHT{purchase.price, purchase.currency} Product{!productId}

The fixes #102: Manual commit behavior for handling errors and retrievals allows you to use a manual committing consumer, moreover improves the streams.consume procedure allowing to read data starting from a specific partition/offset.

Breaking changes

There is a little change about the Sink management, with the fixes 160: change the streams.sink.enabled to false the Streams plugin now has, for the property streams.sink.enabled, the default value set to false so you need to explict set it to true otherwise, if you only specify the topic mapping, you'll see a WARN message into the neo4j.log

We also fixed several issues:

Release 3.4.2 of the Neo4j Kafka integration

06 Jun 12:57
Compare
Choose a tag to compare

We're very happy to present the new 3.4.2 release of the Neo4j Kafka integrations.

Our focus since the last release was on making our integration easier to use and fixing some issues.

We introduced several new features in the Sink:

The fixes #99: Provide a roundtrip-sink-config allows you to ingest data that that comes from another Neo4j instances as CDC events.

In a similar way fixes #154: provide a common pattern for ingestion allows you to define simple expressions in order to extract data from any nested event structure and transfrom that data into Nodes and/or Relationships.

For example to create a user and their purchases from the users and orders topics:

streams.sink.topic.pattern.node.users=User{!userId}
streams.sink.topic.pattern.relationship.orders=User{!userId} BOUGHT{purchase.price, purchase.currency} Product{!productId}

The fixes #102: Manual commit behavior for handling errors and retrievals allows you to use a manual committing consumer, moreover improves the streams.consume procedure allowing to read data starting from a specific partition/offset.

Breaking changes

There is a little change about the Sink management, with the fixes 160: change the streams.sink.enabled to false the Streams plugin now has, for the property streams.sink.enabled, the default value set to false so you need to explict set it to true otherwise, if you only specify the topic mapping, you'll see a WARN message into the neo4j.log

We also fixed several issues:

Neo4j-Streams Release 3.5.0 and Kafka Connect Plugin Release 1.0.0

23 Jan 13:22
98d06d1
Compare
Choose a tag to compare

We're excited about the new release. Big thanks to @conker84 from our partner Larus IT for all the hard work of building this integration.

Thanks to your feedback the Neo4j extension saw a number of fixed issues and more testing in the field, please continue to try it for different use-cases and let us know how well it works for you. Special thanks to @lju-lazarevic and @sarmbruster

Kafka Connect Plugin Release 1.0.0

Finally the Neo4j Kafka integration is available as Connect Plugin. We provide the sink functionality which will also be available on Confluent Hub.
We provide the ability to test the plugin locally with the docker compose setup we provided.

For more details see the docs

New Procedure

We added a new procedure to receive events from a topic and use them in your Cypher statement. That's useful both for testing and also for consuming events directly as part of another workflow.

We describe how to use it in the procedure documentation.

Batching

To allow better control of batching, we added a configuration parameter, batch.size that together with the Kafka setting max.poll.records allows to consume events in bulk(batch)

Bugfixes & Enhancements