You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We are heavily using Avro union schemas within our environment & services. Trying to run Kafka based load tests using xk6-kafka, we are getting errors when serializing objects. I have seen issue 220, but the one we are experiencing seems slightly different.
When we try to serialize objects by calling SchemaRegistry.serialize() and reference a union schema, we essentially get the following error:
Debugging through the xk6-kafka and goavro code, it appears the union is correctly identified but the inner record type is not. It seems like xk6-kafka is not recursively identifying union members as records, and thus a nil codec is eventually returned, which results in the above panic.
You can reproduce the above error by registering the following schemas in the schema registry:
I'm thinking xk6-kafka should probably recursively load schemas associated with union members, and create Codec objects from those. I'm not entirely sure where the changes should go. Maybe that would not even be the right solution! Would be happy to submit a PR if I can get a couple pointers.
Cheers
The text was updated successfully, but these errors were encountered:
This seems to be a limitation of the goavro library, as raised here, yet I suppose you can make it work using a combination of the schema subject and the data, which is roughly like this, or something along the same lines:
Thanks for the quick reply. I believe I had tried that approach, in any case, I just tried it again, and it fails for the same reason (panic because of the nil pointer). Debugging through the code, the error message I get is the following (same as before):
Union item 1 ought to be valid Avro type: unknown type name: "test.ProductCreated"
Given the goavro question you referred to is > 5 years old and hasn't been addressed yet, I guess chances to see it addressed are not that high. In any case, we have worked around the problem, but wanted to see if there was a way to fix it.
Hello,
We are heavily using Avro union schemas within our environment & services. Trying to run Kafka based load tests using xk6-kafka, we are getting errors when serializing objects. I have seen issue 220, but the one we are experiencing seems slightly different.
When we try to serialize objects by calling
SchemaRegistry.serialize()
and reference a union schema, we essentially get the following error:Debugging through the xk6-kafka and goavro code, it appears the union is correctly identified but the inner record type is not. It seems like xk6-kafka is not recursively identifying union members as records, and thus a nil codec is eventually returned, which results in the above panic.
You can reproduce the above error by registering the following schemas in the schema registry:
subject: test.ProductCreated
subject: products.default-value
And executing the following xk6-kafka test:
I'm thinking xk6-kafka should probably recursively load schemas associated with union members, and create
Codec
objects from those. I'm not entirely sure where the changes should go. Maybe that would not even be the right solution! Would be happy to submit a PR if I can get a couple pointers.Cheers
The text was updated successfully, but these errors were encountered: