-
Notifications
You must be signed in to change notification settings - Fork 3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Remove confluent schema registry hard dependency #6552
Remove confluent schema registry hard dependency #6552
Conversation
2abebfc
to
4d32e02
Compare
a00ebdb
to
d1afa3f
Compare
d6e47ea
to
ad0e4fe
Compare
b3bb2bd
to
73910f9
Compare
4ced56a
to
9caf64d
Compare
0c546b9
to
e7b264b
Compare
@ComponentScan(basePackages = {"com.linkedin.metadata.kafka", "com.linkedin.gms.factory.config", | ||
"com.linkedin.gms.factory.common"}, | ||
excludeFilters = {@ComponentScan.Filter(type = FilterType.ASSIGNABLE_TYPE, classes = ScheduledAnalyticsFactory.class), | ||
@ComponentScan.Filter(type = FilterType.ASSIGNABLE_TYPE, classes = SiblingGraphServiceFactory.class)} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
not sure why this needs to be excluded as part of this PR
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
They are being injected through other classes which means that when running this in standalone mode you have multiple sources for the same beans.
Checklist
Purpose of this PR is to remove the schema registry as a hard dependency from DataHub.
In order to do that, the PR creates a new module
kafka-avro-serde
which implements Kafka-compatibleSerializer
&Deserializer
classes.Also included is a static, internal light schema registry implementation to know how to serde DataHub's Avro records (MCP, MCL and it's variants).
Based on #5232, kudos to @tmnd1991!