Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Best practice to work with django and confluent_kafka consmer/producer #1215

Open
1 of 7 tasks
itamar27 opened this issue Oct 5, 2021 · 1 comment
Open
1 of 7 tasks
Labels

Comments

@itamar27
Copy link

itamar27 commented Oct 5, 2021

Description

Hello,
I am trying to get my integrate confluent_kafka in my Django webapp and it gives my quite a hard time.
I have several question, hopefully you could help me with one of them.

My webapp is suppose to use as an API endpoint to another big service,
I am using Kafka running on docker container and orchestrated with docker-compose,
For every message I produce with my producer (every http request is turned to kafka message) I want to read the proper response (from a different topic) with my consumer,
therefore every message has a unique ID, and I match between them, so:

#1 - The first question will be - is there a 'best practice' to use the consumer?
Is it recommended to use web sockets? or should I just call the consumer after every time I produce message to the
broker?

#2 - After setting up the consumer, and integrate in the apps 'views.py', the consumer keeps crashing (and therefore crashing
the Django server), sometimes it happened during the consumer subscription to the topic consumer.subscribe(topics),
and sometimes it happened while the consumer tries to read the message,
both fail with error that resembles, something like 'No topic was found with that name'.
the consumer config looks like this:
conf = { 'bootstrap.servers': '{}:{}'.format(env('KAFKA_SERVER'), env('KAFKA_PORT')), 'group.id': "1", 'auto.offset.reset': 'largest', 'allow.auto.create.topics': True }

How to reproduce

Checklist

Please provide the following information:

  • confluent-kafka-python and librdkafka version (confluent_kafka.version() and confluent_kafka.libversion()): 1.7.0
  • Apache Kafka broker version: 6.2.0
  • Client configuration: {'bootstrap.servers': '{}:{}'.format(env('KAFKA_SERVER'), env('KAFKA_PORT')), 'group.id': "1", 'auto.offset.reset': 'largest', 'allow.auto.create.topics': True}
  • Operating system: ubuntu
  • Provide client logs (with 'debug': '..' as necessary)
  • Provide broker log excerpts
  • Critical issue
@OneCricketeer
Copy link

Not sure "best practice", but you ideally should have a separate thread for the consumer that doesn't interfere with your server's lifecycle

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants