-
Notifications
You must be signed in to change notification settings - Fork 999
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
rate-limiter-flexible don't work since 1.11 #2034
Comments
@yovanoc is it possible to get a minimal reproducible example? |
I can't right now but will try to do that later |
Small friendly ping/reminder @yovanoc :) |
Sorry for the delay. |
Interesting! |
We used to set `time_now_ms_` only in the non-squashed execution path. Fixes #2034
oh nice! glad to help as I can |
We used to set `time_now_ms_` only in the non-squashed execution path. Fixes #2034
happy to see this merged, but where can we see your push to prod routine, I see you have dragonfly, helm-dragonfly, and dragonfly-weekly, where its explained your releases process? thanks |
We release new versions every few weeks, you can follow our announcements on GitHub / Discord. |
Yeah but actually If I understand correctly the k8s operator package it's still in 1.10 so I just wanted to know the pattern you follow between your packages |
Indeed, the operator still uses 1.10. We'll update it this week. |
We do not bundle k8s operator and Dragonfly releases. But you can always override the |
this issue still there in 1.12.1 |
@yovanoc I just tried locally, by downloading the latest version, and I cannot reproduce this issue. |
My bad I had to delete all the old keys.. because there was still there with these long expirations |
Ah, that makes sense :) |
Probably because of this: https://github.com/animir/node-rate-limiter-flexible/blob/846b5a28987f28e0e13b5ec7965def4aa39a22ab/lib/RateLimiterRedis.js#L4
when I log the .consume response it gives me big msBeforeNext
dragonfly regression or was working before but an issue in the rate-limiter package ?
The text was updated successfully, but these errors were encountered: