Rate limiting/throttling/circuit-breaking middleware for ASP.NET Core and Azure Functions.
Install from Nuget:
ASP.NET Core | Azure Functions | Azure Functions with ASP.NET Core Integration |
---|---|---|
dotnet add package ThrottlingTroll |
dotnet add package ThrottlingTroll.AzureFunctions |
dotnet add package ThrottlingTroll.AzureFunctionsAspNet |
-
Supports ASP.NET Core, Azure Functions (.NET Isolated) and Azure Functions with ASP.NET Core Integration.
-
Ingress throttling, aka let your service automatically respond with
429 TooManyRequests
to some obtrusive clients.sequenceDiagram Client->>+YourService: #127760;HTTP alt limit exceeded? YourService-->>Client:❌ 429 TooManyRequests else YourService-->>-Client:✅ 200 OK end
Implemented as an ASP.NET Core Middleware (for ASP.NET Core) and as an Azure Functions Middleware (for Azure Functions).
-
Egress throttling, aka limit the number of calls your code is making against some external endpoint.
sequenceDiagram YourService->>+HttpClient: SendAsync() alt limit exceeded? HttpClient-->>YourService:❌ 429 TooManyRequests else HttpClient->>+TheirService: #127760;HTTP TheirService-->>-HttpClient:✅ 200 OK HttpClient-->>-YourService:✅ 200 OK end
Implemented as an HttpClient DelegatingHandler, which produces
429 TooManyRequests
response (without making the actual call) when a limit is exceeded. -
Propagating
429 TooManyRequests
from egress to ingress, aka when your service internally makes an HTTP request which results in429 TooManyRequests
, your service can automatically respond with same429 TooManyRequests
to its calling client.sequenceDiagram Client->>+YourService: #127760;HTTP YourService->>+TheirService: #127760;HTTP TheirService-->>-YourService:❌ 429 TooManyRequests YourService-->>-Client:❌ 429 TooManyRequests
-
Custom response fabrics. For ingress it gives full control on what to return when a request is being throttled, and also allows to implement delayed responses (instead of just returning
429 TooManyRequests
):sequenceDiagram Client->>+YourService: #127760;HTTP alt limit exceeded? YourService-->>YourService: await Task.Delay(RetryAfter) YourService-->>Client:✅ 200 OK else YourService-->>-Client:✅ 200 OK end
For egress it also allows ThrottlingTroll to do automatic retries for you:
sequenceDiagram YourService->>+HttpClient: SendAsync() loop while 429 TooManyRequests HttpClient->>+TheirService: #127760;HTTP TheirService-->>-HttpClient:❌ 429 TooManyRequests HttpClient-->>HttpClient: await Task.Delay(RetryAfter) end HttpClient->>+TheirService: #127760;HTTP TheirService-->>-HttpClient:✅ 200 OK HttpClient-->>-YourService:✅ 200 OK
-
Request deduplication, aka allowing only one request with given ID per a certain period of time (and rejecting other requests with the same ID):
sequenceDiagram par multiple requests with same id Client->>+YourService: #127760;/process-shopping-cart?id=123 YourService-->>Client:✅ 200 OK Client->>YourService: #127760;/process-shopping-cart?id=123 YourService-->>Client:❌ 429 TooManyRequests Client->>YourService: #127760;/process-shopping-cart?id=123 YourService-->>-Client:❌ 429 TooManyRequests end
A budget way of ensuring Exactly-Once Processing.
-
Storing rate counters in a distributed cache, making your rate limiting policy consistent across all your computing instances. Supported distributed counter stores are:
-
- Declaratively, aka using ThrottlingTrollAttribute. Aims for best readability.
- Statically, aka via
appsettings.json/host.json
. Simplest. - Programmatically, at startup. In case you want to parametrize something.
- Reactively. You provide a routine, that fetches limits from wherever, and an IntervalToReloadConfigInSeconds for that routine to be called periodically. Allows to reconfigure rules and limits on-the-fly, without restarting your service.
And you can combine all four approaches in the same solution.
-
IdentityIdExtractors, that allow you to limit clients individually, based on their IP-addresses, api-keys, tokens, headers, query strings, claims etc. etc.
-
CostExtractors, that you can use to assign custom costs to different requests. Default cost is 1, but if some of your requests are heavier than the others, you can assign higher costs to them. Another typical usecase for this would be to arrange different pricing tiers for your service: you set the rate limit to something high - and then "charge" clients differently, based on their pricing tier.
-
FixedWindow. No more than PermitLimit requests are allowed in IntervalInSeconds. Here is an illustration for the case of no more than 2 requests per each 8 seconds:
The typical drawback of FixedWindow algorithm is that you'd get request rate bursts at the end of each window. So specifically to cope that we have
-
SlidingWindow. No more than PermitLimit requests are allowed in IntervalInSeconds, but that interval is split into NumOfBuckets. The main benefit of this algorithm over FixedWindow is that if a client constantly exceedes PermitLimit, it will never get any valid response and will always get
429 TooManyRequests
. Here is an illustration for the case of no more than 2 requests per each 8 seconds with 2 buckets:In other words, with SlidingWindow your service gets a smoother request rate.
-
Semaphore aka Concurrency Limiter. No more than PermitLimit requests are allowed to be executed concurrently. Here is an illustration for the case of no more than 3 concurrent requests:
If you set Semaphore's PermitLimit to 1 and use RedisCounterStore, then ThrottlingTroll will act as a distributed lock. If you add an IdentityIdExtractor (identifying requests by e.g. a query string parameter), then it will turn into named distributed locks.
-
CircuitBreaker. No more than PermitLimit failures are allowed in IntervalInSeconds. Once the failure limit is exceeded, goes into Trial mode. In Trial mode one request per TrialIntervalInSeconds is allowed to pass through. Once that request succeeds, goes back to normal.
[video is coming]
Most concepts and features are the same for all supported platforms. Things that are specific to each platform are highlighted in the relevant READMEs:
ASP.NET Core | Azure Functions | Azure Functions with ASP.NET Core Integration |
---|---|---|
How to use with ASP.NET Core | How to use with Azure Functions | How to use with Azure Functions ASP.NET Core Integration |
Full minimalistic sample using ASP.NET Core Minimal API:
using ThrottlingTroll;
var builder = WebApplication.CreateBuilder(args);
var app = builder.Build();
app.MapGet("/", () => "Hello ThrottlingTroll!");
// Limiting to 1 request per 2 seconds
app.UseThrottlingTroll(options =>
{
options.Config = new ThrottlingTrollConfig
{
Rules =
[
new ThrottlingTrollRule
{
LimitMethod = new FixedWindowRateLimitMethod
{
PermitLimit = 1,
IntervalInSeconds = 2
}
}
]
};
});
app.Run();
Comprehensive sample projects that demonstrate all the above concepts are located in separate repos:
ASP.NET Core | Azure Functions |
---|---|
ThrottlingTroll-AspDotNetCore-Samples | ThrottlingTroll-AzureFunctions-Samples |
Is very much welcomed.