Skip to content

Commit

Permalink
[Docs] Document Retry's MaxDelay property (#1631)
Browse files Browse the repository at this point in the history
  • Loading branch information
martintmk authored Sep 26, 2023
1 parent e51e05e commit 5aae083
Show file tree
Hide file tree
Showing 2 changed files with 72 additions and 0 deletions.
36 changes: 36 additions & 0 deletions docs/strategies/retry.md
Original file line number Diff line number Diff line change
Expand Up @@ -105,6 +105,7 @@ new ResiliencePipelineBuilder().AddRetry(new RetryStrategyOptions
| `UseJitter` | False | Allows adding jitter to retry delays. |
| `DelayGenerator` | `null` | Used for generating custom delays for retries. |
| `OnRetry` | `null` | Action executed when retry occurs. |
| `MaxDelay` | `null` | Caps the calculated retry delay to a specified maximum duration. |

## Patterns and anti-patterns

Expand Down Expand Up @@ -481,3 +482,38 @@ var retry = new ResiliencePipelineBuilder()
**Reasoning**:

As previously mentioned, always use the designated area to define retry conditions. Reframe your original exit conditions to specify when a retry should be initiated.

### Limiting the maximum delay

In some cases, you might want to set a limit on the calculated delay. This is beneficial when multiple retries are anticipated, and you wish to prevent excessive wait times between these retries.

Consider the following example of a long-running background job:

<!-- snippet: retry-pattern-max-delay -->
```cs
ResiliencePipeline pipeline = new ResiliencePipelineBuilder()
.AddRetry(new()
{
Delay = TimeSpan.FromSeconds(2),
MaxRetryAttempts = int.MaxValue,

// Initially we want to retry with exponential backoff, but after certain amount of retries we want to cap the delay to 15 minutes
MaxDelay = TimeSpan.FromMinutes(15),
UseJitter = true
})
.Build();

// Background processing
while (!cancellationToken.IsCancellationRequested)
{
await pipeline.ExecuteAsync(async token =>
{
// We can afford waiting for successful retry here in case of long-term service outage because this is background job
await SynchronizeDataAsync(token);
},
cancellationToken);

await Task.Delay(TimeSpan.FromMinutes(30)); // the sync runs every 30 minutes
}
```
<!-- endSnippet -->
36 changes: 36 additions & 0 deletions src/Snippets/Docs/Retry.cs
Original file line number Diff line number Diff line change
Expand Up @@ -109,6 +109,42 @@ private static bool TryGetDelay(HttpResponseMessage response, out TimeSpan delay
return false;
}

public static async Task MaxDelay()
{
var cancellationToken = CancellationToken.None;

#region retry-pattern-max-delay

ResiliencePipeline pipeline = new ResiliencePipelineBuilder()
.AddRetry(new()
{
Delay = TimeSpan.FromSeconds(2),
MaxRetryAttempts = int.MaxValue,

// Initially, we aim for an exponential backoff, but after a certain number of retries, we set a maximum delay of 15 minutes.
MaxDelay = TimeSpan.FromMinutes(15),
UseJitter = true
})
.Build();

// Background processing
while (!cancellationToken.IsCancellationRequested)
{
await pipeline.ExecuteAsync(async token =>
{
// In the event of a prolonged service outage, we can afford to wait for a successful retry since this is a background task.
await SynchronizeDataAsync(token);
},
cancellationToken);

await Task.Delay(TimeSpan.FromMinutes(30)); // The sync runs every 30 minutes.
}

#endregion

static ValueTask SynchronizeDataAsync(CancellationToken cancellationToken) => default;
}

public static void AntiPattern_1()
{
#region retry-anti-pattern-1
Expand Down

0 comments on commit 5aae083

Please sign in to comment.