Skip to content
This repository was archived by the owner on Feb 23, 2025. It is now read-only.

Race condition in RefreshTokenDelegatingHandler #266

Closed
Nickolas- opened this issue Nov 30, 2020 · 1 comment
Closed

Race condition in RefreshTokenDelegatingHandler #266

Nickolas- opened this issue Nov 30, 2020 · 1 comment
Labels

Comments

@Nickolas-
Copy link

Nickolas- commented Nov 30, 2020

Hi, im not familiar is it expected behaviour or Im using library not in proper way, but...
If add RefreshTokenDelegatingHandler in HttpClient pipe , you would get Unauthorized http response when you have a valid refresh token. This cause by

private async Task<bool> RefreshTokensAsync(CancellationToken cancellationToken)

When you run two http requests in parallel with invalid access token it would cause a race condition in logic.

        var refreshToken = RefreshToken;
        if (refreshToken.IsMissing())
        {
            return false;
        }

        if (await _lock.WaitAsync(Timeout, cancellationToken).ConfigureAwait(false))

Issue would be if one request finish work earlier than other. For example from the beginning refreshToken is equal to "FIRST". First call of RefreshTokensAsync would return that refreshToken is "TWO" , but previous http request would be still try to refresh token with token "FIRST" and would return response with error because that token is not exist in IS anymore. Can anyone give me some thoughts about it.

@KennyDM
Copy link

KennyDM commented Apr 14, 2021

I can acknowledge this is an actual issue, we are having the same issue on our side.

We believe the issue lies in the fact that the retrieval of the refresh token is not contained within the critical section of the lock where the actual refresh happens (see code sample from Nickolas)

This allows a second call, that was awaiting the lock, to use an old value of the refresh token, even after a first call successfully refreshed the tokens. The result is as @Nickolas- described, that the second call fails to refresh the tokens, because it cannot find the handle anymore in the PersistedGrants (because the first call successfully refreshed both access and refresh tokens).

We just discovered this issue in our mobile app, we will try to work around this by implementing the RefreshTokenDelegatingHandler ourselves, and refactor the RefreshTokensAsynccall a bit, so that the retrieval of the refresh token happens within the critical section of the lock.

Something like this:

	private async Task<bool> RefreshTokensAsync(CancellationToken cancellationToken)
	{
		//code removed here

		if (await _lock.WaitAsync(Timeout, cancellationToken).ConfigureAwait(continueOnCapturedContext: false))
		{
			try
			{		
				//code moved here
				if (string.IsNullOrEmpty(_refreshToken))  //using the private field, since we are in the lock
				{
					return false;
				}

				RefreshTokenResult response = await _oidcClient.RefreshTokenAsync(_refreshToken, null, cancellationToken).ConfigureAwait(continueOnCapturedContext: false);

jochenz pushed a commit to jochenz/IdentityModel.OidcClient that referenced this issue Apr 17, 2021
This reverts commit a773a6c.
Consistently green on the local machine.
leastprivilege pushed a commit that referenced this issue May 2, 2021
…#303)

* initial test

* tests which demonstrate the race condition (and the need to run the handler as a singleton)

* remove singleton test

* fixed race condition on RefreshTokenDelegatingHandler (only happens when using refresh tokens with sliding expirations)

* fix overly strict test

* Revert "fixed race condition on RefreshTokenDelegatingHandler (only happens when using refresh tokens with sliding expirations)"

This reverts commit 21c8f1c.

* updated test fails regularly on my local machine

* Reintroduced the fix for issue #266 again

This reverts commit a773a6c.
Consistently green on the local machine.

Co-authored-by: Jochen Zeischka <[email protected]>
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
Projects
None yet
Development

No branches or pull requests

3 participants