Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Reduce cost of responding to unary calls. #932

Merged
merged 1 commit into from
Aug 10, 2020

Conversation

Lukasa
Copy link
Collaborator

@Lukasa Lukasa commented Aug 10, 2020

Motivation:

When looking at the unary response path I noticed that there was quite a
lot of flushing in the callback chain for sending unary responses. In
particular, as this callback chain performed only synchronous operations
(.map and .recover) there was no need for multiple flushes: the
final .whenSuccess covered all code paths and had an unconditional
flush in it, so that flush would cover the other operations.

However, on further inspection I realised that I could refactor the
entire code path to use a single Future callback. This ends up being a
substantial win. Unnecessary future callbacks put pressure on the
allocator, as they tend to have to allocate closure contexts (if the
closure captures, as all of these did) as well as additional Future
objects for return values. Reducing unnecessary chaining can therefore
be a tidy win when it happens in hot paths.

Modifications:

  1. Replace a writeAndFlush with a write(..., promise: nil), removing
    both an unnecessary flush and a future allocation.
  2. Condense a long callback chain into a single function using
    .whenComplete.

Results:

The previous code allocated 4 closure contexts and 4 futures, and
flushed twice. The new code allocates 1 closure context, zero futures,
and flushes once. A tidy little profit: 4.7% improvement in unary call
microbenchmarks.

Motivation:

When looking at the unary response path I noticed that there was quite a
lot of flushing in the callback chain for sending unary responses. In
particular, as this callback chain performed only synchronous operations
(`.map` and `.recover`) there was no need for multiple flushes: the
final `.whenSuccess` covered all code paths and had an unconditional
`flush` in it, so that flush would cover the other operations.

However, on further inspection I realised that I could refactor the
entire code path to use a single Future callback. This ends up being a
substantial win. Unnecessary future callbacks put pressure on the
allocator, as they tend to have to allocate closure contexts (if the
closure captures, as all of these did) as well as additional Future
objects for return values. Reducing unnecessary chaining can therefore
be a tidy win when it happens in hot paths.

Modifications:

1. Replace a `writeAndFlush` with a `write(..., promise: nil)`, removing
   both an unnecessary flush and a future allocation.
2. Condense a long callback chain into a single function using
   `.whenComplete`.

Results:

The previous code allocated 4 closure contexts and 4 futures, and
flushed twice. The new code allocates 1 closure context, zero futures,
and flushes once. A tidy little profit: 4.7% improvement in unary call
microbenchmarks.
@Lukasa Lukasa requested a review from glbrntt August 10, 2020 14:15
@@ -69,29 +69,25 @@ open class UnaryResponseCallContextImpl<ResponsePayload>: UnaryResponseCallConte
super.init(eventLoop: channel.eventLoop, request: request, logger: logger)

responsePromise.futureResult
// Send the response provided to the promise.
.map { responseMessage -> EventLoopFuture<Void> in
return self.channel.writeAndFlush(NIOAny(WrappedResponse.message(.init(responseMessage, compressed: self.compressionEnabled))))
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This change here may be non-obvious: I actually missed it the first time I looked at this code.

While the Future returned from this writeAndFlush looks important (it's even called out in the type signature of the closure), it actually isn't. This is a map block, not a flatMap, so the rest of the callback chain does not wait for the future to complete before proceeding. In fact, the very next map block discards the Future that this code went to so much effort to construct.

This is why I was able to flatten all of this code down: in the end, everything was secretly just synchronous code, hidden in an async chain.

Copy link
Collaborator

@glbrntt glbrntt left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice!

@glbrntt glbrntt added the 🔨 semver/patch No public API change. label Aug 10, 2020
@glbrntt glbrntt merged commit 3790c87 into grpc:main Aug 10, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🔨 semver/patch No public API change.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants