Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Consult cached contextual types only when no contextFlags #52611

Merged
merged 2 commits into from
Feb 10, 2023
Merged

Conversation

ahejlsberg
Copy link
Member

Fixes #52575.

@typescript-bot typescript-bot added Author: Team For Milestone Bug PRs that fix a bug with a specific milestone labels Feb 4, 2023
@ahejlsberg
Copy link
Member Author

@typescript-bot test this
@typescript-bot user test this inline
@typescript-bot run dt
@typescript-bot perf test faster
@typescript-bot test top100

@typescript-bot
Copy link
Collaborator

typescript-bot commented Feb 4, 2023

Heya @ahejlsberg, I've started to run the extended test suite on this PR at 7c22dcb. You can monitor the build here.

@typescript-bot
Copy link
Collaborator

typescript-bot commented Feb 4, 2023

Heya @ahejlsberg, I've started to run the abridged perf test suite on this PR at 7c22dcb. You can monitor the build here.

Update: The results are in!

@typescript-bot
Copy link
Collaborator

typescript-bot commented Feb 4, 2023

Heya @ahejlsberg, I've started to run the diff-based top-repos suite on this PR at 7c22dcb. You can monitor the build here.

Update: The results are in!

@typescript-bot
Copy link
Collaborator

typescript-bot commented Feb 4, 2023

Heya @ahejlsberg, I've started to run the diff-based user code test suite on this PR at 7c22dcb. You can monitor the build here.

Update: The results are in!

@typescript-bot
Copy link
Collaborator

typescript-bot commented Feb 4, 2023

Heya @ahejlsberg, I've started to run the parallelized Definitely Typed test suite on this PR at 7c22dcb. You can monitor the build here.

@typescript-bot
Copy link
Collaborator

@ahejlsberg Here are the results of running the user test suite comparing main and refs/pull/52611/merge:

Everything looks good!

@typescript-bot
Copy link
Collaborator

Heya @ahejlsberg, I've run the RWC suite on this PR - assuming you're on the TS core team, you can view the resulting diff here.

@typescript-bot
Copy link
Collaborator

@ahejlsberg
The results of the perf run you requested are in!

Here they are:

Comparison Report - main..52611

Metric main 52611 Delta Best Worst p-value
Angular - node (v16.17.1, x64)
Memory used 359,220k (± 0.01%) 359,201k (± 0.01%) ~ 359,151k 359,234k p=0.298 n=6
Parse Time 4.18s (± 0.38%) 4.19s (± 0.58%) ~ 4.16s 4.22s p=0.934 n=6
Bind Time 1.23s (± 0.00%) 1.22s (± 0.80%) ~ 1.21s 1.23s p=0.073 n=6
Check Time 9.47s (± 0.31%) 9.45s (± 0.26%) ~ 9.42s 9.48s p=0.197 n=6
Emit Time 8.05s (± 0.16%) 8.10s (± 0.84%) +0.05s (+ 0.64%) 8.06s 8.24s p=0.014 n=6
Total Time 22.93s (± 0.09%) 22.96s (± 0.36%) ~ 22.87s 23.10s p=0.809 n=6
Compiler-Unions - node (v16.17.1, x64)
Memory used 193,990k (± 0.03%) 195,121k (± 0.89%) ~ 193,939k 197,476k p=0.378 n=6
Parse Time 1.80s (± 0.46%) 1.81s (± 0.45%) ~ 1.80s 1.82s p=0.718 n=6
Bind Time 0.85s (± 0.48%) 0.85s (± 0.96%) ~ 0.84s 0.86s p=0.584 n=6
Check Time 10.45s (± 0.81%) 10.46s (± 0.55%) ~ 10.36s 10.54s p=0.419 n=6
Emit Time 3.06s (± 0.96%) 3.05s (± 0.53%) ~ 3.03s 3.07s p=0.460 n=6
Total Time 16.17s (± 0.54%) 16.16s (± 0.36%) ~ 16.08s 16.26s p=0.873 n=6
Monaco - node (v16.17.1, x64)
Memory used 343,278k (± 0.01%) 343,268k (± 0.01%) ~ 343,234k 343,284k p=0.810 n=6
Parse Time 3.15s (± 1.13%) 3.18s (± 1.44%) ~ 3.13s 3.24s p=0.295 n=6
Bind Time 1.12s (± 0.56%) 1.11s (± 0.46%) ~ 1.11s 1.12s p=0.091 n=6
Check Time 7.72s (± 0.31%) 7.76s (± 0.45%) ~ 7.73s 7.82s p=0.053 n=6
Emit Time 4.53s (± 0.61%) 4.55s (± 0.69%) ~ 4.50s 4.59s p=0.226 n=6
Total Time 16.52s (± 0.30%) 16.61s (± 0.51%) ~ 16.51s 16.73s p=0.106 n=6
TFS - node (v16.17.1, x64)
Memory used 299,769k (± 0.01%) 299,776k (± 0.01%) ~ 299,748k 299,798k p=0.470 n=6
Parse Time 2.49s (± 2.15%) 2.49s (± 1.67%) ~ 2.45s 2.56s p=1.000 n=6
Bind Time 1.26s (± 0.41%) 1.25s (± 0.44%) ~ 1.25s 1.26s p=0.640 n=6
Check Time 7.21s (± 0.47%) 7.20s (± 0.44%) ~ 7.17s 7.26s p=0.935 n=6
Emit Time 4.23s (± 0.86%) 4.25s (± 0.44%) ~ 4.22s 4.27s p=0.369 n=6
Total Time 15.18s (± 0.75%) 15.19s (± 0.47%) ~ 15.12s 15.29s p=0.936 n=6
material-ui - node (v16.17.1, x64)
Memory used 476,134k (± 0.01%) 476,130k (± 0.00%) ~ 476,119k 476,138k p=0.688 n=6
Parse Time 3.71s (± 0.24%) 3.73s (± 0.39%) +0.02s (+ 0.58%) 3.71s 3.75s p=0.028 n=6
Bind Time 1.02s (± 0.00%) 1.02s (± 0.50%) ~ 1.02s 1.03s p=0.174 n=6
Check Time 18.13s (± 0.23%) 18.21s (± 0.75%) ~ 18.07s 18.45s p=0.423 n=6
Emit Time 0.00s (± 0.00%) 0.00s (± 0.00%) ~ 0.00s 0.00s p=1.000 n=6
Total Time 22.86s (± 0.17%) 22.96s (± 0.63%) ~ 22.82s 23.22s p=0.147 n=6
xstate - node (v16.17.1, x64)
Memory used 546,762k (± 0.02%) 546,745k (± 0.02%) ~ 546,624k 546,872k p=0.936 n=6
Parse Time 4.75s (± 0.35%) 4.78s (± 0.43%) ~ 4.75s 4.81s p=0.088 n=6
Bind Time 1.81s (± 3.98%) 1.81s (± 3.56%) ~ 1.68s 1.85s p=0.743 n=6
Check Time 3.08s (± 3.26%) 3.11s (± 3.09%) ~ 3.04s 3.30s p=0.332 n=6
Emit Time 0.09s (± 4.45%) 0.09s (± 4.45%) ~ 0.09s 0.10s p=1.000 n=6
Total Time 9.74s (± 0.56%) 9.79s (± 0.44%) ~ 9.74s 9.87s p=0.173 n=6
System
Machine Namets-ci-ubuntu
Platformlinux 5.4.0-135-generic
Architecturex64
Available Memory16 GB
Available Memory15 GB
CPUs4 × Intel(R) Core(TM) i7-4770 CPU @ 3.40GHz
Hosts
  • node (v16.17.1, x64)
Scenarios
  • Angular - node (v16.17.1, x64)
  • Compiler-Unions - node (v16.17.1, x64)
  • Monaco - node (v16.17.1, x64)
  • TFS - node (v16.17.1, x64)
  • material-ui - node (v16.17.1, x64)
  • xstate - node (v16.17.1, x64)
Benchmark Name Iterations
Current 52611 6
Baseline main 6

Developer Information:

Download Benchmark

@typescript-bot
Copy link
Collaborator

@ahejlsberg Here are the results of running the top-repos suite comparing main and refs/pull/52611/merge:

Everything looks good!

Comment on lines 29122 to +29123
contextualTypes[contextualTypeCount] = type;
contextualIsCache[contextualTypeCount] = isCache;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm trying to grok what happens here - with my limited knowledge of the compiler internals - and I'm getting a little bit lost. Would you mind giving some more context behind the change?

I wonder when we want to consult the cache and when we don't + why do we even "push" things here with isCache === false 🤔 especially the second part is interesting

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The process of obtaining a contextual type can sometimes be expensive. For example, given a deeply nested array literal on the source side and a correspondingly nested type on the target side, getContextualType recurses all the way to the top to obtain the target type and then breaks down the type as it returns out of the recursion. This all adds up, particularly when an array literal has thousands of entries. By pushing a cache entry on the contextual type stack during the processing of an array or object literal we can improve performance by short-circuiting a lot of this recursion.

However... During the type inference phase of processing a generic function call, we sometimes need to instantiate the contextual type using the inferences collected so far (that's how we compute the contextual type to assign to the r parameters in the example). We trigger this eager instantiation by including ContextFlags.Signature in a call to getContextualType. In those cases we can't trust the cached contextual types because they may have resulted from breaking down the constraint of a generic type instead of breaking down an instantiation of that generic type with the current inferences.

Hope this clarifies it a bit.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm trying to still get this.

It sounds like there are places where if there is no special type-checking context (i.e. no context flags), then you can trust these "cache" contextual types - that seems counter-intuitive.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

On entry to checkObjectLiteral and checkArrayLiteral we obtain the contextual type for the object/array literal with no context flags and push that on the contextual type stack as a cache entry. Then, any requests for the contextual type of a child node (as occurs when we check the elements of the literal) will hit that cache entry and shorten the recursive walk up the parent chain. As long as the child requests are for contextual types with no context flags, we can trust the cached type. But for other requests we have to continue walking up to the first non-cache entry and then break that down according to the specified context flags.

Copy link
Member

@DanielRosenwasser DanielRosenwasser Feb 9, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let me try to rephrase - it seems like when there are no context flags, those contextual types are effectively a "source of truth" - there's no pass where we're trying to infer type parameters, skip context-sensitive expressions, etc. So when you push on a contextual type with no context flags (which in this PR, is now called a "cache" contextual type), why is that not "better" than any other contextual type that's available?

It might be the case that my assumption around "source of truth" is incorrect. Maybe understanding what went wrong in the test case would be a good explainer.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No, that's not how it works. When isCache is false, contextual type stack entries are a "source of truth" for all requests regardless of context flags. But when isCache is true, contextual type stack entries can be used only for requests with no context flags. The effect of not using a cache entry is that the search for a contextual type will continue further up the parent chain until possibly an entry that is the "source of truth" is found. Then, as recursion returns out of the nested getContextualType calls, the type is picked apart using the specified context flags which, for example, might specify that generic contextual types should be instantiated with the inferences made so far in order to contextually type parameters of an arrow function or function expression.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

When it comes to "what went wrong" you have a breakdown here. It doesn't exactly mention the interaction with this cache though as I didn't quite understand how this part works when I was debugging it. Maybe you will find it a little bit helpful.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I've been asking others on the team to understand this - our understanding is that the presence of ContextFlags indicates a non-standard checking mode, so any types calculated under those can't be trusted. So I'm really struggling with the intuition here.

I can understand the rule that contextual types calculated with no context flags can only be used by other computations with no context flags. But I cannot understand why a contextual type calculated with ContextFlags.Completions would ever be reusable during type-checking with any other ContextFlags.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Okay, I spoke with @ahejlsberg yesterday evening, and I see where the confusion came from.

First of all, there's something I mentioned which is that when we are pushing a contextual type, we determine if it's a "cache" based on if there are context flags. That's not true, that came from me misreading the code. If I understand correctly, there will be instances of contextual types requested with no context flags that are still not caches - but it is definitely supposed to be the case that any cache should be calculated with no type flags.

Second, there was another thing I mentioned about ContextFlags meaning types are not "trustworthy". I'm not going to say that's entirely untrue, but it comes more from a confusion between ContextFlags and CheckFlags. Anders explained that the intent of ContextFlags is more of a strategy around how we "crack into" contextual types, and what we do with them after receiving them.

And that explained exactly what happened in the test case. Something more minimal than the original is this:

export interface Event<T> {
    nested: {
        nestedCallback: (response: T) => void;
    }
}

export type CustomEvents = {
    a: Event<string>
    b: Event<number>
};

declare function emit<T extends keyof CustomEvents>(type: T, data: CustomEvents[T]): void

emit('a', /*1*/ {
    nested: /*2*/ {
        nestedCallback: /*3*/ (r) => { },
    },
});

So in this example, as of #51865, the way type-checking will work is that as we walk the argument list and we hit the object literal (/*1*/), we now proactively push a contextual type, which is the uninferred/uninstantiated CustomEvents[T].

That's fine, but when we hit the inner object literal (/2/), we need to do the same. And because we need to "dig in" to get the type for the property nested. This ends up requesting the apparent type of the constraint - because again, CustomEvents[T] is uninferred and uninstantiated. So we use the apparent members of CustomEvents[keyof CustomEvents], which is just Event<string> | Event<number> - but notice that we've lost information about T at this point. Regardless, we proactively push that type.

Now when we get to the function expression, we have this contextual type readily available - but it's not the right one! When contextually typing a signature, we use ContextFlags.Signature which does a little extra - it performs some instantiation. But the contextual type it retrieved was already instantiated incorrectly with its constraint. I'm going to hand wave the details here, but the point is that that inner contextual type wasn't really "trustworthy" because it lost information. So we end up with a union of signatures, but we don't assign contextual types to parameters from unions of signatures.

There was no issue with the "outer" property in the example at #52575 because the outer contextual type was still CustomEvents[T] which was instantiable.

So this PR makes it so that any custom strategy does not reuse these "proactive/naive" contextual types.

I realize a lot of this is a rehash of what @Andarist said in #52575 (comment) - but I think explicitly calling out that the object literals got these proactive contextual types is part of what made it clearer to me.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I realize a lot of this is a rehash of what @Andarist said in #52575 (comment) - but I think explicitly calling out that the object literals got these proactive contextual types is part of what made it clearer to me.

It might be "just" a rehash of what I said - but it's a very good rehash :P and it helped me to understand the true origin of the issue much better (I merely described what happened, without exactly understanding why)

nested: {
nestedCallback: (r) => {},
},
});
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a weird one. It currently works OK on the main branch - so it isn't fixed by this PR per se. However, one of my PRs regressed this case - it was caught by the bot here. I've re-checked that your PR here fixes it (see here) and that this "regression" was introduced with the const type parameters PR. Somehow my PR impacted things for this case to only manifest with it 🤷‍♂️

I figured out that it might be worthwhile to include this test case here even if it isn't exactly fixed by this PR right now.

Suggested change
});
});
// repro from 52589#issuecomment-1416180638
declare class MyCompiler {
compile(): void;
}
interface WebpackPluginInstance {
apply: (compiler: MyCompiler) => void;
}
type WebpackPluginFunction = (this: MyCompiler, compiler: MyCompiler) => void;
interface Optimization {
minimizer?: (WebpackPluginInstance | WebpackPluginFunction)[];
}
declare const A: <T, P extends keyof T>(
obj: T,
prop: P,
factory: () => T[P]
) => void;
const applyOptimizationDefaults = (optimization: Optimization) => {
A(optimization, "minimizer", () => [
{
apply: (compiler) => {},
},
]);
};

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would be nice to get more code coverage in these scenarios.

@@ -2110,6 +2110,7 @@ export function createTypeChecker(host: TypeCheckerHost): TypeChecker {

const contextualTypeNodes: Node[] = [];
const contextualTypes: (Type | undefined)[] = [];
const contextualIsCache: boolean[] = [];
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should this be contextualIsCached (or preferably contextualTypeIsCached)?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Or is it more like "cacheable"?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The flag is intended to indicate whether this is entry is a cache (to be consulted only for requests with no context flags) or a base entry (to be consulted always). It refers to the combined node/type entry, not just the type. Not loving the name I gave it, but not sure I like your suggestions any more,

Copy link
Member

@DanielRosenwasser DanielRosenwasser left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We've gotta think of a better name. But otherwise makes sense.

@ahejlsberg ahejlsberg merged commit c838b0c into main Feb 10, 2023
@ahejlsberg ahejlsberg deleted the fix52575 branch February 10, 2023 19:18
@Andarist Andarist mentioned this pull request Mar 16, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Author: Team For Milestone Bug PRs that fix a bug with a specific milestone
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Nested callbacks are not contextually typed with a type parameter dependent type query
4 participants