Skip to content

Commit

Permalink
Never merge fields in cache unless objects have same identity. (#5603)
Browse files Browse the repository at this point in the history
These changes realize the consequences of the following extremely
important principles:

1. In order to accumulate GraphQL data over time, the InMemoryCache must
   be able to merge the fields of different response objects together,
   as those objects are written to the cache.

2. However, merging the fields of two objects is allowable only when those
   objects have the same identity.

   If we cannot determine the identities of both objects (using __typename
   and id, or keyFields), or their identities are different, we should
   never merge their fields together, because doing so risks combining
   unrelated fields into the same logical entity object, resulting in an
   object that could never have been returned by the GraphQL server.

3. Even if two objects have the same identity, which allows us to merge
   their top-level fields, we should not assume we can recursively merge
   the values of fields that have the same name, unless those values are
   also identifiable objects with matching identities. Otherwise, we must
   replace the existing value with the incoming value, without attempting
   to combine them.

   Exception A: if the existing value and the incoming value are deeply
   equal to each other, then we can safely reuse the existing value to
   avoid needlessly altering references in the cache.

   Exception B: The application developer can provide custom field merge
   functions, e.g. for paginating array-valued fields. Since this level of
   customization was not possible in Apollo Client 2.x, some amount of
   automatic merging was necessary, but we can do the right thing now that
   the developer has more control.

I am happy that I did not have to update very many tests as a result of these
changes, but the principles above are so important that I absolutely would
have corrected as many tests as necessary to accommodate these changes.
  • Loading branch information
benjamn authored Nov 21, 2019
1 parent 7283897 commit 7dad1cf
Show file tree
Hide file tree
Showing 8 changed files with 80 additions and 108 deletions.
3 changes: 3 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -73,6 +73,9 @@

- The `QueryOptions`, `MutationOptions`, and `SubscriptionOptions` React Apollo interfaces have been renamed to `QueryDataOptions`, `MutationDataOptions`, and `SubscriptionDataOptions` (to avoid conflicting with similarly named and exported Apollo Client interfaces).

- `InMemoryCache` will no longer merge the fields of written objects unless the objects are known to have the same identity, and the values of fields with the same name will not be recursively merged unless a custom `merge` function is defined by a field policy for that field, within a type policy associated with the `__typename` of the parent object. <br/>
[@benjamn](https://github.com/benjamn) in [#5603](https://github.com/apollographql/apollo-client/pull/5603)

## Apollo Client (2.6.4)

### Apollo Client (2.6.4)
Expand Down
1 change: 0 additions & 1 deletion src/__tests__/__snapshots__/ApolloClient.ts.snap
Original file line number Diff line number Diff line change
Expand Up @@ -344,7 +344,6 @@ Object {
"a": 1,
"d": Object {
"__typename": "D",
"e": 4,
"h": Object {
"__typename": "H",
"i": 7,
Expand Down
11 changes: 6 additions & 5 deletions src/__tests__/local-state/export.ts
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@ import gql from 'graphql-tag';
import { print } from 'graphql/language/printer';

import { Observable } from '../../utilities/observables/Observable';
import { itAsync } from '../../utilities/testing/itAsync';
import { ApolloLink } from '../../link/core/ApolloLink';
import { ApolloClient } from '../..';
import { InMemoryCache } from '../../cache/inmemory/inMemoryCache';
Expand Down Expand Up @@ -325,10 +326,10 @@ describe('@client @export tests', () => {
});
});

it(
itAsync(
'should support setting an @client @export variable, loaded from the ' +
'cache, on a virtual field that is combined into a remote query.',
done => {
(resolve, reject) => {
const query = gql`
query postRequiringReview($reviewerId: Int!) {
postRequiringReview {
Expand Down Expand Up @@ -361,7 +362,7 @@ describe('@client @export tests', () => {
reviewerDetails,
},
});
});
}).setOnError(reject);

const cache = new InMemoryCache();
const client = new ApolloClient({
Expand All @@ -375,6 +376,7 @@ describe('@client @export tests', () => {
postRequiringReview: {
loggedInReviewerId,
__typename: 'Post',
id: 10,
},
},
});
Expand All @@ -388,8 +390,7 @@ describe('@client @export tests', () => {
},
reviewerDetails,
});
done();
});
}).then(resolve, reject);
},
);

Expand Down
17 changes: 14 additions & 3 deletions src/__tests__/local-state/general.ts
Original file line number Diff line number Diff line change
Expand Up @@ -864,7 +864,7 @@ describe('Combining client and server state/operations', () => {
});
});

itAsync('should support nested quering of both server and client fields', (resolve, reject) => {
itAsync('should support nested querying of both server and client fields', (resolve, reject) => {
const query = gql`
query GetUser {
user {
Expand All @@ -878,7 +878,17 @@ describe('Combining client and server state/operations', () => {
const link = new ApolloLink(operation => {
expect(operation.operationName).toBe('GetUser');
return Observable.of({
data: { user: { lastName: 'Doe', __typename: 'User' } },
data: {
user: {
__typename: 'User',
// We need an id (or a keyFields policy) because, if the User
// object is not identifiable, the call to cache.writeData
// below will simply replace the existing data rather than
// merging the new data with the existing data.
id: 123,
lastName: 'Doe',
},
},
});
});

Expand All @@ -892,13 +902,14 @@ describe('Combining client and server state/operations', () => {
data: {
user: {
__typename: 'User',
id: 123,
firstName: 'John',
},
},
});

client.watchQuery({ query }).subscribe({
next: ({ data }: any) => {
next({ data }: any) {
const { user } = data;
try {
expect(user).toMatchObject({
Expand Down
3 changes: 2 additions & 1 deletion src/cache/inmemory/__tests__/cache.ts
Original file line number Diff line number Diff line change
Expand Up @@ -701,8 +701,9 @@ describe('Cache', () => {
ROOT_QUERY: {
__typename: "Query",
a: 1,
// The new value for d overwrites the old value, since there
// is no custom merge function defined for Query.d.
d: {
e: 4,
h: {
i: 7,
},
Expand Down
13 changes: 7 additions & 6 deletions src/cache/inmemory/__tests__/writeToStore.ts
Original file line number Diff line number Diff line change
Expand Up @@ -1307,7 +1307,7 @@ describe('writing to the store', () => {
});
});

it('should merge objects when overwriting a generated id with a real id', () => {
it('should not merge unidentified data when replacing with ID reference', () => {
const dataWithoutId = {
author: {
firstName: 'John',
Expand Down Expand Up @@ -1342,7 +1342,7 @@ describe('writing to the store', () => {
}
}
`;
const expStoreWithoutId = defaultNormalizedCacheFactory({
const expectedStoreWithoutId = defaultNormalizedCacheFactory({
ROOT_QUERY: {
__typename: 'Query',
author: {
Expand All @@ -1352,10 +1352,9 @@ describe('writing to the store', () => {
},
},
});
const expStoreWithId = defaultNormalizedCacheFactory({
const expectedStoreWithId = defaultNormalizedCacheFactory({
Author__129: {
firstName: 'John',
lastName: 'Smith',
id: '129',
__typename: 'Author',
},
Expand All @@ -1364,17 +1363,19 @@ describe('writing to the store', () => {
author: makeReference('Author__129'),
},
});

const storeWithoutId = writer.writeQueryToStore({
result: dataWithoutId,
query: queryWithoutId,
});
expect(storeWithoutId.toObject()).toEqual(expStoreWithoutId.toObject());
expect(storeWithoutId.toObject()).toEqual(expectedStoreWithoutId.toObject());

const storeWithId = writer.writeQueryToStore({
result: dataWithId,
query: queryWithId,
store: storeWithoutId,
});
expect(storeWithId.toObject()).toEqual(expStoreWithId.toObject());
expect(storeWithId.toObject()).toEqual(expectedStoreWithId.toObject());
});

it('should allow a union of objects of a different type, when overwriting a generated id with a real id', () => {
Expand Down
87 changes: 24 additions & 63 deletions src/cache/inmemory/writeToStore.ts
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
import { SelectionSetNode, FieldNode, DocumentNode } from 'graphql';
import { invariant, InvariantError } from 'ts-invariant';
import { invariant } from 'ts-invariant';

import {
createFragmentMap,
Expand Down Expand Up @@ -29,6 +29,7 @@ import {

import { shouldInclude } from '../../utilities/graphql/directives';
import { cloneDeep } from '../../utilities/common/cloneDeep';
import { isEqual } from '../../utilities/common/isEqual';

import { defaultNormalizedCacheFactory } from './entityCache';
import { NormalizedCache, StoreObject } from './types';
Expand All @@ -49,7 +50,6 @@ export type WriteContext = {
type StoreObjectMergeFunction = (
existing: StoreObject,
incoming: StoreObject,
overrides?: MergeOverrides,
) => StoreObject;

type MergeOverrides = Record<string | number, {
Expand Down Expand Up @@ -105,9 +105,12 @@ export class StoreWriter {
// fields when processing a single selection set.
const simpleFieldsMerger = new DeepMerger;

// A DeepMerger used for updating normalized StoreObjects in the store,
// with special awareness of { __ref } objects, arrays, and custom logic
// for reading and writing field values.
// A DeepMerger used for merging the top-level fields of entity
// objects written by writeSelectionSetToStore. Slightly fancier than
// { ...existing, ...incoming }, in that (1) the resulting object will
// be === existing if the incoming data does not alter any of the
// existing values, and (2) a helpful error will be thrown if
// unidentified data replaces a normalized ID reference.
const storeObjectMerger = new DeepMerger(storeObjectReconciler);

return this.writeSelectionSetToStore({
Expand All @@ -120,8 +123,8 @@ export class StoreWriter {
mergeFields(existing, incoming) {
return simpleFieldsMerger.merge(existing, incoming);
},
mergeStoreObjects(existing, incoming, overrides) {
return storeObjectMerger.merge(existing, incoming, store, overrides);
mergeStoreObjects(existing, incoming) {
return storeObjectMerger.merge(existing, incoming, store);
},
variables: {
...getDefaultValues(operationDefinition),
Expand Down Expand Up @@ -181,10 +184,6 @@ export class StoreWriter {
context.mergeStoreObjects(
existing,
processed.result,
// To avoid re-merging values that have already been merged by
// custom merge functions, give context.mergeStoreObjects access
// to the mergeOverrides information that was used above.
processed.mergeOverrides,
),
);

Expand Down Expand Up @@ -377,31 +376,19 @@ function walkWithMergeOverrides(
});
}

const storeObjectReconciler: ReconcilerFunction<[
NormalizedCache,
MergeOverrides | undefined,
]> = function (
const storeObjectReconciler: ReconcilerFunction<[NormalizedCache]> = function (
existingObject,
incomingObject,
property,
// This parameter comes from the additional argument we pass to the
// merge method in context.mergeStoreObjects (see writeQueryToStore).
store,
mergeOverrides,
) {
// In the future, reconciliation logic may depend on the type of the parent
// StoreObject, not just the values of the given property.
const existing = existingObject[property];
const incoming = incomingObject[property];

const mergeChildObj = mergeOverrides && mergeOverrides[property];
if (mergeChildObj && typeof mergeChildObj.merge === "function") {
// If this property was overridden by a custom merge function, then
// its merged value has already been determined, so we should return
// incoming without recursively merging it into existing.
return incoming;
}

if (
existing !== incoming &&
// The DeepMerger class has various helpful utilities that we might as
Expand All @@ -422,48 +409,22 @@ const storeObjectReconciler: ReconcilerFunction<[
return incoming;
}

const childMergeOverrides = mergeChildObj && mergeChildObj.child;

if (isReference(incoming)) {
if (isReference(existing)) {
// Incoming references always replace existing references, but we can
// avoid changing the object identity when the __ref is the same.
return incoming.__ref === existing.__ref ? existing : incoming;
}
// Incoming references can be merged with existing non-reference data
// if the existing data appears to be of a compatible type.
store.set(
incoming.__ref,
this.merge(
existing,
store.get(incoming.__ref),
store,
childMergeOverrides,
),
);
return incoming;
} else if (isReference(existing)) {
throw new InvariantError(
`Store error: the application attempted to write an object with no provided id but the store already contains an id of ${existing.__ref} for this object.`,
);
}
invariant(
!isReference(existing) || isReference(incoming),
`Store error: the application attempted to write an object with no provided id but the store already contains an id of ${existing.__ref} for this object.`,
);

if (Array.isArray(incoming)) {
if (!Array.isArray(existing)) return incoming;
if (existing.length > incoming.length) {
// Allow the incoming array to truncate the existing array, if the
// incoming array is shorter.
return this.merge(
existing.slice(0, incoming.length),
incoming,
store,
childMergeOverrides,
);
}
// It's worth checking deep equality here (even though blindly
// returning incoming would be logically correct) because preserving
// the referential identity of existing data can prevent needless
// rereading and rerendering.
if (isEqual(existing, incoming)) {
return existing;
}

return this.merge(existing, incoming, store, childMergeOverrides);
}

// In all other cases, incoming replaces existing without any effort to
// merge them deeply, since custom merge functions have already been
// applied to the incoming data by walkWithMergeOverrides.
return incoming;
}
Loading

0 comments on commit 7dad1cf

Please sign in to comment.