Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Third-party Google Assets don't follow their own audit rules, causing report errors #6140

Closed
apotek opened this issue Sep 28, 2018 · 8 comments
Assignees

Comments

@apotek
Copy link

apotek commented Sep 28, 2018

When I run lighthouse reports, my only errors and warnings come from assets delivered by the creator of lighthouse: Google.

Specifically, this warning/error shows up.

Uses inefficient cache policy on static assets

All the items in this report come from

  • securepubads.g.doubleclick.net (google property).
  • googletagmanager.com (google property)
  • pagead2.googlesyndication.com (google property)
  • www.google-analytics.com (google property)
  • tpc.googlesyndication.com (google property)

Why does Google serve assets that fail to follow their own recommendations as based on lighthouse? If they recommend static assets be sent with proper caching headers, why doesn't google follow through with that? Some of their assets have a ttl of 0 or 15m!

If they aren't going to follow their own recommendations, then it seems that lighthouse should exclude their own domains from this analysis, since it creates a bunch of false positives which I can do nothing about.

@patrickhulce
Copy link
Collaborator

Thanks for filing @apotek! Our thoughts on how to approach this 3rd party problem are mostly outlined in #4516. Basically: we want you to be able to filter these out too :)

@apotek
Copy link
Author

apotek commented Sep 28, 2018

Yes, allowing us to filter them out would be helpful. I'm wondering, though, if there is a way to open an issue with Google regarding how they cache their JS. I've never understood why some of their libraries are set to ttl 0 or 15 minutes, etc... I doubt they are releasing code updates every fifteen minutes :). It just seems since lighthouse was affiliated with Google development, that maybe they could nudge Google to observe its own rules and recommendations.

Thank you for your response @patrickhulce . I suppose filtering will not affect the score, which will remain affected by Google's caching errors, but at least we could then reduce noise during our weekly testing.

@connorjclark
Copy link
Collaborator

connorjclark commented Oct 23, 2018

Hi @apotek!

I can't speak on behalf of most of those other products you listed, but I have used Google tag manager (GTM) quite a bit. GTM enables webmasters to inject their own scripts into the page based off a number of different triggers. GTM users value being able to update these scripts independently from release cycles of their main production code. Because of this use case, assets from GTM can't be given a low ttl - if they had a high ttl, webmasters would have to wait longer for their changes to take affect. Although the warning message LH provides labels these as static assets, they're really quite dynamic. I'd imagine the same is true of the other products you listed.

Anyhow, #6351 was opened a couple days ago, so you'll be able to filter out 3P assets soon.

@patrickhulce
Copy link
Collaborator

@hoten great insight here! I'm curious if you might be able to shed light on why an asset that is so dynamic in nature shouldn't have a no-cache/must-revalidate cache policy?

Such a resource would still be able to benefit from 304 responses and the need for fresh assets is clearly communicated. @paulirish and I had a hard time coming up with the rationale for doing anything other in between immutable and must-revalidate 😄

@connorjclark
Copy link
Collaborator

I could only guess, but my first assumption is that GTM attempts to be as lightweight as possible, so they really don't want no-cache or must-validate due to the extra time fetching / validating.

This SO post corroborates the discussion so far.

@connorjclark
Copy link
Collaborator

@patrickhulce wdyt about repressing this warning if the cache control is private? Private's only use case AFAIK is to serve user-specific content, which implies the asset is not static.

@patrickhulce
Copy link
Collaborator

@hoten good idea!

While it's certainly possible for user-specific content to be long-lived, it's certainly a good enough indicator for us to not complain. Any usage of no-cache, must-revalidate, or private sounds like a pretty reasonable list of things to not cache forever.

@paulirish
Copy link
Member

Let's add private to our list.
And land the 3P filter.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants