-
Notifications
You must be signed in to change notification settings - Fork 9.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Third-party Google Assets don't follow their own audit rules, causing report errors #6140
Comments
Yes, allowing us to filter them out would be helpful. I'm wondering, though, if there is a way to open an issue with Google regarding how they cache their JS. I've never understood why some of their libraries are set to ttl 0 or 15 minutes, etc... I doubt they are releasing code updates every fifteen minutes :). It just seems since lighthouse was affiliated with Google development, that maybe they could nudge Google to observe its own rules and recommendations. Thank you for your response @patrickhulce . I suppose filtering will not affect the score, which will remain affected by Google's caching errors, but at least we could then reduce noise during our weekly testing. |
Hi @apotek! I can't speak on behalf of most of those other products you listed, but I have used Google tag manager (GTM) quite a bit. GTM enables webmasters to inject their own scripts into the page based off a number of different triggers. GTM users value being able to update these scripts independently from release cycles of their main production code. Because of this use case, assets from GTM can't be given a low ttl - if they had a high ttl, webmasters would have to wait longer for their changes to take affect. Although the warning message LH provides labels these as static assets, they're really quite dynamic. I'd imagine the same is true of the other products you listed. Anyhow, #6351 was opened a couple days ago, so you'll be able to filter out 3P assets soon. |
@hoten great insight here! I'm curious if you might be able to shed light on why an asset that is so dynamic in nature shouldn't have a Such a resource would still be able to benefit from 304 responses and the need for fresh assets is clearly communicated. @paulirish and I had a hard time coming up with the rationale for doing anything other in between immutable and must-revalidate 😄 |
I could only guess, but my first assumption is that GTM attempts to be as lightweight as possible, so they really don't want This SO post corroborates the discussion so far. |
@patrickhulce wdyt about repressing this warning if the cache control is |
@hoten good idea! While it's certainly possible for user-specific content to be long-lived, it's certainly a good enough indicator for us to not complain. Any usage of |
Let's add |
When I run lighthouse reports, my only errors and warnings come from assets delivered by the creator of lighthouse: Google.
Specifically, this warning/error shows up.
Uses inefficient cache policy on static assets
All the items in this report come from
securepubads.g.doubleclick.net
(google property).googletagmanager.com
(google property)pagead2.googlesyndication.com
(google property)www.google-analytics.com
(google property)tpc.googlesyndication.com
(google property)Why does Google serve assets that fail to follow their own recommendations as based on lighthouse? If they recommend static assets be sent with proper caching headers, why doesn't google follow through with that? Some of their assets have a ttl of 0 or 15m!
If they aren't going to follow their own recommendations, then it seems that lighthouse should exclude their own domains from this analysis, since it creates a bunch of false positives which I can do nothing about.
The text was updated successfully, but these errors were encountered: