-
Notifications
You must be signed in to change notification settings - Fork 9.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
report: audit warnings are not top-level #5270
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
much less noisy 👍 thanks for picking this up! :D
if (!warnings || warnings.length === 0) return auditEl; | ||
|
||
// Add list of warnings or singular warning | ||
const warningsEl = this.dom.createChildOf(titleEl, 'div', 'lh-debug lh-debug--warnings'); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
do we have any cases of warning + explanation? if so, does it still look okish :)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@@ -114,23 +125,6 @@ describe('CategoryRenderer', () => { | |||
category.description = prevDesc; | |||
}); | |||
|
|||
// TODO(phulce): revisit if top-level warnings approach is too noisy | |||
it.skip('renders audits with warnings as failed', () => { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
do we want to render an audit with warnings as failed? or keep it in passed?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
good policy question!
Let's see. here's the possible warnings we have right now (that are placed into Passed)
I see the argument for these being visible by default, however in the case of timing/decode, there's not much a user can do about them. Either way, to be visible by default, they'd have to be placed in "Failed" and either we keep the green checkmark or fail them. Both are pretty weird.
So I'm comfy with saying that warnings are orthogonal to passing/failing and default report visibility. wdyt?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah I don't think there's much anyone can do about them either, I'm fine to hide especially now that we've got Sentry tracking 'em for Httparchive runs 👍
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
oh right, I was good with warnings and failures being independent of each other, but I forgot that means that passing audits with warnings would then be collapsed by default. That definitely takes away from the usefulness of warnings since that means they'll essentially never be seen if the audit is marked as passing.
I can kind of see an argument for hiding some warnings normally, not much the user can do about them, but things like "your manifest technically works but you have weird stuff in there that's getting fallbacks" seems worth bringing up even though the audit passed (and even the warnings about lighthouse's shortcomings might be relevant, which is why we made them warnings in the first place e.g. we can't actually verify all your external links or your images are encoded well except for this list of 20 images, which, while you can't make lighthouse fix, you can verify them yourself).
So while I would say that the existence of warnings
should be independent of failure state, I would say that it's definitely not independent of whether or not to show them, and marking the audit as failed doesn't seem like a good way to handle that.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
My suggestion in #5273 (comment) was to use warnings as a way to get attention to an issue without penalizing the score. So I would definitely prefer for the warning to be visible by default (not collapsed). I understand that warnings may also be used for things outside of the developer's control so collapsing may make more sense. Is this something that can be configured by the audit at runtime, rather than forcing all warnings to behave the same?
I'd also love to see different UI treatment for warnings; not a green check and not a red exclamation. Could it be a yellow/orange exclamation to distinguish it that something's amiss?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sg. broke this into it's own issue: #5327 ("make sure warnings within Passed Audits are visible")
assert.ok(/Links.*unsafe/.test(warningEls[0].textContent), 'did not add warning text'); | ||
assert.strictEqual(warningEls.length, sampleResults.runWarnings.length); | ||
for (const warningEl of warningEls) { | ||
assert.ok(/Links.*unsafe/.test(warningEl.textContent), 'did not add warning text'); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
shouldn't this just be deleted now that warningEls should be length 0?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
good call.
@@ -33,7 +33,9 @@ class PerformanceCategoryRenderer extends CategoryRenderer { | |||
valueEl.textContent = Util.formatDisplayValue(audit.result.displayValue); | |||
|
|||
const descriptionEl = this.dom.find('.lh-metric__description', tmpl); | |||
descriptionEl.appendChild(this.dom.convertMarkdownLinkSnippets(audit.result.description)); | |||
if (audit.result.description) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
a little confused by removing the ?
from description
but making this conditional... what's up with that?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
good catch. it's assumed to be there now.
@@ -114,23 +125,6 @@ describe('CategoryRenderer', () => { | |||
category.description = prevDesc; | |||
}); | |||
|
|||
// TODO(phulce): revisit if top-level warnings approach is too noisy | |||
it.skip('renders audits with warnings as failed', () => { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
oh right, I was good with warnings and failures being independent of each other, but I forgot that means that passing audits with warnings would then be collapsed by default. That definitely takes away from the usefulness of warnings since that means they'll essentially never be seen if the audit is marked as passing.
I can kind of see an argument for hiding some warnings normally, not much the user can do about them, but things like "your manifest technically works but you have weird stuff in there that's getting fallbacks" seems worth bringing up even though the audit passed (and even the warnings about lighthouse's shortcomings might be relevant, which is why we made them warnings in the first place e.g. we can't actually verify all your external links or your images are encoded well except for this list of 20 images, which, while you can't make lighthouse fix, you can verify them yourself).
So while I would say that the existence of warnings
should be independent of failure state, I would say that it's definitely not independent of whether or not to show them, and marking the audit as failed doesn't seem like a good way to handle that.
} | ||
if (Object.keys(parsedProps.invalidValues).length) { | ||
warnings.push(`Invalid values found: ${JSON.stringify(parsedProps.invalidValues)}.`); | ||
warnings.push(`Invalid values found: ${JSON.stringify(parsedProps.invalidValues)}`); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
tests for these will need to be updated (they assert a .
in there :)
score: 1, | ||
}; | ||
const auditDOM = renderer.renderAudit({id: 'foo', score: 1, result: auditResult}); | ||
assert.ok(auditDOM.querySelector('.lh-debug--warnings'), 'did not render debug message'); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
want to test at least that the warning text is in there as well? And probably want a plural/bulleted test in addition
@@ -135,8 +135,7 @@ describe('ReportRenderer', () => { | |||
const output = renderer.renderReport(sampleResults, container); | |||
|
|||
const warningEls = output.querySelectorAll('.lh-run-warnings > ul > li'); | |||
assert.strictEqual(warningEls.length, 1); | |||
assert.ok(/Links.*unsafe/.test(warningEls[0].textContent), 'did not add warning text'); | |||
assert.strictEqual(warningEls.length, sampleResults.runWarnings.length); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
there's already a "no lighthouseRunWarnings
" test above this, so probably want to make a const warningResults = Object.assign({}, sampleResults, {runWarnings: ['Warnings are great!']});
or whatever and pass that in
/** | ||
* @typedef {{ | ||
id: string, | ||
score: (number|null), | ||
weight: number, | ||
group?: string, | ||
result: AuditResultJSON | ||
result: LH.Audit.Result |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
👍
@brendankenny PTAL here's a report with a bunch of errors: http://lh-warningsredux0.surge.sh/errors.html Open up Perf > Passed to see more warnings. FYI Explanations & warnings are now the same font-size as audit title text: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
this is a little hard to read now to me, but fine with whatever is decided. It looks like in the earlier screenshots there were still bullets and indentation, though? |
oh, that one was a single warning, not a multi-warning |
They're now reported along with the audit.
Examples:
You can see one example of this in the deployment. (But these other warning cases are pretty esoteric so i've created an artifact bundle that reproduces them.)
fixes #5153