[go: nahoru, domu]

Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

proto: update processForProto method signature, string -> LH.Result #9016

Merged
merged 8 commits into from
Jun 11, 2019
Merged
Show file tree
Hide file tree
Changes from 7 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
26 changes: 11 additions & 15 deletions clients/lightrider/lightrider-entry.js
Original file line number Diff line number Diff line change
Expand Up @@ -24,9 +24,9 @@ const LR_PRESETS = {
* If configOverride is provided, lrDevice and categoryIDs are ignored.
* @param {Connection} connection
* @param {string} url
* @param {LH.Flags} flags Lighthouse flags, including `output`
* @param {LH.Flags} flags Lighthouse flags
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

make sure you aren't accidentally still passing in output: 'html' and doing all that extra work :)

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Added a comment.. i dont think we need a stronger check.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Added a comment.. i dont think we need a stronger check.

yeah, I meant just double check existing code, definitely don't need a check in the code

* @param {{lrDevice?: 'desktop'|'mobile', categoryIDs?: Array<string>, logAssets: boolean, configOverride?: LH.Config.Json}} lrOpts Options coming from Lightrider
* @return {Promise<string|Array<string>|void>}
* @return {Promise<string>}
*/
async function runLighthouseInLR(connection, url, flags, lrOpts) {
const {lrDevice, categoryIDs, logAssets, configOverride} = lrOpts;
brendankenny marked this conversation as resolved.
Show resolved Hide resolved
Expand All @@ -38,6 +38,7 @@ async function runLighthouseInLR(connection, url, flags, lrOpts) {
flags.disableStorageReset = true;
flags.logLevel = flags.logLevel || 'info';
flags.channel = 'lr';
// flags.output assumed to be unset by Lightrider
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yeah, probably just drop this line. At worst it's just extra work, and is just as assumed as every other property on flags could be (and we don't make a comment for any of those)

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

should it instead throw if output is set to anything other than undefined or json? seems like it could be frustrating to figure that out on your own

but you both seem to be on the same page that we "definitely don't need a check in the code" so I feel like I'm missing something

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

should it instead throw if output is set to anything other than undefined or json?

yeah, that's why I was suggesting removing the comment, because from this comment it definitely sounds like it should be an assertion, not a comment :)

AIUI, the runnerResult.report is ignored completely on the LR side now, it uses the LHR directly (with its own HTML reportGenerator bundle), so at worst a defined output is just extra work.

...but there are several possible things on flags that could add extra work and we're just assuming those are correct too, so I really don't think it's worth thinking about. Really it's something for the caller to worry about, either way anyways


let config;
if (configOverride) {
Expand All @@ -52,22 +53,17 @@ async function runLighthouseInLR(connection, url, flags, lrOpts) {

try {
const runnerResult = await lighthouse(url, flags, config, connection);
if (!runnerResult) return;
if (!runnerResult) throw new Error('Lighthouse finished without a runnerResult');

// pre process the LHR for proto
if (flags.output === 'json' && typeof runnerResult.report === 'string') {
// When LR is called with |internal: {keep_raw_response: true, save_lighthouse_assets: true}|,
// this code will log artifacts to raw_response.artifacts.
if (logAssets) {
// @ts-ignore - Regenerate the report, but tack on the artifacts.
runnerResult.lhr.artifacts = runnerResult.artifacts;
runnerResult.report = JSON.stringify(runnerResult.lhr);
}

return preprocessor.processForProto(runnerResult.report);
// When LR is called with |internal: {keep_raw_response: true, save_lighthouse_assets: true}|,
// this code will log artifacts to raw_response.artifacts.
Copy link
Member
@brendankenny brendankenny Jun 6, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this comment is inherited but is pretty confusing here since it appears to refer entirely to stuff outside of LH? e.g. keep_raw_response and/or save_lighthouse_assets somehow become lrOpts.logAssets, and lhr becomes raw_response?

If it's helpful for users of lr, though, seems ok to keep

if (logAssets) {
// @ts-ignore - Regenerate the report, but tack on the artifacts.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

maybe regenerate was referring to what it used to be doing? what about just // @ts-ignore - piggyback the artifacts on the LHR.

runnerResult.lhr.artifacts = runnerResult.artifacts;
}

return runnerResult.report;
// pre process the LHR for proto
return JSON.stringify(preprocessor.processForProto(runnerResult.lhr));
} catch (err) {
// If an error ruined the entire lighthouse run, attempt to return a meaningful error.
let runtimeError;
Expand Down
15 changes: 8 additions & 7 deletions lighthouse-core/lib/proto-preprocessor.js
Original file line number Diff line number Diff line change
Expand Up @@ -16,11 +16,12 @@ const fs = require('fs');
*/

/**
* @param {string} result
* Transform an LHR into a proto-friendly, mostly-compatible LHR.
* @param {LH.Result} lhr
* @return {LH.Result}
*/
function processForProto(result) {
/** @type {LH.Result} */
const reportJson = JSON.parse(result);
function processForProto(lhr) {
const reportJson = JSON.parse(JSON.stringify(lhr));
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

could we skip the cloning? Only the lr entry uses this fn, and the original lhr is not needed there. if some other client comes along with different requirements they can clone the input themselves.

assuming cloning a lhr takes a measurable amount of time.

I clocked 4ms for example.com and for 8ms nyt.com:

console.time('clone'); JSON.parse(JSON.stringify(lhr)); console.timeEnd('clone');

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I find it unexpected when methods mutate the passed in object, so this is my preferred pattern. It's also what we do for Util.prepareReportResult, swapLocale, our tests, etc.

We adopted this pattern in our tests, as previously we had some unexpected mutations that caused failures that were tricky to track down.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yeah I prefer avoiding mutation too. if the few ms is worth the change, I'd suggest renaming the method to modifyForProto to properly set expectations.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the type error (failure in travis) is because this whole function is operating on an any reportJson (due to the return type of JSON.parse). Luckily there was a forEach in there or we wouldn't have noticed the whole thing wasn't being checked :)

Need to keep the /** @type {LH.Result} */ above this line.


// Clean up the configSettings
// Note: This is not strictly required for conversion if protobuf parsing is set to
Expand Down Expand Up @@ -95,7 +96,7 @@ function processForProto(result) {

removeStrings(reportJson);

return JSON.stringify(reportJson);
return reportJson;
}

// @ts-ignore claims always false, but this checks if cli or module
Expand All @@ -113,9 +114,9 @@ if (require.main === module) {

if (input && output) {
// process the file
const report = processForProto(fs.readFileSync(input, 'utf-8'));
const report = processForProto(JSON.parse(fs.readFileSync(input, 'utf-8')));
// write to output from argv
fs.writeFileSync(output, report, 'utf-8');
fs.writeFileSync(output, JSON.stringify(report), 'utf-8');
}
} else {
module.exports = {
Expand Down
33 changes: 20 additions & 13 deletions lighthouse-core/test/lib/proto-preprocessor-test.js
Original file line number Diff line number Diff line change
Expand Up @@ -5,10 +5,17 @@
*/
'use strict';

const processForProto = require('../../lib/proto-preprocessor.js').processForProto;
const {processForProto} = require('../../lib/proto-preprocessor.js');
const sampleJson = require('../results/sample_v2.json');

/* eslint-env jest */
describe('processing for proto', () => {
it('doesn\'t modify the input object', () => {
const input = JSON.parse(JSON.stringify(sampleJson));
processForProto(input);
expect(input).toEqual(sampleJson);
});

it('keeps only necessary configSettings', () => {
const input = {
'configSettings': {
Expand Down Expand Up @@ -44,9 +51,9 @@ describe('processing for proto', () => {
'onlyCategories': null,
},
};
const output = processForProto(JSON.stringify(input));
const output = processForProto(input);

expect(JSON.parse(output)).toMatchObject(expectation);
expect(output).toMatchObject(expectation);
});

it('cleans up default runtimeErrors', () => {
Expand All @@ -56,9 +63,9 @@ describe('processing for proto', () => {
},
};

const output = processForProto(JSON.stringify(input));
const output = processForProto(input);

expect(JSON.parse(output)).not.toHaveProperty('runtimeError');
expect(output).not.toHaveProperty('runtimeError');
});

it('non-default runtimeErrors are untouched', () => {
Expand All @@ -68,9 +75,9 @@ describe('processing for proto', () => {
},
};

const output = processForProto(JSON.stringify(input));
const output = processForProto(input);

expect(JSON.parse(output)).toMatchObject(input);
expect(output).toMatchObject(input);
});

it('cleans up audits', () => {
Expand All @@ -91,9 +98,9 @@ describe('processing for proto', () => {
},
},
};
const output = processForProto(JSON.stringify(input));
const output = processForProto(input);

expect(JSON.parse(output)).toMatchObject(expectation);
expect(output).toMatchObject(expectation);
});


Expand All @@ -108,9 +115,9 @@ describe('processing for proto', () => {
const expectation = {
'i18n': {},
};
const output = processForProto(JSON.stringify(input));
const output = processForProto(input);

expect(JSON.parse(output)).toMatchObject(expectation);
expect(output).toMatchObject(expectation);
});

it('removes empty strings', () => {
Expand Down Expand Up @@ -151,8 +158,8 @@ describe('processing for proto', () => {
],
},
};
const output = processForProto(JSON.stringify(input));
const output = processForProto(input);

expect(JSON.parse(output)).toMatchObject(expectation);
expect(output).toMatchObject(expectation);
});
});
35 changes: 16 additions & 19 deletions lighthouse-core/test/report/proto-test.js
Original file line number Diff line number Diff line change
Expand Up @@ -5,60 +5,57 @@
*/
'use strict';

const path = require('path');
const fs = require('fs');

const sample = fs.readFileSync(path.resolve(__dirname, '../results/sample_v2.json'));
const sampleJson = require('../results/sample_v2.json');
const roundTripJson = require('../../../proto/sample_v2_round_trip.json');
const preprocessor = require('../../lib/proto-preprocessor.js');

/* eslint-env jest */

describe('round trip JSON comparison subsets', () => {
let sampleJson;
let processedLHR;

beforeEach(() => {
sampleJson = JSON.parse(preprocessor.processForProto(sample));
processedLHR = preprocessor.processForProto(sampleJson);
});

it('has the same audit results and details (if applicable)', () => {
for (const auditId of Object.keys(sampleJson.audits)) {
expect(roundTripJson.audits[auditId]).toEqual(sampleJson.audits[auditId]);
for (const auditId of Object.keys(processedLHR.audits)) {
expect(roundTripJson.audits[auditId]).toEqual(processedLHR.audits[auditId]);
}
});

it('has the same i18n rendererFormattedStrings', () => {
expect(roundTripJson.i18n).toMatchObject(sampleJson.i18n);
expect(roundTripJson.i18n).toMatchObject(processedLHR.i18n);
});

it('has the same top level values', () => {
// Don't test all top level properties that are objects.
Object.keys(sampleJson).forEach(audit => {
if (typeof sampleJson[audit] === 'object' && !Array.isArray(sampleJson[audit])) {
delete sampleJson[audit];
Object.keys(processedLHR).forEach(audit => {
if (typeof processedLHR[audit] === 'object' && !Array.isArray(processedLHR[audit])) {
delete processedLHR[audit];
}
});

// Properties set to their type's default value will be omitted in the roundTripJson.
// For an explicit list of properties, remove sampleJson values if set to a default.
if (Array.isArray(sampleJson.stackPacks) && sampleJson.stackPacks.length === 0) {
delete sampleJson.stackPacks;
if (Array.isArray(processedLHR.stackPacks) && processedLHR.stackPacks.length === 0) {
delete processedLHR.stackPacks;
}

expect(roundTripJson).toMatchObject(sampleJson);
expect(roundTripJson).toMatchObject(processedLHR);
});

it('has the same config values', () => {
// Config settings from proto round trip should be a subset of the actual settings.
expect(sampleJson.configSettings).toMatchObject(roundTripJson.configSettings);
expect(processedLHR.configSettings).toMatchObject(roundTripJson.configSettings);
});
});

describe('round trip JSON comparison to everything', () => {
let sampleJson;
let processedLHR;

beforeEach(() => {
sampleJson = JSON.parse(preprocessor.processForProto(sample));
processedLHR = preprocessor.processForProto(sampleJson);

// Proto conversion turns empty summaries into null. This is OK,
// and is handled in the PSI roundtrip just fine, but messes up the easy
Expand All @@ -71,6 +68,6 @@ describe('round trip JSON comparison to everything', () => {
});

it('has the same JSON overall', () => {
expect(sampleJson).toMatchObject(roundTripJson);
expect(processedLHR).toMatchObject(roundTripJson);
});
});