Awesome
Deprecated
In favour of better support and many cool features of:
- Lighthouse CI - is a suite of tools that make continuously running, saving, retrieving, and asserting against Lighthouse results as easy as possible.
- Lighthouse CI Action - action integrates Lighthouse CI with Github Actions environment. Making it simple to see failed tests, upload results, run jobs in parallel, store secrets, and interpolate env variables.
- Treo.sh - Page speed monitoring made simple.
Documentation on these metrics in the works. If you hit bugs in the metrics collection, report at Lighthouse issues. How to use article
Install
$ yarn global add pwmetrics
# or
$ yarn add --dev pwmetrics
CLI Usage
$ pwmetrics <url> <flags>
pwmetrics http://example.com/
# --runs=n Does n runs (eg. 3, 5), and reports the median run's numbers.
# Median run selected by run with the median TTI.
pwmetrics http://example.com/ --runs=3
# --json Reports json details to stdout.
pwmetrics http://example.com/ --json
# returns...
# {runs: [{
# "timings": [
# {
# "name": "First Contentful Paint",
# "value": 289.642
# },
# {
# "name": "Largest Contentful Paint",
# "value": 292
# },
# ...
# --output-path File path to save results.
pwmetrics http://example.com/ --output-path='pathToFile/file.json'
# --config Provide configuration (defaults to `package.json`). See _Defining config_ below.
pwmetrics --config=pwmetrics-config.js
# --submit Submit results to Google Sheets. See _Defining submit_ below.
pwmetrics --submit
# --upload Upload Lighthouse traces to Google Drive. See _Defining upload_ below.
pwmetrics --upload
# --view View Lighthouse traces, which were uploaded to Google Drive, in DevTools. See _Defining view_ below.
pwmetrics --view
##
## CLI options useful for CI
##
# --expectations Assert metrics results against provides values. See _Defining expectations_ below.
pwmetrics --expectations
# --fail-on-error Exit PWMetrics with an error status code after the first unfilled expectation.
pwmetrics --fail-on-error
Defining config
# run pwmetrics with config in package.json
pwmetrics --config
package.json
...
"pwmetrics": {
"url": "http://example.com/",
// other configuration options
}
...
# run pwmetrics with config in pwmetrics-config.js
pwmetrics --config=pwmetrics-config.js
pwmetrics-config.js
module.exports = {
url: 'http://example.com/',
// other configuration options. Read _All available configuration options_
}
All available configuration options
pwmetrics-config.js
const METRICS = require('pwmetrics/lib/metrics');
module.exports = {
url: 'http://example.com/',
flags: { // AKA feature flags
runs: 3, // number or runs
submit: true, // turn on submitting to Google Sheets
upload: true, // turn on uploading to Google Drive
view: true, // open uploaded traces to Google Drive in DevTools
expectations: true, // turn on assertion metrics results against provides values
json: true, // not required, set to true if you want json output
outputPath: 'stdout', // not required, only needed if you have specified json output, can be "stdout" or a path
chromePath: '/Applications/Google\ Chrome\ Canary.app/Contents/MacOS/Google\ Chrome\ Canary', //optional path to specific Chrome location
chromeFlags: '', // custom flags to pass to Chrome. For a full list of flags, see http://peter.sh/experiments/chromium-command-line-switches/.
// Note: pwmetrics supports all flags from Lighthouse
showOutput: true, // not required, set to false for pwmetrics not output any console.log messages
failOnError: false // not required, set to true if you want to fail the process on expectations errors
},
expectations: {
// these expectations values are examples, for your cases set your own
// it's not required to use all metrics, you can use just a few of them
// Read _Available metrics_ where all keys are defined
[METRICS.TTFCP]: {
warn: '>=1500',
error: '>=2000'
},
[METRICS.TTLCP]: {
warn: '>=2000',
error: '>=3000'
},
[METRICS.TTI]: {
...
},
[METRICS.TBT]: {
...
},
[METRICS.SI]: {
...
},
},
sheets: {
type: 'GOOGLE_SHEETS', // sheets service type. Available types: GOOGLE_SHEETS
options: {
spreadsheetId: 'sheet_id',
tableName: 'data',
uploadMedian: false // not required, set to true if you want to upload only the median run
}
},
clientSecret: {
// Data object. Can be get
// either
// by (using everything in step 1 here)[https://developers.google.com/sheets/api/quickstart/nodejs#step_1_turn_on_the_api_name]
//
// example format:
//
// installed: {
// client_id: "sample_client_id",
// project_id: "sample_project_id",
// auth_uri: "https://accounts.google.com/o/oauth2/auth",
// token_uri: "https://accounts.google.com/o/oauth2/token",
// auth_provider_x509_cert_url: "https://www.googleapis.com/oauth2/v1/certs",
// client_secret: "sample_client_secret",
// redirect_uris: [
// "url",
// "http://localhost"
// ]
// }
//
// or
// by (using everything in step 1 here)[https://developers.google.com/drive/v3/web/quickstart/nodejs]
}
}
Defining expectations
Recipes for using with CI
# run pwmetrics with config in package.json
pwmetrics --expectations
package.json
...
"pwmetrics": {
"url": "http://example.com/",
"expectations": {
...
}
}
...
# run pwmetrics with config in pwmetrics-config.js
pwmetrics --expectations --config=pwmetrics-config.js
Defining submit
Submit results to Google Sheets
Instructions:
- Copy this spreadsheet.
- Copy the ID of the spreadsheet into the config as value of
sheets.options.spreadsheetId
property. - Setup Google Developer project and get credentials. (everything in step 1 here)
- Take a
client_secret
and put it into the config as value ofclientSecret
property.
# run pwmetrics with config in package.json
pwmetrics --submit
# run pwmetrics with config in pwmetrics-config.js
pwmetrics --submit --config=pwmetrics-config.js
pwmetrics-config.js
module.exports = {
'url': 'http://example.com/',
'sheets': {
...
},
'clientSecret': {
...
}
}
Defining upload
Upload Lighthouse traces to Google Drive
Instructions:
- Setup Google Developer project and get credentials. (everything in step 1 here)
- Take a
client_secret
and put it into the config as value ofclientSecret
property.
# run pwmetrics with config in package.json
pwmetrics --upload
# run pwmetrics with config in pwmetrics-config.js
pwmetrics --upload --config=pwmetrics-config.js
pwmetrics-config.js
module.exports = {
'url': 'http://example.com/',
'clientSecret': {
...
}
}
View Lighthouse traces in timeline-viewer
Show Lighthouse traces in timeline-viewer.
Required to use
upload
flag
timeline-viewer - Shareable URLs for your Chrome DevTools Timeline traces.
# run pwmetrics with config in package.json
pwmetrics --upload --view
# run pwmetrics with config in your-own-file.js
pwmetrics --upload --view --config=your-own-file.js
pwmetrics-config.js
module.exports = {
'url': 'http://example.com/',
'clientSecret': {
...
}
}
Available metrics:
All metrics now are stored in separate constant object located in pwmetrics/lib/metrics/metrics
;
// lib/metrics/metrics.ts
{
METRICS: {
TTFCP: 'firstContentfulPaint',
TTLCP: 'largestContentfulPaint',
TBT: 'totalBlockingTime',
TTI: 'interactive',
SI: 'speedIndex'
}
}
Read article Performance metrics. What’s this all about? which is decoding this metrics.
API
const PWMetrics = require('pwmetrics');
const options = {
flags: {
runs: 3, // number or runs
submit: true, // turn on submitting to Google Sheets
upload: true, // turn on uploading to Google Drive
view: true, // open uploaded traces to Google Drive in DevTools
expectations: true, // turn on assertation metrics results against provides values
chromeFlags: '--headless' // run in headless Chrome
}
};
const pwMetrics = new PWMetrics('http://example.com/', options); // _All available configuration options_ can be used as `options`
pwMetrics.start(); // returns Promise
Options
<table class="table" width="100%"> <thead> <tr> <th width="10%">Option</th> <th width="15%">Type</th> <th width="40%">Default</th> <th width="25%">Description</th> </tr> </thead> <tbody> <tr> <td style="text-align: center;">flags<sup><b>*</b></sup></td> <td style="text-align: center;">Object</td> <td> <pre> { runs: 1, submit: false, upload: false, view: false, expectations: false, disableCpuThrottling: false, chromeFlags: '' } </pre> </td> <td>Feature flags</td> </tr> <tr> <td style="text-align: center;">expectations</td> <td style="text-align: center;">Object</td> <td style="text-align: center;">{}</td> <td>See <a href="#defining-expectations">Defining expectations</a> above.</td> </tr> <tr> <td style="text-align: center;">sheets</td> <td style="text-align: center;">Object</td> <td style="text-align: center;">{}</td> <td>See <a href="#defining-submit">Defining submit</a> above.</td> </tr> <tr> <td style="text-align: center;">clientSecret</td> <td style="text-align: center;">Object</td> <td style="text-align: center;">{}</td> <td> Client secrete data generated by Google API console. To setup Google Developer project and get credentials apply <a href="https://developers.google.com/drive/v3/web/quickstart/nodejs">everything in step 1 here</a>. </td> </tr> </tbody> </table><sup>*</sup>pwmetrics supports all flags from Lighthouse. See here for the complete list.
Recipes
License
Apache 2.0. Google Inc.