There are tools to help monitor page performance and assess your improvements. One of the best is PageSpeed Insights. It’s available as a web application and as the Lighthouse tab in Chrome’s DevTools (the same DevTools are also available in Edge, Opera, Brave, and Vivaldi).

Web page performance is more important than ever. Users expect a slick and responsive experience that rivals desktop applications. In addition, Google’s Core Web Vitals measure page performance – it influences PageRank and your Search Engine Optimization efforts.

WordPress runs more than a third of all websites but performance is affected by ineffective hosting, slow themes, and an over-reliance on plugins. You can fix most problems by switching to a good web host and using best practice performance techniques.

Accessing Lighthouse

Start Lighthouse by opening the page you want to examine and pressing Ctrl/Cmd + Shift + I or choosing Developer Tools from More tools in the menu. Switch to the Lighthouse tab and click the Analyse Page Load button. Results are shown after a few seconds:

Example Lighthouse report
Example Lighthouse report

You can drill down into the top-level percentages to discover further information and hints which address known problems. The tool is invaluable but there are downsides:

  • You must manually start a run for every page you’re testing.
  • It is not easy to record how factors have improved or worsened over time.
  • There is a lot of data to check and it’s easy to get something wrong.
  • The technical details are provided for developers. It’s possibly overwhelming for clients and managers who want a quick overview of progress.
  • Lighthouse runs can be influenced by local device and network speeds which could lead to false assumptions.

The PageSpeed Insights API provides a way to solve these issues so tests can be automated, recorded, and compared.

What Is the PageSpeed Insights API?

Google provides a free PageSpeed Insights REST API which returns data in JSON format containing all the Lighthouse metrics and more. It allows you to automate page runs, store the resulting data, review changes over time, and display the exact information you need.

The PageSpeed Insights API emulates how Google sees your site. You could run a report every few days or whenever you release a performance update.

The results are helpful but not necessarily indicative of actual user experience. The browser Performance API is a better option when you want to monitor real-world performance across all your user’s devices and networks.

PageSpeed Insights API Quickstart

Copy the following address to your web browser and edit the url to assess the performance of your page:

Firefox is ideal because it has a built-in JSON viewer although Chrome has extensions that provide the same functionality. The overall Lighthouse Performance score is highlighted below:

PageSpeed Insights API result JSON (Firefox)
PageSpeed Insights API result JSON (Firefox)

You can change the API URL query string for your own pages and preferences. The only required parameter is url, e.g.


A desktop test is run by default but you can explicitly request it with:


or switch to mobile with:


Only performance tests are run unless you specify one or more categories of interest:


A specific language can be defined by setting a locale – such as French:


and Google Analytics campaign details can be set with:


The service is free for infrequent requests but you will need to sign-up for a Google API key if you intend to run many tests from the same IP address in a short period. The key is added to the URL with:


You can build the URL’s query string by specifying your chosen parameters separated with ampersand (&) characters. The following API URL tests the page at using a mobile device to assess performance and accessibility criteria:

You can construct your own URLs or use the Google PageSpeed API URL builder tool should you require further assistance.

PageSpeed Insights API JSON Results

Tests will typically return around 600Kb of JSON data depending on your chosen categories, the number of assets in the page, and the complexity of screenshots (embedded in base64 format).

The quantity of data is daunting, there is some duplication, and the results documentation isn’t always clear. The JSON is split into four sections as described below.


These are metrics calculated for the end user’s page loading experience. It includes information such as the Core Web Vitals CUMULATIVE_LAYOUT_SHIFT_SCORE, FIRST_CONTENTFUL_PAINT_MS, and FIRST_INPUT_DELAY_MS. Details and a “category” value returns FAST, AVERAGE, SLOW, or NONE if no measurement was taken. Example:

"loadingExperience": {
"metrics": {
"percentile": 0,
"distributions": [
"min": 0,
"max": 10,
"proportion": 0.970
"min": 10,
"max": 25,
"proportion": 0.017
"min": 25,
"proportion": 0.012
"category": "FAST"


These are aggregated metrics calculated for all users’ page loading experiences. The sections are identical to loadingExperience above and sites with less traffic are unlikely to show any difference in figures.


This is the largest section and contains all Lighthouse metrics. It provides information about the test:

  • requestedUrl – the URL you requested
  • finalUrl – the actual page tested after following all redirects
  • lighthouseVersion – the software version
  • fetchTime – the time the test was run
  • userAgent – the user agent string of the browser used for the test
  • environment – extended user agent information
  • configSettings – the settings passed to the API

This is followed by an “audits” section with many sections including unused-javascript, unused-css-rules, total-byte-weight, redirects, dom-size, largest-contentful-paint-element, server-response-time, network-requests, cumulative-layout-shift, first-meaningful-paint, screenshot-thumbnails, and full-page-screenshot.

Most audit metrics provide a “details” section which contains factors such as “overallSavingsBytes” and “overallSavingsMs” which estimates the benefits of implementing a performance improvement.

The full page and thumbnail “screenshot” sections contain embedded base64 image data.

A  “metrics” section provides a summary of all metrics in an “items” array, e.g.

"metrics": {
"id": "metrics",
"title": "Metrics",
"description": "Collects all available metrics.",
"score": null,
"scoreDisplayMode": "informative",
"details": {
"type": "debugdata",
"items": [{
"observedFirstVisualChange": 234,
"observedFirstContentfulPaint": 284,
"interactive": 278,
"observedFirstPaintTs": 1579728174422,
"observedDomContentLoaded": 314,
// ... etc ...
"numericValue": 278,
"numericUnit": "millisecond"

The “audits” section is followed by “categories” which provides overall Lighthouse scores for the chosen categories passed on the API URL:

"categories": {
"performance": {
"id": "performance",
"title": "Performance",
"score": 0.97,
"auditRefs": [

The “score” is a number between 0 and 1 which is normally shown as a percentage on Lighthouse reports. In general, a score of:

  • 0.9 to 1.0 is good
  • 0.5 to under 0.9 indicates improvement is necessary
  • under 0.5 is poor and requires more urgent attention

The “auditRefs” section provides a list of all metrics and the weightings used to calculate each score.


Finally, the analysis time is reported. This should be identical to the time shown in lighthouseResult.fetchTime.

Useful JSON Result Metrics

I recommend you save and examine the JSON result in a text editor. Some have JSON formatters built-in or available as plugins. Alternatively, you can use free online tools such as:

The following metrics are likely to be useful. Remember to set the associated category options on the URL as necessary.

Summary Metrics

Overall scores from 0 to 1:

Performance lighthouseResult.categories.performance.score
Accessibility lighthouseResult.categories.accessibility.score
SEO lighthouseResult.categories.seo.score
Progressive Web App (PWA) lighthouseResult.categories.pwa.score

Performance Metrics

These include Core Web Vitals scores from 0 to 1:

First Contentful Paint lighthouseResult.audits.first-contentful-paint.score
First Meaningful Paint lighthouseResult.audits.first-meaningful-paint.score
Largest Contentful Paint lighthouseResult.audits.largest-contentful-paint.score
Speed Index lighthouseResult.audits.speed-index.score
Cumulative Layout Shift lighthouseResult.audits.cumulative-layout-shift.score

Other useful performance scores include:

Server response time lighthouseResult.audits.server-response-time.score
Is crawlable
Console errors lighthouseResult.audits.errors-in-console.score
Total byte weight
DOM size score lighthouseResult.audits.dom-size.score

You can usually obtain actual figures and units such as:

  • –
    the total page size, e.g. 450123
  • –
    the units used for the total page size, e.g. “byte”

Alternatively, “displayValue” usually contains a readable message with both the figure and unit:

  • lighthouseResult.audits.server-response-time.displayValue –
    a message about the response time, e.g. “Root document took 170 ms”
  • lighthouseResult.audits.dom-size.displayValue –
    a message about number of elements in the DOM, e.g. “543 elements”

Create a No-Code Performance Dashboard

Live API feeds can be read and processed in many systems including Microsoft Excel. (Somewhat bizarrely, Google Sheets does not support JSON feeds without further plugins or macro code. It does support XML.)

To import the live overall performance score into Excel, start a new spreadsheet, switch to the Data tab, and click From Web. Enter your PageSpeed Insights API URL and hit OK:

Excel From Web data import
Excel From Web data import

Click Connect in the next dialog and keep the default (Anonymous) setting. You will proceed to the Query Settings tool:

Excel Query Settings Tool
Excel Query Settings Tool

Click Record on the right of the lighthouse result metric. Then click the same on categories and performance to drill down the JSON object hierarchy:

Excel JSON object drill-down
Excel JSON object drill-down

Click the Into Table icon at the top from the right-click menu options.

You can then click the filter arrow in the table heading to remove everything other than the score before clicking OK:

Excel imported table filtering
Excel imported table filtering

Finally, click Close & Load to show the live performance score in your spreadsheet:

Excel live data
Excel live data

You can follow the same process for other metrics of interest.

Create a Web Performance Dashboard

This Codepen demonstration provides a form where you can enter a URL and choose desktop or mobile analysis to obtain results.

The code creates a PageSpeed Insights URL, calls the API, then renders various results into a summary table which is quicker to view than a standard Lighthouse report:

Example test result from PageSpeed API
Example test result

The asynchronous startCheck() function is called when the form is submitted. It cancels the submit event and hides previous results:

// make API request
async function startCheck(e) {
show(resultTable, false);
show(error, false);

It then constructs apiURL from the form data and disables the fields:

form =,
fields = form.firstElementChild,
data = new FormData(form),
qs = decodeURIComponent( new URLSearchParams(data).toString() ),
apiURL = form.action + '?' + qs;
fields.disabled = true;

The Fetch API is used to call the PageSpeed URL, get the response, and parse the JSON string into a usable JavaScript object. A try/catch block ensures failures are captured:

  try {
// call API and get result
response = await fetch(apiURL),
result = await response.json();

The result object is passed to a showResult() function. This extracts properties and places them into the result table or any other HTML element which has data-point attribute set to a PageSpeed API property, e.g.

<td data-point="lighthouseResult.categories.performance.score"></td>

The end of the try block:

    // output result
show(status, false);

Finally, a catch block handles errors and the form fields are re-enabled so further tests can be run:

  catch(err) {
// API error
show(status, false);
fields.disabled = false;

Further Development Options

The example code above fetches a result from the PageSpeed Insights API when you request it. The report is more configurable than Lighthouse but execution remains a manual process.

If you intend to develop your own dashboard, it may be practical to create a small application that calls the PageSpeed Insights API and stores the resulting JSON in a new database record against the tested URL and the current date/time. Most databases have JSON support although MongoDB is ideally suited to the task. A cron job can call your application on a timed basis – perhaps once per day during the early hours of the morning.

A server-side application can then implement its own REST API to your reporting requirements, e.g. return the changes in specific performance metrics between two dates. It can be called by client-side JavaScript to display tables or charts that illustrate performance improvements over time.

If you want to create complex reports with significant quantities of data, it’s preferable to pre-calculate the figures once per day at the point new PageSpeed data is available. After all, you don’t want to show how performance is improving on a report which takes minutes to generate!


Chrome’s Lighthouse tool is fabulous but it’s a chore to frequently evaluate many pages. The PageSpeed Insights API allows you to assess site performance using programmatic techniques. The benefits:

  • Performance tests can be automated. You cannot forget to run a test.
  • Results are gathered on Google servers so local device and network speed factors have less influence.
  • Metrics include information that is not usually available in Lighthouse.
  • Important metrics can be recorded and monitored over time to ensure page performance has improved with each update.
  • Performance, accessibility, and SEO information can be shown on simplified reports so it’s available at a glance to developers, managers, and clients.
Craig Buckler

Freelance UK web developer, writer, and speaker. Has been around a long time and rants about standards and performance.