Measuring your website's carbon footprint is easier than most developers expect. The hard part is knowing which method to use, understanding what the numbers actually mean, and — crucially — not stopping at a single measurement. This guide walks through three practical approaches, from a sixty-second online scan to a manual calculation you can run with Chrome DevTools and a spreadsheet. Pick the one that fits your situation.

Before we get into the methods, a quick framing note. Website carbon emissions are driven almost entirely by two variables: page weight (how many bytes transfer from server to screen) and hosting energy source (whether your server runs on renewable electricity). Every measurement method is essentially quantifying these two things and converting them to grams of CO₂ using a consistent formula. The primer on website carbon footprints covers the physics in depth if you want the full picture first.

Method 1: Use an Online Scanner (Fastest)

If you want a result in under a minute with no technical setup, this is your path. The Carbon Badge scanner takes a URL, fetches the page the way a real browser would, measures the data transferred, checks whether your hosting provider appears in the Green Web Foundation's database of verified renewable-powered hosts, and applies the SWDM v4 formula to produce a gram-per-pageview figure and a letter grade.

Here is the exact process:

  1. Go to carbon-badge.com/en/.
  2. Paste your page URL into the input field. Use the specific URL you want to measure — your homepage, your most-visited landing page, your heaviest product page. The score varies significantly by page.
  3. Hit Scan. The tool fetches your page, times the transfer, and calculates the result. This usually takes 5–15 seconds depending on page weight.
  4. Read the result: you will see grams of CO₂ per pageview, a letter grade from A to F, your page weight in kilobytes, and whether your host is green-certified.

That is it. No account required for a one-off scan. The result tells you where you stand right now.

One practical note: run the scan a second time if the first result seems unexpectedly low. The first fetch sometimes benefits from CDN warm-up or cached assets from a recent real visitor. A second run gives a more representative uncached result.

For monitoring across multiple pages and catching regressions after deployments, the pro plans allow scheduled scans with automatic alerts — more on that in the monitoring section below.

Method 2: Manual Calculation Using the SWDM v4 Formula

Manual calculation is slower but gives you complete transparency into every input. It is particularly useful when you want to estimate emissions before building a page, calculate projections across different scenarios, or verify that an automated tool's result makes sense. The methodology comes from the Sustainable Web Design Model version 4 (SWDM v4), published by Wholegrain Digital and used by Carbon Badge and most other credible carbon tools. The SWDM v4 guide covers the formula's origins and assumptions if you want the academic context.

Step 1: Measure Page Weight

Page weight is the total bytes transferred when a visitor loads your page for the first time with an empty cache. You want the transferred size — what actually crossed the wire after compression — not the uncompressed file size.

Using Chrome DevTools (recommended):

Open Chrome and navigate to the page you want to measure. Hit Cmd+Option+I on macOS or Ctrl+Shift+I on Windows/Linux to open DevTools. Click the Network tab. Now hit Cmd+Shift+R (macOS) or Ctrl+Shift+R (Windows) to do a hard reload — this bypasses the cache and simulates a first-time visit. Wait until the page finishes loading. Look at the status bar at the bottom of the Network panel. You will see something like:

47 requests  |  1.2 MB transferred  |  3.8 MB resources  |  Finish: 4.2s

The figure you want is the transferred number — in this case 1.2 MB. That is the compressed, over-the-wire page weight. The "resources" figure (3.8 MB here) is the uncompressed size and is not what you use for carbon calculations.

Using WebPageTest (for more detail):

Go to webpagetest.org, enter your URL, choose a location close to your primary audience, and run the test. The results page shows "Bytes In" under the Summary tab — this is your transferred page weight. WebPageTest also breaks down weight by resource type (images, JS, CSS, fonts, HTML), which helps you identify what is driving the number.

Convert your page weight to gigabytes for the formula. 1.2 MB = 0.0012 GB. A 3.5 MB page = 0.0035 GB.

Step 2: Check Whether Your Host Is Green

The Green Web Foundation maintains a database of hosting providers that have provided documented evidence of running on renewable electricity. You can check any domain directly via their API:

https://api.thegreenwebfoundation.org/greencheck/yourdomain.com

Replace yourdomain.com with your actual domain and open that URL in a browser or hit it with curl. The JSON response contains a green field:

{
  "green": true,
  "url": "yourdomain.com",
  "hostedby": "Infomaniak",
  "hostedbywebsite": "https://www.infomaniak.com"
}

If green: true, your green_factor for the formula is 0.243. If green: false or your domain is not in the database, use 0.

A practical note: the database covers the major providers well (Google Cloud, AWS, Hetzner, Infomaniak, GreenGeeks, and many others), but some smaller or regional hosts may use renewable energy without being listed. The Green Web Foundation accepts submissions from hosting providers — so if your host is green but not listed, that is something worth pursuing with them.

Step 3: Apply the SWDM v4 Formula

With page weight in GB and your green_factor determined, the calculation is straightforward:

CO₂ (grams) = page_weight_GB × 0.194 × 494 × (1 − green_factor)

Breaking down the constants: 0.194 kWh/GB is the combined energy intensity of data centres (0.055), network transmission (0.059), and end-user devices (0.080). 494 gCO₂/kWh is the world average grid carbon intensity from Ember Climate's 2023 data.

Three worked examples:

-- 1.2 MB page, non-green host --
0.0012 × 0.194 × 494 × (1 − 0) = 0.115 gCO₂  →  Grade A

-- 2.5 MB page, non-green host --
0.0025 × 0.194 × 494 × (1 − 0) = 0.240 gCO₂  →  Grade B

-- 5 MB page, non-green host --
0.005 × 0.194 × 494 × (1 − 0) = 0.480 gCO₂  →  Grade C

-- 5 MB page, green host --
0.005 × 0.194 × 494 × (1 − 0.243) = 0.363 gCO₂  →  Grade C (lower end)

You can run this in a spreadsheet cell, a Python REPL, or a calculator. There is nothing opaque about it — every number is published and the methodology is open source.

Step 4: Interpret the Grade

The grade scale Carbon Badge uses is the same one applied across SWDM v4 tools:

  • A (under 0.15g CO₂) — Excellent. Your page is among the cleanest 10% on the web. A first-time visitor's pageview causes less CO₂ than burning half a match.
  • B (0.15–0.30g) — Good. Meaningfully better than average. Optimisation effort has clearly been made.
  • C (0.30–0.50g) — Average. The global median sits around 0.5g. You are in the crowd, not the vanguard.
  • D (0.50–0.75g) — Poor. Your page is heavier than most. Significant optimisation opportunity exists.
  • E (0.75–1.00g) — Bad. This page is in the worst 20% by emissions. Likely driven by unoptimised images or heavy JavaScript.
  • F (over 1.00g) — Very bad. Immediate attention needed. Pages at this level often have 5+ MB of uncompressed assets or serve video autoplay on load.

A useful benchmark: the HTTPArchive Web Almanac records median desktop page weight at 2.3 MB in 2024. At that weight with a non-green host: 0.220g, which is a low B grade. So "median" actually lands at B by the formula, despite feeling average. Most people, when they first scan their site, discover they are at C or D — because developer and marketing sites accumulate weight over time without anyone watching the total.

Method 3: CO2.js for Developers

If you are building carbon measurement into a CI/CD pipeline, a performance monitoring dashboard, or an application that needs to report emissions programmatically, CO2.js is the right tool. It is an open-source JavaScript library published by the Green Web Foundation that implements SWDM v4 (and alternative models) directly in code.

Install it:

npm install @tgwf/co2

Basic usage — calculating CO₂ per byte transferred:

import { co2 } from '@tgwf/co2';

const emissions = new co2();

// Bytes transferred (e.g. from a performance API measurement)
const bytesTransferred = 1_258_000; // 1.2 MB in bytes
const isGreenHost = true;

const gramsPerPageview = emissions.perByte(bytesTransferred, isGreenHost);
console.log(`${gramsPerPageview.toFixed(3)}g CO₂ per pageview`);
// → 0.092g CO₂ per pageview (green host)
// → 0.121g CO₂ per pageview (non-green host)

You can pair CO2.js with the Navigation Timing API to measure real user page weights in production:

// After page load, in a performance observer
const transferSize = performance.getEntriesByType('navigation')[0].transferSize;
const emissions = new co2();
const gramsPerPageview = emissions.perByte(transferSize, false);

// Send to your analytics or monitoring system
fetch('/api/metrics', {
  method: 'POST',
  body: JSON.stringify({ co2_grams: gramsPerPageview, path: location.pathname })
});

This approach gives you real user measurement (RUM) for carbon — actual transfer sizes observed in production across real devices and networks — rather than lab-condition estimates. For high-traffic sites, this is the gold standard: you can see how emissions vary by page, by device type, by traffic source, and over time.

CO2.js also exposes the Green Web Foundation API for hosting checks, so you can automate the full SWDM v4 calculation without any manual steps. The library is actively maintained and aligns with methodology updates as SWDM evolves.

Interpreting Your Results: What Is Good, What Is Bad

Raw numbers matter less than context. Here is how to read what you find.

If you scored A: Your page is genuinely well-optimised. The meaningful next step is not chasing a lower number — it is making sure you stay at A as the site evolves. Deployments that add images, embed videos, or pull in new third-party scripts can push a grade-A page to B or C without anyone noticing. That is what monitoring is for.

If you scored B: Good, but meaningful improvement is available. For most B-grade sites, the path to A involves converting images to AVIF/WebP and auditing JavaScript for unused code. Both are solvable in a day of work on a typical site. The full reduction guide covers the techniques with real numbers.

If you scored C: This is the median. Being average is not a disaster, but it means you have not specifically optimised for weight. Realistically, a C-grade site can reach B with image optimisation alone — that one intervention typically cuts 30–50% of page weight. A move to a green host simultaneously brings another 24.3% reduction on top.

If you scored D, E, or F: Something specific is driving the weight up significantly — almost certainly unoptimised images, heavy JavaScript, or both. The Chrome DevTools Network panel is your first diagnostic tool here: filter by resource type and look for the largest individual files. A single 4 MB hero image will put you in F territory by itself.

One thing worth understanding about the grade scale: the jumps are not linear. Going from F to E requires roughly as much weight reduction as going from B to A, but the F-to-E journey has vastly more impact because F-grade pages are usually 10× heavier than A-grade ones. Optimising a bad page delivers much more total emissions reduction than fine-tuning a good one.

What to Do After Measuring: Optimisation Priorities

Measurement without action is just a number on a screen. Here is the priority order I have found works in practice, ranked by impact per hour of effort:

  1. Images first. Convert JPEGs and PNGs to WebP (or AVIF for maximum compression). Add loading="lazy" to every image below the fold. Use srcset to avoid serving desktop-resolution images to mobile viewports. On most sites, this alone cuts page weight by 30–50%. Target under 100 KB per photograph, under 20 KB per icon or illustration.
  2. Migrate to a green host. One configuration change, no code required, 24.3% automatic CO₂ reduction on the SWDM formula. Verified providers include Infomaniak, Hetzner, GreenGeeks, and major cloud platforms. Check the Green Web Foundation directory for the full list.
  3. Audit JavaScript. Open Chrome DevTools, hit Cmd+Shift+P, type Coverage, and run the Coverage analyser. It shows you exactly which bytes of each loaded JS file are actually executed on that page. Red sections are dead code shipped to every user. Unused JavaScript is the easiest weight to remove — it costs nothing to the user experience and often nothing to the feature set.
  4. Enable Brotli compression. Check whether your server uses it: curl -H "Accept-Encoding: br" -I https://yourdomain.com | grep content-encoding. If you see gzip instead of br, you are leaving 15–25% text compression efficiency on the table. Brotli is supported by all modern browsers and available on Apache (mod_brotli) and Nginx (ngx_brotli).
  5. Set aggressive cache headers. Static assets with content-hashed filenames should carry Cache-Control: public, max-age=31536000, immutable. A repeat visitor who loads cached assets transfers near-zero bytes — and emits near-zero CO₂ for those resources.

A realistic trajectory: a site starting at 1.2g CO₂ per pageview (D grade) can reach 0.25–0.30g (B grade) through steps 1–5 without any design or functionality changes. That is a 75–80% emissions reduction from pure technical hygiene. The SWDM v4 deep-dive explains how those reductions compound in the formula.

Monitoring Over Time: Why One-Time Measurement Is Not Enough

Pages change. That is the part most sustainability guides skip, and it is where most carbon improvements get quietly undone.

A marketing team adds a full-resolution hero image in a content update. A developer adds a new analytics script to a landing page. A plugin auto-update pulls in additional dependencies. A video gets embedded with autoplay. None of these feel like carbon decisions — they feel like normal day-to-day website management. But each one can push a grade-A page to B or C without triggering any alerts, because nobody is watching the carbon score the way they watch uptime or conversion rate.

Monitoring is the mechanism that makes improvement durable rather than a one-time exercise. The approach that works:

  • Set a target grade for each key page and treat dropping below it as a regression, exactly as you would treat a broken link or a failed test.
  • Scan on a schedule. Monthly is the minimum; weekly is better for actively developed sites. The pro tier on Carbon Badge allows scheduled scans with automatic email alerts when a page's grade drops. You can set a threshold — "alert me if any monitored page falls below B" — and receive a notification only when action is needed, without manually checking every month.
  • Integrate into your deployment pipeline. Using CO2.js with your CI/CD system, you can automatically calculate the expected CO₂ of a build and fail the pipeline (or flag a warning) if the page weight exceeds a defined threshold. This catches regressions before they reach production rather than after.
  • Make the score visible. Embedding a Carbon Badge widget on your site creates accountability. When stakeholders can see the current grade without going out of their way to check it, weight creep becomes a conversation rather than a silent accumulation.

The emission reductions that stick are the ones tied to an ongoing process — not a one-off audit that sits in a Notion doc and gets reviewed once a year. Carbon monitoring is infrastructure, not a project.