To reduce your website's carbon footprint, focus on three levers: cut page weight (images and JavaScript account for roughly 75% of it), switch to a green-certified host, and serve repeat visitors from cache. A typical e-commerce site emitting 1.2g CO₂ per pageview can drop to 0.3g — a 75% reduction — with no functional changes visible to users. That is not theoretical. I have seen it happen in practice, multiple times.

But the details matter. So let's get into them properly.

Infographic

Website Carbon Reduction Waterfall

Sources: HTTP Archive 2024, SWDM v4, IEA (2023), real optimization audits

416 TWh

Internet electricity consumption in 2023 — more than the UK

2.3 MB

Median desktop page weight in 2024 — up from 1.9 MB in 2021

Optimization Waterfall: 1.2g → 0.3g CO₂ per pageview

Starting
1.20g CO₂ — Typical unoptimized site
Images
0.78g — WebP + lazy load (-35%)
JavaScript
0.55g — Remove unused JS (-30%)
Compression
0.47g — Brotli (-15%)
Caching
0.38g — CDN + headers
Green host
0.29g — -24.3% (SWDM)

50-80%

Realistic total reduction

2-5×

JS energy cost vs images

~50%

Page weight = images

Sources: HTTP Archive Web Almanac 2024, IEA (2023), SWDM v4 methodology • © 2026 Carbon Badge

Why Website Carbon Emissions Actually Matter

The internet consumed an estimated 416 TWh of electricity in 2023 — comparable to the electricity usage of the entire United Kingdom. Data centres, network infrastructure, and the billions of end-user devices loading pages every second all burn energy. Most of that energy still comes from fossil fuels.

HTTPArchive's annual Web Almanac records the median desktop page weight at 2.3 MB in 2024, up from 1.9 MB in 2021. We are going in the wrong direction. And the hidden cost is substantial: a site with 10,000 monthly pageviews at 1g CO₂ each emits the equivalent of driving roughly 40 km every month — just from people loading your homepage. That is why knowing how to reduce your website carbon footprint is becoming a core professional skill, not an optional extra.

The methodology that makes this measurable is the Sustainable Web Design Model v4 (SWDM v4), published by Wholegrain Digital and adopted by tools including Carbon Badge. If you are new to the concept, the introduction to website carbon footprints explains the fundamentals. For the formula in depth, the SWDM v4 measurement guide covers it in detail. The short version: every gigabyte transferred costs roughly 0.194 kWh of electricity and emits about 494 grams of CO₂ on the average grid.

The formula:

CO₂ (g) = page_weight_GB × 0.194 kWh/GB × 494 gCO₂/kWh × (1 − green_factor)

Where green_factor = 0.243 if your host is verified renewable. That 24.3% automatic reduction is the cheapest gain available — often achievable by migrating to a greener host with zero design changes.

Use the Carbon Badge scanner to get your current emissions score before you change anything. You need a baseline.

The Image Problem Nobody Wants to Admit

Images represent around 50% of page weight on median websites. That percentage has barely moved in a decade despite better compression standards being available. The reason is workflow: developers reach for whatever format their design tool exports, upload it, and move on.

Here is what a real optimisation audit looks like:

Format: Always WebP, Sometimes AVIF

WebP delivers 25–35% smaller files than JPEG at equivalent visual quality. AVIF goes further — up to 50% smaller than JPEG — but browser support, while now at ~96% globally, still has edge cases with older Android WebViews in enterprise environments. My pragmatic approach: serve AVIF to browsers that declare it, WebP as the fallback, JPEG as the last resort.

<picture>
  <source srcset="hero.avif" type="image/avif">
  <source srcset="hero.webp" type="image/webp">
  <img src="hero.jpg" alt="..." width="1200" height="630">
</picture>

The width and height attributes are not optional here — they prevent layout shift and eliminate the browser's need to re-layout the page after image load, which is a separate performance and energy cost.

Sizing: Serve What the Viewport Actually Needs

A 1200px wide hero image on a 375px mobile screen is sending ten times the pixels needed. The srcset and sizes attributes exist precisely for this:

<img
  srcset="hero-400.webp 400w, hero-800.webp 800w, hero-1200.webp 1200w"
  sizes="(max-width: 600px) 400px, (max-width: 1024px) 800px, 1200px"
  src="hero-1200.webp"
  alt="..."
>

Done correctly, this alone cuts image payload by 40–60% on mobile — where a growing majority of traffic now lives.

Compression Targets That Actually Work

My working targets after years of testing: photographs under 100 KB, icons and illustrations under 20 KB, hero images under 150 KB. These are not arbitrary — they are what allows a page to stay under 500 KB total weight and land an A or B grade on SWDM v4.

Tools: Squoosh (browser-based, excellent for manual control), sharp (Node.js, automatable), ImageMagick with quality settings tuned per format.

Lazy Loading Is Table Stakes

Every image below the fold should carry loading="lazy". That is one attribute. No JavaScript required since 2019. And yet HTTPArchive data shows that as of 2024, only 46% of images on the web use it. The remaining 54% are transferring bytes that most users will never scroll to.

JavaScript: The Weight You Cannot See Transferring

JavaScript is the second biggest contributor to page weight — and the most energy-intensive to process. Transferring 300 KB of JS already costs 0.058g CO₂ in transmission alone. But the CPU cycles to parse, compile, and execute that JS on the user's device multiply the energy cost further. A 300 KB JS bundle can consume 2–5× more energy in execution than in transfer, depending on the device.

Audit Before You Cut

Chrome DevTools' Coverage tab (Ctrl+Shift+P → Coverage) shows you exactly which bytes of loaded JS are executed on a given page. I have run this on client sites and routinely found 40–60% of loaded JavaScript is dead code — shipped but never run. That is not a fringe case. That is the median.

The Lighthouse report surfaces this under "Remove unused JavaScript" — pay attention to the estimated savings in kilobytes, not just the audit status.

Code Splitting and Dynamic Imports

If you are using a framework, code splitting should already be configured — but verify it. Dynamic imports let you defer non-critical code until it is needed:

// Load the analytics module only after user interaction
document.addEventListener('click', () => {
  import('./analytics.js').then(mod => mod.init());
}, { once: true });

This approach can defer 20–40% of JS payload on content-heavy pages. The user experience does not degrade. The carbon cost does.

Third-Party Scripts: The Hidden Killers

Chat widgets, marketing pixels, A/B testing scripts, social embeds — each one adds weight and energy cost that never appears in your own bundle analysis. A single Intercom widget can add 200 KB. Facebook Pixel adds another 60 KB. Google Tag Manager, if left undisciplined, becomes a vehicle for unlimited third-party bloat.

The question to ask for every third-party script: what is the measurable business value, and does it outweigh the carbon cost plus the performance impact? For many scripts, the honest answer is no.

Green Hosting: The Easiest Single Change

I keep coming back to this because it is genuinely underrated. The SWDM v4 formula gives a 24.3% automatic reduction in calculated CO₂ when your host is verified renewable by the Green Web Foundation. No design changes. No code refactoring. Just migrate.

Verified green hosts include Infomaniak (Swiss, 100% renewable), Hetzner (German, renewables + efficiency-optimised data centres), GreenGeeks, and large cloud providers including Google Cloud (carbon-neutral since 2007) and AWS (50%+ renewable as of 2023).

There is a nuance worth understanding: "green" hosting does not mean the data centre's electricity physically comes from wind turbines connected by a cable. It means the provider purchases Renewable Energy Certificates (RECs) or Power Purchase Agreements (PPAs) equivalent to their consumption. The Green Web Foundation's database verifies these claims. Not all greenwashing is caught, but it is the best independent verification available at scale.

For sites on shared hosting with no green option, the migration cost is usually a few hours of work. The carbon reduction is permanent and immediate. If you are thinking about energy efficiency more broadly — beyond the server — tools like Energy Rebate Calculator can help you quantify the energy cost of physical infrastructure, including home offices and on-premise hardware, which is relevant if you self-host or run your own build servers.

Caching: Reducing Emissions per Repeat Visit

The first visit to a page transfers all assets from server to browser. The second visit — if properly cached — transfers almost nothing. Caching is therefore one of the highest-leverage tools for high-traffic sites, because it multiplies emission reductions across every repeat visitor.

HTTP Cache Headers That Actually Work

Static assets (images, fonts, JS, CSS with content-hashed filenames) should be cached aggressively:

Cache-Control: public, max-age=31536000, immutable

The immutable directive tells browsers not to revalidate the asset even on forced refresh, as long as it has not expired. For content-hashed assets (where the filename changes when content changes), this is safe and eliminates unnecessary revalidation requests.

HTML documents themselves need more care — typically:

Cache-Control: public, max-age=0, must-revalidate

Or with a short TTL if your content is stable:

Cache-Control: public, max-age=600, stale-while-revalidate=3600

The stale-while-revalidate directive is particularly useful: users get cached content instantly while the browser revalidates in the background. No round-trip delay, no wasted energy on synchronous revalidation.

CDN Edge Caching

A CDN does two things for emissions: it caches assets closer to users (reducing transmission distance and therefore energy), and it consolidates high-traffic delivery through infrastructure with better PUE (Power Usage Effectiveness) than most origin servers. Cloudflare's network operates at approximately 1.1 PUE — compared to a typical on-premise server room at 1.5–2.0. That difference compounds at scale.

Server-Side Rendering and Static Generation

If you are serving dynamically generated pages on every request, consider whether static generation is viable. A statically generated HTML page costs essentially zero compute per request after the CDN caches it. The same page served by a PHP or Node.js runtime burns CPU cycles on every request.

For the right content types — blogs, documentation, marketing pages — static generation (via Next.js, Astro, Hugo, 11ty, or even a PHP-generated cache) eliminates a substantial slice of server-side energy consumption.

Fonts: The Carbon Cost of Typography

Web fonts are not as heavy as images or JavaScript, but they are often loaded inefficiently. A typical Google Fonts embed loads 2–4 font files totalling 100–400 KB, with potential render-blocking behaviour that forces the browser to wait before painting text.

The sustainable approach:

  • Self-host fonts rather than loading them from Google's CDN. One fewer DNS lookup, one fewer cross-origin connection, full control over caching headers.
  • Subset aggressively. If your site is English-only, you do not need the full Latin Extended character set. Tools like pyftsubset or Fonttools can strip unused glyphs, reducing a 150 KB font to 30 KB.
  • Use font-display: swap to show system fonts immediately while the web font loads, eliminating render-blocking.
  • Question whether you need a web font at all. The modern system font stack — -apple-system, BlinkMacSystemFont, 'Segoe UI', Roboto, sans-serif — renders beautifully on every platform and costs exactly zero bytes to transfer.

Compression: The Quick Win You Might Be Missing

Text-based assets (HTML, CSS, JS, SVG, JSON) compress extremely well. Gzip reduces them by roughly 70%. Brotli, the newer algorithm supported by all modern browsers, achieves 15–25% better compression than Gzip on typical web assets.

Verify your server is using Brotli:

curl -H "Accept-Encoding: br" -I https://yourdomain.com/ | grep content-encoding
# Should return: content-encoding: br

If it returns gzip or nothing, you are leaving efficiency on the table. On Apache, Brotli requires mod_brotli. On Nginx, ngx_brotli. Both are available on most modern hosting environments.

Pre-compressing static assets at build time (so the server does not compress on every request) is the most efficient approach — generate .br and .gz versions of your CSS and JS files and configure your server to serve them directly.

Measuring Progress: Tools and Metrics

You need to measure before and after every significant change. Gut-feel optimisation without data is just hoping. The tools I use in practice:

Carbon Badge Scanner

The Carbon Badge scanner applies SWDM v4 directly, checks Green Web Foundation for your hosting status, and produces a letter grade (A through F) with a gram-per-pageview figure. Run it on your key pages — homepage, most-trafficked landing pages, heaviest product pages. The pro tier allows scheduled monitoring so you catch regressions automatically when a new deploy ships.

Lighthouse

Google Lighthouse does not measure carbon directly, but its Performance score correlates strongly with emissions. Every millisecond you save in page load is energy not consumed by the user's device. Pay particular attention to Total Blocking Time (TBT) and Largest Contentful Paint (LCP) — both are sensitive indicators of JavaScript and image bloat.

HTTPArchive and Web Almanac

HTTPArchive crawls the top 8 million websites monthly and publishes aggregate data. The annual Web Almanac is indispensable for benchmarking — it tells you whether your page weight is better or worse than the median for your category. Knowing you are at the 30th percentile for image weight gives you a clearer target than a raw kilobyte number.

Chrome DevTools Network Panel

The Network panel, filtered by resource type, shows you exactly what is transferring and how large each resource is — both raw and compressed. The "Transferred" column (what actually crossed the wire) versus the "Size" column (decompressed) reveals whether your server compression is working. A 300 KB JS file showing 90 KB transferred means Brotli is working well. The same file showing 280 KB transferred means compression is broken or missing.

Advanced: Dark Mode, Efficient Color, and OLED Screens

This is not as impactful as image optimisation, but it is worth understanding. OLED and AMOLED screens — which dominate modern mobile — draw significantly less power rendering dark pixels than light ones. A white pixel on an OLED screen draws roughly 6× more power than a black pixel.

This means dark-mode interfaces have measurable energy benefits for the substantial fraction of users on OLED devices. The prefers-color-scheme media query lets you serve a dark theme to users who have opted into it at the OS level:

@media (prefers-color-scheme: dark) {
  :root {
    --bg: #0a0a0a;
    --text: #e8e8e8;
  }
}

Carbon Badge itself uses a dark theme by default — partly aesthetic, partly intentional. If your brand allows it, it is a low-effort environmental signal with real (if modest) energy impact.

Putting It Together: A Realistic Reduction Plan

The optimisations above are not all equally impactful. If I had to prioritise for a typical site starting from scratch:

  1. Baseline measurement. Run the Carbon Badge scanner. Note the grade and grams per pageview.
  2. Image audit. Convert to WebP/AVIF, add srcset, add loading="lazy" below the fold. Expected impact: 30–50% weight reduction.
  3. JavaScript audit. Remove unused code, defer non-critical scripts, evaluate each third-party. Expected impact: 10–30% weight reduction.
  4. Enable Brotli compression. Expected impact: 5–15% weight reduction (if not already active).
  5. Set aggressive cache headers. Expected impact: near-zero transfer cost on repeat visits.
  6. Migrate to a green host. Expected impact: 24.3% CO₂ reduction on the SWDM formula.
  7. Re-measure. Compare to baseline. Document what changed.

A site starting at 1.2g CO₂ per pageview (a D grade) can realistically reach 0.25–0.35g (a B or low-C grade) through steps 1–6 without any changes to design, content, or functionality. That is a 70–80% emissions reduction from pure technical hygiene.

For further context on why these numbers matter at the internet scale, the internet carbon emissions statistics overview puts individual site choices into the aggregate picture. And if you want to see how SWDM v4 compares methodologically to alternatives, the SWDM v4 deep-dive covers the academic underpinnings.

One more thing worth mentioning: energy efficiency improvements on the web compound. A 50 KB savings per pageview on a site with 100,000 monthly visits is 5 GB of avoided transfer per month — and that recurs every month, indefinitely, without any further work. The return on investment for sustainable web development is fundamentally different from most engineering optimisations.

If you are curious how Carbon Badge compares to other measurement tools in terms of methodology and accuracy, this comparison covers the major options and where they diverge.

What is your current grade? There is only one way to find out.