September 7th, 2024

WebP: The WebPage Compression Format

The blog post addresses website performance optimization through image and data compression, emphasizing Brotli's efficiency over gzip, while noting GitHub Pages' lack of Brotli support and proposing workarounds.

Read original articleLink Icon
FrustrationSkepticismCuriosity
WebP: The WebPage Compression Format

The blog post discusses the challenges of optimizing website performance, particularly focusing on image and data compression. The author emphasizes the importance of page load time and accessibility, noting that while minifying HTML helps, the real gains come from data compression. They highlight the use of gzip and Brotli compression methods, with Brotli being more efficient but slower than gzip. The author expresses frustration with GitHub Pages, which does not support Brotli compression, resulting in larger file sizes and increased load times for their blog. They propose that GitHub could allow users to upload pre-compressed files, although this feature is currently unavailable. As a workaround, the author suggests using a JavaScript-based Brotli decompressor, which can reduce the effective size of data transferred. They compare the sizes of gzip and Brotli compressed data, illustrating the potential benefits of using Brotli despite the additional overhead of decompression on the client side.

- The author focuses on improving website performance through compression techniques.

- Brotli compression is more efficient than gzip but is not supported by GitHub Pages.

- The author suggests a workaround using JavaScript to decompress Brotli data on the client side.

- Pre-compressed data uploads could enhance performance if supported by GitHub.

- The post highlights the trade-offs between compression methods and their impact on load times.

Related

Optimizing JavaScript for Fun and for Profit

Optimizing JavaScript for Fun and for Profit

Optimizing JavaScript code for performance involves benchmarking, avoiding unnecessary work, string comparisons, and diverse object shapes. JavaScript engines optimize based on object shapes, impacting array/object methods and indirection. Creating objects with the same shape improves optimization, cautioning against slower functional programming methods. Costs of indirection like proxy objects and function calls affect performance. Code examples and benchmarks demonstrate optimization variances.

Writing HTML by hand is easier than debugging your static site generator

Writing HTML by hand is easier than debugging your static site generator

The blog author discusses challenges of static site generators versus manual HTML coding, citing setup complexities and advocating for simplicity, stability, and control in website management. Emphasizes static data benefits.

Simple GitHub Actions Techniques

Simple GitHub Actions Techniques

Denis Palnitsky's Medium article explores advanced GitHub Actions techniques like caching, reusable workflows, self-hosted runners, third-party managed runners, GitHub Container Registry, and local workflow debugging with Act. These strategies aim to enhance workflow efficiency and cost-effectiveness.

The Cost of JavaScript

The Cost of JavaScript

JavaScript significantly affects website performance due to download, execution, and parsing costs. Optimizing with strategies like code-splitting, minification, and caching is crucial for faster loading and interactivity, especially on mobile devices. Various techniques enhance JavaScript delivery and page responsiveness.

"GitHub" Is Starting to Feel Like Legacy Software

"GitHub" Is Starting to Feel Like Legacy Software

GitHub faces criticism for performance decline and feature issues like blame view rendering large files. Users find navigation challenging and core features neglected despite modernization efforts. Users consider exploring alternative platforms.

AI: What people are saying
The comments reflect a diverse range of opinions on image and data compression techniques discussed in the blog post.
  • Concerns about WebP format compatibility and the extra steps required for editing images.
  • Debate over the effectiveness of Brotli versus gzip, with some arguing that Brotli's benefits may not justify its complexity.
  • Discussion on the implementation of Brotli in web environments, particularly on platforms like GitHub Pages.
  • Experiences shared regarding the use of WebP and its growing support across tools and browsers.
  • Critiques of the article's practical implications, including the lack of performance measurements in the proposed techniques.
Link Icon 39 comments
By @BugsJustFindMe - 3 months
> the longest post on my site, takes 92 KiB instead of 37 KiB. This amounts to an unnecessary 2.5x increase in load time

Sure, if you ignore latency. In reality it's an unnecessary 0.001% increase in load time because that size increase isn't enough to matter vs the round trip time. And the time you save transmitting 55 fewer KiB is probably less than the time lost to decompression. :p

While fun, I would expect this specific scenario to actually be worse for the user experience not better. Speed will be a complete wash and compatibility will be worse.

By @gkbrk - 3 months
> Why readPixels is not subject to anti-fingerprinting is beyond me. It does not sprinkle hardly visible typos all over the page, so that works for me.

> keep the styling and the top of the page (about 8 KiB uncompressed) in the gzipped HTML and only compress the content below the viewport with WebP

Ah, that explains why the article suddenly cut off after a random sentence, with an empty page that follows. I'm using LibreWolf which disables WebGL, and I use Chromium for random web games that need WebGL. The article worked just fine with WebGL enabled, neat technique to be honest.

By @lifthrasiir - 3 months
It is actually possible to use Brotli directly in the web browser... with caveats of course. I believe my 2022 submission to JS1024 [1] is the first ever demonstration of this concept, and I also have a proof-of-concept code for the arbitrary compression (which sadly didn't work for the original size-coding purpose though). The main caveat is that you are effectively limited to the ASCII character, and that it is highly sensitive to the rendering stack for the obvious reason---it no longer seems to function in Firefox right now.

[1] https://js1024.fun/demos/2022/18/readme

[2] https://gist.github.com/lifthrasiir/1c7f9c5a421ad39c1af19a9c...

By @raggi - 3 months
Chromies got in the way of it for a very long time, but zstd is now coming to the web too, as it’s finally landed in chrome - now we’ve gotta get safari onboard
By @jfoster - 3 months
I work on Batch Compress (https://batchcompress.com/en) and recently added WebP support, then made it the default soon after.

As far as I know, it was already making the smallest JPEGs out of any of the web compression tools, but WebP was coming out only ~50% of the size of the JPEGs. It was an easy decision to make WebP the default not too long after adding support for it.

Quite a lot of people use the site, so I was anticipating some complaints after making WebP the default, but it's been about a month and so far there has been only one complaint/enquiry about WebP. It seems that almost all tools & browsers now support WebP. I've only encountered one website recently where uploading a WebP image wasn't handled correctly and blocked the next step. Almost everything supports it well these days.

By @98469056 - 3 months
While peeking at the source, I noticed that the doctype declaration is missing a space. It currently reads <!doctypehtml>, but it should be <!doctype html>
By @Retr0id - 3 months
I've used this trick before! Oddly enough I can't remember what I used it for (perhaps just to see if I could), and I commented on it here: https://gist.github.com/gasman/2560551?permalink_comment_id=...

Edit: I found my prototype from way back, I guess I was just testing heh: https://retr0.id/stuff/bee_movie.webp.html

By @butz - 3 months
Dropping google fonts should improve page load time a bit too, considering those are loaded from remote server that requires additional handshake.
By @niutech - 3 months
This page is broken at least on Sailfish OS browser, there is a long empty space after the paragraph:

> Alright, so we’re dealing with 92 KiB for gzip vs 37 + 71 KiB for Brotli. Umm…

That said, the overhead of gzip vs brotli HTML compression is nothing compared with amount of JS/images/video current websites use.

By @michaelbrave - 3 months
I personally don't much care for the format, if I save an image and it ends up WebP then I have to convert it before I can edit or use it in any meaningful way since it's not supported in anything other than web browsers. It's just giving me extra steps to have to do.
By @gildas - 3 months
In the same vein, you can package HTML pages as self-extracting ZIP files with SingleFile [1]. You can even include a PNG image to produce files compatible with HTML, ZIP and PNG [2], and for example display the PNG image in the HTML page [3].

[1] https://github.com/gildas-lormeau/SingleFile?tab=readme-ov-f...

[2] https://github.com/gildas-lormeau/Polyglot-HTML-ZIP-PNG

[3] https://github.com/gildas-lormeau/Polyglot-HTML-ZIP-PNG/raw/...

By @divbzero - 3 months
> Typically, Brotli is better than gzip, and gzip is better than nothing. gzip is so cheap everyone enables it by default, but Brotli is way slower.

Note that way slower applies to speed of compression, not decompression. So Brotli is a good bet if you can precompress.

> Annoyingly, I host my blog on GitHub pages, which doesn’t support Brotli.

If your users all use modern browsers and you host static pages through a service like Cloudflare or CloudFront that supports custom HTTP headers, you can implement your own Brotli support by precompressing the static files with Brotli and adding a Content-Encoding: br HTTP header. This is kind of cheating because you are ignoring proper content negotiation with Accept-Encoding, but I’ve done it successfully for sites with targeted user bases.

By @phh - 3 months
> A real-world web page compressed with WebP? Oh, how about the one you’re reading right now? Unless you use an old browser or have JavaScript turned off, WebP compresses this page starting from the “Fool me twice” section. If you haven’t noticed this, I’m happy the trick is working :-)

Well it didn't work in Materialistic (I guess their webview disable js), and the failure mode is really not comfortable.

By @next_xibalba - 3 months
If only we hadn't lost Jan Sloot's Digital Coding System [1], we'd be able to transmit GB in milliseconds across the web!

[1] https://en.wikipedia.org/wiki/Sloot_Digital_Coding_System

By @rrrix1 - 3 months
I very much enjoyed reading this. Quite clever!

But...

> Annoyingly, I host my blog on GitHub pages, which doesn’t support Brotli.

Is the glaringly obvious solution to this not as obvious as I think it is?

TFA went through a lot of round-about work to get (some) Brotli compression. Very impressive Yak Shave!

If you're married to the idea of a Git-based automatically published web site, you could at least replicate your code and site to Gitlab Pages, which has supported precompressed Brotli since 2019. Or use one of Cloudflare's free tier services. There's a variety of ways to solve this problem before the first byte is sent to the client.

Far too much of the world's source code already depends exclusively on Github. I find it distasteful to also have the small web do the same while blindly accepting an inferior experience and worse technology.

By @somishere - 3 months
Lots of nice tricks in here, definitely fun! Only minor nitpick is that it departs fairly rapidly from the lede ... which espouses the dual virtues of an accessible and js-optional reading experience ;)
By @lxgr - 3 months
From the linked Github issue giving the rationale why Brotli is not available in the CompressionStream API:

> As far as I know, browsers are only shipping the decompression dictionary. Brotli has a separate dictionary needed for compression, which would significantly increase the size of the browser.

How can the decompression dictionary be smaller than the compression one? Does the latter contain something like a space-time tradeoff in the form of precalculated most efficient representations of given input substrings or something similar?

By @Dibby053 - 3 months
I didn't know canvas anti-fingerprinting was so rudimentary. I don't think it increases uniqueness (the noise is different every run) but bypassing it seems trivial: run the thing n times and take the mode. With so little noise, 4 or 5 times should be more than enough.
By @Pesthuf - 3 months
It's impressive how close this is to Brotli even though brotli has this massive pre-shared dictionary. Is the actual compression algorithm used by it just worse, or does the dictionary just matter much less than I think?
By @TacticalCoder - 3 months
I loved that (encoding stuff in webp) but my takeaway from the figures in the article is this: brotli is so good I'll host from somewhere where I can serve brotli (when and if the client supports brotli ofc).
By @kardos - 3 months
On the fingerprinting noise: this sounds like a job for FEC [1]. It would increase the size but allow using the Canvas API. I don't know if this would solve the flicker though (not a front end expert here)

Also, it's a long shot, but could the combo of FEC (+size) and lossy compression (-size) be a net win?

[1] https://en.m.wikipedia.org/wiki/Error_correction_code

By @astrostl - 3 months
Things I seek in an image format:

(1) compatibility

(2) features

WebP still seems far behind on (1) to me so I don't care about the rest. I hope it gets there, though, because folks like this seem pretty enthusiastic about (2).

By @bogzz - 3 months
Still waiting on a webp encoder to be added to the Go stdlib...
By @ajsnigrutin - 3 months
So... 60kB less of transfer... and how much slower is it on an eg. galaxy s8 phone my mom has, because of all the shenanigans done to save those 60kB?
By @butz - 3 months
I wonder what is the difference in CPU usage on client side for WebP variant vs standard HTML? Are you causing more battery drain on visitor devices?
By @csjh - 3 months
I think the most surprising part here is the gzipped-base64'd-compressed data almost entirely removes the base64 overhead.
By @ranger_danger - 3 months
>ensure it works without JavaScript enabled

>manually decompress it in JavaScript

>Brotli decompressor in WASM

the irony seems lost

By @Jamie9912 - 3 months
Why don't they make zstd images surely that would beat webp
By @galaxyLogic - 3 months
Is there a tool or some other way to easily encode a JPG image so it can be embedded into HTML? I know there is something like that, but is it easy? Could it be made easier?
By @DaleCurtis - 3 months
What a fun excursion :) You can also use the ImageDecoder API: https://developer.mozilla.org/en-US/docs/Web/API/ImageDecode... and VideoFrame.copyTo: https://developer.mozilla.org/en-US/docs/Web/API/VideoFrame/... to skip canvas entirely.
By @toddmorey - 3 months
"I hope you see where I’m going with this and are yelling 'Oh why the fuck' right now."

I love reading blogpost like these.

By @cobbal - 3 months
I would love to try reading the lossy version.
By @bawolff - 3 months
They did all this and didn't even measure time to first paint?

What is the point of doing this sort of thing if you dont even test how much faster or slower it made the page to load?

By @kopirgan - 3 months
19 year old and look at the list of stuff she's done! Perhaps started coding in the womb?! Amazing.
By @niceguy4 - 3 months
Not to side track the conversation but to side track the conversation, has there been many other major WebP exploits like the serious one in the past?