WebP: The WebPage Compression Format
The blog post addresses website performance optimization through image and data compression, emphasizing Brotli's efficiency over gzip, while noting GitHub Pages' lack of Brotli support and proposing workarounds.
Read original articleThe blog post discusses the challenges of optimizing website performance, particularly focusing on image and data compression. The author emphasizes the importance of page load time and accessibility, noting that while minifying HTML helps, the real gains come from data compression. They highlight the use of gzip and Brotli compression methods, with Brotli being more efficient but slower than gzip. The author expresses frustration with GitHub Pages, which does not support Brotli compression, resulting in larger file sizes and increased load times for their blog. They propose that GitHub could allow users to upload pre-compressed files, although this feature is currently unavailable. As a workaround, the author suggests using a JavaScript-based Brotli decompressor, which can reduce the effective size of data transferred. They compare the sizes of gzip and Brotli compressed data, illustrating the potential benefits of using Brotli despite the additional overhead of decompression on the client side.
- The author focuses on improving website performance through compression techniques.
- Brotli compression is more efficient than gzip but is not supported by GitHub Pages.
- The author suggests a workaround using JavaScript to decompress Brotli data on the client side.
- Pre-compressed data uploads could enhance performance if supported by GitHub.
- The post highlights the trade-offs between compression methods and their impact on load times.
Related
Optimizing JavaScript for Fun and for Profit
Optimizing JavaScript code for performance involves benchmarking, avoiding unnecessary work, string comparisons, and diverse object shapes. JavaScript engines optimize based on object shapes, impacting array/object methods and indirection. Creating objects with the same shape improves optimization, cautioning against slower functional programming methods. Costs of indirection like proxy objects and function calls affect performance. Code examples and benchmarks demonstrate optimization variances.
Writing HTML by hand is easier than debugging your static site generator
The blog author discusses challenges of static site generators versus manual HTML coding, citing setup complexities and advocating for simplicity, stability, and control in website management. Emphasizes static data benefits.
Simple GitHub Actions Techniques
Denis Palnitsky's Medium article explores advanced GitHub Actions techniques like caching, reusable workflows, self-hosted runners, third-party managed runners, GitHub Container Registry, and local workflow debugging with Act. These strategies aim to enhance workflow efficiency and cost-effectiveness.
The Cost of JavaScript
JavaScript significantly affects website performance due to download, execution, and parsing costs. Optimizing with strategies like code-splitting, minification, and caching is crucial for faster loading and interactivity, especially on mobile devices. Various techniques enhance JavaScript delivery and page responsiveness.
"GitHub" Is Starting to Feel Like Legacy Software
GitHub faces criticism for performance decline and feature issues like blame view rendering large files. Users find navigation challenging and core features neglected despite modernization efforts. Users consider exploring alternative platforms.
- Concerns about WebP format compatibility and the extra steps required for editing images.
- Debate over the effectiveness of Brotli versus gzip, with some arguing that Brotli's benefits may not justify its complexity.
- Discussion on the implementation of Brotli in web environments, particularly on platforms like GitHub Pages.
- Experiences shared regarding the use of WebP and its growing support across tools and browsers.
- Critiques of the article's practical implications, including the lack of performance measurements in the proposed techniques.
Sure, if you ignore latency. In reality it's an unnecessary 0.001% increase in load time because that size increase isn't enough to matter vs the round trip time. And the time you save transmitting 55 fewer KiB is probably less than the time lost to decompression. :p
While fun, I would expect this specific scenario to actually be worse for the user experience not better. Speed will be a complete wash and compatibility will be worse.
> keep the styling and the top of the page (about 8 KiB uncompressed) in the gzipped HTML and only compress the content below the viewport with WebP
Ah, that explains why the article suddenly cut off after a random sentence, with an empty page that follows. I'm using LibreWolf which disables WebGL, and I use Chromium for random web games that need WebGL. The article worked just fine with WebGL enabled, neat technique to be honest.
[1] https://js1024.fun/demos/2022/18/readme
[2] https://gist.github.com/lifthrasiir/1c7f9c5a421ad39c1af19a9c...
As far as I know, it was already making the smallest JPEGs out of any of the web compression tools, but WebP was coming out only ~50% of the size of the JPEGs. It was an easy decision to make WebP the default not too long after adding support for it.
Quite a lot of people use the site, so I was anticipating some complaints after making WebP the default, but it's been about a month and so far there has been only one complaint/enquiry about WebP. It seems that almost all tools & browsers now support WebP. I've only encountered one website recently where uploading a WebP image wasn't handled correctly and blocked the next step. Almost everything supports it well these days.
Edit: I found my prototype from way back, I guess I was just testing heh: https://retr0.id/stuff/bee_movie.webp.html
> Alright, so we’re dealing with 92 KiB for gzip vs 37 + 71 KiB for Brotli. Umm…
That said, the overhead of gzip vs brotli HTML compression is nothing compared with amount of JS/images/video current websites use.
[1] https://github.com/gildas-lormeau/SingleFile?tab=readme-ov-f...
[2] https://github.com/gildas-lormeau/Polyglot-HTML-ZIP-PNG
[3] https://github.com/gildas-lormeau/Polyglot-HTML-ZIP-PNG/raw/...
Note that way slower applies to speed of compression, not decompression. So Brotli is a good bet if you can precompress.
> Annoyingly, I host my blog on GitHub pages, which doesn’t support Brotli.
If your users all use modern browsers and you host static pages through a service like Cloudflare or CloudFront that supports custom HTTP headers, you can implement your own Brotli support by precompressing the static files with Brotli and adding a Content-Encoding: br HTTP header. This is kind of cheating because you are ignoring proper content negotiation with Accept-Encoding, but I’ve done it successfully for sites with targeted user bases.
Well it didn't work in Materialistic (I guess their webview disable js), and the failure mode is really not comfortable.
[1] https://en.wikipedia.org/wiki/Sloot_Digital_Coding_System
But...
> Annoyingly, I host my blog on GitHub pages, which doesn’t support Brotli.
Is the glaringly obvious solution to this not as obvious as I think it is?
TFA went through a lot of round-about work to get (some) Brotli compression. Very impressive Yak Shave!
If you're married to the idea of a Git-based automatically published web site, you could at least replicate your code and site to Gitlab Pages, which has supported precompressed Brotli since 2019. Or use one of Cloudflare's free tier services. There's a variety of ways to solve this problem before the first byte is sent to the client.
Far too much of the world's source code already depends exclusively on Github. I find it distasteful to also have the small web do the same while blindly accepting an inferior experience and worse technology.
> As far as I know, browsers are only shipping the decompression dictionary. Brotli has a separate dictionary needed for compression, which would significantly increase the size of the browser.
How can the decompression dictionary be smaller than the compression one? Does the latter contain something like a space-time tradeoff in the form of precalculated most efficient representations of given input substrings or something similar?
Also, it's a long shot, but could the combo of FEC (+size) and lossy compression (-size) be a net win?
(1) compatibility
(2) features
WebP still seems far behind on (1) to me so I don't care about the rest. I hope it gets there, though, because folks like this seem pretty enthusiastic about (2).
>manually decompress it in JavaScript
>Brotli decompressor in WASM
the irony seems lost
I love reading blogpost like these.
What is the point of doing this sort of thing if you dont even test how much faster or slower it made the page to load?
Related
Optimizing JavaScript for Fun and for Profit
Optimizing JavaScript code for performance involves benchmarking, avoiding unnecessary work, string comparisons, and diverse object shapes. JavaScript engines optimize based on object shapes, impacting array/object methods and indirection. Creating objects with the same shape improves optimization, cautioning against slower functional programming methods. Costs of indirection like proxy objects and function calls affect performance. Code examples and benchmarks demonstrate optimization variances.
Writing HTML by hand is easier than debugging your static site generator
The blog author discusses challenges of static site generators versus manual HTML coding, citing setup complexities and advocating for simplicity, stability, and control in website management. Emphasizes static data benefits.
Simple GitHub Actions Techniques
Denis Palnitsky's Medium article explores advanced GitHub Actions techniques like caching, reusable workflows, self-hosted runners, third-party managed runners, GitHub Container Registry, and local workflow debugging with Act. These strategies aim to enhance workflow efficiency and cost-effectiveness.
The Cost of JavaScript
JavaScript significantly affects website performance due to download, execution, and parsing costs. Optimizing with strategies like code-splitting, minification, and caching is crucial for faster loading and interactivity, especially on mobile devices. Various techniques enhance JavaScript delivery and page responsiveness.
"GitHub" Is Starting to Feel Like Legacy Software
GitHub faces criticism for performance decline and feature issues like blame view rendering large files. Users find navigation challenging and core features neglected despite modernization efforts. Users consider exploring alternative platforms.