(评论)
(comments)

原始链接: https://news.ycombinator.com/item?id=38653110

总体而言,虽然在某些情况下 WebP 在保留高质量细节和最大限度地减少带宽使用方面可能存在不足,但它在减小图像文件大小方面具有优势,这可以显着提高较慢网络上的可访问性。 此外,WebP 可以更好地处理动画和视频,从而实现更快的流传输时间和更少的缓冲事件。 然而,如前所述,输出的质量很大程度上取决于压缩期间选择的特定设置。 因此,应仔细考虑以确保特定用例的最佳结果。 尽管如此,如前所述,WebP 与传统格式的选择最终取决于个人对质量、大小以及与旧系统兼容性的偏好。 此外,尽管实现 WebP 的初始设置过程可能看起来令人生畏,但正确的配置和测试可以在资源需求和维护工作方面带来长期的节省。 最终,采用 WebP 或其他格式的决定应考虑权衡因素,例如目标受众人口统计、图像风格变化和具体预算限制。

相关文章

原文
Hacker News new | past | comments | ask | show | jobs | submit login
WebP is so great except it's not (2021) (aurelienpierre.com)
274 points by enz 1 day ago | hide | past | favorite | 386 comments










I've noticed the same issue with WebP and have gone back to JPG/PNG for most things (jpg for photos, png for UI-type images)

I think the real problem is, like many of the commenters here, most people can't tell the difference because desktop monitors have been stuck in a deadzone of zero innovation for the last 10 years. I'm sure half the folks here are viewing his example images on a 2012-era HD 1920x1080 LCD, which is definitely part of the problem.

It's bizarre. Smaller displays (Mobile phones) and larger displays (4k TVs) have fantastic pixel densities now considering their viewing distance. However any panel in the range of 20"-40" is stuck in the mid-2000s.

Also, I think the author would have done us a favor by using example photos with lighter backgrounds (or changing the background color of his post to black). The harshness of the black images on white don't allow the eye to adjust enough to see the issue. If you put those images on a dark background its super easy to tell the difference.



I have no problem seeing the artefacts on both my 2012-era displays. One of them is a rather good at the time 30" 2560x1600 IPS monitor, the other is an entry-level 27" TN 1080p TV.

So I don't think display quality really is the problem here. Maybe the drivers, or post-processing filters. Or maybe everyone doesn't have an eye for this. I have an interest in image processing, and that's the kind of detail one tends to notice with experience. The author of the article is undoubtedly more experienced than me and noticing these details may even be part of his job. He most likely will be able to notice these problems on crappy monitors, as well as telling you in which way that monitor is crap.



Someone else noted the author is sending different images to different monitor types... so no wonder everyone is seeing different things.

Generally though i would expect wide gaumet monitors to make a significant difference for these types of artifacts



I have an extremely hard time perceiving any difference on a 27" 4K monitor. I am not even sure I really see them.

The examples are just bad. If you want to show something, screenshot and enlarge it to show the artifacts.



> enlarge it to show the artifacts.

One might argue that if you need to enlarge it to see the artifacts, then the artifacts aren't perceptible enough and the codec is already good enough for the use case.



But we are philistines not pro photographers


This seems to be highly subjective. I had absolutely no problem seeing those artifacts without any pixel peeping, they're that obvious.

WebP image gradients just looked broken (posterized) except the lossless one, which was (obviously) perfect.



It's hard to see in the first set of images, but the second set is much clearer. In the WebP example, look to the right of the subject, about 1/6th of the image's width from the right edge. There's a hard transition between shades of grey. The JPEG version directly above it also has banding but each band is narrower so the difference at the edges is more subtle.


He was talking about the background, not the foreground.

The difference is in color around the edges of the picture in the background change noticeably on a non-fullscreen image on my Android 12 device.



> The examples are just bad. If you want to show something, screenshot and enlarge it to show the artifacts.

Yes! Where's the red underlines and diffs? I can see the background banding, but the foreground looks the same at a glance except that some of them look ambiguously "off" in ways that could just be placebo.

You'd think a visual artist would be more interested in visual communication and not just a wall of text with un-annotated photos.



I think he was complaining specifically about the background banding.


I downloaded the images and then compared them via Beyond Compare.

After that it was pretty obvious what the author is talking about.



The article is about the background banding.


Laptop and desktop monitors have been advancing just fine over in the Apple world with high ppi, brightness and color accuracy being standard for nearly a decade... it's just expensive and so one of the first corners cut for PC as most folks simply don't care.


I see the rings easy on my few years old AOC 1440p monitor. PC users can have way better monitors. Studio colour accuraccy or fast hz gaming


I could see them, but only after turning my brightness up close to the max. I usually have it very low.


> I've noticed the same issue with WebP and have gone back to JPG/PNG for most things (jpg for photos, png for UI-type images)

Wait... I agree for JPG but if you use lossless WEBP instead of PNG, isn't it simply the same pixels, just with a file about 30% smaller than the corresponding PNG file? (and 15% smaller compared to already heavily optimized PNG files like when using zopfli/optipng/etc.).

Isn't the "lossless" in "lossless WEBP" actually lossless when converting a PNG file to WEBP?

FWIW when you convert losslessly a PNG to WEBP, then decompress the WEBP back to a PNG file, then convert again that PNG back to a WEBP file, you get the exact same lossless WEBP file. It's also the same WEBP you get when you encode losslessly from either a PNG or that same PNG but "crushed" with a PNG optimizer.



Yeah but I just don't fw webp and other weird formats. JPEG and PNG are tried and true, also it's nice how the extension indicates lossiness.

On the technical side, webp support still isn't like png. Tried dragging a webp into Google Slides just now, got "unsupported image type," which is ironic. I'll try again in like 10 years.



> On the technical side, webp support still isn't like png.

Oh that's a good point.

I see lossless WEBP mostly as a way to save bandwith where PNG would have been used. If you've got a pipeline where, anyway, you already "crush" your PNG file, you may as well also generate a lossless WEBP file and serve that: all browsers support it. And you can fall back on the optimized PNG should the browser not support WEBP.

I mean: I use WEBP, but only lossless WEBP, as a replacement for PNG when I'd serve PNG files to browsers.

But for that one usecase: showing a PNG file in a webpage, I don't see that many downsides to lossless WEBP. It saves bandwith.



Only if you can accurately detect browser support and serve the PNG instead, which means added complexity. And you have to store both.

Also, if users download your images and use them elsewhere, webp will still be more annoying for them. Though it's not very common that you want them doing that anyway.



https://caniuse.com/webp

Any updated (modern) browser should be able to see webp just fine, I'd rather just serve it without a backup plan if I'm planning to have webp in my website.



The browser support for webp is fine, problem is everything else. If you only care about displaying the images (not letting people use them elsewhere), you only use lossless webp, and all your backend infra supports it, then sure.


At this point in my life, I just don't have time. I basically use either mp4 or PNG for all web "images/animation" when doing web pages. I don't detect browsers or the like. Unless there is some revolutionary new image/video tech, I'll stick with them for the foreseeable future. I only bother with JPEG when it's straight from the phone/camera and I don't want any reduction in quality from the original high rez.


I'm on a 27" 4K IPS screen here and have to squint/zoom in to see the difference the author is writing about. While it's nice some people really care for the best result I think most people aren't going to notice or care about it.


I'm guess it's also true that HN is definitely the wrong audience for this post. As the author suggests, if you spend all day in VScode/VIM, you're among the segment of computer users who looks at images the least as a percentage of time spent on a computer.


Yes, but at least there are a decent amount of font 'connoisseurs' here ;)


I caught it on my Android 12 without full screening. He's talking about the background, not the foreground. The backgrounds color noticeably changes from shot to shot around edges.


I have to zoom in to really notice that. But both the jpg and webp have distortion - webp slightly more. Both have difficulty with edges.


I think we're talking about two different things. You're not noticing the forest for the trees. I'm talking about big huge macro effects that become more apparent when you zoom out, not less.

There is a difference in the gradients of color. One hasn't the guy looking backlit and one doesn't.



At default zoom the image is 20% of the width of my monitor so it's hard to see artefacts. When zoomed in the posterization is noticeable but jpeg at 85% is about as bad as webp. I don't see any substantial difference in lighting.


It's like the audiophile equivalent of using $500 speaker wire. Nobody normal really cares about the difference, if there's really any difference at all.


>because desktop monitors have been stuck in a deadzone of zero innovation for the last 10 years.

That's a weird thing to say unless the pixel density is your one and only measure. Regardless of that, the posterization should be perfectly visible on a 2012 FullHD monitor, or even a 1366x768 TN screen of a decade-old laptop. Most commenters here are probably viewing the pictures on a scale different from 1:1.



> That's a weird thing to say unless the pixel density is your one and only measure.

Is it though? We now have OLED TVs and OLED smartphones.

Where's our OLED PC monitors?

On every measure, if you care about colors/contrast/black+white levels/resolution/density, the average computer monitor has fallen far behind.

You can't even buy a smartphone that has a panel half as bad as most PC monitors on the market. And, at least in my area, you'd actually have to go to a lot of effort to find a non-4k TV.



> Where's our OLED PC monitors?

https://computers.scorptec.com.au/computer/Oled-Monitor

They've been around for years.

PC monitors have been improving constantly with high refresh rates, local dimming HDR + 10 bit color, adaptive sync, OLED and more.



Only on the unusual high-end gaming monitors.


OLED is overwhelmingly reserved to high-end TVs and phones as well, so I think that point is moot.


My base iPhone 12 mini from years ago has OLED, so do a lot of cheaper Android phones. Gaming displays are far less common than these.


Phones have a smaller display which makes them easier to manufacter.


Yeah, that also supports how the iPads don't have OLED yet.


> Where's our OLED PC monitors?

https://www.displayninja.com/oled-monitor-list/

Mainly targeted towards the gaming market at the moment.



some of those prices are insane. Why are they so much more expensive that OLED TV's of similar size? Frame rate?


I dunno about TV much since I don't use them, but I have some ideas why it might be:

- Framerate - Response time - Adaptive sync - (how prone to burn-in is OLED? Monitors often have way more static images to TVs)

I assume combing these all might just make it more expensive than just individually each feature



> - Framerate - Response time - Adaptive sync - (how prone to burn-in is OLED? Monitors often have way more static images to TVs)

The much more complicated electronics plus Supply & Demand. Demand for TVs should be way higher then for high end monitors.



> I'm sure half the folks here are viewing his example images on a 2012-era HD 1920x1080 LCD, which is definitely part of the problem.

I just looked at the first two images of the post.

First on two mid end LCDs: one ASUS IPS from this year and one BenQ TN from 2012, both 24" 1920x1080 (~91 DPI). The difference between the images is clear on both.

And before posting, to make sure, I pulled out a 15" 1024x768 (~85 DPI: basically the same) NEC TN LCD from 2002. And a NEC CRT roughly 15" viewable 1024x768 from 1998. Both on VGA connectors (so there is the typical noise from that, which still doesn't cover up the posterization). The difference between the images is clear on both.

All monitors viewed from 3' away.

People are simply accommodated to poor image quality, including posterization. AAA FPS video games display it on static art backgrounds in the loading menu, and I can never tell if they are intended. Show them a 240Hz monitor with 30ms input lag and 5 frames of overshoot artifacts and viewing angles worse than 1998, and they'll be wowed.



It’s quite noticeable on a 2011 MacBook Air, too. The issue is less pronounced if you don’t have a decent display but it’s more that people are not used to it. Like bad kerning, it’s something you’ll notice everywhere if you train your eye to look for it, but otherwise probably don’t notice except that some things feel less appealing.


Not true. Monitors now are 1440p or 4k. Even at work for me.

The "issue" is that monitors last a LONG time. And thats good. We dont touch them or fiddle with them. They tend to just work. Phones and shit we keep dropping and breaking, then the battery gets bad.

Also for gaming you may even want 1080p 200hz monitor for high refresh rate and FPS over pixel density.



You also can't write software bad enough that you're forced to upgrade your monitor due to poor performance.


You almost can. The Windows Terminal app has a performance issue on gsync monitors. I think it's being treated like a game but the app only renders at 60 fps or something, maybe lower, which I guess forces the whole screen to refresh at that rate which causes mouse stutter


> They tend to just work

They really don't...



Pixel density isn't the issue. 2K-4K computer monitors are pretty common. But they tend to suck in other ways compared to a MacBook screen. And yes I can tell the difference between the images on my MBP.


Also, only a tiny fraction of PC monitors have color gamuts wider than sRGB, proper HDR support, or any kind of calibration.

Recently I’ve been dabbling in HDR video, but I realised that the exercise is futile because I can’t send the results to anyone — unless they’re using an Apple device.



I see thing rings easy on my few years old AOC 1440p monitor.


I opened the first two pictures in separate tabs and switched quickly between them. There is zero difference. Tried it on two different monitors, Chrome and Firefox. Same with the pictures of the guy at the end.

EDIT: The last comparison is webp twice, he linked it wrong. Here is the jpg one, still no difference:

https://eng.aurelienpierre.com/wp-content/uploads/sites/8/20...



I checked those images on a Macbook 16 M2 Max (standard P3-1600 nits preset), Chrome 120.0.6099.109. All of the WebP images had pretty bad posterization, while JPEG examples did not.

Edit: You have to actually click for a full size image to see the truth. Those inline images had pretty bad compression artefacts, even the supposed lossless versions.

So https://eng.aurelienpierre.com/wp-content/uploads/sites/8/20... (full size lossless WebP image) looks fine, but inline version of the same image https://eng.aurelienpierre.com/wp-content/uploads/sites/8/20... looks terrible.

Edit 2: The difference between...

https://eng.aurelienpierre.com/wp-content/uploads/sites/8/20... lossy-noise.jpg (216 kB JPEG)

https://eng.aurelienpierre.com/wp-content/uploads/sites/8/20... (150 kB WebP)

https://eng.aurelienpierre.com/wp-content/uploads/sites/8/20... (301 kB WebP)

... is pretty obvious. Both of the WebP examples, even that 301 kB version, show clearly visible posterization.

I wonder if there's some issue with the WebP encoder (or the settings) he is using?

Edit 3:

It should be noted that monitor gamma and color profile might affect gradient posterization visibility.



> I wonder if there's some issue with the WebP encoder (or the settings) he is using?

I played around with online optimizers and IrfanView which I had locally. IrfanView got the results they did, no matter what else I tuned, obvious degradation at 90. Online optimizers were not even comparable in how bad they were.

edit: I found Squoosh [0], which has WebP V2 compression marked as unstable. It’s far better, half the size of JPEG 90, but it’s still degraded in comparison. Also, it saves as wp2 file, which neither Chrome nor FF support natively.

[0]: https://squoosh.app/editor



They ceased development on WebP2.. don't think they could've come up with anything better than AVIF or JXL already have anyway.


The first link in your Edit 2 section (the JPEG) one is broken, it should be https://eng.aurelienpierre.com/wp-content/uploads/sites/8/20...


Thanks! Unfortunately I can't change it anymore.


> I wonder if there's some issue with the WebP encoder (or the settings) he is using?

He's re-encoding the JPEG compressed images. That is a huge mistake.



From the article:

> It’s not 100 % clean either, but much better. Granted, this is WebP re-encoding of an already lossy compressed JPEG, so we stack 2 steps of destructive compression. But this is what Google Page Speed insights encourage you to do and what a shitload of plugins enable you to do, while pretending it’s completely safe. It’s not.



Addendum:

Tried it with a Windows laptop connected to a Samsung LS32A800 32" 4k display. Laptop has factory default settings. Chrome 120. The monitor is pretty low end for a 4k model.

Monitor's picture settings: Custom, brightness 81, contrast 75, sharpness 60, gamma mode1 and response time fastest.

Switched between those three "Edit 2" images blindly, yet the issues are obvious also on this combination.

The JPEG version looks better compared to WebP ones. (Also, this goes against my prior general assumptions about JPEG vs WebP quality.)



the second image and the third image are half resolution of the other, yeah some posterization is visible in Shoot-Antoine-0044-_DSC0085-lossless-1200x675.webp, but it's half resolution and he purposefully added a high frequency noise for his test then averaged the noise point trough resizing, and well, of course it's blurry.


> I opened the first two pictures in separate tabs and switched quickly between them. There is zero difference. Tried it on two different monitors, Chrome and Firefox. Same with the pictures of the guy at the end.

One easy difference to spot is the background in this pair is posterized (https://en.wikipedia.org/wiki/Posterization) in webp but not in jpg:

https://eng.aurelienpierre.com/wp-content/uploads/sites/8/20...

https://eng.aurelienpierre.com/wp-content/uploads/sites/8/20...



For clarity if anyone is still confused, on Wikipedia's example image, look at the snakes's shadow - that's what's happening to the background in the blog's image.

I didn't know the word "posterization", so I'd describe this (slightly?) more simply as a stepped gradient instead of a smooth gradient.



At 50 y/o my eyesight began to fail and yet the differences in the pictures are freaking obvious. As in: it's impossible to not see how huge the differences are.

And many people commented the same. These simply aren't small differences.

People who cannot see the differences or who only see them after taking a close look should realize something: there are many people for whom the differences are going to be immediately obvious.



> People who cannot see the differences or who only see them after taking a close look should realize something: there are many people for whom the differences are going to be immediately obvious.

That's one possible conclusion. Another is that some people are overstating how obvious it is. I don't mean this as an insult - there's plenty of cases where people's stated perceptions and preferences disappear when tested under strict conditions (hello Audiophiles).

So - it's not immediately obvious whether claims such as yours are trustworthy.

(for the record I can see the difference but it's fairly subtle on my screen)



It's definitely an objective phenomenon but there's two factors at play: first is the monitor quality. I have two monitors of the same model number but made in different years with obviously different panels (color reproduction is all over the place between them), and the banding is obvious in one monitor but not the other. I can drag the window between screens and it disappears. On my iPhone, it's very obvious.

Second is how much each person's brain interpolates. I got used to those visual artifacts on the web in the early 90s so my brain started doing its own interpolation. It took reading the entire article and flipping tabs back and forth to compare images before I noticed the difference. Now I can't unsee it in other images that I recently converted to webp for a project.



> There is zero difference.

There is a clear difference though, I can see it in all my monitors, from desktop to laptop and even mobile. It's especially visible in the top right quarter.

That being said if you're not into photography you might just not care enough to see it



The first picture is very hard to spot imo. I had to zoom in a bit to spot it initially. You'll see the "blockiness" is slightly worse in the webp version. (Left side of the image, head height)

For the second image, I opened the jpeg 90 [1] and webp 90 [2] versions. Comparing those two, there are clear banding issues to the right of the neck. Slightly less visible are the darker bands circling around the whole image, though still noticeable if you know where to look.

Comparing the jpeg 90 version with either webp lossless, jpeg 100 or jpeg 95, I can spot some very slight banding in the jpeg 90 version just to the right of the neck. Very difficult to spot though without zooming in.

[1] https://eng.aurelienpierre.com/wp-content/uploads/sites/8/20...

[2] https://eng.aurelienpierre.com/wp-content/uploads/sites/8/20...



I don't see any difference either on Windows on either of my monitors.

I wonder if the author's issue is due to the author using a Mac. Back when I was at Google working on VR images, my work machine was a Macbook and my home machine was a normal Windows desktop. I realized that images looked worse on my laptop's screen because the native resolution of the display hardware was something like 4000 (numbers made up because I don't remember the specs) but the display was set to 3000. So OSX would incorrectly rescale the image using the wrong gamma curves. Since I was trying to calibrate VR headsets, I spent way too much time looking at gamma test images like https://www.epaperpress.com/monitorcal/gamma.html where a high res pure black + pure white grid is shown next to a set of grays. That was how I realized that my Mac was incorrectly resizing the graphics without being properly gamma aware. I also realized that if I set the OS resolution to 2000, it would use nearest neighbor instead of bilinear filtering and the gamma issue would go away. My Windows desktop had the OS running at the native resolution of the display so this wasn't an issue there. This also wasn't an issue if I had an external monitor hooked up to the Mac and set to its native resolution.

Apple users tend to say "it just works" which is true 90% of the time. But there are cases like this where it doesn't "just work" and there was no easy way to force the OS to run at its native resolution on that specific laptop.

Edit: I tested with the second set of images (the upper body shot) and the problems with the gradient are visible there. But I still can't see a different when quickly flipping through the first part of images on my properly calibrated native-resolution monitor. I _can_ see some banding on one of my monitors that was intentionally miscalibrated so that I could read text better.



It could also be a browser issue implementing webp. There's a decade-old bug in Chrome, where they're using the wrong color profile for CSS, so colors are brighter than in other browsers. It's extreme enough that one of the designers I worked with spotted it in passing just glancing at my Firefox window, which led down a rabbit hole finding the bug report.

https://bugs.chromium.org/p/chromium/issues/detail?id=44872

Total aside, y'know how people do things like make their smartphones greyscale (or at least mute the colors a bit) to reduce smartphone addiction? It wouldn't surprise me if these over-saturated colors were part of why Chrome got so popular so fast...



> I wonder if the author's issue is due to the author using a Mac.

It is not, since I tested positive on Linux. What post processing would any OS even do on an image when you view it in a new tab as one is meant to do for this tutorial?



I did the same, and it took me a long time to spot it, but in the upper-right corner you see circles in the WebP version. It's outside the centre of attention, so it's not that obvious. Actually, it wasn't until I saw the second picture and knew what to look for that I spotted this in the first picture.

It's not so easy to see if the browser zooms the image, so make sure to open the image and set zoom to 100%. I also need to keep my face fairly close to my screen (12" 1920×1080, so not that large).



I always zoom in on pictures on the web to see if the compression is good or if there are artifacts.


I agree, it's not a good example to lead with.

That said, in the context of showing off your photography I can understand considering these kind of artifacts undesirable, even though they're perfectly fine for a lot of other uses. On my own website I spent quite some time downgrading my mugshot to be as small as possible without too many artifacts – it's now 4.9K in WebP, vs. 9.2K in JPEG before. Maybe that was a tad obsessive though...

I do think the author doesn't quite appreciate that most people are not photographers, and that for most images quality doesn't actually matter all that much.



The same image rendered with different os/hardware will almost always look different.

Different operating systems and monitors have different default gamma curves for rendering brightness and black levels. Monitors are most likely either uncalibrated, or _can't be calibrated_ to render a greyscale with just 64 brightness levels distinctly.

TFA is calling attention to "posterization" in their portrait backgrounds. They expected the grey background to have a smooth gradient, but, depending on your monitor, you should see visual jagged stair-steps between different grey levels.

When an image uses a color palette that's insufficiently variable to render the original image colors with high fidelity, that's "posterization."

(I paid for my college doing high-end prepress and digital image services, and got to work with a ton of really talented photographers who helped me see what they were seeing)



I can readily tell the difference on the guy's forehead. The webp version has less dynamic and looks like a big white spot, while jpeg has more shades.


The gradients in the webp look clearly terrible to me. I'm using a normal 1440p monitor, nothing fancy


I thought it was pretty clear. I'm not even running any monitor/computer setup. The light behind her is clearly different, it almost looks like a photo with different lighting.

4k Dell monitor, Safari on a Mac.



If I view the full images of the first two in two Chrome tabs, two Firefox tabs, or download them and open then both in Preview on a 27" 5k iMac and flip back and forth between the two I see nothing changing.

There is definitely something changing though, because if I open each in Preview, switch Preview to full screen, set the view to be actual size, and take a full screen screenshot, the screenshot for the WebP image is 14% smaller than the one for the JPEG.

If I use screen zoom to go way in and then flip between the two images I can finally see some changes. The JPEG background has more small scale variation in shade. In the hair there are some white streaks that aren't quite as long in the WebP. Lots of small changes in the shirt, but it is about 50/50 whether or not any given difference there looks better in the JPEG or the WebP.



This whole thread feels like one of those "I can tell the difference between an MP3 encoded at 320 kbit/s and one encoded at 256 kbit/s!" audiophile threads. Yes, there are probably people out there with well-calibrated ears who can, but I am sure not one of them. FWIW I have a 27" 5k iMac and can't even remotely see any difference between the images.


Lots of replies here saying either: "I can't see the difference" or "Wow the difference is stark".

My takeaway as a non-photographer is: "different tools for different uses". If you're posting photography where image quality matters then use JPEG or another format that you think displays the image best. If you're writing a blog post with screenshots or other images where minute quality doesn't matter that much then use WebP.



No, in both cases, use something that is better than JPEG and Webp: JPEG XL.


JPEG XL is great except is has virtually no browser support[1]

[1]: https://caniuse.com/jpegxl



JPEG XL is clearly superior in almost all contexts, but Google killed it and then Apple is trying to support it now. Unless Google reverses its stance though it will stay dead.


The thing that I like the best about jxl is how consistent the reference encoder is. If I need to compress an entire directory of images, cxjl -d 1.0 will generate good looking images at a pretty darn small size.

Using mozjpeg (SPEG), or openjpeg (JPEG 2000) or cwebp, and I want to get even close (in bpp) to what cjxl does on the default I have to use different settings for b&w vs color and line-art vs photos.



The last time I checked, it was not possible to re-encode a JXL image into a JPEG image. Is this now supported?


There's a clear difference between the JPEG and WEBP versions. Especially on the background on the right of the man.

There are clear bands of various shades of grey that circle out of the brighter areas behind the face and from the mid-right edge. They appear to join about two thirds from the middle to the right edge. That artifacting is most notable at full size, but is still visible on the smaller size on the web page.



Here is the diff: https://imgur.com/a/QT8oNqj

>> To the non-educated eye, this might look ok, but for a photographer it’s not, and for several reasons.

webp is a banding nightmare.



I can see a difference in the gradients, but in practical use on the average website does that even matter?

Photography portfolios are the one use case where having gigantic JPEG 90 images might make sense I suppose. Although everyone is going to get annoyed at your loading times.



It's because the author is linking to the wrong images.

See my post lower in this thread.

https://news.ycombinator.com/item?id=38656046



The author is complaining about the consequences of recompressing images, which are also black and white and have a huge gradient background, and also, the post is full of flaws. I don’t know, Hacker News is better as less of a Hacker Rants.


> which are also black and white and have a huge gradient background

That's the entire point of this article. Rather than picking a dozen different kinds of images at random, it considers the problem within the very specific context of actual photographs, made by actual professional photographers, with specific (yet not uncommon) artistic/stylistic choices.

It's like showing why an audio codec sucks for cellos. Yes, there is going to be a hundred other things you may want to record (like a podcast, a rock band, etc), and most of them will not be cellos, but still that doesn't change the fact that the codec sucks for cellos.



The author just makes a ton of mistakes. Many photographers competently shoot and store RAW, and many know better than to mass convert low quality JPEGs to WebP. It’s HIS work, he can choose to make as few or as many mistakes with presenting it as possible. So I don’t think he’s representative of most photographers. It’s a technical discipline.

I guess the more technically interesting POV would be to suggest a solution. Probably he should use the black and white profile with HEIF and serve the WebP only to search engines, using the modern image tag.

Or, you could put Y information in the unused UV plane for WebP. I guess you could also decompress the original JPEGs better for the purpose of conversion. While not for him, it takes about 100 lines of JavaScript to author a Mobile Safari-compatible image bitstream, which is very little. The MediaCodecs API is great.

Anyway, the rant elevated my knowledge very little. It was more like anti knowledge. Like if you were to integrate the rant into an LLM, it would produce worse recommendations.



> [...] many [photographers] know better than to mass convert low quality JPEGs to WebP.

Correct, but this is the workflow that the engineers behind WebP recommend, so I think it's entirely fair to pick on it.

> Anyway, the rant elevated my knowledge very little. It was more like anti knowledge.

Then perhaps you weren't the target audience. I'm not a photographer, and the rant has offered me a little bit more perspective.



You either have a bad screen or limited eyesight, it's quite funny to me that this is the most upvoted comment.

There's definitely very ugly "banding" going on in the gradients on the WebP versions i say as someone who's worked extensively with UX and interfaces.

I'm on a M2 Macbook Air.



I'm looking at an LG UltraFine, which as far as I know, is not a bad screen, but I can't really tell.

I've read all the comments, and zoomed way in. I can see it on one of the pairs if I pay attention, but on most of them, I still am not sure how to even look for the difference.

Last time I had a vision check, I got a 20/15, which is supposed to be better than "normal". It may have declined since then.

I don't think it's a monitor or eyesight thing. I think I don't know "how" to look for the effect I'm supposed to be seeing.



He also screwed up the 4th and 5th image - one of the ones labeled "85% jpeg lossy" links to the webp.


It's your screen. Maybe we found the ultimate image compression method here- we all just need to use the same screen as you.


It could be partially placebo affect. Its not like he is doing a blinded test.


It's not, it's just that people who spend thousands of dollars and hours into photography are more susceptible to care. Same with music, most people are fine with $15 earphones while musicians or music enthusiasts will find them disgusting.


Music is probably a bad example of your point, as that field is famous for audiophiles insisting they can hear a difference for various things only for them not being able to tell the difference in a double blind test.


Just because there are some 'extreme' weirdos in the audiophile space, doesn't mean that there is no difference between cheap and expensive equipment.

While people might not be able to tell the difference between $50 and $5000 speaker cables, anybody will be able to the hear the difference between $50 and $5000 speakers.



It's more like 64kbs vs 128kbps than copper vs gold cables if you want to keep the analogy


In my opinion the worst and most distinguishable downside of webp is the forced 4:2:0 chroma subsampling. On many images with bright colors you can clearly see the color and brightness loss without an educated eye.

On comparison [1] you can clearly see that the top right balloon has lost its vibrant red color. On comparison [2] the bright blue neon art on the center has lost its brightness.

[1] https://storage.googleapis.com/demos.webmproject.org/webp/cm...

[2] https://storage.googleapis.com/demos.webmproject.org/webp/cm...



thank you for that link - it is detectable but in my eyes neglegible for website use. What about saturation?

I have to ask, what could be the reason this gives me pale blue (other colors are okeyish) jpg > webp:

cwebp -pass 10 -m 6 -nostrong -sharp_yuv -quiet -q 60 -sharpness 2 $1 -o



Not to stir yet stir another debate but yeah, definitely not able to perceive the difference in either of the examples you linked. It would be helpful if that site let you drag the vertical comparison bar at least. On an iPhone 14 display.


I can see it in the second link setting webp to small in the orange reflections above the rightmost outside needle tree htms. ... oh, you cant drag it? ...


> To the non-educated eye, this might look ok, but for a photographer it’s not, and for several reasons.

There surely must be better examples to show "non-educated" plebs (to use the tone of the post) why webp is bad and to justify the post and the tone.

I'm on Android, maybe this is why all pic quality look the same?

Also - yeah, if you are making pics for educated eyes: don't use tech that is not suitable for educated eyes? Or don't outsource that decision making to others?



The authors point is that if you are making this tech, you should have educated eyes.

And given all the confident comments in this thread claiming the author is full of shit and there's no difference, I think their frustration is justified? If you can't see the difference in the first images that's fine but you probably shouldn't be confidently claiming to know better than the author, let alone designing an image codec.



There's room for different opinions.

His font choice is terrible for my legibility. Maybe for others it's great. But it made the already difficult article that much harder to read. And I like this topic. I already seriously question his sense of what is reasonable and good and for what purpose. His purposes are so alien to mine that his opinion ends up being pretty irrelevant to mine. I wish him well with his.

I can't see the things he's pointing out in the images, and I tried and tried.

I use webp extensively, there have been zero complaints from users about the images. But I don't make art sites. I make software people use to get stuff done. I don't transfer images above maybe 50-80k. Art, aside from modest marketing, is most definitely not the point.



> His font choice is terrible for my legibility.

There may be a connection [1].

If we assume some of the people designing codecs, that he curses in this piece, end up reading it, he may simply have wanted to make sure they do remember. ;)

[1] https://hbr.org/2012/03/hard-to-read-fonts-promote-better-re...



If you tried and couldn't see, it might be like others say that it's more visible on certain monitors and setups. But then, again - if you are designing codecs or choosing them, you probably want a monitor that makes it easy to see these things. I can see them on my old iPhone screen.

It reminds me of how sometimes you see a huge billboard hideously strong 10 foot wide JPEG compression artifacts. It was someone's job to make those, too.



> But then, again - if you are designing codecs or choosing them, you probably want a monitor that makes it easy to see these things

You keep bringing this up. I don't really care. Someone designing a codec may have put this apparent problem case on the don't-care list as well. I would be in general agreement with the designer's priorities for a reasonable web codec.

I have, with some care, selected webp as a general codec for web use on most of my sites. Nobody is complaining, and my page weights and development speed is improved. I don't have to fret between png+transparency and jpg to minimize asset size while maintaining it's usability. I just use webp and most of the time it's a size/speed win with good enough quality.

Not every codec needs to be artist and photographer approved.



The author's point is deeply stupid. As he admits:

> WebP re-encoding of an already lossy compressed JPEG

So... all this shows nothing. Is webp worse than jpeg? Not addressed. He re-encoded jpeg to webp and it somehow didn't magically cure the compression artifacts he's seeing! Who coulda thunk!

Any comparison starts with taking the originals, encoding to jpeg and webp, and comparing that. Or he could repeatedly encode original -> jpeg -> jpeg and compare that to what he has, which is original -> jpeg -> webp



Most of the comparisons are encoded from source. The one that isn't is because re-encoding is a specific recommendation from the services that they are criticising. They are specifically showing that yes, that's a bad idea.


Still, the author could do more to highlight the differences using zooms and annotations. The banding in the background is particularly strong and would help their point to highlight visually to the reader.


I too am on Android.

I was able to see it without full screening.

Look at the man with his face screwed up. Look at the edges of his shirt near his shoulders.

In the pictures that had bad image quality, there is a sort of glow around his shoulders, as if they are backlit.

In the pictures that had a good image quality, The gradient was smooth. There was no backlit glow around his shoulders; it just looked like a smooth gradient background image.

To be clear, I'm not a photographer. I'm a DevOps engineer. The last time I professionally wrote a line of JavaScript was at least 11 years ago.

It's easy enough to see.



See the discussion here [1], you need to view it full size to be able to tell.

[1] https://news.ycombinator.com/item?id=38653224



…so essentially WebP is fine for mobile devices and the vast majority of desktop web cases. I’m fine with WebP not being a suitable format for permanent storage of photography.


A close up section of the same zone in the images would make them visible. I could hardly see the artefacts in the first place as my attention was caught with the highly contrasted parts of the images.


No, I can see it on Android without zooming in. Not well for sure, but it is there towards the corners.


For starters, anyone that ever worked with a codec, will know that you don't compare them with ONE SINNGLE IMAGE.

This whole basic idea of the blog post is just to generate more whining and clicks and not to actually make a comparison between formats that's worth a basic smell test.



This cuts against WebP more: all of Google’s marketing was “it’s a third smaller!!!!” and then when you looked they were comparing it to unoptimized libjpeg outout and using computational metrics like SSIM which only crudely approximate what humans notice about image quality.

I did the same comparison the author did when WebP came out but used an optimized JPEG encoder and found the same conclusion: when you produced subjectively equivalent images, the savings were more like -10% to +15% and for web sites which didn’t get Google-scale traffic the performance impact was negative since it made caching less effective and you had to support an entire new toolchain.



In what way does "anything cut" against anything when you do cherry picked single datum point comparison?

There isn't a codec pair in this world where you can't make a cherry picked comparison where one of them is worse (I've done plenty of those).



Criticism of cherry-picking cuts against WebP because the marketing campaign for that codec relied on cherry-picking both the least optimized JPEG codec and the most favorable metrics for comparison. If you had humans comparing images or enabled JPEG optimization you saw far less exciting numbers for WebP - usually under 10% savings, not uncommonly negative – and there were other formats which consistently outperformed it. You can see the mood around that time here:

https://calendar.perfplanet.com/2014/mozjpeg-3-0/

Even a decade later, however, Google repeats the 25-34% claim and their performance tools tell developers they should use a modern format, which by sheer coincidence means the one they invented rather than the best ones on the market.



Except the problem isn't in a single image, it is a pattern that is frequently there and the image was only used to demonstrate it. WebP has this problem way back as one of the reason others were hesitant to support it except Google.


It is basically the same with all On2 Media marketing. From WebP, VP8, VP9 to AV1. And it has been going on for over a decade.


This article didn't go into the biggest problem with webp for me: the inconveninence of the format outside the browser compared to the small space saving. There are better formats (the video-codec inspired ones like heif, avif, and what might come out of h266, or even jpeg-xl), and webp just seems like a compromise without enough upside.


I feel your pain. Right-click, save as, and ... awww-goddamn it, another WebP >:|


My favorite is the URL ends with jpg but when you save the image you get a fucking WebP. Thanks everyone for breaking the Internet in the name of Google. The best.


I always screenshot them lol


WebP is actually based on a video codec. It's just that VP8 pretty much never caught on with hardware encoders/decoders apparently.


VP8 was never competitive so most of the energy went into VP9, which did beat H264.


It beat H.264 in terms of quality/size but not in terms of hardware support. This is why Google Meet is the laggiest video conference software, they keep trying to make VP9 a thing while the others stuck with H.264. And now there's H.265.


Google marketed it that way but I could never reproduce a meaningful size savings without noticeable quality loss. You need to serve a LOT of video before even the top-end 10% savings was worth it, especially if your traffic was spread across many items so doubling your storage cost cancelled out a fair chunk of the total. I have no doubt that YouTube saw a savings but I don’t know how many other sites did, and I would be curious what the savings was relative to the extra power used by the millions of client devices which could’ve streamed H.264 at 10% CPU versus having the fan on high.


If users don't have hardware accelerated video decoding, it's so bad that it actually hurts the experience. I can't imagine that being worth the space savings. There doesn't have to be a good reason YouTube does it, it might just be someone wanting to insert their tech, which I'm pretty sure is the reason Meet uses it.


I remember doing bluray re-encodes back in that day. x264 was simply better as an encoder when compared to vp8 and you knew that at least in terms of software everyone had a compatible decoder in their preferred codec-pack.


Oh yes, with uh websites where you download said re-encodes, there'd always be a few uploads with weird encoding and the author screaming in the comments that it's better and you gotta use the bleeding edge VLC before complaining that it doesn't work.


Even worse that the original blog post, because of this you may be dealing with a JPEG image, converted to WEBP, and then back to JPEG. And then maybe someone edited that JPEG and it got converted back to WEBP!

A large chunk of the hn commentors are debating over banding they can or can't see in a best case scenario WEBP image. The reality is the bulk of the WEBP images look horrible, something I've started to really notice only recently. Of course, you can "clean" the images by using different generative upscaling processes now, which is pretty ironic how much electricity we are using because someone wanted to save 45kb.

Also this reminds me a lot about GIFs being converted to JPEGs. 25~ years ago there was a lot of nice, clean GIF screenshots (256 colors was all you needed) that got destroyed by JPEG.

Google tells developers to use WEBP but has no problem serving petabytes of video ads no one wants to watch!



Now let's talk about HEIF, an inconvenience inside and outside of the browser on desktop.


A bit of context: Aurelien Pierre is known to be a major contributor to Darktable (open source raw developper / catalog ; in other words, an open source Adobe Lightroom), and is known to have strong opinion about the correct way do to stuff, to the point of abrasiveness and to the point where he has forked Darktable into its own stuff (Ansel; see HN discussion some times ago https://news.ycombinator.com/item?id=38390914 ).


Thanks for the info, going to have to check out Ansel. Do you know if its still compatible with the Darktable formats?


I’m not sure what you mean by formats. It should support all the old raw/jpeg formats, or at minimum it has for me


If I cared about archive image quality (and I do), I wouldn't re-compress older images in a new format unless I could do so from uncompressed originals. Re-encoding from a lossy compressed source will make quality worse. Storage is cheap and getting cheaper.

What would make sense is choosing safe settings for compressing new photos in the new format.



> Re-encoding from a lossy compressed source will make quality worse.

JPEG-XL is supposed to reencode old JPEG files into 20% smaller files without quality loss though. In context, Google has been holding JPEG-XL back by removing support for it from Chrome and refusing to reinstate it, claiming that it did not have good enough "incremental benefits compared to existing formats" such as webp.



Careful with the JPEG-XL re-compression, though--depending on how you're re-encoding, jxl may use SSIM to evaluate for visual losslessness, and the whole point of TFA is that SSIM is blind to posterization, but (some) humans aren't.

Disk space is cheap. It's most likely not worth the 20% compression to lose your original images (and possibly lose metadata as well--it's quite hard to robustly retain all vendor-specific MakerNotes, for example).



JXL has Guetzli lossless JPEG compressor integrated into the standard so it produces reversible and completely standard compliant JXL images that are 15-20% smaller size. Reversible in sense that you can still convert the image back the original JPEG, that is bit exact file as the input JPEG was (it takes care of all the metadata also - it has to).

Also if you decide to forgo the reversibility you can get a bit more out of it as JXL is actually a superset of JPEG, so it can read the JPEG stream and convert it to JXL without complete recompression - it will just use more efficient structure of JXL and much more efficient (ANS vs. Huffman) entropy encoding. The additional savings compared to the reversible mode aren't big however.



Wow, I didn't know that. A top google result says:

> It is possible to losslessly transcode JPEG images into JPEG XL. Transcoding preserves the already-lossy compression data from the original JPEG image without any quality loss caused by re-encoding, while making the file size smaller than the original.

I wonder how it does that and why JPEG didn't notice it could. I would re-encode to JPEG-XL, when supported. So then the situation isn't that WebP is so great but rather Chrome's not so great.



> I wonder how it does that

It's trivial to do: JPEG's last stage is a compression via Huffmann code - which is a really ancient, not particularly effective compressor. You simply decompress that stage, and compress with something more modern, yielding better savings. Stuffit did it in 2005. PackJPG in 2006. Brunsli (a Google project!) in 2019 - and it was one of the inputs to the JXL draft. Lepton did it in 2016.

> and why JPEG didn't notice it could.

Oh that's the best part - they did, all the way back in 1991. The JPEG standard allows you to choose for the last stage between Huffmann and Arithmetic Coding - which is way more effective. Unfortunately it was patent-encumbered and its support is low. It yielded 10%ish space saving which wasn't worth the compatibility headache (it has the same extension and mime-type of a Huffmann-encoded JPEG, so a webserver won't know if your browser supports it). If it only had used a different file extension it would probably be the dominant format today.



Okay, but that isn't really the point. You can start from a perfect gradient saved as a PNG and you will still see that WebP has visible banding at -q100 while JPEG is visually transparent at -q90.


I think the author is focusing on the wrong thing. They focused on the difference in format, when they should have focused on the compression. Different image processing programs will have different compression even when set to the same number (eg "80").

I think for a truly meaningful comparison you'd need to test a variety of images including full color with busy backgrounds as well as these b&w studio portraits on a smooth gradient type bg, and test a variety of programs like imagemagik, graphicsMagick, sharp, photoshop, whatever cloud offerings, etc.

The other issue I see is use case. If you're a professional photographer trying to upload full size full quality photos, maybe just don't compress at all so you know your creative / editing work is completely preserved. That use case is not the average use case of a website displaying a reasonably sized image of reasonable quality. For many situations a significantly smaller image might be worth having a more compressed image, and for many images the compression won't be as noticeable as it is in a full resolution professional studio photo with a large gradient type background.



I clearly have "non-educated eyes" as I can't see any meaningful differences personally.


It depends greatly on your device. On my work windows machine I can see a bit of banding. On my phone, it's worse. On my macbook, it's atrocious.


Like most folks you were probably simply looking at the foreground. The background around the edges of the shirt and the edges of the picture (depending on the image) noticeably change color from shot to shot without full screening it on my small Android 12 device.

It's artifacts made in the background of the image that this poster is complaining about.



My sight's both poor and uneducated, but looking again after the defects are pointed out, they're pretty stark.


Good for you. Once you noticed the banding issue, you're cursed to see it everywhere.


Same here. Especially considering the ones supposedly "look like shit".

The whole thing reads like a no-so-subtle brag about how his mighty photographer's eye can spot details that mere mortals can't.



Your viewing environment will matter a lot. In a dark room with a bright monitor, the banding in the background of the example images is pretty bad (if you are looking for it). But if you have a laptop in a bright sunny room in front of a window causing back lighting, you probably won't be able to see it.


It's there. It's very noticeable once pointed out. It drastically distorts the images' 'softness' because of the harsh steps through the gradients. It does not appear as the artist intended for it to, which is the biggest issue.


very interesting, i could clearly see the difference - even before reading. and i'm using a 9-year-old MacBook Air 11'... not bad, but not exactly high-end stuff.

fascinating how perception is different.



The gradients on webp frequently look like video stills. Chroma subsampling reduces the density of available luminance approximations and the more heavily it's applied, the worse gradients look. High contrast high frequency details aren't affected much, but gradients can really suffer.


Like video, webp uses limited ycbcr, as opposed to jpeg which uses full ycbcr. This leads to grayscale jpeg looking perfect on monitors that use full rgb values, as opposed to webp which will have slight banding issues when displaying grayscale content.


I was going to say, it's not uncommon to see pretty bad banding in dark gradients with WebM/VP9, so this makes some sense.


Chroma subsampling reduces the density of available luminance approximations

Chroma means color, and color subsampling is used to avoid taking information out of luminance channels because they are more important, so it is actually the opposite of what you are saying here.



https://www.google.com/search?q=gradient+banding+4:2:0

There simply aren't enough bits of precision in the luma encoding for good gradient support most of the time, chroma fills the gaps, and chroma subsampling produces artifacts.

Webp lossy only does 4:2:0

https://groups.google.com/a/webmproject.org/g/webp-discuss/c...

These problems would go away with 10-bit AIUI. AVIF supports 10 bit but WebP does not.



I think you're conflating a few different things. Chroma doesn't fill gaps, low resolution chroma channels introduce artifacts of their own.

This is spatial resolution, 10 bit color channels is quantization resolution of the values. Everything contributes to banding artifacts, which are just noticeable changes in values when that are meant to be perceptually smooth, but the luminance channel is the most important, which is why it isn't subsampled.

These are fundamentals of image and video compression and not unique to webp.



So... why are we still having problems with banding in image compression? If anything, gradients should be the easiest things to compress in these images, because the compression algorithms work entirely in the frequency domain. Whatever is introducing banding here is adding more frequency coefficients and making the image bigger and worse at the same time.

Did Google/On2 just not notice that they were crushing every gradient they encode or is are all the common WebP encoders doing some kind of preprocessing pass that crushes gradients and munges luma?



I would guess the problem is that on a slow gradient, each individual block is very close to a constant. The tiny AC coefficients tend to be quantized away, resulting in a visible transition along block boundaries.

I thought the loop filter was supposed to help with this though.



Webp is encoded using limited ycbcr values as opposed to jpeg which uses full range ycbcr values. When converting jpeg to webp, there will be banding. Grayscale limited ycbcr when converted to full rgb during display ill also have banding.

Webp really doesnt have a banding issue unless you convert jpeg or display purely grayscale content.



Snarks at Safari for often not being instantly up to date with every rushed “web standard” from Google, then gripes about “Google monkeys” and the issues with…their rushed “web standard”. Pick your poison.


Every time I've used webp, I've been disappointed. And when I'm disappointed, I try jxl for giggles and find much better photo quality (especially fine gradients), at a much better file size.

Let's cut our losses, ditch webp and move to jxl.



> Every time I've used webp, I've been disappointed.

In what way?



I dont get it.

The author seems to care highly about image quality, but also wants to squeeze out as many bytes as possible?

Bandwidth is cheap. If we are talking about photography as art, why would you be trying to scrap a few kb off in the first place?



The author is also a web designers that primarily use wordpress. Wordpress website owners these days would put their site into pagespeed insight and that tool will advise that images to be converted to webp, then demand their web guy to do it. I imagine the author got tired of seeing images on their sites ruined but can't do anything because that's what the clients want to tick off a box in pagespeed insight.


It's more nuanced than that: the author compares two lossy compressions and gives their opinion about which one is better.

It is not honest to say "use my compression algorithm, it is better" and then, when people point out that it is actually worse, to say "well if you care about quality, you should not compress anyway". It doesn't make the algorithm any better.



The repeated callouts to PageSpeed imply that their concerned about search placement, which is understandable for the profession. If your site is bumped off the first page because Google doesn't like that you're still using JPEG that's lost income for you.

It can also be an issue if a client asks for WebP. Do you give in and deliver a lower quality image and allow your art to be displayed in a degraded manner? Losing future clients who think your photos look bad. Or refuse out of dignity and lose the current client?



Because not all countries have cheap or unlimited bandwidth


You missed the point he's making: webp requires 30% more data to achieve the same dynamic than jpeg, so there's no real use for it.


Did he make that point? The only time he thought they were equivalent was when using lossless mode, which is not a reasonable comparison. He never actually compared webp at 30% more quality than jpeg.


He did, about halfway through:

WebP [lossy, 96] is actually 39 % heavier than JPEG 85 plus noise for a similar-ish look on this difficult picture, and still not totally as smooth as the JPEG (there is still a tiny bit of ringing). It’s also 30 % heavier than JPEG 90 with simple Floyd-Steinberg dithering.



> "WebP is actually 39 % heavier than JPEG 85 plus noise for a similar-ish look on this difficult picture, and still not totally as smooth as the JPEG (there is still a tiny bit of ringing). It’s also 30 % heavier than JPEG 90 with simple Floyd-Steinberg dithering."


Because it's a substantial amount of effort to upgrade to the "new" tech, and he's showing that the "new" tech is actually worse than the "old" tech of reliable old jpeg.

> Bandwidth is cheap.

Labour is not. Just leave your jpegs as-is!



Hard to take this seriously with that obnoxious font that draws curlicues connecting letters like s and t.


I did learn from it that there's a CSS property for ligatures, and the blog has set it to discretionary ligatures.

https://developer.mozilla.org/en-US/docs/Web/CSS/font-varian...



So here’s what I don’t get about this post:

> this is WebP re-encoding of an already lossy compressed JPEG

Author is clearly passionate about imagery and quality, so why are they not re-encoding using the original file rather than a lossy copy?



> So, I wondered how bad it was for actual raw photos encoded straight in darktable. Meaning just one step of encoding.


There's pretty bad posterization in the background. If you can't see it, kick up your contrast. You don't need HDR levels of contrast to notice it.


The banding is SUPER monitor dependent, its noticeable on my 4k monitor, super apparent on a different monitor with a terrible LCD panel, and not at all visible on my iPad.

I wonder if the author took that into consideration.



Back in the early 2010's I had a cheap Dell laptop with a 6-bit panel and an integrated Intel GPU. Video on that device had incredible banding, almost all the time, because as I understand it, the Linux drivers were relatively immature and did not do any dithering. A few years later a driver update enabled dithering and the bulk of the problem went away.

As a video codec developer I was a little sad about that, actually. I had to start looking closer to see problems.



> not at all visible on my iPad.

That is indeed surprising. Is it iPad or iPad Pro? It is technically possible that your monitors only support 8bpp color depth while your iPad Pro supports 10bpp (via the P3 color space) and the WebP file has a smooth gradient only when viewed with 10bpp or more. But I can't really believe that, as the original JPEG file still looks like 8bpp and doesn't have any further color profile attached.



That wouldn't make any sense unless there's something else going on.

It could simply be an effect of brightness -- do you have your 4K monitor set to bright, while your iPad is much dimmer? (Remember Apple devices have adaptive brightness enabled by default as well.)



From my own experience, JPEG quality and compression efficiency can differ a lot depending on the encoder implementation. It would make more sense to compare specific encoders rather than formats in general.

In 2014 (WebP was released in 2010) Mozilla claimed that the standard JPEG format is not used to it's full potential [1] and introduced mozjpeg project that is still being updated [2]. I wonder how it compares today with current WebP implementations.

[1] https://research.mozilla.org/2014/03/05/introducing-the-mozj... [2] https://github.com/mozilla/mozjpeg



> As a photographer, I care about robustness of the visual output. Which means, as a designer, designing for the worst possible image and taking numerical metrics with a grain of salt.

I think it's kind of silly how the author pooh-poohs averages and demands that whoever is working compression algorithms should focus on the worst possible image. If you know anything about information theory, you know that is literally mathematically impossible to make a compression algorithm that always performs well in the worst possible case.



You're taking the bare definition of "worst". He was not talking about compressing random noise


The type of image shown here is a common use case. There's no arguing that it's a statistically insignificant case.


Just give me a good ol' jpg. Or a png. Not everything is compatible with webp yet, but when I want to feed in an image from google images, it doesn't work.


Is webp still relevant these days?

You can use picture/source/srcset to provide different image formats depending on browser support. avif for modern browsers, jpg for maximum compatibility. Means people with old browsers will either get lower quality or a few more bytes, but that seems like an okay tradeoff.



jxl for modern browser, jpg for the rest would be a much better solution, especially if the source is jpg


I can see some banding on the one labeled webp lossless. What gives? Is the banding in the source material? Are we using a different definition of "lossless" than i am used to?

Edit: i think maybe my browser is scaling the photo which is adding artifacts.

Edit2: maybe the thumbnails are scaled at different quality levels???



> maybe the thumbnails are scaled at different quality levels???

Agreed, the WebP lossless version looks pretty bad when scaled by the browser. And since virtually no website/device shows images at their native resolution these days, that's something to consider.

On the other hand, most people these days view websites on their phones, so those artifacts will be harder to see.



I dont even think its that - it seems like it was scaled badly by the author of the post not the web browser and that he is not actually displaying the lossless version. If you click on it it goes to the lossless version but the version dispkayed on page is not that version.


It's even worse than what you said: the tag has a srcset attribute with many possible values so different people may see different images depending on their browser's resolution. The one displayed to me was Shoot-Antoine-0044-_DSC0085-lossless-800x450.webp, which shows clear posterization at its native size as well as when it is further scaled down by the browser to 550x309.


Damn, between that and some people having wide gaumet monitors no wonder everyone is fighting.

This almost feels like a troll post.



You have to open the images in a new tab to get the full res version. Then the webp lossless looks perfect.


I never gave it much thought until I started posting my 3d renders online. Began to find serious issues, especially around posterized backgrounds as the article mentions. A problem which is exacerbated by the vignettes that renderers offer.


Author might be right about the gradient shifts in images after conversion, but at the same time, most of the websites are not using such color accurate images everywhere. Some are logos and some are with alpha channel. It is a fact that WebPs are lightweight assets to load on the user side which reduces bandwidth consumption for the user and your server. So use WebP where it's needed to save some loading time and bandwidth and use your preferred format where you want to show images as is.

If you're planning to convert your images to WebP in bulk, I wrote a shell script: here's the link:

https://medium.com/@siddheshgunjal82/bulk-convert-images-to-...



Voting how appallingly obvious the banding is to me. Couple of questions over images being mixed up aside, this stuff is important.

Perception is psychological. And image formats are political.

Perhaps some truly do experience zero banding or artifacts.

But to the rest of us... "There are four lights"

https://www.startrek.com/en-un/news/the-four-lights



I wish Slack supported webp. I end up saving an image have to run "convert image.webp image.jpg" and then upload the jpeg


I wish websites didn't have webps, or the browser could auto convert when downloading


Also: Telegram, GitHub, probably more.

(GitHub works if you rename it to a .png or .jpg file, but it's a hack).



Further, with jpeg, there is progressive jpeg. Allowing an image to show up asap on slow connections instead of trying to load the whole thing all at once. When I'm on a 2g connection, I absolutely appreciate progressive jpegs, though they are pretty rare in the wild (and pagetest doesn't even recognize them).


My issue with webp is that when it's animated, it seems random whether it gets treated as an image file like a gif or a video file. Any webp I save I have to convert to a real image file to ensure I can view/use it outside of a browser.


Webp is like usb-c in a way, multiple different capabilities in one package. Might sound good on paper, but gets annoying.


I now hope more people understand why I am pushing for JPEG XL, practically before anyone else on HN ( apart from its authors ).

One thing I want to state is that nothing presented here about WebP are new. They have been there since the beginning ( 2010s ). The real problem is, quote:

>>So there is a real issue with the design priorities of image algos from tech guys who clearly lack historical and artistic background, and don’t talk to artists

And their marketing.



"See the posterized ring in the background ?"

Nope. I'm looking at this on a 2k 38" ultrawide monitor, comparing the two images at 190% zoom and I have no idea what I am looking at. I literally can't see a point of difference between them at all. I know my eyes aren't great, but is the difference really that noticeable? What am I missing?



I might be missing something because I never delved into it, but my problem with WebP is I can't save images this way from my browser. Well, I can save them, but they don't show up when I try to view them on my system (Ubuntu Mate 20.04 on RPi4).


The problem is not the format, but the software / OS you choose to use. There are OS’s that have image format libraries, and once a codec is installed, ALL apps gain the ability to use it. This was first done in the 80’s, so if your Ubuntu 20.04 doesnt support data translations, maybe its time to switch to something else.


Might be the OS indeed. Luckily I can make screenshots and save as jpg or whatever. No need to ditch Linux for me.


That's pretty weird. I'm on Ubuntu 23 and WebP images work the same as JPGs or PNGs.

Browsers like Chrome like to associate themselves with WebP for some weird reason, but file explorers, image editors, album viewers, and everything else support WebP just fine.

I don't know what you use, but I use Nautilus, Gnome Image Viewer, and Pinta/GIMP. Perhaps the three years of improved software support make the difference?



Yeah same. Huge annoyance. I just want to stick to the same-old universally-compatible file formats I've always enjoyed everywhere.


They don't show up on older Windows versions either. The file explorer needs some sort of library to handle .webp thumbnails correctly. I'm pretty sure you can install something on Ubuntu to make them show. Maybe try a different file manager?


In general I've found that this shift to .webp breaks all the nice interoperability and composability we used to have with audio and video image files since there seems to be zero interest in making sure that simple familiar features like still work.


Outside of photographers, how many people are looking at super high-resolution images on the web? Even images that might have high-resolution versions are usually converted to a shrunken image 600px wide to fit inside the website's theme scaffolding.

Is that really even worth shaving 15% off the file size? If bandwidth matters, websites should look to reduce the volume of useless stock images littering their templates.

WebP seems like a gift to Cloudflare and the other companies that do the heavy lifting of caching and serving millions of images across multiple sites. For users, it's at best indistinguishable from JPEG, and at worst an obstruction to saving images from the web.



Honestly, I would have agreed wholly with you until I spend 1 month volunteering in Kiribati. 2/3G is the norm there and even few KBs would make a difference. It reminded me a lot of my childhood with 28/56k modems :/

Additionally, I believe countries like India, Pakistan, Bangladesh, ... are in similar situation infrastructure wise (please correct me if I am wrong) and so for 1/2B people would benefit from a slimmer web.



Why aren't the competing images presented side by side? Having to scroll to examine them makes comparison very difficult, especially for those of us not blessed with an experienced photographer's eye.


Isn't this like anything else? No one size solution typically works for everything. If you are a photographer/artist and true close to perfect rendering is for you... don't use WebP as the format to present your images.


The simple truth is that JPEG is more than good enough and has ubiquitous support. There is no reason to switch to a different format and risk degradation or reduced interoperability for slightly smaller file sizes.


I don't understand fanatically chasing smaller image sizes when JPEG was good enough for the web of the 90's. There must be a different reason to throw some of the highest paid engineers in the world at WebP and it ain't generosity.


Google spent a large amount of money purchasing On2. WebP and WebM were a way to show shareholders that they were seeing benefits from the acquisition, and if you look at Google’s traffic volume you could make an argument that even a modest size reduction would pay for the engineering time.

The problem was that this was basically only true for the largest sites. If you’re YouTube or Netflix, it pays to optimize your video encoding but for most other sites the volume just isn’t there and the performance costs for anyone who uses a CDN cancel it out because you need a lot of traffic for each format before a 10-20% byte size reduction saves more time than the cache misses take.



Images on the web of the 90s were also low-res and generally didn't look very good.


Comparing with Beyond Compare:

https://imgur.com/a/xatzZt7

--

Hoping the conversion doesn't add extra noise, I converted them (with ImageMagick: `convert image.webp image.png`) and compared them (Beyond Compare doesn't support WEBP).

Of course I have a non-educated eye as the article puts it, but if still with machine help I cannot see a difference in light dithering, there must be something off.

The second photo (of a man) is more clear in proving the point. This should probably have been used as the first example in the article.



Wow,had no idea BC did images. I've been using it for years!


The uncompressed WEBP image looks terrible to me with a lot of banding on Safari mobile. Did the author accidentally switch images or is Safari doing some “optimization”?


Lossless webp is a good alternative to png. Why compare lossless eebp photo to lossy anything?

I used to use png everywhere in openetg, so webp's a welcome improvement that's greatly reduced asset size

Perhaps the article should be "In defense of JPEG" but that wouldn't get the clicks



Just use mozjpeg and throw away webp.


Unless the OP is using a 8K monitor with professional color grading, I don't understand how he can say that some of these pictures are "looking like shit". They all look exactly the same to me on my regular 27" 1080p, on my 27" 2K or on my iPhone.


Probably if you’re working a lot with photography compression artifacts start to become a real eyesore. Especially the first lower quality webp image does look like shit to me but I also realize a lot of other people would not consciously notice.

The banding is just not supposed to be there.



Easily visible on my air M1, 1080p gaming monitor and pixel 3


For what its worth, the website itself also isn't great. Had to turn off Enhanced Tracking Protection mode to not get text that scrolled off the screen, and then was met with weird fonts.


So true. Still have to find out how to avoid color bleach when converting to webp.


Is this blog a joke/prank?

The images don't link to the correct filetype stated.

- "JPEG, lossy, 85 : 184 kiB" → links actually to a WebP file (https://eng.aurelienpierre.com/wp-content/uploads/sites/8/20...)

- "JPEG, lossy, 85 : 211 KiB" → links actually to a WebP file (https://eng.aurelienpierre.com/wp-content/uploads/sites/8/20...)

etc...

So when the blog tells you that JPEG is so much better quality, the "jpeg" image that's actually being shown is a WebP image.



It seems I have an uneducated eye by their standards, because I barely see any difference, which I'm happy to admit, but I think the author misses the point of webp completely.

The format is intended to bring down the file size of graphics in general, not high-level photography which accounts for probably 0.5% of the images on the internet.

This is a case of the best daily driver car won't be good enough for a race car driver.



Yeah this article comes off as almost idiotic to me. It is entirely irrelevant unless you're supporting high-quality photography on your site, in which case, yeah obviously you're going to be careful about how you compress your images.

For the vast majority of web images, use webp if it's smaller. Minuscule artifacts and judgy designers aren't going to get in the way.



Boy that ct ligature is distracting though.


How does the quality compare at the same file size? It seems like all the comparisons have fairly significant file size differences.


I just finished dealing with a very complicated pipeline for an online media management database. WebP is great except when it's not, and when it's not, it really sucks.

I'm going to go with a technical argument here instead of a subjective one, so there's no room for argument: WebP is billed as a replacement for PNG and JPG, and advertised heavily as being usable in both lossy and lossless modes for either. This is blatantly false. Alpha channel aside, PNG is, effectivelyᵗ, 32-bits per pixel, 8-bits for each of RGB. JPG is notably not; to make good use of compression in the frequency domain possible, RGB is usually converted from RGB to YUV/YCbCr. But JPEG lets you customize how this is done, and you can choose to use the default chroma subsampling of 4:2:0, upgrade to 4:2:2, or forego subsampling altogether and use 4:4:4 directly.

WebP is, experiments aside, always 4:2:0 in default/lossy mode (regardless of the tuning profile chosen). Screenshots, vector graphics, text w/ anti-aliasing applied, etc. look absolutely horrendous to the trained eye if converted from RGB or RGBA to YUV 4:2:0. WebP is unusable for png transcodes at any quality except in lossless mode.

I'm not hating on WebP - PNGs converted to lossless WebP are still a good bit smaller, at least for large sizes. But I absolutely despise how pathetically low and biased Google's benchmarks touting WebP as the be-all, end-all have been. And the toolchain is severely compromised, because you have to manually remember to specify lossless mode when compressing a PNG to WebP and that gets harder when it's an automated toolchain and the export is several steps removed from the input. And this becomes completely Mission Impossible™ when you have a lossless WebP and you want to generate a thumbnail from it because the heuristic is no longer "source extension is png" to determine if the output should be generated in lossless mode. IMO, the WebP toolchain *and all other toolchains like ImageMagick and libvips* should pass through the "lossless" property of WebP by default, because unlike with other formats, it tries too hard to be everything for everyone at once and will fall over on its face otherwise.

I said I wasn't going to talk about the subjective side, but I just want to say that even for tiny thumbnails, we've found that their WebP versions need to be generated with at least quality 90 to ensure they will all (regardless of source image) be usable on non-mobile devices (hi-dpi ameliorates but does not resolve the situation, it's just the fact that you see the pixels physically larger); the smoothing effect for detailed real-world photos (think warzone photos with smoke and haze in the air, odd lighting, etc) is way too extreme at lower qualities. Again, the quality:size ratio is still better than JPEG, but not to the extent that Google advertised it to be, but more importantly, if you took Google at its word you would find WebP to be altogether unusable to begin with.

(None of this was about converting already lossily compressed content into WebP; this is straight from source (where "source" is a lossless format like SVG, PNG, RAW, or something like a 24MP JPEG@Q95 being shrunk orders of magnitude) to WebP.)

I played around some with AVIF, HEIC, and JPEGXL. AVIF has some severe color management issues that need to be ironed out in the various toolchains, though HEIC is a lot better in that regard but its lack of compatibility now and in the foreseeable future just makes it a dead end; but JPEGXL appears to be a really solidly built image codec with great potential, kneecapped primarily by adoption.

ᵗ palletization can, but does not have to, affect this



This is yet another reason why the WebP format has been deprecated, at least in these parts.


On my 14in Macbook Pro I CANNOT TELL THE DIFFERENCE AT ALL


The images inline in the blog are heavily compressed and look about the same. Click through to the actual demo files and the difference becomes obvious.

I can see the difference on my LCD monitor from at least six years ago. WebP really struggles with gradients. I wouldn't use lossy WebPs for photography websites. AVIF does a lot better (-25% at no perceivable quality loss), but completely messes up the brightness on my PC for some reason; I think that's a Firefox bug.

That's not to say WebP is necessarily a bad format. There are tons of images where it easily beats JPEG without quality degradation, but these images clearly show cases where it isn't.

Personally, I use lossless WebP to replace PNGs on websites, thereby maintaining lossless quality without the PNG overhead. Lossy WebPs (and JPEGs) need to be hand-checked, though.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact



Search:
联系我们 contact @ memedata.com