[Settings] [Home] [Catalog] [Search] [Private messages] [Admin]
[Return] [Bottom]

Posting mode: Reply

Emotes
Kaomoji
Emoji
BBCode
(for deletion)
  • Allowed file types are: gif, jpg, jpeg, png, bmp, swf, webm, mp4
  • Maximum file size allowed is 50000 KB.
  • Images greater than 200 * 200 pixels will be thumbnailed.
  • 26 unique users in the last 10 minutes (including lurkers)





Want your banner here? Click here to submit yours!

https://github.com/victorvde/jpeg2png

There's this rather obscure tool on github from a decade ago called "jpeg2png" that's designed to remove jpeg artifacts. I've been using it for a decade now and nothing else even comes close to it in terms of quality. Even those expensive neural network based ones are considerably less effective than this tool.

The only real flaw of it is that some images that actually do contain significant noise and are saved at low quality get smoothed out slightly. Understand though that importantly, that this is not smoothing in the sense of blurring, but reduction of the JPEG coefficients. In general, this tool does not result in any blurring. I believe the tool works by tweaking the DCT coefficients to result in the minimum noise and maximum coherence, which usually results in a result very close to the original.

The reason the smoothing happens for high noise images is presumably because it is because the solution it finds is just to tweak every DCT coefficient so it is slightly smaller to minimize the cost. I wonder if this issue could be simply fixed by detecting when most DCT coefficients decrease, and not changing them in that case. In cases where they are legitimately being decreased, the reduction in cost would be significantly higher and the changes in DCT coefficient should be pretty much evenly spread, as the errors were introduced by quantization/rounding, which is not biased.

Want your banner here? Click here to submit yours!

[Top]

Delete post: []
First
Last