Joachim_L Posted May 25, 2020 Share Posted May 25, 2020 Attached you'll find a portion of an image, where it seems to be, that Unsharp Mask was applied too heavily (cannot undo this, is in the original image). Is there a simple technique to revert this? Quote ------ Windows 10 | i5-8500 CPU | Intel UHD 630 Graphics | 32 GB RAM | Latest Retail and Beta versions of complete Affinity range installed Link to comment Share on other sites More sharing options...
John Rostron Posted May 25, 2020 Share Posted May 25, 2020 In the book Real World Image Sharpening by Bruce Fraser and Jeff Schewe, page 259, for Photoshop they recommend: Set a Blur Layer at 100% opacity with a Gaussian blur of 0.2 radius. Set the blend mode to Darken. Now repeat the Gaussian blur four more times. I give this to you more-or-less verbatim. You will need to translate the Photoshop commands into Affinity ones. John Joachim_L 1 Quote Windows 10, Affinity Photo 1.10.5 Designer 1.10.5 and Publisher 1.10.5 (mainly Photo), now ex-Adobe CC CPU: AMD A6-3670. RAM: 16 GB DDR3 @ 666MHz, Graphics: 2047MB NVIDIA GeForce GT 630 Link to comment Share on other sites More sharing options...
Gregory Chalenko Posted May 26, 2020 Share Posted May 26, 2020 I would say, to try and revert the result of Unsharp Mask, you first need to know how exactly Unsharp Mask works. What it does, is it subtracts a blurred copy of the image from the original and adds some amount of it back to the original: Original - Blurred + Original = Unsharp Mask From this we can reverse engineer the operation to restore the original. I leaved the amount of Unsharp Mask out to avoid over-complication of the formula, so let's say, it was applied at 100%. Solving of the equation: Original + Original = Unsharp Mask + Blurred 2 * Original = Unsharp Mask + Blurred Original = (Unsharp Mask + Blurred) / 2 As you can see, you can't get the original restored with 100% precision, especially if the Unsharp Mask used a small radius, but you should get a fairly good result, if you blur the sharpened image that you have and average the blurred copy with the sharpened image. The radius of the blur needs to be tuned in every case individually. You can try playing with the opacity of the blurred layer as you average it with the sharpened image. Joachim_L 1 Quote Link to comment Share on other sites More sharing options...
Jowday Posted May 26, 2020 Share Posted May 26, 2020 12 hours ago, John Rostron said: In the book Real World Image Sharpening by Bruce Fraser and Jeff Schewe, page 259, for Photoshop they recommend: Set a Blur Layer at 100% opacity with a Gaussian blur of 0.2 radius. Set the blend mode to Darken. Now repeat the Gaussian blur four more times. I give this to you more-or-less verbatim. You will need to translate the Photoshop commands into Affinity ones. John Same technique explained here: https://shutterstoppers.com/fix-over-sharpened-images They real tragedy is that when you need to UNSHARPEN an image and Google the term... D'oh! John Rostron and Joachim_L 1 1 Quote "The user interface is supposed to work for me - I am not supposed to work for the user interface." Computer-, operating system- and software agnostic; I am a result oriented professional. Look for a fanboy somewhere else. “When a wise man points at the moon the imbecile examines the finger.” ― Confucius Not an Affinity user og forum user anymore. The software continued to disappoint and not deliver. Link to comment Share on other sites More sharing options...
Joachim_L Posted May 26, 2020 Author Share Posted May 26, 2020 Thanks to all for your good advice. I used the technique from shutterstoppers, but adding a bit of clarity to the original layer which gave me - my opinion - the best results. John Rostron and Jowday 2 Quote ------ Windows 10 | i5-8500 CPU | Intel UHD 630 Graphics | 32 GB RAM | Latest Retail and Beta versions of complete Affinity range installed Link to comment Share on other sites More sharing options...
John Rostron Posted May 26, 2020 Share Posted May 26, 2020 7 hours ago, Gregory Chalenko said: I would say, to try and revert the result of Unsharp Mask, you first need to know how exactly Unsharp Mask works. What it does, is it subtracts a blurred copy of the image from the original and adds some amount of it back to the original: Original - Blurred + Original = Unsharp Mask From this we can reverse engineer the operation to restore the original. I leaved the amount of Unsharp Mask out to avoid over-complication of the formula, so let's say, it was applied at 100%. Solving of the equation: Original + Original = Unsharp Mask + Blurred 2 * Original = Unsharp Mask + Blurred Original = (Unsharp Mask + Blurred) / 2 The problem with your algebra is that 'Blurred + Original' is a single entity. You cannot disentangle the blur without using more complex deblurring (deconvolution) methods. The effect of oversharpening is to introduce halos and that is why the other techniques suggested use a blending mode of Darken since this will reduce or eliminate the bright halos. John Quote Windows 10, Affinity Photo 1.10.5 Designer 1.10.5 and Publisher 1.10.5 (mainly Photo), now ex-Adobe CC CPU: AMD A6-3670. RAM: 16 GB DDR3 @ 666MHz, Graphics: 2047MB NVIDIA GeForce GT 630 Link to comment Share on other sites More sharing options...
John Rostron Posted May 26, 2020 Share Posted May 26, 2020 Here is an example of oversharpening followed by unsharpening (from Fraser and Schewe). The image is of a Borage flower which has hairy leaves. Here is the original: And an oversharpened version (Radius 9.2px, Factor 4 and Threshold 0): and with five rounds of Gaussian Blur at 0.2px with Darken blend mode: Not perfect, but better than the oversharpened version. John Quote Windows 10, Affinity Photo 1.10.5 Designer 1.10.5 and Publisher 1.10.5 (mainly Photo), now ex-Adobe CC CPU: AMD A6-3670. RAM: 16 GB DDR3 @ 666MHz, Graphics: 2047MB NVIDIA GeForce GT 630 Link to comment Share on other sites More sharing options...
Gregory Chalenko Posted May 26, 2020 Share Posted May 26, 2020 7 hours ago, John Rostron said: The problem with your algebra is that 'Blurred + Original' is a single entity. You cannot disentangle the blur without using more complex deblurring (deconvolution) methods. I agree that without deconvolution it's impossible to do it properly, but with a larger radius of the source Unsharp Mask it can be close. 7 hours ago, John Rostron said: The effect of oversharpening is to introduce halos and that is why the other techniques suggested use a blending mode of Darken since this will reduce or eliminate the bright halos. What about dark halos? Unsharp Mask produces both bright and dark edges, so both bright and dark ones need to be addressed. Quote Link to comment Share on other sites More sharing options...
John Rostron Posted May 26, 2020 Share Posted May 26, 2020 2 hours ago, Gregory Chalenko said: What about dark halos? Unsharp Mask produces both bright and dark edges, so both bright and dark ones need to be addressed True, but it is the light halos that tend to be the more intrusive. Joh Quote Windows 10, Affinity Photo 1.10.5 Designer 1.10.5 and Publisher 1.10.5 (mainly Photo), now ex-Adobe CC CPU: AMD A6-3670. RAM: 16 GB DDR3 @ 666MHz, Graphics: 2047MB NVIDIA GeForce GT 630 Link to comment Share on other sites More sharing options...
Medical Officer Bones Posted May 26, 2020 Share Posted May 26, 2020 Never really thought about reversing unsharp mask, but I will play too. The only image editor that I know of that can apply a filter "in reverse" is PhotoLine. So I opened the above image in it, applied an unsharp mask adjustment layer, and reversed the layer opacity to -100 (PhotoLine allows for negative layer opacity values which will invert an adjustment layer's effect). On the left, the original. On the right, the original oversharpened version. In the middle, the reversed unsharp mask filter (9.2 size). Unsharp masks destroys part of the original information through blurring, though. It can never be restored fully. The sharp highlight spikes remain, and should be filtered out in post. That said, the inverted unsharp mask version does look improved. Joachim_L and Jowday 2 Quote Link to comment Share on other sites More sharing options...
John Rostron Posted May 27, 2020 Share Posted May 27, 2020 11 hours ago, Medical Officer Bones said: So I opened the above image in it, applied an unsharp mask adjustment layer, and reversed the layer opacity to -100 (PhotoLine allows for negative layer opacity values which will invert an adjustment layer's effect). This would suggest that you are working on a live layer, so any effect is reversible. I'm sure that if you created a live unsharp mask layer in Affinity Photo, then it would be similarly reversible. @Joachim_L's original problem was that the oversharpening was baked in (flattened) in his image. John Quote Windows 10, Affinity Photo 1.10.5 Designer 1.10.5 and Publisher 1.10.5 (mainly Photo), now ex-Adobe CC CPU: AMD A6-3670. RAM: 16 GB DDR3 @ 666MHz, Graphics: 2047MB NVIDIA GeForce GT 630 Link to comment Share on other sites More sharing options...
Medical Officer Bones Posted May 27, 2020 Share Posted May 27, 2020 8 hours ago, John Rostron said: This would suggest that you are working on a live layer, so any effect is reversible. I'm sure that if you created a live unsharp mask layer in Affinity Photo, then it would be similarly reversible. @Joachim_L's original problem was that the oversharpening was baked in (flattened) in his image. John I think that you might have misunderstood my explanation: unlike Affinity Photo (or any other image editor that I am aware of), PhotoLine's layer opacity setting ranges from -200 up to +200. It is the only image editor on the market with that option for layers: Photo, Photoshop, Gimp, Pixelmator, etc. all offer a range from 0 to 100. And a negative layer opacity setting also allows for any adjustment layer's effect or blend mode to be inverted. This entirely novel concept of a -200 --> +200% layer opacity range setting falls outside the usual sphere of experience of most users (whose familiarity with layer blending is generally limited to a 0% --> 100% range). It falls outside the usual accepted paradigm how layer-based image editors work with layers. I did indeed work with the flattened original image, and applied a live unsharp mask adjustment layer with the same (or similar) settings as mentioned by the OP. I then changed the layer opacity to - (minus) 100%, which inverts (reverses) the baked original unsharp mask effect (up to a point, of course, since it is a destructive effect which removes original data). Quote Link to comment Share on other sites More sharing options...
Gregory Chalenko Posted May 27, 2020 Share Posted May 27, 2020 14 minutes ago, Medical Officer Bones said: I think that you might have misunderstood my explanation: unlike Affinity Photo (or any other image editor that I am aware of), PhotoLine's layer opacity setting ranges from -200 up to +200. It is the only image editor on the market with that option for layers: Photo, Photoshop, Gimp, Pixelmator, etc. all offer a range from 0 to 100. And a negative layer opacity setting also allows for any adjustment layer's effect or blend mode to be inverted. This entirely novel concept of a -200 --> +200% layer opacity range setting falls outside the usual sphere of experience of most users (whose familiarity with layer blending is generally limited to a 0% --> 100% range). It falls outside the usual accepted paradigm how layer-based image editors work with layers. That's a cool approach! Exactly same principles as in Blackmagic Design Fusion, the node-based compositing software Quote Link to comment Share on other sites More sharing options...
Medical Officer Bones Posted May 27, 2020 Share Posted May 27, 2020 1 minute ago, Gregory Chalenko said: That's a cool approach! Exactly same principles as in Blackmagic Design Fusion, the node-based compositing software Exactly! Node-based compositors generally have way more non-destructive and flexible options for image editing. When I explain this concept to users familiar with node-based editing, they generally immediately understand what I am talking about. It is a cool trick in your toolbox to have access to. I wish and hope that Affinity will consider implementing an expanded layer opacity range as well, because it is forward-thinking, rather than sticking with old-fashioned 25 year old Photoshop concepts. lepr and Gregory Chalenko 2 Quote Link to comment Share on other sites More sharing options...
John Rostron Posted May 27, 2020 Share Posted May 27, 2020 53 minutes ago, Medical Officer Bones said: I did indeed work with the flattened original image, and applied a live unsharp mask adjustment layer with the same (or similar) settings as mentioned by the OP. I then changed the layer opacity to - (minus) 100%, which inverts (reverses) the baked original unsharp mask effect (up to a point, of course, since it is a destructive effect which removes original data). As far as I can tell from your explanation here, you have a flattened layer (original) then on top of this a live unsharp mask layer. This differs from the OP in that you have separate layers for the background and the filter. I would guess that the opacity looks at the difference between the two layers and subtracts this from the combined effect twice. No wonder that it can reconstitute the pre-sharpening state. However, this would not help the OP who does not have this pre-sharpened state. How would your algorithm work on a sharpened and then flattened image? I'm not convinced that the difference calculated for the negative opacity has the necessary information to unsharpen an image. John Quote Windows 10, Affinity Photo 1.10.5 Designer 1.10.5 and Publisher 1.10.5 (mainly Photo), now ex-Adobe CC CPU: AMD A6-3670. RAM: 16 GB DDR3 @ 666MHz, Graphics: 2047MB NVIDIA GeForce GT 630 Link to comment Share on other sites More sharing options...
Medical Officer Bones Posted May 27, 2020 Share Posted May 27, 2020 In the simplest of terms: I used the OP's original image with the baked unsharp mask effect, and then applied an unsharp mask adjustment layer effect in reverse with negative layer opacity. That is exactly what the OP is trying to achieve: undoing/reversing an unsharp mask effect. But as I said, it cannot restore all original information, of course, because applying unsharp mask will always destroy some information. *edit* I tested this method with other images with an obvious unsharp mask filter applied to them, and it depended on the particular image how successful this move is. In some instances lower than -50% layer opacity would be too fuzzy. In other images it did not really help that much unless other adjustments were made afterwards, or some masking to protect details. To visually clarify the layer stack (notice the -100% layer opacity): John Rostron 1 Quote Link to comment Share on other sites More sharing options...
Joachim_L Posted May 28, 2020 Author Share Posted May 28, 2020 Interesting comments on my simple question / problem. Thanks to all, hope Serif will read this as well. Or the master of APhoto @James Ritson has another approach to my problem? Quote ------ Windows 10 | i5-8500 CPU | Intel UHD 630 Graphics | 32 GB RAM | Latest Retail and Beta versions of complete Affinity range installed Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.