Jump to content
peanutLion

I am simply unable to get an unsharp mask Live filter layer to merge

Recommended Posts

Has anyone else had this weird problem?

 

Here are the four steps I'm taking:

1) I have an AP file open and it has only a background layer. In the tool bar I select Force pixel alignment, Move by whole pixels and Snapping (represented by the magnet symbol).  

2) I add an unsharp mask Live filter layer (not nested but it makes no difference here) and raise Radius to 5.0px, Factor to 4 and Threshold to 0%. The image sharpens in the way you would expect.  

3) Both layers are visible (a tick in each box). I right click on the Live filter layer's icon (or the Background layer's icon) and select Merge visible.  

4) The merged layer appears but it has lost the sharpening completely.

 

Am I correct in thinking that this is a major bug? I simply cannot sharpen.

 

 

 

Share this post


Link to post
Share on other sites

I've tried on 10.11.6 with APh 1.6.7 and it works. I've placed the unsharp mask Live filter layer above as you did (usually I let nest those down via the setable assistence settings) and tried with either layers merge visable, works.


☛ Affinity Designer 1.6.1 ◆ Affinity Photo 1.6.7 ◆ OSX El Capitan

Share this post


Link to post
Share on other sites

Many thanks for your help, v_kyr.

Here's what I've discovered:

The sharpening is being reduced on merge, ie it's not disappearing completely! In other words when I merge the two layers the merged layer is sharper than the background layer by itself but not as sharp as it is suposed to be!

Share this post


Link to post
Share on other sites
16 hours ago, peanutLion said:

The fact that no one else has reported this makes me wonder whether I am doing something silly:

I can't rule out that possibility, but I must be doing the same thing because I can reproduce the problem!  It's not  losing all the sharpening in the merged image, but it's losing enough to be easily noticed.

20 hours ago, peanutLion said:

In the tool bar I select Force pixel alignment, Move by whole pixels and Snapping (represented by the magnet symbol).  

This doesn't seem to have any effect - I get the same results with them switched on or off.  

This is the original, sharpened image.Sharpened.thumb.jpg.7a5cca9382a5670241895d24ca6df823.jpg

And this is the merged version.  The difference is more apparent before they're resized and uploaded (but that might just be because I can blink compare them).

UnSharpened.thumb.jpg.88dd14be3fb6bdd9b5dd4aff99a745ce.jpg 


AP user, running Win10

Share this post


Link to post
Share on other sites

I tried playing with Filters, rather than Live Filter Layers.  I used a different image but set the same sharpening values i.e. Radius 5px and Factor 4

This is what it looked like before I pressed "Apply"

  118922162_Snap2019-02-21at16_41_06.thumb.png.c4395790d945904b0cac3e791a0ce217.png

 

And this is what it looked like afterwards.  This time the difference is much more apparent before the uploads!

768062618_Snap2019-02-21at16_42_59.thumb.png.ed1f3dbca8a60f849b4f06be0fd1659d.png


AP user, running Win10

Share this post


Link to post
Share on other sites

Many thanks for showing your before/after pictures. I can definitely see the difference between them.

Having read other posts in this forum I have realised that Affinity Photo uses a low-res version of your image (something called a mipmap, whatever that is) when you view it at a scale of less than 100%. This is what causes the problem we are discussing here.

The problem only arises in the case of Live filter layers, or at least so I understand. I have tried using an HSL Adjustment layer instead of the unsharp mask Live filter layer and the problem did not occur at any level of zoom.

On the plus side I have also tried going through the fours steps I detailed above (with an unsharp mask layer) but this time with the zoom at 100%. The merged layer then appeared exactly like the product of the not merged Background layer and the not-merged unsharp mask layer. On the minus side I have now got to determine how to work my new understanding into my workflow for editing portraits! In other words how do I look at a large image that is at, say, a zoom of 20% (which it has to be at so that I can get all of it in the window) and still see the correct effect of the filter layer?

Share this post


Link to post
Share on other sites
8 minutes ago, peanutLion said:

Having read other posts in this forum I have realised that Affinity Photo uses a low-res version of your image (something called a mipmap, whatever that is) when you view it at a scale of less than 100%. This is what causes the problem we are discussing here.

<expletive deleted>!!  I'd forgotten about the mipmaps!  It looks like there's still a bug though - the first set of images I  posted were JPEG exports, not screen grabs.  I've just tried again, doing everything at 100%, and the merged image has lost some of the sharpening.


AP user, running Win10

Share this post


Link to post
Share on other sites
2 hours ago, peanutLion said:

On the plus side I have also tried going through the fours steps I detailed above (with an unsharp mask layer) but this time with the zoom at 100%. The merged layer then appeared exactly like the product of the not merged Background layer and the not-merged unsharp mask layer. On the minus side I have now got to determine how to work my new understanding into my workflow for editing portraits! In other words how do I look at a large image that is at, say, a zoom of 20% (which it has to be at so that I can get all of it in the window) and still see the correct effect of the filter layer?

You are always recommended to assess sharpening at 100% zoom. You can move around the image with the Hand tool.

John


Windows 10, Affinity Photo 1.6.5.123 and Designer 1.6.5.123, (mainly Photo), now ex-Adobe CC

CPU: AMD A6-3670. RAM: 16 GB DDR3 @ 666MHz, Graphics: 2047MB NVIDIA GeForce GT 630

Share this post


Link to post
Share on other sites

Thank you for the post, John, but I'm sensing a very big problem here.

I might be wrong (and please do say so, somebody, if I am) but if I zoom the image to 100% then set the sharpening layer's sliders as I think the image deserves, when I zoom out to 20% so that I can now see the whole of the image, now the image looks very different. This is a problem in the case of retouching portraits because we want to view the image as a whole on screen to see what our edits will look like to the ultimate viewer. But with big files you can't do that and get an accurate representation of your editing (or so it seems to me).

Let's be generous and say that in the case of changes made in a sharpening live filter layer at a zoom of 100% (as you say is recommended) we don't actually mind using the hand tool to move around the image. What about the case of another live filter layer, maybe Gaussian blur? Sometimes we want to blur skin (perhaps to lessen certain types of imperfections in it) but we don't want to go too far. In a big file at 100% zoom we'll only see a part of the face. To ensure our blurring is still acceptable we want to zoom out to 20% to get a good idea of what the ultimate viewer will see. It seems to me that we can't do this in Affinity Photo because doing so will give us an incorrect representation of the blur.

But please tell me I'm wrong if I am – because I really hope I am.

Share this post


Link to post
Share on other sites
8 minutes ago, peanutLion said:

if I zoom the image to 100% then set the sharpening layer's sliders as I think the image deserves, when I zoom out to 20% so that I can now see the whole of the image, now the image looks very different. This is a problem in the case of retouching portraits because we want to view the image as a whole on screen to see what our edits will look like to the ultimate viewer. But with big files you can't do that and get an accurate representation of your editing (or so it seems to me).

I think you should try using the Navigator Studio and hit the hamburger menu for the Advanced features. You can set a location at 100% to  "set the sharpening layer's sliders as I think the image deserves" then zoom out to the 20% whole image view and snap back to the same 100% view at the same location. 'Tis great.


MacBook Pro (13-inch, Mid 2012) Mac OS 10.12.6 || Mac Pro (Late 2013) Mac OS 10.14.5

Affinity Designer 1.6.1 | Affinity Photo 1.6.7 | Affinity Publisher beta 1.7.0.337 | Affinity Photo beta 1.7.0.128 | Affinity Designer Beta 1.7.0.12

Share this post


Link to post
Share on other sites

Hi peanutLion,
Welcome to Affinity Forums :)
There will always be some differences between viewing an image at 100% (where image and screen pixels match 1:1) and viewing it a 20% because when you set the zoom to 20% to fit the image on screen it must be resampled for display - small details  generated by filters/adjustments namely noise, sharpening etc etc will suffer with the resampling process/will loose definition.

Share this post


Link to post
Share on other sites

Many thanks, Old Bruce, but the problem is not one of simply getting around the image easily; it's one of seeing differents effects at different levels of zoom.

Share this post


Link to post
Share on other sites
45 minutes ago, peanutLion said:

This is a problem in the case of retouching portraits because we want to view the image as a whole on screen to see what our edits will look like to the ultimate viewer.

The problem with that is you can't be sure how the image will be rendered by whatever app the viewer will be using. For example, most browsers will scale images to fit within the web page by interpolating its actual pixels, much like resampling an image in Affinity does. It also may not use the same (or any) color management. Most browsers also can scale up or down web pages, & many can selectively scale images separately from text.


Affinity Photo 1.6.7 & Affinity Designer 1.6.1; macOS High Sierra 10.13.6 iMac (27-inch, Late 2012); 2.9GHz i5 CPU; NVIDIA GeForce GTX 660M; 8GB RAM
Affinity Photo 1.6.11.85 & Affinity Designer 1.6..4.45 for iPad; 6th Generation iPad 32 GB; Apple Pencil; iOS 12.1.1

Share this post


Link to post
Share on other sites

First of all I hope someone will tell me I'm talking nonsense if I am. I'm not an expert. I'm here to learn how to use AP, so I'll happily be told I'm far off track. However, at the moment AP seems to have a major and rather bizarre flaw in its design.

Thank you, R C-R, for your comment. One app/computer/screen will display the same image differently to another app/computer/screen. That problem is not what I'm on about here. The problem I'm trying to resolve is entirely different.

Thank you, also, MEB for a comment that I assume is from Serif. You have given me an appreciation of why visual differences occur when we zoom an image in an out. I just find it hard to believe that this is acceptable.

An image simply should not change in appearance as we zoom in and out in an image-editing app (except in the obvious way). I hope I am wrong when I say this but the fact that AP causes this to happen is strange. I understand that the change is a side-effect of a technique AP uses to render quickly. It's just a shame we can't disable that technique.

Our problem is that in AP we can only be certain that an edit will produce a visual effect to the degree that we like if the zoom is 100%. But with a large file, say a head & shoulders portrait, 100% zoom may show us only part of the face on screen. It's like looking through a letter box but from the inside. Mostly that is not how we want to edit an image.

The only workaround that I can think of is this (and, again, please tell me this is all wrong if it really is):
1) make a copy of a large file
2) change the copy's document size so that the whole picture fits on screen at 100% zoom
3) do all edits on the copy at 100% zoom
4) copy the filter/adjustment layers and paste them in the original large file
5) redo any brushwork.

 

Share this post


Link to post
Share on other sites
1 hour ago, peanutLion said:

The only workaround that I can think of is this (and, again, please tell me this is all wrong if it really is):
1) make a copy of a large file
2) change the copy's document size so that the whole picture fits on screen at 100% zoom
3) do all edits on the copy at 100% zoom
4) copy the filter/adjustment layers and paste them in the original large file
5) redo any brushwork.

This would work for any adjustments that do not involve manipulations involving pixels, such as sharpening! Sharpening should be the last thing you do before publishing your file at its final size/resolution. If you are preparing a file for print, get it to the size/resolution required by the printer, apply sharpening and then print. You would then (and only then) be able to see if your sharpening is right for that printed image, this is not a foible of Affinity, but of any Photo software. Most photo software allows you to soft-proof your image before printing, but there is no guarantee that it will give you the final result you want.

John


Windows 10, Affinity Photo 1.6.5.123 and Designer 1.6.5.123, (mainly Photo), now ex-Adobe CC

CPU: AMD A6-3670. RAM: 16 GB DDR3 @ 666MHz, Graphics: 2047MB NVIDIA GeForce GT 630

Share this post


Link to post
Share on other sites
1 hour ago, peanutLion said:

An image simply should not change in appearance as we zoom in and out in an image-editing app (except in the obvious way).

How could all the pixels of an image larger than the screen or window it is displayed in be rendered without some kind of appearance-changing interpolation? There are only a certain, unvarying number of pixels available in the display itself, so if there are more pixels than that in the image, there is no way they can all be displayed  at the same time.

For example, my iMac has a screen resolution of 2560 x 1440 pixels. Say I am working on an image that is twice that resolution at 5120 x 2880 pixels. To fit the entire image onto the screen, even in Full Screen mode using every one of those screen pixels for the image, each screen pixel has to represent 4 image pixels. A pixel can have only one color, so any color variation smaller than that among those 4 image pixels will have to be averaged down to one color.


Affinity Photo 1.6.7 & Affinity Designer 1.6.1; macOS High Sierra 10.13.6 iMac (27-inch, Late 2012); 2.9GHz i5 CPU; NVIDIA GeForce GTX 660M; 8GB RAM
Affinity Photo 1.6.11.85 & Affinity Designer 1.6..4.45 for iPad; 6th Generation iPad 32 GB; Apple Pencil; iOS 12.1.1

Share this post


Link to post
Share on other sites

Hello R C-R. Many thanks for the comment. You ask an interesting question ("How could all the … interpolation?") but I think with a little thought we can answer it well.

Let's say we do this:

1) Open large file, zoom to 50%
2) Go to Filters menu > Gausian blur (instead of using live filter layer)
3) Move Radius slider to 10px to make blurring of image obvious, hit Apply
4) Zoom out repeatedly (keep pressing control/command and minus key)
5) Observe how the blurring effect seems not to change as we zoom out.

As we do step 4) we are repeatedly getting AP to represent a greater and greater number of image pixels in each screen pixel. Yet the blurring effect is not changing as we do so.

If AP can ensure that the effect does not change as we zoom out more and more when we follow steps 1-5 above, clearly AP could in principle do the same thing when we employ a live filter layer.

The fact that in the case of a live filter layer the effect DOES actually change as we zoom out/in is, I think, an error of judgement by the people who made AP, for whose skills I otherwise have nothing but admiration btw.

Again, if I'm not making sense I'd like to be told so.

 

Many thanks also to John Rostron for your comment. I think we are saying the same thing.

 

 

 

 

Share this post


Link to post
Share on other sites

Ok, I'll be the dissenter, I don't really get what you think you're seeing, whether I apply a Gaussian Blur as a destructive filter or a Live filter the effect for me is the same. The blur is the same, it doesn't maintain the level of blur when the image is larger as the smaller image would blur to nothing.

I think the effect is akin to having a printed blurred image, the blur is static and cannot be changed you are standing a foot away from it but if you start to walk backwards the image starts to look less blurred.


iMac 27" Late 2015 Fully Loaded, iMac 27" Mid 2011 both running High Sierra 10.13.6 - Affinity Designer/Photo, Publisher Beta 1.7.0.140, Illustrator CC, Inkscape, Blender, Sketchup, Pepakura Designer, MTC, Pixelmator & Pixelmator Pro + more... XP-Pen Artist-22E, - iPad Pro 12.9 B|  

Affinity Help - Affinity Designer (ADe) Tutorials - Affinity Photo (APh) Tutorials Instagram & Flickr

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×