peanutLion Posted February 20, 2019 Share Posted February 20, 2019 Has anyone else had this weird problem? Here are the four steps I'm taking: 1) I have an AP file open and it has only a background layer. In the tool bar I select Force pixel alignment, Move by whole pixels and Snapping (represented by the magnet symbol). 2) I add an unsharp mask Live filter layer (not nested but it makes no difference here) and raise Radius to 5.0px, Factor to 4 and Threshold to 0%. The image sharpens in the way you would expect. 3) Both layers are visible (a tick in each box). I right click on the Live filter layer's icon (or the Background layer's icon) and select Merge visible. 4) The merged layer appears but it has lost the sharpening completely. Am I correct in thinking that this is a major bug? I simply cannot sharpen. CRWillow 1 Quote Link to comment Share on other sites More sharing options...
v_kyr Posted February 20, 2019 Share Posted February 20, 2019 Works here on OSX. - Which version and OS? Quote ☛ Affinity Designer 1.10.8 ◆ Affinity Photo 1.10.8 ◆ Affinity Publisher 1.10.8 ◆ OSX El Capitan ☛ Affinity V2.3 apps ◆ MacOS Sonoma 14.2 ◆ iPad OS 17.2 Link to comment Share on other sites More sharing options...
peanutLion Posted February 20, 2019 Author Share Posted February 20, 2019 Many thanks for the reply v–kyr. Mac OSX 10.14.3 Affinity Photo 1.6.7 Quote Link to comment Share on other sites More sharing options...
v_kyr Posted February 20, 2019 Share Posted February 20, 2019 I've tried on 10.11.6 with APh 1.6.7 and it works. I've placed the unsharp mask Live filter layer above as you did (usually I let nest those down via the setable assistence settings) and tried with either layers merge visable, works. Quote ☛ Affinity Designer 1.10.8 ◆ Affinity Photo 1.10.8 ◆ Affinity Publisher 1.10.8 ◆ OSX El Capitan ☛ Affinity V2.3 apps ◆ MacOS Sonoma 14.2 ◆ iPad OS 17.2 Link to comment Share on other sites More sharing options...
peanutLion Posted February 20, 2019 Author Share Posted February 20, 2019 Many thanks for your help, v_kyr. Here's what I've discovered: The sharpening is being reduced on merge, ie it's not disappearing completely! In other words when I merge the two layers the merged layer is sharper than the background layer by itself but not as sharp as it is suposed to be! Quote Link to comment Share on other sites More sharing options...
v_kyr Posted February 20, 2019 Share Posted February 20, 2019 Honestly I don't see there any real difference, at least not for my choosen test image in my try. demo.mp4 Quote ☛ Affinity Designer 1.10.8 ◆ Affinity Photo 1.10.8 ◆ Affinity Publisher 1.10.8 ◆ OSX El Capitan ☛ Affinity V2.3 apps ◆ MacOS Sonoma 14.2 ◆ iPad OS 17.2 Link to comment Share on other sites More sharing options...
peanutLion Posted February 20, 2019 Author Share Posted February 20, 2019 Many thanks for the video. I have attached my own below to illustrate the problem. Screen Recording 2019-02-20 at 22.22.04.mov Quote Link to comment Share on other sites More sharing options...
v_kyr Posted February 20, 2019 Share Posted February 20, 2019 Maybe it's a macOS Mojave related problem here (?), did you tried the 1.7.x Beta version if it behaves there too this way? Quote ☛ Affinity Designer 1.10.8 ◆ Affinity Photo 1.10.8 ◆ Affinity Publisher 1.10.8 ◆ OSX El Capitan ☛ Affinity V2.3 apps ◆ MacOS Sonoma 14.2 ◆ iPad OS 17.2 Link to comment Share on other sites More sharing options...
peanutLion Posted February 20, 2019 Author Share Posted February 20, 2019 Thank you so much, v–kyr. I will try the beta version and post the result here Quote Link to comment Share on other sites More sharing options...
peanutLion Posted February 20, 2019 Author Share Posted February 20, 2019 The problem is still apparent in the case of the latest beta of AP. The fact that no one else has reported this makes me wonder whether I am doing something silly: Screen_Recording_2019-02-20_at_23_47_27.mov Quote Link to comment Share on other sites More sharing options...
IanSG Posted February 21, 2019 Share Posted February 21, 2019 16 hours ago, peanutLion said: The fact that no one else has reported this makes me wonder whether I am doing something silly: I can't rule out that possibility, but I must be doing the same thing because I can reproduce the problem! It's not losing all the sharpening in the merged image, but it's losing enough to be easily noticed. 20 hours ago, peanutLion said: In the tool bar I select Force pixel alignment, Move by whole pixels and Snapping (represented by the magnet symbol). This doesn't seem to have any effect - I get the same results with them switched on or off. This is the original, sharpened image. And this is the merged version. The difference is more apparent before they're resized and uploaded (but that might just be because I can blink compare them). Quote AP, AD & APub user, running Win10 Link to comment Share on other sites More sharing options...
IanSG Posted February 21, 2019 Share Posted February 21, 2019 I tried playing with Filters, rather than Live Filter Layers. I used a different image but set the same sharpening values i.e. Radius 5px and Factor 4 This is what it looked like before I pressed "Apply" And this is what it looked like afterwards. This time the difference is much more apparent before the uploads! Quote AP, AD & APub user, running Win10 Link to comment Share on other sites More sharing options...
peanutLion Posted February 21, 2019 Author Share Posted February 21, 2019 Many thanks for showing your before/after pictures. I can definitely see the difference between them. Having read other posts in this forum I have realised that Affinity Photo uses a low-res version of your image (something called a mipmap, whatever that is) when you view it at a scale of less than 100%. This is what causes the problem we are discussing here. The problem only arises in the case of Live filter layers, or at least so I understand. I have tried using an HSL Adjustment layer instead of the unsharp mask Live filter layer and the problem did not occur at any level of zoom. On the plus side I have also tried going through the fours steps I detailed above (with an unsharp mask layer) but this time with the zoom at 100%. The merged layer then appeared exactly like the product of the not merged Background layer and the not-merged unsharp mask layer. On the minus side I have now got to determine how to work my new understanding into my workflow for editing portraits! In other words how do I look at a large image that is at, say, a zoom of 20% (which it has to be at so that I can get all of it in the window) and still see the correct effect of the filter layer? NotMyFault 1 Quote Link to comment Share on other sites More sharing options...
IanSG Posted February 21, 2019 Share Posted February 21, 2019 8 minutes ago, peanutLion said: Having read other posts in this forum I have realised that Affinity Photo uses a low-res version of your image (something called a mipmap, whatever that is) when you view it at a scale of less than 100%. This is what causes the problem we are discussing here. <expletive deleted>!! I'd forgotten about the mipmaps! It looks like there's still a bug though - the first set of images I posted were JPEG exports, not screen grabs. I've just tried again, doing everything at 100%, and the merged image has lost some of the sharpening. Quote AP, AD & APub user, running Win10 Link to comment Share on other sites More sharing options...
John Rostron Posted February 21, 2019 Share Posted February 21, 2019 2 hours ago, peanutLion said: On the plus side I have also tried going through the fours steps I detailed above (with an unsharp mask layer) but this time with the zoom at 100%. The merged layer then appeared exactly like the product of the not merged Background layer and the not-merged unsharp mask layer. On the minus side I have now got to determine how to work my new understanding into my workflow for editing portraits! In other words how do I look at a large image that is at, say, a zoom of 20% (which it has to be at so that I can get all of it in the window) and still see the correct effect of the filter layer? You are always recommended to assess sharpening at 100% zoom. You can move around the image with the Hand tool. John Quote Windows 10, Affinity Photo 1.10.5 Designer 1.10.5 and Publisher 1.10.5 (mainly Photo), now ex-Adobe CC CPU: AMD A6-3670. RAM: 16 GB DDR3 @ 666MHz, Graphics: 2047MB NVIDIA GeForce GT 630 Link to comment Share on other sites More sharing options...
peanutLion Posted February 21, 2019 Author Share Posted February 21, 2019 Thank you for the post, John, but I'm sensing a very big problem here. I might be wrong (and please do say so, somebody, if I am) but if I zoom the image to 100% then set the sharpening layer's sliders as I think the image deserves, when I zoom out to 20% so that I can now see the whole of the image, now the image looks very different. This is a problem in the case of retouching portraits because we want to view the image as a whole on screen to see what our edits will look like to the ultimate viewer. But with big files you can't do that and get an accurate representation of your editing (or so it seems to me). Let's be generous and say that in the case of changes made in a sharpening live filter layer at a zoom of 100% (as you say is recommended) we don't actually mind using the hand tool to move around the image. What about the case of another live filter layer, maybe Gaussian blur? Sometimes we want to blur skin (perhaps to lessen certain types of imperfections in it) but we don't want to go too far. In a big file at 100% zoom we'll only see a part of the face. To ensure our blurring is still acceptable we want to zoom out to 20% to get a good idea of what the ultimate viewer will see. It seems to me that we can't do this in Affinity Photo because doing so will give us an incorrect representation of the blur. But please tell me I'm wrong if I am – because I really hope I am. Quote Link to comment Share on other sites More sharing options...
Old Bruce Posted February 21, 2019 Share Posted February 21, 2019 8 minutes ago, peanutLion said: if I zoom the image to 100% then set the sharpening layer's sliders as I think the image deserves, when I zoom out to 20% so that I can now see the whole of the image, now the image looks very different. This is a problem in the case of retouching portraits because we want to view the image as a whole on screen to see what our edits will look like to the ultimate viewer. But with big files you can't do that and get an accurate representation of your editing (or so it seems to me). I think you should try using the Navigator Studio and hit the hamburger menu for the Advanced features. You can set a location at 100% to "set the sharpening layer's sliders as I think the image deserves" then zoom out to the 20% whole image view and snap back to the same 100% view at the same location. 'Tis great. Quote Mac Pro (Late 2013) Mac OS 12.7.4 Affinity Designer 2.4.1 | Affinity Photo 2.4.1 | Affinity Publisher 2.4.1 | Beta versions as they appear. I have never mastered color management, period, so I cannot help with that. Link to comment Share on other sites More sharing options...
Staff MEB Posted February 21, 2019 Staff Share Posted February 21, 2019 Hi peanutLion, Welcome to Affinity Forums There will always be some differences between viewing an image at 100% (where image and screen pixels match 1:1) and viewing it a 20% because when you set the zoom to 20% to fit the image on screen it must be resampled for display - small details generated by filters/adjustments namely noise, sharpening etc etc will suffer with the resampling process/will loose definition. Quote A Guide to Learning Affinity Software Link to comment Share on other sites More sharing options...
peanutLion Posted February 21, 2019 Author Share Posted February 21, 2019 Many thanks, Old Bruce, but the problem is not one of simply getting around the image easily; it's one of seeing differents effects at different levels of zoom. Quote Link to comment Share on other sites More sharing options...
R C-R Posted February 21, 2019 Share Posted February 21, 2019 45 minutes ago, peanutLion said: This is a problem in the case of retouching portraits because we want to view the image as a whole on screen to see what our edits will look like to the ultimate viewer. The problem with that is you can't be sure how the image will be rendered by whatever app the viewer will be using. For example, most browsers will scale images to fit within the web page by interpolating its actual pixels, much like resampling an image in Affinity does. It also may not use the same (or any) color management. Most browsers also can scale up or down web pages, & many can selectively scale images separately from text. Quote All 3 1.10.8, & all 3 V2.4.1 Mac apps; 2020 iMac 27"; 3.8GHz i7, Radeon Pro 5700, 32GB RAM; macOS 10.15.7 Affinity Photo 1.10.8; Affinity Designer 1.108; & all 3 V2 apps for iPad; 6th Generation iPad 32 GB; Apple Pencil; iPadOS 15.7 Link to comment Share on other sites More sharing options...
peanutLion Posted February 22, 2019 Author Share Posted February 22, 2019 First of all I hope someone will tell me I'm talking nonsense if I am. I'm not an expert. I'm here to learn how to use AP, so I'll happily be told I'm far off track. However, at the moment AP seems to have a major and rather bizarre flaw in its design. Thank you, R C-R, for your comment. One app/computer/screen will display the same image differently to another app/computer/screen. That problem is not what I'm on about here. The problem I'm trying to resolve is entirely different. Thank you, also, MEB for a comment that I assume is from Serif. You have given me an appreciation of why visual differences occur when we zoom an image in an out. I just find it hard to believe that this is acceptable. An image simply should not change in appearance as we zoom in and out in an image-editing app (except in the obvious way). I hope I am wrong when I say this but the fact that AP causes this to happen is strange. I understand that the change is a side-effect of a technique AP uses to render quickly. It's just a shame we can't disable that technique. Our problem is that in AP we can only be certain that an edit will produce a visual effect to the degree that we like if the zoom is 100%. But with a large file, say a head & shoulders portrait, 100% zoom may show us only part of the face on screen. It's like looking through a letter box but from the inside. Mostly that is not how we want to edit an image. The only workaround that I can think of is this (and, again, please tell me this is all wrong if it really is):1) make a copy of a large file2) change the copy's document size so that the whole picture fits on screen at 100% zoom3) do all edits on the copy at 100% zoom4) copy the filter/adjustment layers and paste them in the original large file5) redo any brushwork. Quote Link to comment Share on other sites More sharing options...
John Rostron Posted February 22, 2019 Share Posted February 22, 2019 1 hour ago, peanutLion said: The only workaround that I can think of is this (and, again, please tell me this is all wrong if it really is):1) make a copy of a large file2) change the copy's document size so that the whole picture fits on screen at 100% zoom3) do all edits on the copy at 100% zoom4) copy the filter/adjustment layers and paste them in the original large file5) redo any brushwork. This would work for any adjustments that do not involve manipulations involving pixels, such as sharpening! Sharpening should be the last thing you do before publishing your file at its final size/resolution. If you are preparing a file for print, get it to the size/resolution required by the printer, apply sharpening and then print. You would then (and only then) be able to see if your sharpening is right for that printed image, this is not a foible of Affinity, but of any Photo software. Most photo software allows you to soft-proof your image before printing, but there is no guarantee that it will give you the final result you want. John Quote Windows 10, Affinity Photo 1.10.5 Designer 1.10.5 and Publisher 1.10.5 (mainly Photo), now ex-Adobe CC CPU: AMD A6-3670. RAM: 16 GB DDR3 @ 666MHz, Graphics: 2047MB NVIDIA GeForce GT 630 Link to comment Share on other sites More sharing options...
R C-R Posted February 22, 2019 Share Posted February 22, 2019 1 hour ago, peanutLion said: An image simply should not change in appearance as we zoom in and out in an image-editing app (except in the obvious way). How could all the pixels of an image larger than the screen or window it is displayed in be rendered without some kind of appearance-changing interpolation? There are only a certain, unvarying number of pixels available in the display itself, so if there are more pixels than that in the image, there is no way they can all be displayed at the same time. For example, my iMac has a screen resolution of 2560 x 1440 pixels. Say I am working on an image that is twice that resolution at 5120 x 2880 pixels. To fit the entire image onto the screen, even in Full Screen mode using every one of those screen pixels for the image, each screen pixel has to represent 4 image pixels. A pixel can have only one color, so any color variation smaller than that among those 4 image pixels will have to be averaged down to one color. Quote All 3 1.10.8, & all 3 V2.4.1 Mac apps; 2020 iMac 27"; 3.8GHz i7, Radeon Pro 5700, 32GB RAM; macOS 10.15.7 Affinity Photo 1.10.8; Affinity Designer 1.108; & all 3 V2 apps for iPad; 6th Generation iPad 32 GB; Apple Pencil; iPadOS 15.7 Link to comment Share on other sites More sharing options...
peanutLion Posted February 22, 2019 Author Share Posted February 22, 2019 Hello R C-R. Many thanks for the comment. You ask an interesting question ("How could all the … interpolation?") but I think with a little thought we can answer it well. Let's say we do this: 1) Open large file, zoom to 50% 2) Go to Filters menu > Gausian blur (instead of using live filter layer) 3) Move Radius slider to 10px to make blurring of image obvious, hit Apply 4) Zoom out repeatedly (keep pressing control/command and minus key) 5) Observe how the blurring effect seems not to change as we zoom out. As we do step 4) we are repeatedly getting AP to represent a greater and greater number of image pixels in each screen pixel. Yet the blurring effect is not changing as we do so. If AP can ensure that the effect does not change as we zoom out more and more when we follow steps 1-5 above, clearly AP could in principle do the same thing when we employ a live filter layer. The fact that in the case of a live filter layer the effect DOES actually change as we zoom out/in is, I think, an error of judgement by the people who made AP, for whose skills I otherwise have nothing but admiration btw. Again, if I'm not making sense I'd like to be told so. Many thanks also to John Rostron for your comment. I think we are saying the same thing. Quote Link to comment Share on other sites More sharing options...
firstdefence Posted February 22, 2019 Share Posted February 22, 2019 Ok, I'll be the dissenter, I don't really get what you think you're seeing, whether I apply a Gaussian Blur as a destructive filter or a Live filter the effect for me is the same. The blur is the same, it doesn't maintain the level of blur when the image is larger as the smaller image would blur to nothing. I think the effect is akin to having a printed blurred image, the blur is static and cannot be changed you are standing a foot away from it but if you start to walk backwards the image starts to look less blurred. Quote iMac 27" 2019 Somona 14.3.1, iMac 27" Affinity Designer, Photo & Publisher V1 & V2, Adobe, Inkscape, Vectorstyler, Blender, C4D, Sketchup + more... XP-Pen Artist-22E, - iPad Pro 12.9 (Please refrain from licking the screen while using this forum) Affinity Help - Affinity Desktop Tutorials - Feedback - FAQ - most asked questions Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.