Jump to content
You must now use your email address to sign in [click for more info] ×

peanutLion

Members
  • Posts

    18
  • Joined

  • Last visited

  1. The problem I'm on about has more to do with something other than interpolation (if by interpolation we mean the manner in which AP decides how to show many image pixels in much fewer screen pixels). Suppose we zoom out of an image containing just the original Background layer. Visually it's pretty much as if we are walking backwards away from a print on the wall. The interpolation, however AP does it, seems to work well. Only when we zoom quite far out does the image look odd here and there. But that's what we would expect because, after all, AP has to show much fewer pixels than are actually available in the image and it makes a decision about how to do that. No problems there. Hats off to AP. We get the same "walking away from a print on the wall" feeling if we zoom out from a solely visible merged layer instead. The interpolation again seems pretty good again. Hats off again. The problem I'm on about arises in the case of composites, to display which AP employs mipmaps. At zooms different to 100% the composite alone is not a good representation of the edits we make in the layers that make up that composite (earlier I thought it was merged layers that were incorrect but I now realise it's not). The problem I'm on about is not so much to do with the interpolation. What's causing the problem I'm interested in is the low-res mipmaps and (it seems to me) the way AP combines them with a composite's layers. Here's what Serif said on 7 Dec 2018, 2.28pm (moderator MEB, paraphrased): " Only 100% zoom gives you an accurate preview of how the filter will affect the image if flattened/merged with other layers or exported. " [peanutLion: MEB is talking about 100% zoom of a composite] [peanutLion: MEB helpfully goes on:] " … If you create a merged copy you will get more accurate preview of the filter because its effect is now baked to the image and was calculated using the original image dimensions … when merged, the filter effect is always calculated/baked using the original image (not the low-res versions - called mipmaps - that we use to display/render the image at zoom levels below 100% more quickly). "
  2. I'm beginning to understand what's going on when we notice that the merged layer alone looks different to the composite alone. It's the merged layer that properly represents our edits, and it does so at any level of zoom. The composite only does so when the zoom is at 100%. I had incorrectly thought it was the other way around. So it seems to me that merging the layers of a composite is actually what we should be doing as we go along. Doing so will let us zoom by any amount and still see (in the merged layer) what our use of layers in the composite has actually achieved. Merging is a more attractive way of working than looking at the composite and then always zooming to 100% to see the effect of our use of layers (because if the file is large, then 100% zoom will put only a small part of the image in the app window, such as pretty much just the eyes in a head & shoulders portrait). It's still a pity, though, that we have to work like that! I wonder what other image-editing apps do. It would be interesting to know why AP's use of low-res versions of our image (ie mipmaps) in a composite at zooms below 100% leads to an inaccurate representation on screen of our use of the layers of the composite.
  3. Remember that the problem in question concerns a merged layer not a composite layer. A merged layer looks different to the composite of layers from which it came at virtually all levels of zoom. It is only at 100% zoom that the merged layer appears correct. This is what we are seeing in the short videos I posted above. One way of getting around this problem is never to perform a merge until the very last moment. However, this can mean we have to put up with slower performance as a large number of composites slow down the editing (whereas a merged layer made up of those composites does not).
  4. Hello R C-R. Many thanks for the comment. You ask an interesting question ("How could all the … interpolation?") but I think with a little thought we can answer it well. Let's say we do this: 1) Open large file, zoom to 50% 2) Go to Filters menu > Gausian blur (instead of using live filter layer) 3) Move Radius slider to 10px to make blurring of image obvious, hit Apply 4) Zoom out repeatedly (keep pressing control/command and minus key) 5) Observe how the blurring effect seems not to change as we zoom out. As we do step 4) we are repeatedly getting AP to represent a greater and greater number of image pixels in each screen pixel. Yet the blurring effect is not changing as we do so. If AP can ensure that the effect does not change as we zoom out more and more when we follow steps 1-5 above, clearly AP could in principle do the same thing when we employ a live filter layer. The fact that in the case of a live filter layer the effect DOES actually change as we zoom out/in is, I think, an error of judgement by the people who made AP, for whose skills I otherwise have nothing but admiration btw. Again, if I'm not making sense I'd like to be told so. Many thanks also to John Rostron for your comment. I think we are saying the same thing.
  5. First of all I hope someone will tell me I'm talking nonsense if I am. I'm not an expert. I'm here to learn how to use AP, so I'll happily be told I'm far off track. However, at the moment AP seems to have a major and rather bizarre flaw in its design. Thank you, R C-R, for your comment. One app/computer/screen will display the same image differently to another app/computer/screen. That problem is not what I'm on about here. The problem I'm trying to resolve is entirely different. Thank you, also, MEB for a comment that I assume is from Serif. You have given me an appreciation of why visual differences occur when we zoom an image in an out. I just find it hard to believe that this is acceptable. An image simply should not change in appearance as we zoom in and out in an image-editing app (except in the obvious way). I hope I am wrong when I say this but the fact that AP causes this to happen is strange. I understand that the change is a side-effect of a technique AP uses to render quickly. It's just a shame we can't disable that technique. Our problem is that in AP we can only be certain that an edit will produce a visual effect to the degree that we like if the zoom is 100%. But with a large file, say a head & shoulders portrait, 100% zoom may show us only part of the face on screen. It's like looking through a letter box but from the inside. Mostly that is not how we want to edit an image. The only workaround that I can think of is this (and, again, please tell me this is all wrong if it really is): 1) make a copy of a large file 2) change the copy's document size so that the whole picture fits on screen at 100% zoom 3) do all edits on the copy at 100% zoom 4) copy the filter/adjustment layers and paste them in the original large file 5) redo any brushwork.
  6. Many thanks, Old Bruce, but the problem is not one of simply getting around the image easily; it's one of seeing differents effects at different levels of zoom.
  7. Thank you for the post, John, but I'm sensing a very big problem here. I might be wrong (and please do say so, somebody, if I am) but if I zoom the image to 100% then set the sharpening layer's sliders as I think the image deserves, when I zoom out to 20% so that I can now see the whole of the image, now the image looks very different. This is a problem in the case of retouching portraits because we want to view the image as a whole on screen to see what our edits will look like to the ultimate viewer. But with big files you can't do that and get an accurate representation of your editing (or so it seems to me). Let's be generous and say that in the case of changes made in a sharpening live filter layer at a zoom of 100% (as you say is recommended) we don't actually mind using the hand tool to move around the image. What about the case of another live filter layer, maybe Gaussian blur? Sometimes we want to blur skin (perhaps to lessen certain types of imperfections in it) but we don't want to go too far. In a big file at 100% zoom we'll only see a part of the face. To ensure our blurring is still acceptable we want to zoom out to 20% to get a good idea of what the ultimate viewer will see. It seems to me that we can't do this in Affinity Photo because doing so will give us an incorrect representation of the blur. But please tell me I'm wrong if I am – because I really hope I am.
  8. Many thanks for showing your before/after pictures. I can definitely see the difference between them. Having read other posts in this forum I have realised that Affinity Photo uses a low-res version of your image (something called a mipmap, whatever that is) when you view it at a scale of less than 100%. This is what causes the problem we are discussing here. The problem only arises in the case of Live filter layers, or at least so I understand. I have tried using an HSL Adjustment layer instead of the unsharp mask Live filter layer and the problem did not occur at any level of zoom. On the plus side I have also tried going through the fours steps I detailed above (with an unsharp mask layer) but this time with the zoom at 100%. The merged layer then appeared exactly like the product of the not merged Background layer and the not-merged unsharp mask layer. On the minus side I have now got to determine how to work my new understanding into my workflow for editing portraits! In other words how do I look at a large image that is at, say, a zoom of 20% (which it has to be at so that I can get all of it in the window) and still see the correct effect of the filter layer?
  9. The problem is still apparent in the case of the latest beta of AP. The fact that no one else has reported this makes me wonder whether I am doing something silly: Screen_Recording_2019-02-20_at_23_47_27.mov
  10. Many thanks for the video. I have attached my own below to illustrate the problem. Screen Recording 2019-02-20 at 22.22.04.mov
  11. Many thanks for your help, v_kyr. Here's what I've discovered: The sharpening is being reduced on merge, ie it's not disappearing completely! In other words when I merge the two layers the merged layer is sharper than the background layer by itself but not as sharp as it is suposed to be!
  12. Has anyone else had this weird problem? Here are the four steps I'm taking: 1) I have an AP file open and it has only a background layer. In the tool bar I select Force pixel alignment, Move by whole pixels and Snapping (represented by the magnet symbol). 2) I add an unsharp mask Live filter layer (not nested but it makes no difference here) and raise Radius to 5.0px, Factor to 4 and Threshold to 0%. The image sharpens in the way you would expect. 3) Both layers are visible (a tick in each box). I right click on the Live filter layer's icon (or the Background layer's icon) and select Merge visible. 4) The merged layer appears but it has lost the sharpening completely. Am I correct in thinking that this is a major bug? I simply cannot sharpen.
×
×
  • Create New...

Important Information

Terms of Use | Privacy Policy | Guidelines | We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.