Jump to content
You must now use your email address to sign in [click for more info] ×

toggle off layers - lag


Recommended Posts

if you have a pixel layer (photo) and some adjustment layers (like 5 curves, HSL, vibrance) and the underlying image is a pixel layer only

 

toggling off the layers (shift select, toggle off) above should instantly reveal the pure image below 

it takes some time, half a second or so, though 

 

in theory Affinity should not have to do any processing, just straight display the image below 

 

this would make before/ after comparisons much more enjoyable/ faster 

 

cheers 

 

PS

this may be related

https://forum.affinity.serif.com/index.php?/topic/44671-snappy-popups/

 

 

Link to comment
Share on other sites

On a related issue:

 

if if you place a 100% opaque image/ pixel layer on the very top

 

affinity still calculates all potentially expensive live filter layers below 

 

although they obviously have no effect 

 

this makes before and after comparisons more painful as well 

 

 

Link to comment
Share on other sites

Can you post an example file showing this ~½ second delay? I could not duplicate it on my Mac with some 2448x3264 px from my iPhone & a variety of adjustments, including some radical Lab curve lightness channel adjustments that take 1 second + to render when enabled. Selecting all the adjustments & toggling them off at once very nearly instantly shows the unaltered photo -- whatever the delay is, it is too short for me to notice.

All 3 1.10.8, & all 3 V2.4.1 Mac apps; 2020 iMac 27"; 3.8GHz i7, Radeon Pro 5700, 32GB RAM; macOS 10.15.7
Affinity Photo 
1.10.8; Affinity Designer 1.108; & all 3 V2 apps for iPad; 6th Generation iPad 32 GB; Apple Pencil; iPadOS 15.7

Link to comment
Share on other sites

I think it is down to how clicks are handled

 

just noticed in another application, klicks are accepted on the "push down of a klick" and thus it is more responsive 

 

in affinity this only happens on a few dialogs, most dialogs have the "slower" behavior 

 

some dialogs are very fast - always very fast 

slow.mov

some can be infinitely slow (yes, I've exaggerated the effect a bit:))

 

 

 

hopefully you can produce this though :

"

On a related issue:

 

if if you place a 100% opaque image/ pixel layer on the very top

 

affinity still calculates all potentially expensive live filter layers below 

"

 

 

Link to comment
Share on other sites

It would really help if you could provide an example file showing the behavior you mentioned in your first post and/or the behavior shown in the second post.

 

Regarding how clicks are handled, apologies if you already know this but on Macs (& I assume equivalently on Windows) a click consists of at least two distinct events, for example "mouseDown" & "mouseUp" for a simple left button click. (I am using the terminology of the Responding to Mouse Events section of this Apple Developer document.) But even for the left button other events must also be considered, like "mouseDragged" or "mouseMoved."

 

The result is that while triggering something immediately on a mouseDown event is more responsive, it also means users can't press the button & move the pointer off the targeted item (like the Adjustments button in the Layers panel) before releasing the button if they do not really mean to trigger whatever that item does. For most users this is not much of an issue, but consider users with disabilities (or anyone else) using mouse-related options in System Preferences > Accessibility; or people (like me) who have a pointing device with multiple, programmable buttons & have a button set to (in effect) split mouseDown & mouseUp events into two separate click & release button presses -- useful for avoiding aggravating RSI problems by eliminating the need to keep the button pressed.

 

There is more to it but basically what I am saying is for me & probably for some others, the immediate mouseDown response of (for example) the Adjustments button is the undesired behavior rather than the other way around.

All 3 1.10.8, & all 3 V2.4.1 Mac apps; 2020 iMac 27"; 3.8GHz i7, Radeon Pro 5700, 32GB RAM; macOS 10.15.7
Affinity Photo 
1.10.8; Affinity Designer 1.108; & all 3 V2 apps for iPad; 6th Generation iPad 32 GB; Apple Pencil; iPadOS 15.7

Link to comment
Share on other sites

here is another clip showing the slowdown of toggling on a 100% opaque layer 

slow2.mov

 

1. you klick correctly much more frequently than incorrectly

thus, the snappy behavior is a benefit 99% of the time, and a false klick can be corrected with one cmd z keypress in the other 1%

(it can not, if you enter HDR merge e.g. but that is another usability issue for affinity and could easily be corrected so that it accepts esc or cmd z)

2. affinity does not support much accessibility features anyhow so it would not be worse or better by any significant margin anyway 

2.b this device you are describing sounds really special to say the least, on the other hand many use tablets to avoid the same issue and tablet would work even better as described in (3)

 

3. a further benefit of the "klick down = klick" behavior as seen in the adjustment panel is that you can actually klick+drag and get the adjustment instead of two klicks, this is especially very good on a graphic tablet 

4. even if the layer panel would support something like PS where you klick and drag down to toggle a series of layers, this behavior could still very well be implemented and would not interfere so the distinction between keydown/ keydown+keyup is not necessary (maybe there are one or two buttons in the whole UI but the rest is not depending on it

 

 

Link to comment
Share on other sites

It would really help if you could post an example Affinity file (not a movie) showing the delayed behaviors. I can't reproduce what the movies show but I have no way of knowing if my file sizes, choice of adjustments, etc. have anything to do with that.

 

As for the mouseDown stuff, Affinity supports more accessibility features than you might think, & it should support as many as possible for what I hope are obvious reasons. "Special devices" include not just multi-button programmable ones but things like eye-tracking switches, blow tubes, & so on. (Check out the Switch Control & Dwell Control sections of System Preferences > Accessibility to get an idea of what I mean).

 

I really don't want to get into an argument about that aspect of it but also please consider that from a programming perspective this is much more complicated than it seems. The Apple Developer document & the related links make this clear. For example, the responder chain, window context, error handling, conformity requirements, object hierarchies, & so on all play a part in determining what can or should be done at either the system or application level, & more importantly what can cause undesirable side effects like application crashes, conflicts with third party add-ons, or data loss.

 

So it is not just about certain functions or icon driven actions; it is also about modality, inter-application communications, file access arbitration, & all the other arcane "under the hood" stuff that determine how responsive an app actually can be.

All 3 1.10.8, & all 3 V2.4.1 Mac apps; 2020 iMac 27"; 3.8GHz i7, Radeon Pro 5700, 32GB RAM; macOS 10.15.7
Affinity Photo 
1.10.8; Affinity Designer 1.108; & all 3 V2 apps for iPad; 6th Generation iPad 32 GB; Apple Pencil; iPadOS 15.7

Link to comment
Share on other sites

have you actually created a document that takes noticeable time to compute on itself and overlaid an opaque layer? happens every time on 4 Macs so it is definitely not a singularity

 

the professional dev can figure out how to use metal api and non trivial math can probably figure out how to deal with keypresses

 

 

 

 

Link to comment
Share on other sites

3 minutes ago, MBd said:

have you actually created a document that takes noticeable time to compute on itself and overlaid an opaque layer?

I am not sure exactly what you mean by this but as I have said more than once now, I cannot duplicate the delay behavior on my Mac. If you could just provide an example file it would make it a lot easier to see if I get the same results as you do.

All 3 1.10.8, & all 3 V2.4.1 Mac apps; 2020 iMac 27"; 3.8GHz i7, Radeon Pro 5700, 32GB RAM; macOS 10.15.7
Affinity Photo 
1.10.8; Affinity Designer 1.108; & all 3 V2 apps for iPad; 6th Generation iPad 32 GB; Apple Pencil; iPadOS 15.7

Link to comment
Share on other sites

8 hours ago, MBd said:

again this should be instantly every time

toggle.afphoto

Thanks for the file. I can reproduce a delay with it; however, I noticed it is dependent on the display size (zoom level) & AP's "View Quality" & "Retina Rendering" performance settings, with much shorter delays for small display sizes & the two faster but lower quality performance settings.

 

This leads me to believe that toggling on & off an 'upper' layer does not simply 'reveal' the layers below; that the displayed image is not rendered on screen layer by layer or buffered as a stack of images in memory but instead is rendered anew in real time each time there is a change in the display. Keeping a stack of all possible images in memory would quickly eat up all the memory available to the app for large (many megapixel) files, so I am not sure there is much they can do to make this an 'instant' process.

All 3 1.10.8, & all 3 V2.4.1 Mac apps; 2020 iMac 27"; 3.8GHz i7, Radeon Pro 5700, 32GB RAM; macOS 10.15.7
Affinity Photo 
1.10.8; Affinity Designer 1.108; & all 3 V2 apps for iPad; 6th Generation iPad 32 GB; Apple Pencil; iPadOS 15.7

Link to comment
Share on other sites

2 minutes ago, MBd said:

I am not talking about toggling off the top layer

 

i am only talking about toggling it back on

that should always be instantly

It still has to re-render every displayed pixel in the the top layer, which for this file means re-rendering the displayed pixels of what is a larger-than-canvas-sized, non-pixel-aligned 9994.2 wide by 5424.7 pixel rectangle to the screen at the current display quality. Try resizing the rectangle to the canvas size (8000 x 4500 px) pixel aligned to its edges, & note what happens to the rendering speed. Better still, also set the view to Pixel Size & the view quality to the fastest, least precise settings.

 

It is never going to be able to literally render the screen image instantly no matter what settings or pixel dimensions are involved, but in terms of computational time, it is proportional to how hard the app has to work to render that image. Minimize that & you will get near instant results; otherwise there will be delays.

All 3 1.10.8, & all 3 V2.4.1 Mac apps; 2020 iMac 27"; 3.8GHz i7, Radeon Pro 5700, 32GB RAM; macOS 10.15.7
Affinity Photo 
1.10.8; Affinity Designer 1.108; & all 3 V2 apps for iPad; 6th Generation iPad 32 GB; Apple Pencil; iPadOS 15.7

Link to comment
Share on other sites

On 6.8.2017 at 0:27 AM, R C-R said:

It still has to re-render every displayed pixel in the the top layer, which for this file means re-rendering the displayed pixels of what is a larger-than-canvas-sized, non-pixel-aligned 9994.2 wide by 5424.7 pixel rectangle to the screen at the current display quality. Try resizing the rectangle to the canvas size (8000 x 4500 px) pixel aligned to its edges, & note what happens to the rendering speed. Better still, also set the view to Pixel Size & the view quality to the fastest, least precise settings.

 

It is never going to be able to literally render the screen image instantly no matter what settings or pixel dimensions are involved, but in terms of computational time, it is proportional to how hard the app has to work to render that image. Minimize that & you will get near instant results; otherwise there will be delays.

if there is only one background layer it IS instantly always

so it IS possible

the software needs to stop rendering the layers below which do not show up at all anyway cause they are 100% covered by the top image 

 

 

Link to comment
Share on other sites

24 minutes ago, MBd said:

if there is only one background layer it IS instantly always

so it IS possible

the software needs to stop rendering the layers below which do not show up at all anyway cause they are 100% covered by the top image 

I am not sure that it actually is rendering the layers below -- it could be simply that for one background layer the app uses a pre-rendered mipmap of that layer. Regardless, did you try what I suggested about resizing & pixel aligning the rectangle? At least for a view setting of Pixel Size, do you see the same (effectively) instant results for toggling the top layer on or off that I do?

 

Besides, realistically how often would users be likely to place a 100% opaque layer over all the layers below that covers them completely? At best, this is an unusual 'edge' case that would require special programing to detect that condition. Is it really something they should be working on with so many other things that need attention?

All 3 1.10.8, & all 3 V2.4.1 Mac apps; 2020 iMac 27"; 3.8GHz i7, Radeon Pro 5700, 32GB RAM; macOS 10.15.7
Affinity Photo 
1.10.8; Affinity Designer 1.108; & all 3 V2 apps for iPad; 6th Generation iPad 32 GB; Apple Pencil; iPadOS 15.7

Link to comment
Share on other sites

#affinity always shows dowsnized previews unless you are zoomed in more than 100%

 

Alignment should not make any difference, this happens using same sized/ by which I mean canvas sized pixel layers just the same 

 

the usage is no edge case at all 

 

as a before/ after view you may put the original image at the top and see the difference 

 

but even more importantly 

using live filters slows AP down, obviously 

the result is that one has to merge visible the layers beneath

this however does not yield any significant performance increase because they are still calculated

so one has to turn all underlying live filters off additionally 

 

 

Link to comment
Share on other sites

3 minutes ago, MBd said:

Alignment should not make any difference, this happens using same sized/ by which I mean canvas sized pixel layers just the same

This does not happen for me, at least with the resized & aligned rectangle in your example. It is also pretty obvious from watching Activity Monitor when I toggle the resized top layer that the app is not re-rendering anything -- there is no spike in CPU usage long enough for that to have occurred. 

 

Also, mipmaps are not just previews; they are are a pre-calculated, optimized sequences of images. As far as I know, the developers have not shared the details of the Affinity implementation, so I am not going to jump to any conclusions about that.

All 3 1.10.8, & all 3 V2.4.1 Mac apps; 2020 iMac 27"; 3.8GHz i7, Radeon Pro 5700, 32GB RAM; macOS 10.15.7
Affinity Photo 
1.10.8; Affinity Designer 1.108; & all 3 V2 apps for iPad; 6th Generation iPad 32 GB; Apple Pencil; iPadOS 15.7

Link to comment
Share on other sites

so once more: a video (actually two):

 

the green layer is aligned now

still one can clearly see, it is much faster to toggle on without any layers underneath - for no good reason - by which I mean, nothing is needed to be calculated/ no effects applied

slow:fast.mov

 

 

and as a proof of the live filter thing which again:

using live filters slows AP down, obviously 

the result is that one has to merge visible the layers beneath

this however does not yield any significant performance increase because they are still calculated

 

as shown here:

real slow.mov

mind you, this is latest AP beta with Metal enabled on a MacBookPro which supports Metal but it was the same before, and is the same on other much more powerfull iMac 

 

which is clearly a flaw which hinders usage in e.g. compositions with many layers/ filters/ adjustments

 

 

 

Link to comment
Share on other sites

It is very hard to tell much from the movies -- uploading example files I can try this with on my Mac would be more useful. That said, I think I have seen something similar to the first movie when I use Metal in the current beta: it appears not to be delay that occurs during rendering but a delay before rendering even begins. Switching to the old OpenGL display renderer works much faster, so perhaps this is at least in part a bug in the beta?

All 3 1.10.8, & all 3 V2.4.1 Mac apps; 2020 iMac 27"; 3.8GHz i7, Radeon Pro 5700, 32GB RAM; macOS 10.15.7
Affinity Photo 
1.10.8; Affinity Designer 1.108; & all 3 V2 apps for iPad; 6th Generation iPad 32 GB; Apple Pencil; iPadOS 15.7

Link to comment
Share on other sites

"Switching to the old OpenGL display renderer works much faster, so perhaps this is at least in part a bug in the beta?"

 

form my last psot:

"mind you, this is latest AP beta with Metal enabled on a MacBookPro which supports Metal but it was the same before, and is the same on other much more powerfull iMac"

 

It is very hard to tell much from the movies

// it is very easy to see how slow it is once the filter layers below are turned on, although the top layer is 100% opaque 

 

file is attached:

slow.afphoto

 

 

Link to comment
Share on other sites

I can only tell you that what I get is different from what your movies show. I will continue to experiment as time permits & see if I can figure out why.

All 3 1.10.8, & all 3 V2.4.1 Mac apps; 2020 iMac 27"; 3.8GHz i7, Radeon Pro 5700, 32GB RAM; macOS 10.15.7
Affinity Photo 
1.10.8; Affinity Designer 1.108; & all 3 V2 apps for iPad; 6th Generation iPad 32 GB; Apple Pencil; iPadOS 15.7

Link to comment
Share on other sites

What do you mean "different"?

17 minutes ago, R C-R said:

I can only tell you that what I get is different from what your movies show. I will continue to experiment as time permits & see if I can figure out why.

So postt a Video in which toggling on all layers in the sample document provided (all filter/ adjustment and the top merged layer), show that affinity still is fluid as it is with only one layer.

 

(Again:

with the top layer in the document activated, affinity should not do any calculation other than straight displaying this layer, no matter if the layers beneath are activated or not,

 

but apprently affinity recalculates everything always no matter what

 

are you telling me this is different on your maschine?)

 

 

 

Link to comment
Share on other sites

3 hours ago, MBd said:

are you telling me this is different on your maschine?)

For your slow.afphoto file I get different results depending on various combinations of the following:

1. AP preferences > Performance View Quality and/or Retina Rendering (or Metal on the beta).

2. Zoom level.

3. How many times I toggle on & off the top layer before doing something that changes the view size.

 

I am not going to make a movie because:

A. There are far too many different results to include them all in a reasonably sized movie file.

B. At least on my iMac, doing a screen capture movie impacts the performance of CPU intensive apps

C. There is a variable delay during screen recording between the time I click on something & the recorded indicator appears in the movie.

 

All of the above are why I said I can't tell much from movies. #3 above suggests some kind of screen buffering or mipmap stuff is going on, but it is unclear what that is. The results vary from nearly instantaneous (an imperceptible delay) to as much as 10-15 seconds or so in the worst case. Activity Monitor shows some very strange things going on for CPU use -- sometimes AP seems to be using most of all four cores & other times not, with little or no correlation to the observed on-screen rendering time. (This is with Activity Monitor's sampling frequency set to its fastest setting, which also has a minor impact on performance.)

 

Weirdly, setting View Quality to "Bilinear (Best Quality)" sometimes produces much faster (& sometimes near instantaneous) results compared to "Nearest Neighbor (Fastest)." There is (maybe) some correlation between zoom level & what I presume are pre-rendered mipmap images that do not require interpolation at certain zoom levels, but whatever the cause or source of this occurs at both greater than and less than 100% or Pixel Size zoom levels, and also at odd zoom levels that are not power of two related.

 

I have no idea what to make of all this, so I am not comfortable with any assumptions or conclusions made about any of it (by me, you, or anyone else besides the developers). There are too many variables & unknown dependencies for that. I have already spent more time on this that I want to, so barring something illuminating from the developers this will be my last post on the subject.

All 3 1.10.8, & all 3 V2.4.1 Mac apps; 2020 iMac 27"; 3.8GHz i7, Radeon Pro 5700, 32GB RAM; macOS 10.15.7
Affinity Photo 
1.10.8; Affinity Designer 1.108; & all 3 V2 apps for iPad; 6th Generation iPad 32 GB; Apple Pencil; iPadOS 15.7

Link to comment
Share on other sites

I get reproducible results on two core macbook, iMac, quad core macbook, with or without screen capture, with or without metal 

sometimes it is a bit faster (but sometimes it is not-as you said as well) but it should always be instantly

 

and to repeat that this is not a small edge case:

 

live filters are an integral part of affinity photo 

always merging them is thus not an option as it diminishes the advantages that are possible 

after some filter layers affinity photo becomes (very!) unresponsive sooner or later which is only a question of time/ hardware

 

thus the only viable workflow that takes advantage of live filter layers and keep affinity photo responsive is doing "merge visible"

after that, work can be proceeded as normal

as soon as one wants to modify settings of the live filter, one has to uncheck the merged layer, adjust the live filters (with some lag) and do a merge visible again

(uncheck the layers because otherwise affinity is still calculating like crazy without any reason)

 

unchecking the live filter layers below should not be necessary 

especially to new users this leaves the impression of affinity photo being unresponsive, and for others its just annoying 

 

from a technical standpoint

checking if there is an opaque layer that covers everything and then stopping all calculations of layers below this layer could easily be done once a second and would thus reduce the maximum unresponsive time in such a scenario to one second

but I'm sure the developers could just as well perform this check every two or three frames without any significant performance hit, and actually a significant performance gain in many cases

 

I think a developer reading this already after the first post is just like 

"yeah I should have done this in very first place but just did not have the time to implement this properly..."

 

after all, this has been an issue for a very long time already, since the very beginning

 

 

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...

Important Information

Terms of Use | Privacy Policy | Guidelines | We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.