Jump to content
You must now use your email address to sign in [click for more info] ×

Recommended Posts

I have been using Affinity Photo since beta and have recently run into a problem with performance.  I noticed this slightly at the beginning of this year and it has become worse as the year has gone on. Once I reach a certain number of layers (7-9) or apply a specific layer adjustment (such as motion blur) it begins to excessively use my cpu.  This causes large slow downs and forces me to try and find alternative methods of editing in Affinity.  I tried reinstalling without success and have the latest version of all software.  I was wondering if anyone else has run into this and/or has a solution. I have attached an image as an example.  Once I added the motion blur, the cpu usage skyrocketed.  

I am using a 2017 MacBook Pro (regret buying) with 16gb ram.

2.9 GHz Intel Core i7

Radeon Pro 560 4096 MB

Intel HD Graphics 630 1536 MB

 

Screen Shot 2018-07-20 at 9.54.43 AM.png

Link to comment
Share on other sites

  • Staff

Hi @PhotonFlux,

Welcome to the forums. 

This would be expected with live filters. I suggest you rasterize the effect once you're happy with the result and you will see a significant decrease in CPU usage. Live filters will keep your CPU high as they are constantly redrawing the effect for live preview. 

Thanks,

Gabe. 

Link to comment
Share on other sites

11 minutes ago, owenr said:

An instance of a live filter could have an option for its output to be cached; the cache would be automatically updated when the filter's parameters are changed or the underlying composite changes, but there would be no need for the exponentially complex re-computation of the filter's output when there is an edit at a level above the filter in the layer stack. The view refresh when there is a stack of several live filters could then be a matter of a few milliseconds instead of several seconds in some cases.

But how would that work when a user pans or zooms in & out on a document? Even if a full sized image of the filter's output & its effects on underlying layers was cached to memory (which could eat up a lot of RAM), it still would have to be re-rendered at the current document display size, right?

All 3 1.10.8, & all 3 V2.4.1 Mac apps; 2020 iMac 27"; 3.8GHz i7, Radeon Pro 5700, 32GB RAM; macOS 10.15.7
Affinity Photo 
1.10.8; Affinity Designer 1.108; & all 3 V2 apps for iPad; 6th Generation iPad 32 GB; Apple Pencil; iPadOS 15.7

Link to comment
Share on other sites

14 minutes ago, haakoo said:

By this I mean to have a lock in the live filter layer that prevents the redraw.

Also for this, I do not understand how it could work when panning or zooming in & out of a document. Maybe I am oversimplifying this too much but it seems to me that a filter is either live or it is not. If it is live, then it needs to be re-rendered every time anything changes that affects it.

All 3 1.10.8, & all 3 V2.4.1 Mac apps; 2020 iMac 27"; 3.8GHz i7, Radeon Pro 5700, 32GB RAM; macOS 10.15.7
Affinity Photo 
1.10.8; Affinity Designer 1.108; & all 3 V2 apps for iPad; 6th Generation iPad 32 GB; Apple Pencil; iPadOS 15.7

Link to comment
Share on other sites

43 minutes ago, owenr said:

Panning/zooming would be no different to panning/zooming on a document containing a Pixel layer instead of the cache of each live filter.

They already use mipmaps in a way somewhat like cacheing to reduce rendering overhead, but I am not sure how well that would work with live filters -- if nothing else, it could result in a lot more mipmaps that have to be stored somewhere, either in the file or in memory. This could also affect serialization, the file recovery feature, snapshots, or history, but I am not sure about that either.

All 3 1.10.8, & all 3 V2.4.1 Mac apps; 2020 iMac 27"; 3.8GHz i7, Radeon Pro 5700, 32GB RAM; macOS 10.15.7
Affinity Photo 
1.10.8; Affinity Designer 1.108; & all 3 V2 apps for iPad; 6th Generation iPad 32 GB; Apple Pencil; iPadOS 15.7

Link to comment
Share on other sites

  • Staff

@owenr there is absolutely no need to be snarky. It's counterproductive to any discussion.

@PhotonFlux live filters are indeed very taxing—in your screen grab you're using several stacked on top of one another including convolutions and distortions (plus something nested into your main pixel layer, not sure what that is?) This is why live filter layers are set to child layer by default, otherwise the temptation is to stack them as top level layers. This doesn't help much in your case, but if you were working on a multiple layer document you would typically try to child-layer live filters into just the pixel layers you were trying to affect.

The Affinity apps use render caching in idle time, but this is invalidated any time the zoom level is changed. What you should be able to do in theory is set a particular zoom level, wait a couple of seconds (or longer depending on how many live filters you have active) and the render cache will kick in. Now if you start to use tools or adjustments on a new top level layer, the performance should be improved. In honesty, this works better for multiple pixel layers where you're using masking—less so for live filters that have to redraw on top of one another.

The live filters are also calculated entirely in software—however, as you're running a MacBook with a recent Intel iGPU you may want to dabble with enabling Metal Compute in Preferences>Performance. This enables hardware acceleration for many of the filter effects. There are still some kinks to be worked out (I believe you might see some improvements to hardware acceleration in 1.7) as it's a port from the iPad implementation, but it may speed things up if you're consistently using multiple stacked live filters. It also significantly decreases export times too. There are some caveats (Unsharp Mask behaves differently and you may need to tweak values on existing documents) but I would say it's worth a try at least.

Hope that helps!

Product Expert (Affinity Photo) & Product Expert Team Leader

@JamesR_Affinity for tutorial sneak peeks and more
Official Affinity Photo tutorials

Link to comment
Share on other sites

29 minutes ago, owenr said:

No need to post that. Your confusion is obvious.

My confusion is over how you got from what I wrote to asking if I advocate the abolishing of multiple Pixel layers, & from there to thinking I am somehow grasping for reasons not to "support" live filter output caching.

To be clear about it, I am not advocating abolishing anything, nor am I looking for reasons not to support anything that would improve the performance of the app. I am just wondering if simple cacheing of the output of live filter layers actually would do that, if it would do so any better than whatever cacheing is already used in the app, & if it would cause issues with existing features of the app that make it impractical to implement.

All 3 1.10.8, & all 3 V2.4.1 Mac apps; 2020 iMac 27"; 3.8GHz i7, Radeon Pro 5700, 32GB RAM; macOS 10.15.7
Affinity Photo 
1.10.8; Affinity Designer 1.108; & all 3 V2 apps for iPad; 6th Generation iPad 32 GB; Apple Pencil; iPadOS 15.7

Link to comment
Share on other sites

29 minutes ago, owenr said:

Contrast that with cacheing where each time a pixel of the top filter's output is required, the pixel would be provided immediately from the top filter's cache which would be no different to a pixel layer.

But also consider that every time the view is zoomed this cache would have to be updated, & the same would be true for any changes to any of the lower layers the filter(s) above them affect, or when the visibility of any of these layers is toggled. So CPU & memory resources would have to be allocated to do this immediately after any such changes were made, just as they are now with the render cache.

I could be wrong but it doesn't seem like this would be any better than the way it works now (which is to update the render cache in idle time), & there still has to be some way to make it work efficiently with snapshots, file recovery, app & OS level memory management, & so on, & to do that in both Windows & Mac OS environments running on diverse hardware that may not even have the same architecture.

Or if you prefer the tl;dr version: it is not as simple as one might think to implement cacheing to improve performance.

All 3 1.10.8, & all 3 V2.4.1 Mac apps; 2020 iMac 27"; 3.8GHz i7, Radeon Pro 5700, 32GB RAM; macOS 10.15.7
Affinity Photo 
1.10.8; Affinity Designer 1.108; & all 3 V2 apps for iPad; 6th Generation iPad 32 GB; Apple Pencil; iPadOS 15.7

Link to comment
Share on other sites

5 hours ago, owenr said:

A filter's output cache would be equivalent to a Pixel layer and it would not need updated unless some underlying layer is changed or a filter parameter is changed.

Or the zoom level is changed. You are not taking into account the render cache that contains the image rendered at the current screen resolution.

5 hours ago, owenr said:

The performance benefit of cached filter output is trivial to test for yourself by rasterising to Pixel layers as pseudo caches.

Unless you take into account the need to update its effects on underlying layers, which is the whole point of having a live filter. It is like saying flattening the document is a good test of the performance benefits of throwing away everything that is non-destructive.

All 3 1.10.8, & all 3 V2.4.1 Mac apps; 2020 iMac 27"; 3.8GHz i7, Radeon Pro 5700, 32GB RAM; macOS 10.15.7
Affinity Photo 
1.10.8; Affinity Designer 1.108; & all 3 V2 apps for iPad; 6th Generation iPad 32 GB; Apple Pencil; iPadOS 15.7

Link to comment
Share on other sites

@James Ritson

Thanks for the info. I'll try to set them as child layers and rasterize the ones I have set in stone. It is interesting to me that there is a jump in cpu usage going from 5 to 6 layers or 6 to 7 layers. Would the cache be full at that point? I do not understand how any of that works. 

I have just learned so much. Thank you again everyone. 

Link to comment
Share on other sites

37 minutes ago, owenr said:

A filter's output cache would be independent of the view zoom. That cache would contain a full scale image of the filter's output. That cache content wouldn't be changed by zooming the view, just like the pixels contained in a Pixel object or Image object aren't changed by zooming the view.

So where is this full scale image, or the several of them if there are different live filters or however you see this working in documents with complex layer hierarchies, going to be stored? You can't assume there will always be enough room in RAM for this, so the overhead of paging to & from VM & however the OS manages that has to be considered, or if you want to avoid that, you have to consider the overhead of maintaining a temp file on the drive, which is going to incur an even heavier penalty on performance, particularly on systems with an iron drive rather than a SSD.

Besides that, the developers have already told us that not everything is necessarily loaded into memory (real or VM) at once, & that they use mipmaps to improve render times, so it is not even necessarily true that everything in a regular pixel layer is loaded into memory at different zoom levels.

1 hour ago, owenr said:

We currently have the situation where filter output is expensively repeatedly recalculated when editing the document below or above the filter, or adjusting the filter itself, or simply panning the view when the document display image is larger then the viewport.

Why do you think this is always true for layers above the filter layer? James Ritson explained that the Affinity apps already do render caching during idle time, so I am not sure how you came to this conclusion. Can you explain anything more about that?

1 hour ago, owenr said:

However, filter output cacheing has a similar benefit but, crucially, without throwing away any non-destructiveness.

There is no way to know if that actually would improve performance or make it worse without considering everything it would affect, including its impact on memory use, render cacheing, mipmapping, anything serialized, & whatever else the apps might already be using to improve performance. We users don't know enough about that to do more than speculate about it, but I am reasonably sure the developers not only have done more than just speculate about it; they have already considered all the possibilities & implemented the ones they believe perform best overall in real world use.

That does not mean I think they are not interested in implementing anything differently that really would be better, just that it seems unlikely that this kind of cacheing would do that. I am fully aware that this is just speculation on my part & that others might come to different conclusions about it, but that is just the nature of things like this, so I don't particularly want to waste any more time on it.

All 3 1.10.8, & all 3 V2.4.1 Mac apps; 2020 iMac 27"; 3.8GHz i7, Radeon Pro 5700, 32GB RAM; macOS 10.15.7
Affinity Photo 
1.10.8; Affinity Designer 1.108; & all 3 V2 apps for iPad; 6th Generation iPad 32 GB; Apple Pencil; iPadOS 15.7

Link to comment
Share on other sites

30 minutes ago, owenr said:

It would have been trivial for you to test what I claimed.

But what would that prove? Again, there is no way to know if what you suggest actually would improve performance or make it worse without considering everything it would affect, test for that, & to do so in the context of both typical & extreme workflows that tax the limits of both the app & the hardware it runs on.

All 3 1.10.8, & all 3 V2.4.1 Mac apps; 2020 iMac 27"; 3.8GHz i7, Radeon Pro 5700, 32GB RAM; macOS 10.15.7
Affinity Photo 
1.10.8; Affinity Designer 1.108; & all 3 V2 apps for iPad; 6th Generation iPad 32 GB; Apple Pencil; iPadOS 15.7

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...

Important Information

Terms of Use | Privacy Policy | Guidelines | We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.