Jump to content
You must now use your email address to sign in [click for more info] ×

Recommended Posts

Hello

I understand Affinity Photo can benefit from the usage of a GPU, I have an Nvidia GeForce GTX 860M since it's a laptop.

My question is, to ascertain that software does is working with the GPU, should I select the GPU selection in the Affinity Photo Preference Menu in the Performance Tab?, should I assign it directly in the Nvidia control panel? or should I do both things?

 

Thank you

NvidiaAffinityGpu.png

Link to comment
Share on other sites

  • Staff

Hi Nova-Odos and Welcome to the Forums,

If you see this link it explains how we mainly use the CPU and only make use of the GPU to render things to screen.  So changing the settings you've included, won't really give you any benefits.  There are other replies in that thread from the Dev which are also worth reading, as they just explain a bit more about how Affinity works :) 

Link to comment
Share on other sites

Hello thanks for the greetings

I'm actually using Affinity Photo to paint an image with brushes, which I suppose enters in the "rendering to screen"  subject. I'm no expert but I do believe GPU does help my case. So I feel my question still goes unanswered. :P

 

Link to comment
Share on other sites

Nope (I suspect with rendering to screen, is referred the entire general visualization, but brush stuff is processed in the CPU), he answered your question completely. Other CPU based software (non Serif related) I have used and even talked to its devs, what I heard from them was... cpu features and capability are what count, even things like the CPU cache size.... In Affinity, it does not work like that, is not heavily based in GPU, like several digital painting dedicated software (which AP is not, neither AD). Some software applications do use GPU for brush painting, some do it slightly, some rely completely on CPU. The latter seems is mostly the case in Affinity. Not wanna make too much advertising of other apps (than I already do) at a company's products forums, but there are some GREAT apps which do not use the GPU at all for the brush (and this is often better for reasons long to explain...to mention only a few, more "democratic" possibilities, so that lower machines can use it, those home or company ones having only a poor integrated card, etc...also, having quite more memory for the task than only the card dedicated memory, etc) . That GPU tho is quite old, my sister has a laptop with one of those, is not of terrific power to say the least, neither too modern features (so probably the benefit wouldn't be drastic). As far as I have gone learning, you get the major benefits with Affinity by increasing the power of the cpu, numbers of cores, cpu clock, getting one of recent gen, etc. Of course, RAM helps (not sure if RAM speed factors in importantly, tho), and having an SSD disk do, too. (in the brush matter, probably not so much...or....maybe yep in large canvas sizes, as I suspect those disks are very good for large images, in general).

AD, AP and APub. V1.10.6 and V2.4 Windows 10 and Windows 11. 
Ryzen 9 3900X, 32 GB RAM,  RTX 3060 12GB, Wacom Intuos XL, Wacom L. Eizo ColorEdge CS 2420 monitor. Windows 10 Pro.
(Laptop) HP Omen 16-b1010ns 12700H, 32GB DDR5, nVidia RTX 3060 6GB + Huion Kamvas 22 pen display, Windows 11 Pro.

 

 

Link to comment
Share on other sites

Thank you very much to both replies. I still find this technical stuff a little bit foggy, but the further explanation of "stokerg" reply's by "SrPx", does give me a better understanding of the matter.

If I understood correctly, to get the most ot Affinity Photo, I should set it to work with my CPU. I suppose I could fine tune a better performance directly in Affinity's Preference Menu.

Also, if I didn't misunderstood, the GPU might be of some help of sorts for performance, but since it's a laptop version and also an old model, the boost in performance might actually be a very small one or maybe even none at all.

I consider my question answered, but it begs the question, if Affinity does work better with CPU (like brushes which are one of the main features I use), why have a screen selection to chose between CPU and GPU? In which case does using a GPU is beneficial for Affinity Photo?

Thank you both

Link to comment
Share on other sites

From my understanding if you have a graphics card that works fine with Affinity Photo you should leave that selected in Affinity's Preferences, Performance settings as the Renderer.

Which will give you the best performance.

The Warp setting bypasses your graphics card and is just there as a fall-back option for misbehaving graphics cards or as a test to see if a particular problem is graphics card related or not.

Using Warp as your renderer will in most cases provide a less optimal performance experience than using a graphics card as your renderer.

But since Affinity's main features rely more on your CPU's power/cores than your choice of graphics card then using Warp as your renderer (if you have to) should not significantly impact on your usage of the software.

To save time I am currently using an automated AI to reply to some posts on this forum. If any of "my" posts are wrong or appear to be total b*ll*cks they are the ones generated by the AI. If correct they were probably mine. I apologise for any mistakes made by my AI - I'm sure it will improve with time.

Link to comment
Share on other sites

I think you really should leave the card as selected and not let the software-only, in preferences.

I actually didn't reply to the main question, now that I see, lol. Yes (what carl says), you should  totally set your graphics card in AP/AD preferences (about what to set at nVidia Windows control panel, a bit of trial back and force is what I'd do, if in the need, as I haven't had performance issues yet in AP/AD, with the projects I make with those). I remember reading from some dev that setting the by software renderer was gonna be slow. It seems, indeed, the final render is relied on the graphics card, so, not selecting it, is probably removing a key part in the puzzle (you can read in one of those posts in that thread how smooth panning is provided by the card. It seems is about bits, yet important ones), but IMO that doesn't contradict the fact that most stuff is driven by the CPU and happening there, and that most performance benefits (provided on the basis that you set your card in preferences) from CPU. Then also from RAM size (and probably RAM speed and latency?) as the more is done in RAM and less in disk, the better.....and disk speed (SSDs, or at least, if you have a 5400rpms disk (typical in PC laptops long ago), a 7200rpms will improve things. I've seen dell laptops having a setting for the 7200, where if in BIOS you set certain disl performance mode, everything will run faster (with the penalty of more noise, temperature, and who knows if less durability, or overheating risks, but...). Anyway, despite the life cycle issue, SSDs are a ton faster than any HD). If that wasn't a laptop (I'm a bit lost on how easy/expensive is to replace a cpu in a laptop) but a desktop, I'd maybe try to replace the CPU with a more powerful one. Second hand, obviously, as those CPUs aren't sold anymore.Also, you should find it pretty cheap...at least for desktop.  Thing is... I find i5s refurbished, even Sandy Bridge full towers (desktop) at 100 -200 bucks, with poor discrete or integrated card (if the card is poorer than even your current one, you could plug the old -maybe- in the purchased tower/laptop).  That said, you are surely better of with even a cheap new AMD board, and even just a 2600 (if you overclock it) or a 2600x (if can't afford a so much better 2700x), for just some bucks more than an old refurbished intel. But I know how at times even 100 $ are a barrier ;) 

If you read that linked thread (several posts from devs) it totally matches what I have been experimenting with several other softwares...Specially Blender. I don't know if rendering is a much more controlled task than all what Affinity apps do (probably the case) , and so, it can be relied to the card in a more predictable way, but the fact is, while rendering with Blender Cycles with the card is fast, very fast (yet tho, obviously beaten by any render farms CPU based with old Xeons...), the problem often arises with shortage of memory. If the scene is big enough (and/or, you use large textures, often my case), you are out of look, as you can only fit there what the gpu memory can fit in. I know you can use a 1080 with 8GB, but a lot of us not happy in investing 700 bucks, or 900 for the new 2080 RTX.. (plus, there will be a bottleneck if the cpu is not on pair with that monster, you'd be not using it to most of its capability at all. And for Affinity, better invest in the CPU, anyway) And there are scenes that wont even fit in that memory. And seems to me Affinity apps, specially Photo, might see some really more complex memory situations. I totally agree with the fact that there's a lot of hype with using "only the card", and that's a bit of a myth. Most of the apps I've seen using that as a selling point, were very focused digital painters (video editing too, but seems the type of task is more fitting to GPU) not doing the bazillion of other things that, for example, AP does.

BTW, (and I know I'm derailing, but I've been asked too often, locally, about which card to work with graphics) besides you'd need a great machine to fit such monster card in (bottleneck issue), as a first thing, is the fact that with the 700 - 900 $ for the card, you could buy instead an amazing new desktop (maybe a decent laptop, instead, I'm just not fan of laptops for long sessions and pro work). Like a Ryzen 2700x with the board and everything (a tower, not also with a monitor), if you use a cheap and good distributor. And probably you would see a lot more benefit (unless you're a hardcore gamer) than with a low or average machine and a monster of a card. Productivity goes always by the hand of a better CPU, and also having good components in everything else (ram, disk), but without breaking the bank in one of them (like the card).

Quote

Thank you very much to both replies. I still find this technical stuff a little bit foggy, but the further explanation of "stokerg" reply's by "SrPx", does give me a better understanding of the matter.

You are welcome.  :)   But that said, absolutely any of the developers' posts (even from a 2015 thread ) has a lot more value and credibility than anything I could say. I'm just a user chit-chatting, while they actually know the stuff, I just read from them and combine with what I know (often in a poor manner).

AD, AP and APub. V1.10.6 and V2.4 Windows 10 and Windows 11. 
Ryzen 9 3900X, 32 GB RAM,  RTX 3060 12GB, Wacom Intuos XL, Wacom L. Eizo ColorEdge CS 2420 monitor. Windows 10 Pro.
(Laptop) HP Omen 16-b1010ns 12700H, 32GB DDR5, nVidia RTX 3060 6GB + Huion Kamvas 22 pen display, Windows 11 Pro.

 

 

Link to comment
Share on other sites

  • 3 weeks later...

Then, maybe a better card will help. But as always, I'm super fan of balanced machines. It's only games where an average cpu and a great card makes sense. Heck, where (price/capability ratio in this very moment in the market, months ago was wildly different. Years ago was mostly Intel everything) maybe just AMD makes the major sense, if price is of any importance (great machines like the i5 8400 have really gone up in price, maybe is the shortage of plants on intel side, or whatever, for what that and similar CPUs really offer compared with, say, a 2600 which u can overclock. Or my fav in nice price/capability ratio now, a 1700...). And even in games, starting not to be the case ( card being so crucial and multi core and cpu speed less so), as besides the bottle neck issue, is not only apps, games are becoming more and more multi threaded. I don't see my self purchasing anything higher than a 1060 or a RX 570 or 580 any time soon... True that my 1050 2GB is short for some GPU power uses in Blender and other apps (GPU based video editing, digital painting GPU based, etc), but even there, a 1050 Ti 4GB (which difference is by no means just the memory, that's one of the many complaints/angers against nVidia with its not so clarifying naming: It's quite a better card in other specs, overall, than the card with same name and 2 GB, non "ti") should suffice for quite advanced productivity/graphics uses. I don't see very clever to purchase anything over that unless all the machine is going on par in capability. IE, all those are imo great cards for productivity for a Ryzen 1700, 2600, core i5 8400...a 1700x/1800x, 2700/x, core i5 8600k, etc,  might do a better use of higher cards. I mean, all of these will do well with a 1080 too, but you get more benefit if that large difference of money is put in other parts, specially CPU, then RAM, or first RAM, depending on the workflow and current cpu. Speaking in general, not just Affinity.. ...And maybe I'd dare to go to pretty high card ( if I REALLY need so, which I'd strongly doubt) with an intel 8700k, or AMD 2700x (imo, this one is a good fit no matter the card, thanks to the great pricing...). Or a 9900k / Threadripper  in that much higher price range and power consume range of these two latter cases. I mean, a 2600 with a 1080 or 2080 is imo putting bucks in places where productivity will see less of an advantage (indeed, gpu will be bottle necked) than if putting in CPU, RAM and disk, in this order. Even if they improve the card usage, for content creation, I'd go for a better cpu, put the bucks there.  Or heck, just more RAM ! And trying it to be at least 2666 MHZ, and hopefully CL16 or so in latency.... The card can be average, and also, is easier to replace for most of the humanity than a CPU... (and IMO, specially with the yet harsh market wounds of the bitcoin miners. Way easier to sell second hand than an CPU. Even at this moment. Even more as most of second hand right now is saturated with utterly destroyed cards that went suffering a large erosion with the mining 24/7 intensive farming...Indeed, last notice I got, nVidia is receiving the punch now in $ due to the fall of miners purchases... Divine punishment, if u ask me.....  :72_imp:  )

AD, AP and APub. V1.10.6 and V2.4 Windows 10 and Windows 11. 
Ryzen 9 3900X, 32 GB RAM,  RTX 3060 12GB, Wacom Intuos XL, Wacom L. Eizo ColorEdge CS 2420 monitor. Windows 10 Pro.
(Laptop) HP Omen 16-b1010ns 12700H, 32GB DDR5, nVidia RTX 3060 6GB + Huion Kamvas 22 pen display, Windows 11 Pro.

 

 

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...

Important Information

Terms of Use | Privacy Policy | Guidelines | We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.