Imaginarius Posted February 10, 2015 Share Posted February 10, 2015 Does Affinity Photo support dual GPUs? And if not, is this planned?Adobe disappointed usersof the new Mac Pros because they still are not supporting the two GPUs. However, Pixemater does... Link to comment Share on other sites More sharing options...
Dave Harris Posted February 11, 2015 Share Posted February 11, 2015 We don't make use of dual GPUs. That's because we mostly use the GPU for drawing to the screen. Most of our real processing is done by the CPU. That benefits greatly the more cores you have. paolo.limoncelli 1 Link to comment Share on other sites More sharing options...
Imaginarius Posted February 13, 2015 Author Share Posted February 13, 2015 That's a bit sad to hear. Multi-threading on CPUs however is welcome...It's just that I work with huge files with hundreds of layers and I think any tiny bit of speed would help. Even on my new Mac PRo... and the demands will just get bigger. ronnyb 1 Link to comment Share on other sites More sharing options...
Staff MattP Posted February 14, 2015 Staff Share Posted February 14, 2015 I think you'll find that Affinity is quicker with most documents than other offerings... even those that support Dual GPUs. I can definitely rig together documents to crucify each application (including Affinity) but in the real world of just actually using the software to achieve something, you should see that Affinity's approach is scalable and effective. GPUs ultimately bring restrictions with them that are hard to get around and as soon as you need to start 'getting around them' you incur terrible penalties in terms of performance. I would not argue against using GPUs if the user's requirements were always a known quantity - video editing for example will always be something of a sane pixel size that you can plan for and work with, but we aim to allow gigapixel documents to be worked on and getting data to/from the card is too costly (as you can't actually store this much data on the card itself) AshTeriyaki and paolo.limoncelli 2 Link to comment Share on other sites More sharing options...
Staff Andy Somerfield Posted February 14, 2015 Staff Share Posted February 14, 2015 Further to this, in order to offer features with significant algorithmic complexity (inpainting, performant blurs, etc.), the GPU is not an option. We take advantage of it for some things though - the super smooth panning / zooming of documents is entirely GPU based :) Andy. User_783649 1 Link to comment Share on other sites More sharing options...
Leftshark Posted February 14, 2015 Share Posted February 14, 2015 ...we aim to allow gigapixel documents to be worked on and getting data to/from the card is too costly (as you can't actually store this much data on the card itself) When you say "too costly" what is that cost? Does that mean the amount of time it takes to transfer the data from RAM (or storage) to the card and back, can be more than the amount of time saved applying the GPU to the data? Link to comment Share on other sites More sharing options...
Staff Andy Somerfield Posted February 15, 2015 Staff Share Posted February 15, 2015 Transferring data *to* a GPU is fast - no problem there.. The GPU can work on the data very fast - no problem there either.. The problem comes in getting the data back into main memory - that's slow.. Take a simple Gaussian blur for example - we can perform the blur many times faster just using CPU in main memory than roundtripping to the GPU - because of how long it takes to get the result back. Thanks, Andy. User_783649 1 Link to comment Share on other sites More sharing options...
deeds Posted February 15, 2015 Share Posted February 15, 2015 The hype around GPU processing, particularly that coming from AMD and Apple is just noise to support their various decisions. Noise that isn’t nearly accurate or informative of real world implications and opportunities for GPU processing. Bullshit, by any other word. Link to comment Share on other sites More sharing options...
Staff Andy Somerfield Posted February 15, 2015 Staff Share Posted February 15, 2015 I'm not so sure.. See, sometimes, you don't need to "come back" from the GPU - you can send your data there (fast), do loads of stuff (fast) - then only bring the data back right at the end for saving.. Video stuff can work will like this, but yeah - photo editing needs the data back from the card all the time, so is often ill suited. A lot of products wheel out the "GPU accelerated" line because it sounds shiny - and often they don't really use much GPU at all, or are slow when they do. That, yes, is bullshit - but the GPU cannot be dismissed completely. Also, here's another interesting thing - some architectures (iPad being a good example) - share the same memory between the GPU and CPU - so their is no penalty in either direction.. this can be very useful. There are a few filters in Photo which perform *faster* on an iPad Air 2 than on my 12 core new Mac Pro 32GB in the office.. because we can use the GPU without penalty - but that's all stuff for another day ;) Thanks, Andy. User_783649 1 Link to comment Share on other sites More sharing options...
deeds Posted February 15, 2015 Share Posted February 15, 2015 Absolutely agree. Metal being a prime example of taking advantage of that shared memory, and a bunch of other optimisations for putting things like physics and other activities off to the GPU. PhysX was the other great lose from the fallout around Nvidia and CUDA, unfortunately. Link to comment Share on other sites More sharing options...
Leftshark Posted February 15, 2015 Share Posted February 15, 2015 Also, here's another interesting thing - some architectures (iPad being a good example) - share the same memory between the GPU and CPU - so their is no penalty in either direction.. this can be very useful.…There are a few filters in Photo which perform *faster* on an iPad Air 2 than on my 12 core new Mac Pro 32GB in the office.. because we can use the GPU without penalty - but that's all stuff for another day ;) Thanks for answering my earlier question. Your answer leads to another question: If memory sharing between CPU and GPU can make GPU acceleration of some operations practical, where they might not have been if you had to go out across a bus to get to the discrete GPU, is there ever a case on a Mac where GPU acceleration of a feature is more practical with Iris/Iris Pro level integrated graphics than with a discrete GPU? I ask because if that was true, that would go against the conventional wisdom that a discrete GPU is always better. But I don't know if Mac integrated graphics result in the same penalty-free GPU use as your iPad example of GPU/CPU shared memory. Link to comment Share on other sites More sharing options...
Staff Andy Somerfield Posted February 16, 2015 Staff Share Posted February 16, 2015 Hi Leftshark, That's a good question - and yes, in theory, integrated graphics cards should be able to work like that. Unfortunately though, while Iris shares main memory with the CPU, the ability to allocate memory which is accessible by both the CPU and GPU is not exposed through any drivers I have seen - we are still forced to do a real copy of all the bits to and from the card - from one area of main memory to another area of main memory and back again! I've always been surprised that this wasn't something which is made available.. Thanks, Andy. anon1 1 Link to comment Share on other sites More sharing options...
anon1 Posted June 6, 2016 Share Posted June 6, 2016 Hi Leftshark, That's a good question - and yes, in theory, integrated graphics cards should be able to work like that. Unfortunately though, while Iris shares main memory with the CPU, the ability to allocate memory which is accessible by both the CPU and GPU is not exposed through any drivers I have seen - we are still forced to do a real copy of all the bits to and from the card - from one area of main memory to another area of main memory and back again! I've always been surprised that this wasn't something which is made available.. Thanks, Andy. YEAH I´d definitely love to have my MBp running faster than a MacPro :D :rolleyes: Sometimes one is just scratching ones head about what engineers don´t include, Sony Software, start recording with photo trigger, false color overlay, voice transmission over Sony Wifi app and LAV mic attached to smartphone instead of 500€ Sennheiser "cough cough" :blink: Link to comment Share on other sites More sharing options...
verysame Posted June 24, 2017 Share Posted June 24, 2017 Here's a free image editor that uses GPU and it seems pretty fast: https://pencilsheep.com/ So, somehow they figured out how to use GPU for blur, glow, distort and all the other live filters. The number of filters is pretty impressive, it's definitely worth a try. Although a web version is available, I would encourage trying the desktop version. Edit: I noticed is for Windows/Linux only. ronnyb 1 Andrew - Win10 x64 AMD Threadripper 1950x, 64GB, 512GB M.2 PCIe NVMe SSD + 2TB, dual GTX 1080ti Dual Monitor Dell Ultra HD 4k P2715Q 27-Inch Link to comment Share on other sites More sharing options...
myclay Posted June 26, 2017 Share Posted June 26, 2017 so what can Pencilsheep do better? Sorry I am lost with that program. Sketchbook (with Affinity Suite usage) | timurariman.com | artstation store Windows 11 Pro - 23H2 | Ryzen 5800X3D | RTX 3090 - 24GB | 128GB | Main SSD with 1TB | SSD 4TB | PCIe SSD 256GB (configured as Scratch disk) | Link to comment Share on other sites More sharing options...
verysame Posted June 28, 2017 Share Posted June 28, 2017 it's just an example of GPU being used in an image editor. There's been some talking lately here on the forum about how GPU is not ideal for an image editor, Pencilsheep seems to prove image editors can benefit from GPU too. Andrew - Win10 x64 AMD Threadripper 1950x, 64GB, 512GB M.2 PCIe NVMe SSD + 2TB, dual GTX 1080ti Dual Monitor Dell Ultra HD 4k P2715Q 27-Inch Link to comment Share on other sites More sharing options...
ronnyb Posted July 10, 2017 Share Posted July 10, 2017 I ran the web version on my iPhone 7. VERY impressive I have to say! Of course the UI for a mobile device sucks, but to be fair it's definitely a computer vs mobile app.... Here's a free image editor that uses GPU and it seems pretty fast: https://pencilsheep.com/ So, somehow they figured out how to use GPU for blur, glow, distort and all the other live filters. The number of filters is pretty impressive, it's definitely worth a try. Although a web version is available, I would encourage trying the desktop version. Edit: I noticed is for Windows/Linux only. 2021 16” Macbook Pro w/ M1 Max 10c cpu /24c gpu, 32 GB RAM, 1TB SSD, Sonoma 14.4.1 2018 11" iPad Pro w/ A12X cpu/gpu, 256 GB, iPadOS 17 Link to comment Share on other sites More sharing options...
Nikos Posted July 15, 2017 Share Posted July 15, 2017 On 2/16/2015 at 9:26 AM, Andy Somerfield said: Hi Leftshark, That's a good question - and yes, in theory, integrated graphics cards should be able to work like that. Unfortunately though, while Iris shares main memory with the CPU, the ability to allocate memory which is accessible by both the CPU and GPU is not exposed through any drivers I have seen - we are still forced to do a real copy of all the bits to and from the card - from one area of main memory to another area of main memory and back again! I've always been surprised that this wasn't something which is made available.. Thanks, Andy. And the question is, Andy, don't you use Metal on the iPad? That's the main reason Apple created Metal: to avoid data copying between the CPU and the GPU... And if you do use it on the iPad, why not use it on the Mac too? Link to comment Share on other sites More sharing options...
anon1 Posted July 15, 2017 Share Posted July 15, 2017 2 minutes ago, Nikos said: And the question is, Andy, don't you use Metal on the iPad? That's the main reason Apple created Metal: to avoid data copying between the CPU and the GPU... And if you do use it on the iPad, why not use it on the Mac too? already using metal Link to comment Share on other sites More sharing options...
Nikos Posted July 15, 2017 Share Posted July 15, 2017 4 minutes ago, MBd said: already using metal Well, this was my point: it makes sense to use it! Link to comment Share on other sites More sharing options...
Recommended Posts