Jump to content
You must now use your email address to sign in [click for more info] ×

Best hardware specs, Windows PC build for Affinity Designer & Photo


Recommended Posts

I have a $1700 budget for a PC Build that will run Affinity Designer & Photo at its best. I will be designing logos, photo manipulating & illustrating art for poster prints, t-shirts, athletic wear, and displaying art on web.

I'm interested in knowing what specs would allow for optimal use of all tools and features.  Also considering a 24" monitor 1900 x 1080 IPS with 85&up RGB color Gamut(not included in $1700).  Would appreciate recommendations for best graphics card to support monitor and software graphic capabilities. GTX 10 series or Quadro P series. If at all possible to  recommend hardware I can pcpartpicker, and decide next move from there, I would appreciate it very much.

Link to comment
Share on other sites

  • Staff

Hi G822,

Welcome to the forums :)

In terms of recommended systems, our devs have mentioned before that the more cores in the CPU, and the faster the cores run, the better. Something like an Intel core i5 with 4 cores that runs at ~3.5GHz.
Our app will run with less RAM, but we recommend 8GBs where possible. Equally any app will run on a hard drive but I would personally consider a high capacity SSD for boot and main applications a necessity nowadays.
 
My personal build is currently running an i7 6700K @4Ghz, 16GB RAM @3200Mhz & a GTX 1070 and I have no issues running Affinity, however this build was designed and built with gaming in mind and is potentially nearing 'overkill' for basic editing.
 
It's company policy that we're not allowed to recommend specific systems or hardware, as we can't be liable for any purchases made, but using the above as a guide should help!

Please note -

I am currently out of the office for a short while whilst recovering from surgery (nothing serious!), therefore will not be available on the Forums during this time.

Should you require a response from the team in a thread I have previously replied in - please Create a New Thread and our team will be sure to reply as soon as possible.

Many thanks!

Link to comment
Share on other sites

Comparison of Photoshop vs. Affinity Photo requirements:
I have observed that AP better uses CPU multicore than PS but the summary CPU usage is quite large; AP requires less RAM than PS; AP uses graphics card less than PS, more image processing is done by CPU, GPU power is less used. Of course, this may be specific to my hardware configuration (see footer)

Dell Precision T7910, 2x Xeon E5-2630v3, 128 GB RAM DDR4, Quadro K6000, Dell UP2716D, Huion Inspiroy Q11K V2 Pen Tablet, WIN 10 PRO
Dell Precision T3640, Xeon W1290, 128 GB RAM DDR4, Quadro RTX 5000, Eizo CG319X, Huion Inspiroy Q11K V2 Pen Tablet, WIN 10 PRO FWS
-----

Graphic software: Affinity Photo, Designer, Publisher - Adobe Photoshop, Lightroom Classic, InDesign, Illustrator, Acrobat - Zoner Photo Studio - Topaz Gigapixel Ai, Denoise Ai, Sharpen Ai

 

 

Link to comment
Share on other sites

Hello Dan C,

I appreciate you responding.  I have actually configured various PC builds such as Dell,  Alienware, Origin PC & custom PC build companies.  I don't trust myself right now to build a PC .  I hope you can be patient with me & answer some ?s or refer me in the right direction. I won't be gaming on my PC, it will be geared for Graphic Design. All the PC builds I've researched have Intel i5s  with 6 cores, and i3s with 4.  I've read both sides, that apps like Adobe Illus CC & Photoshop don't use more that 4 cores.  I'm not sure how Affinity compares to AI & PS.  So my ?s is,  how do these Affinity's apps benefit  from 2 xtra cores (4 to 6)? 

Another thing I noticed in Affinity Designer Youtube tutorial is that the video showed it was just using the Intel Graphics UHD 630 , but it had a gtx 1050, that was not being utilized.  So my ?s are, do certain features, tools, affects in both Affinity Designer & Photo,  require a dedicated gpu .  Do these apps notify you when your gpu is being or should be used depending on the tool being used. 

Also does Affinity Designer have an auto-trace tool?

Thanks

Link to comment
Share on other sites

9 minutes ago, G822 said:

Also does Affinity Designer have an auto-trace tool?

It doesn't. I use https://www.vectorizer.io/

Dell Precision T7910, 2x Xeon E5-2630v3, 128 GB RAM DDR4, Quadro K6000, Dell UP2716D, Huion Inspiroy Q11K V2 Pen Tablet, WIN 10 PRO
Dell Precision T3640, Xeon W1290, 128 GB RAM DDR4, Quadro RTX 5000, Eizo CG319X, Huion Inspiroy Q11K V2 Pen Tablet, WIN 10 PRO FWS
-----

Graphic software: Affinity Photo, Designer, Publisher - Adobe Photoshop, Lightroom Classic, InDesign, Illustrator, Acrobat - Zoner Photo Studio - Topaz Gigapixel Ai, Denoise Ai, Sharpen Ai

 

 

Link to comment
Share on other sites

Auto trace, nope. You can use Inkscape instead, then export as svg, pdf or etc, and import back into AD.

As far as I've been reading Affinity apps use quite more the CPU than the gpu. That 1050 is fine. You seem to discard Ryzen 7 2700X. But this one has quite some cores (more than the intels 6700k ,7700k and 8700k,) and seems to have quite some performance according to benchmarks. And is now actually cheaper (seems intel prices are rising due to dunnowhat shortage of production plants or sth. It was already cheaper with ryzen, anyway). PS has always been mostly single core, with some filters exceptions. Affinity uses a lot more the multicore. But I don't know if Affinity apps are specially tuned for Intel or not. By all means, I'd go for that ryzen. Or for a Threadripper.

AD, AP and APub. V1.10.6 and V2.4 Windows 10 and Windows 11. 
Ryzen 9 3900X, 32 GB RAM,  RTX 3060 12GB, Wacom Intuos XL, Wacom L. Eizo ColorEdge CS 2420 monitor. Windows 10 Pro.
(Laptop) HP Omen 16-b1010ns 12700H, 32GB DDR5, nVidia RTX 3060 6GB + Huion Kamvas 22 pen display, Windows 11 Pro.

 

 

Link to comment
Share on other sites

  • Staff
9 hours ago, G822 said:

how do these Affinity's apps benefit  from 2 xtra cores (4 to 6)? 

Most of the work done by Affinity is CPU bound, so we always say the more the better, but I couldn't give exact examples of which tools/features as I haven't worked on the back end of our coding, apologies.

10 hours ago, G822 said:

Do these apps notify you when your gpu is being or should be used depending on the tool being used. 

The app doesn't notify you, but you can set the renderer in the settings, so you can choose to use the onboard graphics chip, or your dedicated GPU.

Certain tools & features will use the GPU more than the CPU, but you won't know which is which unless you're watching the task manager!

9 hours ago, SrPx said:

You seem to discard Ryzen 7 2700X

AMD always slips my mind! :/ 

Please note -

I am currently out of the office for a short while whilst recovering from surgery (nothing serious!), therefore will not be available on the Forums during this time.

Should you require a response from the team in a thread I have previously replied in - please Create a New Thread and our team will be sure to reply as soon as possible.

Many thanks!

Link to comment
Share on other sites

12 hours ago, Dan C said:

AMD always slips my mind! :/ 

Oh! I meant G822,  not you :) ... also, as you were very careful not to recommend any specific hardware.  ( I was actually referring to G822 second post, where he speaks about i3 and i5 ( my preferred would always be i7) but he did not mention Ryzens in that second post). Anyway, have learnt with some experience that some intel users ( I am an intel user, lol) don't want to even hear about the brand name (even its mention can be non well received....).  I myself, I like both brands, just the price per core and performance is these weeks now very much in favor of AMD (due to that intel's hardware production shortage, and as was already expensive). But if costing equally, would be a tough decision (ie, a 8700k and a 2700x, for those of us who don't use the PC for gaming anymore. I still prefer more cores of 2700x, though.   But in blender tests, Cycles rendering , despite its 6 cores instead of 8, the i7 8700k gets amazing results. So, yep, tough decision. I'd favor AMD for empathy, breaking a bit of a monopoly, and because the platform might show a bit more upgrade-ability without changing the board. AMD's Threadripper platform, well, only one of the several threadrippers gets really fast clocks, but the one with more cores, it really has more cores than anything else.....many more. The issue I see with those is seems yet the drivers and other system stuff is yet a little not 100%,  so, the AMD's high end mainstream, 2700x, seems more tested and proved in that regard.yet. Last time I checked, the one with more cores was not the best purchase, but an intermediate one ( Threadripper 1950x ? can't remember) as had  still a lot of cores, but also having some nice stock clock speeds ( ie, staying in the 4.2  Ghz or so, at least. Otherwise, the loss of performance is large compared to a 8700k or 2700x).

And seems intel has produced a new i9, quite a powerful machine, in the mainstream line. ( has come generating quite some controversy for certain "strange" review, btw. But is a great cpu.... at around 500 bucks, if I read right ;)  (2700x is around 300 $))

 

AD, AP and APub. V1.10.6 and V2.4 Windows 10 and Windows 11. 
Ryzen 9 3900X, 32 GB RAM,  RTX 3060 12GB, Wacom Intuos XL, Wacom L. Eizo ColorEdge CS 2420 monitor. Windows 10 Pro.
(Laptop) HP Omen 16-b1010ns 12700H, 32GB DDR5, nVidia RTX 3060 6GB + Huion Kamvas 22 pen display, Windows 11 Pro.

 

 

Link to comment
Share on other sites

Did hide the wall of text with the spoiler tag. Is just a lot of text about the 9900k and hardware in general.

About the new intel i9 line.... read a bit about it. It's a monster in performance, we're still in the mainstream line !  In comparison, the 2700x is about a 70% - 90% (depends on the software) of performance of what the 9900k gets, but the 2700x is around 300 $, the i7 9900k is as high as 580 $. Ouch.  Not a good price ratio, in comparison, IMO. So, it all gets reduced to what are the needs (OP, imo you don't really need these, tho, the more, the merrier, specially for some extreme print work case) , and which is the money available. For people looking for huge RAW image processing, highest performance, and money, not an issue, I'd go for the 9900k. Is not yet the price range of the extreme line from intel , where prices grow up to 1k till 2k $. Indeed, much better bang for the buck than those, this 9900k.  Now..., it would be interesting to check how a 2950x threadripper (16 cores 32 threads !! best clocks of the series... 3.5 / 4.4 ghz, base/boost ...180W TDP... at least not 250W TDP like the other threadrippers...ouch, anyway. Still, its normal, is a lot of cores...) performs against a 9900k ( 8c / 16t , 3.6GHZ / 5 Ghz (yeah, 5 Ghz turbo, NO overclocking! ...woah.). 5 ghz is a lot higher than 4.4, but however, 32 threads, a lot more than 16. So, things like Blender, or Premiere, still will get best results in threadripper. A Blender Cycles render taking in the 8700k 39 minutes to render, takes 32 in Ryzen 2700x, 18 in Threadripper 2950x (surely the sweet spot in this platform)  and a bit above just 10 minutes in threadripper 2990WX. All of these not being overclocked... Issue with Threaripper: Besides power consume, mother boards are more expensive, like with the extreme line in intel...

What would be extremely interesting would be to check how it performs an i9 9900k compared to a Threadripper 2950x. The latter is much more cores, but the former is a lot more clock speed, even more, can get higher clocks when more cpus are under load, and also has a much better IPC....

Still, I don't understand tho, why  releasing things like the 9700k (8c/8t)  and 9600k (6c/6t , and this one is an i5 ) with multi threading disabled, while this is a selling advantage that helped AMD greatly.... :?   Unless.... the yet to appear 9800k (or is that a false rumor....)   is gonna be multithreaded, that would surely leave two i7s of this line with multithreading, not only the  super expensive one.

So, yeah, RAWs editing doing really huge resource hungry workflows, -or general editing of huge images-  I would probably put all the money in a 9900k... People like me, tho, never touching  RAWs, and working a lot in 3D rendering and video editing... the 2700x, 8700kl and Threadripper 2950x makes a lot more sense.... (the 2990wx (would never buy that one, tho) surely blasts in Blender Cycles rendering any intel solution, thanks to 32 cores and 64 threads). For those levels of multithreading (and electricity power bills) not considering the intel's i9 7980XE.... (Only 2.6 ghz base clock, tho 4 something in turbo.... 18 cores, 36 t, but... around 2k $...hahahaha...Geez... and stilll in 3D rendering is easily beaten by the threadripper top one... )

Being the 2700x around 300 $, i7 8700k 380 $ (or overclock the 2700...is only 265 $ now ! , and if overclocked, is clearly still less powerful, than the 2700x, but not for a lot, and consumes quite less electricity)   i9 9900k 580 $, threadripper 2950x 900 $, threadripper 2990wx around 2k $, intel i9 7980XE 2k $ too (but quite below in certain benchmarks), being the top Threadripper 2990wx (250W TDP) prohibitive for the power usage, and the 2950x or the 7980XE from intel quite in the edge of that ( 180w and 165 w TDP, respectively), IMO, for the users not doing such RAW based editing/processing  really huge images opened at once, I'd say still the Ryzen 2700x and i7 8700k are a very fine purchase.... I mean, for most uses, it pays better putting those bucks saved in more ram, a ssd, and an average graphic card (1050 or the better 1050 4gb ti, better not only in having more memory) , or a 1060 as much. If you were working a lot with wireframes, I mean 3D meshes, maybe is when a quadro or the like, a pro card, makes sense. Other than that, I think one is better off with a GTX 1050 or 1060. If you want to do GPU rendering in other apps, then obviously the card is crucial. Again, seems very far from your case.(in your case I'd stay in a 1050/1050 ti )

For the thread poster usage, seems a Ryzen 2700x / intel 8700k is quite perfect.  Instead of going pricier with the CPU, I'd rather prefer adding more RAM (for example, the 80 bucks of price difference between intel 8700k and AMD ryzen 2700x, might mean the difference of having 8gb and 16gb!!!. And that IS HUGE).  In intel, the RAM speed is not crucial, you can do well with 2666 Mhz ram, even with 2400. With AMD, I'd advice never below 2666 (mother boards will take that speed without overclocking in the BIOS), and if you are fine with overclocking ram (very easy) I'd go for 3200, as in the Ryzen platform, using RAM at these speeds is a world changing situation. Don't go 2133 anymore. I'd say, is WAY more important to set at least 16 Gb (ideally 32, if have the bucks) . And then a SSD disk for the OS boot and programs (not for file saving, unless your work is that of working very heavily with RAWs and you prefer to buy a new SSD from time to time (due to ending the disc cycle pretty soon), but work at light speed, as your activity income makes any investment quite justified). Does not seem your case, you could probably do even with just a regular HD. But still consider having a SSD for the OS installation (I'd still remove the caching options, cache folders (even windows temp, browser cache folders, etc), in the many apps from using SSD for caching, and set the HD for that...) and boot, and for the applications to install (Affinity, etc).

If you go for a Ryzen 2700x, and you don't overclock (not necessary with this one, really) then don't put any money in cooling. Its stock cooler is more than enough.

 

AD, AP and APub. V1.10.6 and V2.4 Windows 10 and Windows 11. 
Ryzen 9 3900X, 32 GB RAM,  RTX 3060 12GB, Wacom Intuos XL, Wacom L. Eizo ColorEdge CS 2420 monitor. Windows 10 Pro.
(Laptop) HP Omen 16-b1010ns 12700H, 32GB DDR5, nVidia RTX 3060 6GB + Huion Kamvas 22 pen display, Windows 11 Pro.

 

 

Link to comment
Share on other sites

On 10/21/2018 at 2:32 PM, SrPx said:

Auto trace, nope. You can use Inkscape instead, then export as svg, pdf or etc, and import back into AD.

As far as I've been reading Affinity apps use quite more the CPU than the gpu. That 1050 is fine. You seem to discard Ryzen 7 2700X. But this one has quite some cores (more than the intels 6700k ,7700k and 8700k,) and seems to have quite some performance according to benchmarks. And is now actually cheaper (seems intel prices are rising due to dunnowhat shortage of production plants or sth. It was already cheaper with ryzen, anyway). PS has always been mostly single core, with some filters exceptions. Affinity uses a lot more the multicore. But I don't know if Affinity apps are specially tuned for Intel or not. By all means, I'd go for that ryzen. Or for a Threadripper.

I love my Threadripper 1950x, but from my personal experience, I wouldn't recommend the combo Affinity-Threadripper:

 

P.S. Other than Affinity, I highly recommend the AMD. It's performing extremely well with all the other software I use.

Andrew
-
Win10 x64 AMD Threadripper 1950x, 64GB, 512GB M.2 PCIe NVMe SSD + 2TB, dual GTX 1080ti
Dual Monitor Dell Ultra HD 4k P2715Q 27-Inch

Link to comment
Share on other sites

I have had an AMD processor for many years now with no problems, either with Affinity or any other software. See below for my spec.

John

Windows 10, Affinity Photo 1.10.5 Designer 1.10.5 and Publisher 1.10.5 (mainly Photo), now ex-Adobe CC

CPU: AMD A6-3670. RAM: 16 GB DDR3 @ 666MHz, Graphics: 2047MB NVIDIA GeForce GT 630

Link to comment
Share on other sites

Aren't we splitting hairs here, I mean what do all these performance speeds mean in real terms? are we talking milliseconds, seconds or minutes differences.

These marketeers use terms like "Blazing fast performance" what does that mean?

A good system is more than just a CPU, the Motherboard and Power Supply are pretty crucial bits of hardware. You can have the "fastest" CPU going but if the system it sits in isn't configured right then it will not perform as might be expected.

iMac 27" 2019 Somona 14.3.1, iMac 27" Affinity Designer, Photo & Publisher V1 & V2, Adobe, Inkscape, Vectorstyler, Blender, C4D, Sketchup + more... XP-Pen Artist-22E, - iPad Pro 12.9  
B| (Please refrain from licking the screen while using this forum)

Affinity Help - Affinity Desktop Tutorials - Feedback - FAQ - most asked questions

Link to comment
Share on other sites

11 hours ago, verysame said:

P.S. Other than Affinity, I highly recommend the AMD. It's performing extremely well with all the other software I use.

Absolutely agree !.  I was supposed to be already working on a Ryzen machine, because I thought this dinosaur would have melted around June ( it gave some warnings).. But still kicking. A 2700x, a Threadripper, or in the very low end, a 2400G, are fantastic examples of great ratio in money/performance and capabilities. I mean, of course an intel  9900k and a 8700k as fantastic purchases, but... Clearly AMD gives in every range the best bang for the buck. So... is not just a matter of, hey, let's be cheap... In the high end area you do a lot more with the same amount of money than with intel. And I don't really need the 5GHz as much as I need the cores for rendering (3D rendering -mostly- , and video editing (from time to time, tho tends to be ...intense) , I have worked a lot in these, and threads are EVERYTHING. Of course, the clock and everything else, too, but rendering is around the better performance gain, when having multiple cores and hyper threading  )

4 hours ago, John Rostron said:

I have had an AMD processor for many years now with no problems, either with Affinity or any other software. See below for my spec.

An A6 ! That one seems from 2011, am I right ? Another dinosaurs lover, hello, my friend (I win... 2009.... :D ) . I will never EVER forget how an Athlon (sweet machine, it was well built, too)  saved very show stopping issues when trying to get a smooth brush behavior back in 2001, ion very critical moments for the company, at a super small game studio... True that the intel machines (which couldn't deliver what we expected) were just celerons (with good cards and quite memory), but this is the sad story of always: Yeah, those were not supposed to be targeted for "workstations" usage, BUT... the price was the SAME than the Athlon's !!  ...So, fine, AMD had already then its low end cpus too, in the range of the Celeron (which probably costed 50% of the money, lol) cpus, but is just like now. Intel fans (I'm no AMD/intel fan, btw) complain at review sites that is unfair to compare machines targeted to different segments just because they cost the same... To me, that's the very fair and right punishment for intel, with those prices, and developing gen after gen with not great improvements, specially latest years before and including Kabylake (far are the glorious days of Sandy Bridge) and all what Ryzen has provoked...

At the end of the day, we're all looking for the savings, so, if super performance means 580 dollars (Heck, it IS  in the mainstream line!! versus just 300 bucks in AMD) , I see what I can do with 580 dollars in AMD platform instead... With that criteria, AMD is more than ever a win-win.  And the fact is, Intel has had some areas where it has a very different take ( at low low range has had around the best bargains). Right now ALL is expensive, but even till around June, there were very sweet spots (sadly, not the type of machines I'd be interested on, we all need power around here) , like "pentiums".... The g4560 (this one, for a long time, sweetest price/performance ratio ever), g4600, even the g4400, were absolute great purchases, Still are, but prices are increasing a lot, and keep climbing, I guess...Even a 8400 and a 8100 have been great purchases till very recently. It depends on your country a lot, but prices are rocket climbing due to that production plants shortage issues for intel (or whatever is the real reason, who knows).... right now, AMD is strong it's even in the low range... a Ryzen 1600, now that the BIOS have received so many great updates, drivers too, etc,  is for the low end a great purchase, yet. And really cheap. You overclock it, and have a very decent machine for some coins. Even more , a 2400G, and avoid the overpriced cards (is not growing anymore, but prices have NOT gone to what they were initially, before the bitcoin mining fever) , plus, the integrated card, in benchmarks, gets significantly better performance even than the integrated one in the i3 8100, and than any other integrated one in Intel. (of course, still even a GTX 1050 2gb is much better.... Buit you get a fully functional PC for very little money. And for productivity, not big deal. )

Wise intel users are *VERY* grateful to AMD (even when never considering jumping wagon)... They were not gonna get a 8c/16t for a very long time, if wasn't for Ryzen.... ;) . I still remember the anger in loyal intel users when each (meanwhile, I was with my trusty 860 all this time, lol) of the latest gens were released... With Skylake, Kabylake, with practically all. With huuge pricing in the extreme (core-X) range.... Now the monopoly isn't anymore, and the users are the winners.... (I hope sounds familiar ;)  )

AD, AP and APub. V1.10.6 and V2.4 Windows 10 and Windows 11. 
Ryzen 9 3900X, 32 GB RAM,  RTX 3060 12GB, Wacom Intuos XL, Wacom L. Eizo ColorEdge CS 2420 monitor. Windows 10 Pro.
(Laptop) HP Omen 16-b1010ns 12700H, 32GB DDR5, nVidia RTX 3060 6GB + Huion Kamvas 22 pen display, Windows 11 Pro.

 

 

Link to comment
Share on other sites

4 hours ago, John Rostron said:

I have had an AMD processor for many years now with no problems, either with Affinity or any other software. See below for my spec.

John

Too bad, in my case is a total disaster when it comes to Affinity.

Andrew
-
Win10 x64 AMD Threadripper 1950x, 64GB, 512GB M.2 PCIe NVMe SSD + 2TB, dual GTX 1080ti
Dual Monitor Dell Ultra HD 4k P2715Q 27-Inch

Link to comment
Share on other sites

1 hour ago, firstdefence said:

Aren't we splitting hairs here, I mean what do all these performance speeds mean in real terms? are we talking milliseconds, seconds or minutes differences.

These marketeers use terms like "Blazing fast performance" what does that mean?

A good system is more than just a CPU, the Motherboard and Power Supply are pretty crucial bits of hardware. You can have the "fastest" CPU going but if the system it sits in isn't configured right then it will not perform as might be expected.

Of course, bottle necks can happen if one is not aware of how all this works....

BUT... is not small difference, I can tell you. Just rendering benchmarks with actual Blender scenes, which you can download yourself and test in Blender Cycles render. The BMW scene, etc. Even looking at a reviewer who did not overclock, and even did set up things in a very soft way, all stock, (but quite the way I work, I never overclock) it's still a significant difference: Blender cycles rendering the Gooseberry scene, around 38 minutes in the Ryzen 2700x,  but 32 minutes in the 9900k. The BMW scene (always in Cycles, if I understood well, is the renderer I use (kindda the main now in reality in Blender) 4:25 vs 4:01. The classroom, 14:42  vs 13:30. Now... is that huge?? Nope, in minutes, BUT...imagine having to tho those renders ALL day as your freelancing main, or your actual job... Then is.. CRUCIAL. Even more, not always escalates proportionally, but substitute minutes per hours...32 hours versus 38 hours can mean almost a  working day lost...(in the rendering machine, at least)

And that is in areas where AMD tends to shine, so, if we go to single core focused activities, like working with Photoshop and some other applications (true that the trend is going for threading all that can be) then the 5GHz without overclocking of the 9900k vs the 4.3 of the 2700x, or the 4.2 of the overclocked 2700 (I compare this one so as mostly everybody buy this one to overclock it) are going to be very noticeable IF you do very heavy tasks for the CPU. Of course, if you do light editing you might not even see any difference even with a 2600 overclocked, or a 1700. Because you are not requiring the cpu to use most of its power at all, with your activity. BUT... I can say, if working with large files, there will be a bunch of situations where this kind of power will be very noticeable. And this type of projects happens mostly only in professional gigs, professional work at companies, etc. The hobbyist will prefer by a mile to wait half an hour more in a rendering project of 4 hours, if that avoids loosing 280 US dollars ! Obviously. A pro... well, it will totally depend on the very specific needs. Anyway, AMD now has all ranges. Specially for 3D rendering. Nothing in intel can beat (other than render farms with Xeons and etc, we're talking a bout a single computer, but you can do crazy setups with any brand...) the 2990wx. And my own preferred (theoretically, lol, never gonna spend that money) sweet spot in TR, the 2950x, as a matter of comparison with the before mentioned render times, that same Gooseberry scene, which took 38 minutes in the 2700x, and 32 mins in the 9900k, takes only between 21 and 22 minutes in the TR 2950x.

So, yep, between these ranges the differences are more than synthetic, in real life usage. Just like I could see how much slower I'd edit video in an i5 than in an i7 with identical components, some decade ago....

Now, in PS, I expect the real life performance of a 9900k at 5hz (that's just turbo( and also, it maintains higher clocks when all cores are under load than the AMD case, plus the IPC advantage...), so, not even yet overclocked, and as you know, the intels can overclock a lot more than all AMD solutions to date) to be significantly, really real life work affecting kind of a difference in PS, and any other mostly single threaded app, or that use mostly that : high clocks in single core. I have seen too many benchmarks with large differences in time using packs of actions, very mundane ones, both PS light load, and PS heavy load, many differences when the clock is higher. It's.... time. Again, I am working with a 860, and gets the job done. Many of us have learnt fast workarounds to avoid cpu bottle necks in whatever the application, and optimized overall a lot our workflows.  Even so, with faster machines, you will work a lot faster. With lower machines, there are a bunch of tasks of very heavy nature which that machine wont be able to accomplish.

But for the general freelancer, or even pro worker, I'd be to think the 2700x is a very good spot. For working at a triple A game studio, I'd probably expect a TR 2950x, one of these 9900k, or one of the intel Core-X, the extreme range. And with a 1080 / 1070 installed, and a bunch of ram, SSDs.  Simply because the software they use everyday, and level of detail, do demand that. Still, I bet a lot of those have the poor artists working with crappy old machines, lol... But for the tasks in hand, it is crazy not to use sth of that power. I suppose a similar case  is photo labs dealing with RAW images at a very professional level. (with the money that costs an Eizo pro monitor, I wouldn't understand cheaping out in the cpu... )

AD, AP and APub. V1.10.6 and V2.4 Windows 10 and Windows 11. 
Ryzen 9 3900X, 32 GB RAM,  RTX 3060 12GB, Wacom Intuos XL, Wacom L. Eizo ColorEdge CS 2420 monitor. Windows 10 Pro.
(Laptop) HP Omen 16-b1010ns 12700H, 32GB DDR5, nVidia RTX 3060 6GB + Huion Kamvas 22 pen display, Windows 11 Pro.

 

 

Link to comment
Share on other sites

1 hour ago, verysame said:

Too bad, in my case is a total disaster when it comes to Affinity.

There have been reported issues with specific apps (not only your case with Affinity), from audio editing, to games, to 3D, to etc (a bunch of that is due to not just apps, but the (system and whatnot) libraries the developers of most brands use, are thought and prepared for intel cpus) .....  and in several cases is fixable. Sorry if you are fully aware of them already (RAM careful setup in TR / ryzen, is a total must of using fast memory (3200), it is a different world once done, and overclocking it on bios to get it, of course, to set that. Having quad channel memory, not dual channel (4 sticks, not two), as seems that's the way with TR. Having latest updates for your BIOS, all drivers too, and be sure two devices are not sharing  the same PCI-E channel, as this can lead to slow redraw, among other things (like performance of the SSD). ). Maybe is tho some incompatibility at the cpu level, some processor feature, who knows... The 2950x has higher clock speed, and other advantages, but the 1950x is a beast, anyway...

AD, AP and APub. V1.10.6 and V2.4 Windows 10 and Windows 11. 
Ryzen 9 3900X, 32 GB RAM,  RTX 3060 12GB, Wacom Intuos XL, Wacom L. Eizo ColorEdge CS 2420 monitor. Windows 10 Pro.
(Laptop) HP Omen 16-b1010ns 12700H, 32GB DDR5, nVidia RTX 3060 6GB + Huion Kamvas 22 pen display, Windows 11 Pro.

 

 

Link to comment
Share on other sites

13 minutes ago, SrPx said:

There have been reported issues with specific apps (not only your case with Affinity), from audio editing, to games, to 3D, to etc (a bunch of that is due to not just apps, but the (system and whatnot) libraries the developers of most brands use, are thought and prepared for intel cpus) .....  and in several cases is fixable. Sorry if you are fully aware of them already (RAM careful setup in TR / ryzen, is a total must of using fast memory (3200), it is a different world once done, and overclocking it on bios to get it, of course, to set that. Having quad channel memory, not dual channel (4 sticks, not two), as seems that's the way with TR. Having latest updates for your BIOS, all drivers too, and be sure two devices are not sharing  the same PCI-E channel, as this can lead to slow redraw, among other things (like performance of the SSD). ). Maybe is tho some incompatibility at the cpu level, some processor feature, who knows... The 2950x has higher clock speed, and other advantages, but the 1950x is a beast, anyway...

Yup, thank you SrPx, I'm aware of those situations to watch out for.

What can I say? I'm running a lot of software here, Blender, C4D, CC suite, CorelDRAW, Fusion, DaVinci, Dartktable, Rawtherapee, and other minor graphics apps, and for how weird this may sound, Affinity is the only software that causes the CPU/power usage to run crazy.

Andrew
-
Win10 x64 AMD Threadripper 1950x, 64GB, 512GB M.2 PCIe NVMe SSD + 2TB, dual GTX 1080ti
Dual Monitor Dell Ultra HD 4k P2715Q 27-Inch

Link to comment
Share on other sites

Maybe is sth crazy, but....have you played with the Creator Mode and Game Mode at Ryzen Master software?. Probably you have... I believe TRs come by default in creator mode, as game mode is...for games, and forces to disable half of the cores (and threads), making it a much less powerful cpu, kind of a "really good" 1800x. But it improves the core to core latency (sth which is not an issue in an intel) and the memory latency. Who knows !! (I probably would check the other things mentioned earlier) maybe in Affinity apps, the way TR does things in terms of latency is what is happening in your workflow... I'm not saying is to be recommended game mode for a whatever 2D application, but hey, having a beast like that, I'd test any crazy random thing to get  things right....

Of course, it'd be crazy to have to reboot after using an Affinity app, but if what you were planning was to mostly use the Affinity suite, it might have worth it, and surely not needing to swap to other apps constantly... After all, it converts -game mode- the TR to an overpowered ryzen, so to speak, and also keeps the other advantages of a TR. It wouldn't be slow in anything, anyway... What I doubt is that somehow you would have been given the cpu in game mode by the vendor, and that this would be what would be causing the performance probs (the opposite case) as I'm sure you would have noticed checking with any utility.... My only take at this, is maybe the game mode improves pci lanes and memory latency, and core to core latency, and that might prove critical in Affinity, for some reason... It's fun to try, after all.  :D .Of course, remembering to switch it back if doing a Blender/Vray or whatever render, as rendering with 32 threads is "kind of" much better :D .

Sorry if these are all uber basic things for you which you already know.

AD, AP and APub. V1.10.6 and V2.4 Windows 10 and Windows 11. 
Ryzen 9 3900X, 32 GB RAM,  RTX 3060 12GB, Wacom Intuos XL, Wacom L. Eizo ColorEdge CS 2420 monitor. Windows 10 Pro.
(Laptop) HP Omen 16-b1010ns 12700H, 32GB DDR5, nVidia RTX 3060 6GB + Huion Kamvas 22 pen display, Windows 11 Pro.

 

 

Link to comment
Share on other sites

29 minutes ago, verysame said:

Yup, thank you SrPx, I'm aware of those situations to watch out for.

What can I say? I'm running a lot of software here, Blender, C4D, CC suite, CorelDRAW, Fusion, DaVinci, Dartktable, Rawtherapee, and other minor graphics apps, and for how weird this may sound, Affinity is the only software that causes the CPU/power usage to run crazy.

Then it's crazy, indeed....

AD, AP and APub. V1.10.6 and V2.4 Windows 10 and Windows 11. 
Ryzen 9 3900X, 32 GB RAM,  RTX 3060 12GB, Wacom Intuos XL, Wacom L. Eizo ColorEdge CS 2420 monitor. Windows 10 Pro.
(Laptop) HP Omen 16-b1010ns 12700H, 32GB DDR5, nVidia RTX 3060 6GB + Huion Kamvas 22 pen display, Windows 11 Pro.

 

 

Link to comment
Share on other sites

3 hours ago, SrPx said:

Maybe is sth crazy, but....have you played with the Creator Mode and Game Mode at Ryzen Master software?. Probably you have... I believe TRs come by default in creator mode, as game mode is...for games, and forces to disable half of the cores (and threads), making it a much less powerful cpu, kind of a "really good" 1800x. But it improves the core to core latency (sth which is not an issue in an intel) and the memory latency. Who knows !! (I probably would check the other things mentioned earlier) maybe in Affinity apps, the way TR does things in terms of latency is what is happening in your workflow... I'm not saying is to be recommended game mode for a whatever 2D application, but hey, having a beast like that, I'd test any crazy random thing to get  things right....

Of course, it'd be crazy to have to reboot after using an Affinity app, but if what you were planning was to mostly use the Affinity suite, it might have worth it, and surely not needing to swap to other apps constantly... After all, it converts -game mode- the TR to an overpowered ryzen, so to speak, and also keeps the other advantages of a TR. It wouldn't be slow in anything, anyway... What I doubt is that somehow you would have been given the cpu in game mode by the vendor, and that this would be what would be causing the performance probs (the opposite case) as I'm sure you would have noticed checking with any utility.... My only take at this, is maybe the game mode improves pci lanes and memory latency, and core to core latency, and that might prove critical in Affinity, for some reason... It's fun to try, after all.  :D .Of course, remembering to switch it back if doing a Blender/Vray or whatever render, as rendering with 32 threads is "kind of" much better :D .

Sorry if these are all uber basic things for you which you already know.

Thank you, SrPx,

I tried everything. I even called Dell support and we went through a series of steps:

1) Updated the bios

2) Updated the Alien Profiles in order to give more power to Affinity

3) With the remote connection, they even tweaked other settings - which didn't hurt as now I gained a little more power in the other software :)

The result is, nothing has changed on the Affinity side.

It's also worth to be noted that the high Fan/CPU issue has been reported by other users as well. In other words, maybe the Threadripper situation is not related to the Threadripper at all, who knows?

I don't know what to do in order to provide support to the developers (that sentence sounds kind of weird), I feel the situation is somehow of hopeless.

Andrew
-
Win10 x64 AMD Threadripper 1950x, 64GB, 512GB M.2 PCIe NVMe SSD + 2TB, dual GTX 1080ti
Dual Monitor Dell Ultra HD 4k P2715Q 27-Inch

Link to comment
Share on other sites

They remote fixed your computer ? Wow. (well, I know it's very easy today,  I've done that for free (friends/family), lol  But I mean, I'm impressed...vendors are getting really advanced in their level of service )

So, you have a Dell Alienware. Those tend to be very good machines. First, I like Dell for the machines' build quality, endurance, very low noise profiles, low temps, and performance, ( all this at least in my personal experience, YMMV, in the latest companies, all my family members, several friends...Mine is not, is built picking each of the pieces, unbranded, newegg style...)  but even more, Alienware is their deluxe-high performance thing. Maybe...that's the problem, if it has some custom setting, some feature out of the ordinary, non standard...

Anyway ... I see... You surely have tried all already.

Well, that's all I've got....

No clue....

Maybe is something that is triggered in certain new processors, some CPU feature... Definitely not happening in old typewriters like mine. And if at least were happening to all and every user with modern machines ( is not the case, but is happening to a reduced group, that is a fact ) , that kind of would make more sense. So, yep, must be sth really specific....

AD, AP and APub. V1.10.6 and V2.4 Windows 10 and Windows 11. 
Ryzen 9 3900X, 32 GB RAM,  RTX 3060 12GB, Wacom Intuos XL, Wacom L. Eizo ColorEdge CS 2420 monitor. Windows 10 Pro.
(Laptop) HP Omen 16-b1010ns 12700H, 32GB DDR5, nVidia RTX 3060 6GB + Huion Kamvas 22 pen display, Windows 11 Pro.

 

 

Link to comment
Share on other sites

Yes, this time I decided to go with Dell mainly for their warranty & support.

Other than the remote support, which is handy, the best part comes if part of the hardware doesn't work and need to be replaced. They send you the technician at home/office in the next 1-2 business day. As I use this machine for work I didn't want to risk building it by myself, as I did with the previous. Don't get me wrong, it was worth the money, but not the hassle. After a month with the previous build, the mobo failed, I'll let you speculate on the consequences. Can't afford to waste all that time anymore, thus I preferred to pay some extra bucks and get Dell's warranty. I'm not aware of another company that does the same. Plus, that warranty applies all over the world.

Andrew
-
Win10 x64 AMD Threadripper 1950x, 64GB, 512GB M.2 PCIe NVMe SSD + 2TB, dual GTX 1080ti
Dual Monitor Dell Ultra HD 4k P2715Q 27-Inch

Link to comment
Share on other sites

Yeah, the brand seems is one of a kind... even much more than it already has always been, from what I am reading. And....absolutely! There are times in life where money needs to be spent or you end up in quite worse situation, plus, in your case simply would be very counter productive. 

There's been for some years some kind of bad criticism with the brand, and I have no freakin' idea why (other than when waiting a monitor to come back from repairs or some delayed order. I wish that those would be all issues with so many brands... ! ). My old boss always used Dells for absolutely everything, and when I asked him why, he'd tell me that simply there was a large difference in....everything with any other PC solution ( according to him, in service, pieces quality, performance, error free pieces for longer time)... and I agree, I was of that idea before my long time in the last company. The 2 Dells I purchased for my parents like 7 or 8 years ago, still run so silent that is impossible to know when they are switched on if the screen is off. And they're in everything like brand new. In arcane times, a neighbor had two PCs, went to fix her sister's one.... imagine, it was the original pentiums time. A pentium 100 (MHz) from Dell, and a "whatever" unbranded, clocked at 150. Same ram, same card, same disk, everything(the card on the Dell was really lower in specs) . Well, I did all sort of tests as I found it shocking... it was WAY faster than the pentium 150, in a bunch of tests.  There was nothing specially different in the CPUs, either. The baord, if anything, was lower end in the Dell! . This I have observed with other Dells, too (does not happen to all). Probably in the labs they just have a more strict set of habits and raise the bar higher or something, but in many cases they perform better than it's supposed by the specs. To me, are the macs (or what those were once..) of the PC world....

Edited by SrPx
Disgusting typos

AD, AP and APub. V1.10.6 and V2.4 Windows 10 and Windows 11. 
Ryzen 9 3900X, 32 GB RAM,  RTX 3060 12GB, Wacom Intuos XL, Wacom L. Eizo ColorEdge CS 2420 monitor. Windows 10 Pro.
(Laptop) HP Omen 16-b1010ns 12700H, 32GB DDR5, nVidia RTX 3060 6GB + Huion Kamvas 22 pen display, Windows 11 Pro.

 

 

Link to comment
Share on other sites

i have a question here and would like an answer from one Dev if possible.
I have a small Acer Aspire with an AMD A4 1250 Dual Core Processor + 4096GB RAM + AMD Radeon HD 8210.

Do you honestly this can run Affinity suite for "even" projects?
What settings do you recommand for such small config?

i personaly want to use it for my on-the-go work because it has a touch screen and it's simply small but cool.

Blessings.

Never be the Same Again !
---
Dell Optiplex 5090 SFF
Intel Core i5-10500T @2.30GHz with 12GiB 2666MHz DDR4
Intel UHD Graphics 630 for 10th Generation
M.2 2280, 512 GB, PCIe NVMe Gen3 x4, Class 40 SSD

Windows 11 Pro x64 22H2 + LibreOffice 7.5.3

Link to comment
Share on other sites

2 cores only, 2 threads only (no multi threading), 1 GHz without turbo, and only 4GB.... Good luck !  :) 

Anyway, I once had to work at a library, was left (when it had been since many years very obsolete, indeed, must have been quite after 1997 ! )  with an enhanced 286, and a very crappy 386 sx. I still could use graphic programs with those and make some serious projects. I'm guessing you might do "something", but it is usually said in general for 2D software to go at least after 2 GHz cpus... Probably you are left to mostly mock-ups work , screen resolution, never print resolution files.... Devs are very busy these days, so I ventured into at least telling you what I think of that...

In the bright side, Affinity eats quite less resources than Photoshop or other apps, so that might play in your advantage. 

A machine like this, one of the very few cases where I see total sense to try to overclock it. That said, I have very little knowledge in that respect. But definitely, I'd try to overclock this machine -if physically possible which dunno if it is-  and probably replace one of the two 2 GB RAM sticks with one of 4 GB, so u end up with 6 GB, which surely gives u a much better overall performance. But only if the whole thing would be really cheap to do. Otherwise, I wouldn't bother... Anyway, is a touch screen, not sure if laptop or AIO, but those are sometimes hard/dangerous to touch internally...

Also, removing all crap-software installed these things comes with, so to keep the TSR stuff and memory at a minimum usage.

 

AD, AP and APub. V1.10.6 and V2.4 Windows 10 and Windows 11. 
Ryzen 9 3900X, 32 GB RAM,  RTX 3060 12GB, Wacom Intuos XL, Wacom L. Eizo ColorEdge CS 2420 monitor. Windows 10 Pro.
(Laptop) HP Omen 16-b1010ns 12700H, 32GB DDR5, nVidia RTX 3060 6GB + Huion Kamvas 22 pen display, Windows 11 Pro.

 

 

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...

Important Information

Terms of Use | Privacy Policy | Guidelines | We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.