rvst
Members-
Posts
216 -
Joined
Everything posted by rvst
-
The slider should never go from zero to 1TB if the system only has 128GB of RAM installed. It should go from zero to 128GB of RAM. I think this was the whole point of my original post: the correct way to initialize this slider is to make its range 0 - [MAX RAM]. For sliders with a large range, the typical way that most software allows more granular control is to add small up/down arrow buttons to the right of the value field that allow fine-grained control once the slider has been moved to the approximately correct value.
-
Sorry, with all due respect, I don't buy that it's easier for the user. Two posts in two days - one by a user with LESS RAM and one by a user with MORE RAM than 64GB would indicate that far from being easier to use, it's actually confusing people, since they expect the slider maximum value to correspond to maximum physical memory. This is how almost all other software manages it. So, if it's a design choice rather than a bug, it's a bad design choice in my humble opinion, especially since it's not obvious that the max value can be overridden manually.
-
Just to underscore this (and to demonstrate that overriding the slider with an explicit value in the field works), here are two screenshots of Affinity Photo using a LOT of RAM In the capture below you can see that literally 95% of the RAM in the system is being used, a total of 113GB out of 128GB in total So, Affinity can easily use more than 64GB of RAM.
-
Yes, of course. I'm simplifying as this isn't a technical forum and the discussion is not really relevant to the fact that there is a bug on the Performance settings for Affinity products. For the record, I'm a Linux kernel systems programmer, so I'm sufficiently familiar with OS-level architecture. Kernel pages will never get swapped out and memory is typically reserved for kernel routines, but beyond the base operating system requirements, a process can theoretically use up to the maximum amount of memory available to userspace processes
-
While it is relatively rare for a process to use a large amount of resident RAM, it certainly can happen and graphics programs are likely to be one of those that would use a lot of RAM. If a page is in a working set (ie. accessed frequently) then it won't be swapped to virtual memory. With sufficient RAM available, most operating systems will not swap to disk much. On Linux, this can even be configured through `/proc/sys/vm/swappiness`. I imagine given it's origins that this is something that MacOSX could do too. 64 bit processes can address a lot more than 4GB of RAM - a 4GB address space is the limit for a 32 bit process. I actually use Windows 10 and not MacOSX - my understanding is that Windows VM tends not to swap until physical RAM is exhausted, but I'll need to check this to be certain. Incidentally, MacOSX is originally based on the Mach kernel, which is a derivative of BSD Unix from which FreeBSD descends... This is all good info, but ultimately not a good reason to leave what is clearly a bug in place...
-
This appears to be a bug. See the recent thread I posted on the same subject https://forum.affinity.serif.com/index.php?/topic/152526-affinity-photodesignerpublisher-performance-settings-ram-usage-limit-bug/ It does the same for me and I in effect have double the amount of RAM that Affinity could use. The slider should, if correctly initialised, have a range 0 - [installed amount of RAM]
-
I don't think that is the case - Affinity doesn't pre-allocate all the RAM. This slider just limits how much of the RAM it can use. So in default mode with this limit, it will only ever use half the RAM available if someone has 128GB installed. It is entirely possible for one app to use all the RAM, Affinity included, if multiple very large images with many layers are open and being edited. It's simply that at this point some swapping to virtual memory will start to occur. I grant you, though, that this is likely to be an edge case - not many people are going to use more than 64GB of RAM. But this is a bug nonetheless...
-
I think I have tracked down the problem. It's an issue of ICC v2 versus v4 profiles. All of the professional graphics software I have displays these JPGs that use the sRGB-IEC61966-2.1 profile and the other color.org profiles identically, whereas some of the color managed image viewers I use don't - they have differing results. So I extracted the sRGB-IEC61966-2.1 ICC profile from one of the offending JPGs using Argyll CMS tools and then inspected it with exiftool. It turns out that it is in fact a version 4 ICC profile. The reason I thought it was an Affinity issue is that all my other graphics packages export to JPG using a v2 ICC profile, not a v4 profile. Windows Photo Viewer (the colour managed version that comes from Windows 7, not the unmanaged Photos app that is the default in Windows 10) is known to have compatibility issues with V4 profiles, as do some other apps I would imagine. So this is clearly the cause of the problem... Thanks for the help all those who responded to this thread.
-
Curious to know what your machine specification is (RAM, CPU and what type of disk drive)....and some details of the file you're working on: resolution, bit depth, number of different layers and filters? On my old machine, I noticed a bit of a lag similar to the one you mention when working on very large and complex files with many filters and layers. Very complex files are literally going to chew through RAM and if your machine is short on RAM and swapping to disk, this will slow things down enormously.
-
Forgive me, I don't mean to be snarky in my reply. I know how these different colour spaces work - it's the very reason I use ProPhoto RGB, since it is wide gamut and so are my monitors. I get better colour precision and gradation by working in a wide gamut colour space. When using a properly colour managed viewer, a ProPhoto RGB and sRGB version of the same photo containing embedded ICC profiles look identical on my monitor, as well they should. The only time an sRGB and ProPhoto RGB version of the same file don't look the same is if the ProPhoto RGB file fails to contain an embedded ICC profile or one uses a viewer that is not colour managed, in which case the colours in the ProPhoto RGB version will look all washed out. The problem I am referring to is quite different. With Affinity Photo (and ONLY with Affinity Photo), when I export a file to the sRGB colour space using the default sRGB IEC61966-2.1 profile, the file undergoes a transformation of some type. It seems to apply what looks like a non-linear tone curve to the output file. I can take the IDENTICAL TIFF file, load it into multiple different pieces of software (Lightroom, On1 are two examples of other software I use) and export it to a sRGB JPG and in every case except for Affinity Photo, it looks identical to the ProPhoto RGB version. To investigate this issue further, I just download the ICC v4 and V2 profiles from here: https://www.color.org/srgbprofiles.xalter, imported them both into Affinity Photo and then used these two profiles as the target sRGB profiles for my JPG export. The resulting exported SRGB JPG is correct, matching the ProPhoto version and quite different from the version using the default sRGB IEC61966-2.1 profile. There appears to be something wrong with the sRGB ICC profile in Affinity (I'm assuming this ICC profile is shipped with Affinity, but I may be wrong).
-
I use ProPhoto RGB as my working space. When I export to a JPG using the sRGB IEC61966-2.1 profile (and of course I embed the profile), I end up with a substantially different looking photo to the original, almost as if it has subsequently had a non-linear S-shaped curve applied to it. The difference is marked. I've taken to exporting the file into TIFF format using a ProPhoto RGB profile, importing it into Adobe Lightroom and then exporting that to sRGB, which is a bit of a convoluted workflow. This way, the resulting sRGB JPG looks exactly like what I have on the screen when I have the ProPhoto RGB file loaded in Photo. Curiously, if I choose to export it with a *wsRGB profile, then I get the expected result - the JPG looks identical to the original file. But of course I don't want to do this, since it's not a properly standardized ICC file. Anyone have any idea what's going on here?
-
When soft proofing, It would be nice to have a context menu option on the soft proof layer that allows to make a selection from the gamut warning area. This would allow creation of a mask, among other things, that would permit adjustments only affecting the out of gamut areas. I've worked out a way of doing this, but it is very clumsy. See attached images for a visualization of what I would like to be able to do with two clicks. Oh, I'll add that my hack method is not perfect if you have any 50% grey areas in your image...
-
Actually, this is possible if a little imperfect and very clumsy. Turn the gamut warning on, choose edit-->copy flattened. Choose File->new from clipboard. You will get a pixel layer with the gamut warning in 50% gray. Choose Select->select sampled color. Alt click the dropper and drag it onto the 50% gray. Lower tolerance to 1%. Click apply to create the selection. Create a mask from the selection. Copy the mask and paste it back into the original document. It would be nice to have a context menu option on the soft proof layer that allows to make a selection from the gamut warning area
