jorismak

Members
  • Content count

    113
  • Joined

  • Last visited

  1. I believe this is because the cropping (And other stuff) can be 'non-destructive'. Pick the layer you're working on / working with, right click it and try 'rasterize'. See if the picture 'received' by Color Efex now matches more.
  2. Same here. Switching blend mode to 'add' crashes it. (had a source layer dragged into it, enabled the 'equations checkmark' but didn't change any value yet. Blend mode to 'add' -> crash. This is in 1.6.0.75. On a related note, can someone point me in the direction to do the same as the following Photoshop apply-screen (note the 'invert', 'add' blending and the scale of 2) Basically, I want to compare to layers and to make the 'difference' between the two, so that I can set the new 'difference' layer to 'linear light' and apply the changes that way. (So, compare two layers to make the difference between those two, then apply those differences to a completely different layer) (and no, I can't use the built-in frequency-separation, I'm using this for something else).
  3. Where do you see to 'convert to 8 bit' ?? I never touched 8bit mode in Affinity since the very first Windows beta.
  4. Although i congratulate you on the release for the iPad version, can you share something about the how the development teams are allocated within Sherif? Seeing an iPad release kinda stings with me seeing the poor state the Affinity Photo release 1.5 is in (and 1.6 betas not showing much improvement). If it are separate development teams, then I have nothing to whine about. If resources actually went from Affinity Designer / Photo windows towards an iPad version I have serious doubts about the future commitment to that project... That being said, I don't use the Mac version of course, is Affinity Photo 1.5 and 1.6 beta having as much issues on Mac as it is on Windows?
  5. I know a lot of people that want to 'start' with the jpeg-output in a RAW editor, and tweak it from there. That is what I meant. The same with all the (early) x-trans users that just plain liked the jpeg output more but couldn't get ACR to match it closely :). Anyway, it was just a way to explain _what_ was happening, and that it is not a fault _in theory_. But you absolutely have a point that 'more pleasing out of the box' will help, specially when people start comparing raw converters. It's nice that it can produce great results, but if you need a lot of tweaking to get there, people tend to start using other stuff. Specially in the pro world, where the mantra of 'time is money' is simply true. A raw converter that needs 3 sliders tweaked a little bit vs a raw converter that doesn't is very clear in my mind :). I also think the 'pleasing from the start' comes from supporting different camera's and raw files. Some people have been complaining that their raw files start out way too dark and it appears a new (Beta) update fixed it. I believe (as an example) people not thinking the raw development was OK from Olympus OM-D E-M5 mark2 files, but my OM-D E-M10 mark1 files came out great from the start. So it depends on camera body and everything. Maybe make an official 'support' request with a sample raw file from you so they can tweak the default response? Wouldn't be the first time they did that from what I read on the forum.
  6. It means that the output C1 and ACR create are most often not at all what your camera would create if it was set to jpeg. C1 and ACR don't try to clone the look of your camera, they develop the raw how they think it should be developed, and that means it can look different. Most notably, C1 uses a film-like curve by default that gives the images some extra 'pop' from the get-go, and ACR also has a tone-curve applied that is more meant to normalize the output of all the sensors out there, so they all come out pretty similar in response. In other words, ACR (in 2012+ mode any way) applies a tone curve and leaves the output to be 'neutral'. Both the C1 and ACR method give lower contrast around the levels of well-exposed skintones (let's say around 55% to 65% IRE) why they often seem 'brighter' in the mids. That all being said, AP does little to nothing to make the images 'pleasing' with default values. It's rather bare-bones and straight shooting in the RAW development -> does little to the data so you can tweak it any way you want to. Yes, that means you might have to apply a curve that has a slight s-curve to it. This will will make the image pop a bit more and boost anything from 50% and higher, giving you the brightness in the mids you seem lacking. And if the curve is applied in luminance-only mode (something like LAB mode) that means your brightness increases and gives pop, but the color intensities stay the same. I'm betting that if you brighten that image you see the colors don't seem so saturated anymore. Since they're darker, you see more of the color. If you add white (make it brighter) the color seems less intense. So, you're absolutely right in what you see, but I don't think it's viewed as 'an issue', just a different output by default. And like I said, AP does way less to your files 'by default' then other programs like C1 and ACR do. ACR does a lot to your file that you can't turn off :). Maybe the other way of viewing things is like this: There is no 'one correct way' of developing the RAW data, only multiple ways to do it, and you can choose which you find more pleasing or takes the less work to get to a pleasing result. Most often photographers want the RAW converter to match the camera's JPEG output by default (Since that is what they saw on the display when they took the picture :)) but like I said, there is no one 'correct' way. Your camera is just one way to deal with the sensor data, programs do other stuff with it. Did you ever took a look at the (Free / opensource) Rawtherapee? It gives you _all_ the options. There are like 6 options to set brightness+contrast in there. Normal programs pick one, but there are multiple ways to do things.
  7. MEB replied on April 30 already.. just sayin' :)
  8. If you switch from Develop to Photo then back to Develop, you're not working on your RAW data when you switch back to Develop, you're working with the pixel data developed the first time. It's like opening your file in ACR, saving it as a tif, then opening that tif again in ACR. It's not as big a problem as it sounds, but it's not what you want probably :). It's also the reason why the histogram displays 'dark' the first time and OK the second time. The first time you're working on RAW pixel data, and the histogram you're seeing (probably, I'm guessing!) has no gamma correction applied to it. It appears to be in linear space. The moment you develop it, it gets gamma corrected so you get a 'normal' picture. Hitting develop again will get you the develop tools, but working on the pixel data from the first develop, so be careful. @ Serif / Affinity team: There are a lot of (newer) camera's which contain lens corrections in the file, and the 'usage' of this metadata has been mentioned in other topics (so I won't go into it here). But the Lensfun database will probably not contain lens corrections for camera's that do the corrections already.. so waiting on corrections in a lensfun database that will probably never appear (at least not soon) doesn't seem like the right path. Rawtherapee (it's opensource right? Doesn't mean you can rip the source right out but maybe you can arrange something or you can recreate something based on the idea) has an interesting feature regarding this: it has an auto-correct button which (Behind the scenes) overlays the embedded jpeg preview file on top of the raw data, and then automatically sets the correction parameters to make the RAW data and the embedded JPEG data to 'line up'. So it doesn't read (or detect) any 'embedded correction' info, it analyzes the preview jpeg to automatically set settings. This is at least a good 'fall back' to make use of correction in camera's (now or newer ones in the future) without having to know the metadata of each and every RAW format. @ the OP: ACR doesn't 'support' to Fuji film-profiles do they? I think Fuji is keeping this close to their chest. There are 3rd party film simulations for ACR, and for Affinity you can use the LUT system and the haldclut film-emulation pack floating around (check rawtherapee's site) or maybe use the LUT functions to try make your own? Use the in-camera RAW processing to create a 'before and after' picture, then use the LUT tools to create a lut that transformers the picture 'before' into the 'picture after', then apply it to your RAW conversions.
  9. Sort of what I said : linear space in 8 bit is wrong and should never be used. But.. linear space in 16 bit is used a lot and completely ignored by Affinity Photo. In 32 bit floating point, linear space is even the standard / norm :).
  10. I asked about this in a previous post somewhere (I called it a bug I believe ). Affinity photo does not support ICC profiles with a linear gamma,while in 8bit / 16 bit. They appear and work fine if working with 32 bit files. I do gamma correction with imagemagick or Photoshop and then work with Affinity photo. OR I load the file in Photoshop, save it as 32bit and then open that in Affinity Photo. Then you can do all the editing you want in linear space (while actually seeing it gamma corrected for your monitor ) and save it back. But you do have a 32 bit file. If you ever want to go back to a linear 16bit file you'd have to use Photoshop or something else again. Another way they explained it to me is that you can open your linear file in Affinity photo , and then add a LUT layer with a 3d LUT loaded to view the file normally on your monitor. Keep that LUT layer always on top, to use as a proofing layer. When saving, remember to disable the LUT layer so you actually save linear data again. Which LUT to use and where to get it depends on your colour profile that you want to work in. My files are linear-adobergb so I use a lut that converts it to normal Adobe RGB while working. A gamma of 2.2 is never exactly correct, but it's often 'close enough'. (sRGB isnt entirely 2.2, there are parts in the shadows there the gamma is different. So not a constant gamma. Adobe RGB is 'constant' but it's more 2.19 than 2.2 (2 + 51/256). I don't know why linear ICC profiles are ignored / removed with 16 bit files. On 8bit I understand (you don't want linear space in 8bit, it will posterize like crazy) but 16 bit is used a lot. Scanner output as a simple example.
  11. I just tested it here. Windows defaults to 100% scaling on both my monitors (including my 15" quadhd screen on another laptop ?!?!). If I crank the scaling to 175% and I log out and log in again (as the display dialog states: "Some apps will not respond to scaling changes until you sign out". After logging in again I start Affinity, and I get all the UI and text bigger _but not blurred_. The image data is still at the exact same size (pixel for pixel), thank god for that. If I make the scaling bigger but I do not log out again,then yes, Affinity is displaying blurred, because it can't read the new value yet until you log out and back again. I've been reading up about it on MSDN about the scaling changes. It seems _nothing has changed_ since Windows 7 in how the scaling works exact for one thing: Previously you _had_ to log out and ilog in again to get some effect, what happens now is that they 'try' to apply the changes without logging out and in.. but still recommend you do. That's the blurring -> It's Windows scaling it up _until_ you logged out and logged back in. Since the WDDM driver model in Windows Vista the scaling for Windows has not changed in how it works. In Windows 8 (or 8.1) they finally had the courage to enable it by default on a new Windows install if the monitor reported an actual display size, and now in Windows 10 Anni-update they tried to make it so you don't have to logout/login before it has affect (Which they warn for, doesn't work in all programs, Affinity one of them). So: To get the display scaling you want: Set the scaling bigger in the Windows settings, log out, log in, tadah: Affinity does what you want. If you now have problems with _other legacy old_ software that looks blurry -> contact them to support the Windows scaling. (Disclaimer: I'm not affiliated with Serrf / Affinity, just to be clear and in case I start to piss some people off :P)
  12. > All I want is to be able to read the text in the panel boxes and settings dialogs without needing to mess with the Windows settings in any way Then I think you're wanting something very special that you can't expect programs to cater for. The Windows settings are there for a reason. > Increasing this causes blurring Not in Affinity it doesn't.. not here at least. The whole thing is that you have a small surface area with a large amount of pixels, a larger-than-'normal'-DPI. So you need to tell a program that it must use more pixels 'than normal' to draw text and UI, because otherwise it would be too small. The DPI scaling setting is made for this (and since Windows 8 is set automatically by Windows if your monitor reports 'an actual screen size' so DPI can be calculated). It sounds like you're asking to be driven somewhere, but you don't want to get in a car. Sounds impossible.
  13. You want the CPU usage to spike, since you want it to do work as quickly as possible :). You don't want it to say 'I could do this adjustment in 10 milliseconds, but to keep the fans quiet I will do it in 500 milliseconds'.... For what it's worth, I'm on Windows 10 x64 with a GTX 1060 6gb, but I have the much older Core i7-860 with only 8gb of RAM.. and it all runs snappy. Some actions are more cpu intensive and take a couple of seconds (for instance, I don't expect FFT stuff or the denoiser to do stuff 'instantly', that kind of operation takes a little in every application). The slow loading times are what is curious to me (it saves a 24 MP 16bit TIFF file faster with compression than it opens a 24MB 16bit tiff file without compression :P), but otherwise it runs snappy.
  14. DPI scaling is what you _want_ in this cases. It tells the whole system to scale UI elements higher because you have a higher DPI monitor. The 'font blurring' happens with older programs which don't support it natively. Affinity Photo does. I set my scaling higher, I get bigger UI elements, _without blurring_ and without issues. The image itself is unscaled, as it should be (was a long discussion before :P). So what you're asking is that you _do not_ enable OS wide UI scaling, but that programs independently discover you have a high DPI monitor and scale up? That's not how it works, sounds like a very bad idea. The OS scaling is made for this. In the WindowsXP-era this could cause issues because some programs behaved nicely without while others did not, and weird stuff happened :). They fixed that from Windows 7 upwards so that older programs that don't support the OS-scaling parameter can scaled in their entirety. Yes, this causing 'blurring' (cause it's a low resolution image scaled up) but this way old legacy software still appears on your screen, without issues and while being bigger (because that is what you wanted :)). Modern software listens to the OS scaling natively, and only scales up what makes sense (such as fonts and UI elements). I have a fullhd-laptop on a < 15" screen, and I enjoy the OS DPI scaling set to (the weird )128%. All fonts and UI elements are perfectly sharp, no blurring. What you're asking is for Affinity to say "no thanks Windows for this OS-wide dpi scaling solution, we do not want that, we will event the wheel again and do it ourselves"... doesn't sound right.
  15. No issues with crashing here, and images with sRGB or AdobeRGB 16bit work just fine. Didn't try any wide-gamut stuff in Nik though.