Jump to content


  • Content count

  • Joined

  • Last visited

Everything posted by jorismak

  1. (getting the thread back alive) Maybe I can turn the question around into the problem he seems to be _really_ having: The 'soft proof' for Affinity Photo gives very different results to the soft-proof of Photoshop. Particularly, the blacks seem to be at least gray with my printer-profile in the soft-proof, while Photoshop still displays a pretty-much-black. So both are set to 'relative' intent, and both with black-point-compensation, yet the result is clearly different. Also, if I enable the 'gamut warning' in Photoshop, I can push the saturation of the image _way_ higher before Photoshop starts giving out-of-gamut warnings.. but if I check 'gamut check' in Affinity Photo, it gives a lot of gamut-warnings almost straight out of the box, and I can't push the saturation at all without getting more gamut warnings... This is all on the same (Mac) machine, with the same display-profile loaded and with the same printer ICC profile. Particularly the out-of-gamut warnings which seem to be given waay to early make it impossible to do any real soft-proofing with Affinity Photo on my setup.
  2. DxO still host the last free version I was told. It should be somewhere on the site . It's here https://nikcollection.dxo.com/nik-collection-2012/ Requires an email to download though .
  3. Really ? That can't be true.. I would consider that basic functionality, specially the blend modes..
  4. I decided to give Affinity Photo another go .. my workflow keeps changing / evolving so I thought to just try it to see if it's more usable these days compared to the early 1.5 beta's. I'm running into a few things where I'm thinking that I'm just missing it, or there must be another 'affinity-way' to do it. I can't seem to find a way to set the levels overlay to a specific blackpoint by sampling the image. Like the basic 'black, mid and white point' droppers in the levels-dialogbox. I only need black-point for now, but can't seem to find any of those. Now, I need to open the info tab, add a color sampler to a specific point (maybe even add a blur-fx underneath if I want average values, because the color sampler doesn't seem to have a size control.. the color-picker does, but not the color sampler). I drop it on the point, and I can get RGB values in the 0 - 255 range. But the levels-overlay takes values in percent.. so I need to convert the values from the color sampler to percent by doing (x / 255) * 100, and enter those values manually in the R, G and B channels in the levels-overlay... quite a bit of work! The other thing that I can't seem to find (and must surely exist somewhere) is the 'Divide' blending mode. I hovered over all the blending modes but none do what I expect from 'Divide' , so it doesn't seem to be a simple naming-difference. Now, I have to duplicate a channel and use 'Apply image' with another layer and enter 'DR / SR', 'DG / SG' and 'DB / SB' in the formulas to divide one layer by another layer. Not as slow as the levels-black-point but still a lot harder than just selecting the Divide blend mode and keep on going... Where are those functions in Affinity (I'm on and / or what are they called because I can't seem to find them.
  5. Hi there, PS has the options in the levels dialog to do an 'auto levels', but more importantly to set the thresholds for it. You can enter an amount of percentage of signal at which you want the set the level-markers so to speak. And then it has options to do that for every channel separately, or to keep the overal link between channels and a few other options. Is there something like that somewhere in Photo? The per-channel thing I can do by separating the channels into a R, G and B layer and doing 'auto levels' on those layers... but I want to set the thresholds somewhere :).
  6. jorismak

    Affinity Photo for iPad launched at Apple WWDC

    Although i congratulate you on the release for the iPad version, can you share something about the how the development teams are allocated within Sherif? Seeing an iPad release kinda stings with me seeing the poor state the Affinity Photo release 1.5 is in (and 1.6 betas not showing much improvement). If it are separate development teams, then I have nothing to whine about. If resources actually went from Affinity Designer / Photo windows towards an iPad version I have serious doubts about the future commitment to that project... That being said, I don't use the Mac version of course, is Affinity Photo 1.5 and 1.6 beta having as much issues on Mac as it is on Windows?
  7. If you switch from Develop to Photo then back to Develop, you're not working on your RAW data when you switch back to Develop, you're working with the pixel data developed the first time. It's like opening your file in ACR, saving it as a tif, then opening that tif again in ACR. It's not as big a problem as it sounds, but it's not what you want probably :). It's also the reason why the histogram displays 'dark' the first time and OK the second time. The first time you're working on RAW pixel data, and the histogram you're seeing (probably, I'm guessing!) has no gamma correction applied to it. It appears to be in linear space. The moment you develop it, it gets gamma corrected so you get a 'normal' picture. Hitting develop again will get you the develop tools, but working on the pixel data from the first develop, so be careful. @ Serif / Affinity team: There are a lot of (newer) camera's which contain lens corrections in the file, and the 'usage' of this metadata has been mentioned in other topics (so I won't go into it here). But the Lensfun database will probably not contain lens corrections for camera's that do the corrections already.. so waiting on corrections in a lensfun database that will probably never appear (at least not soon) doesn't seem like the right path. Rawtherapee (it's opensource right? Doesn't mean you can rip the source right out but maybe you can arrange something or you can recreate something based on the idea) has an interesting feature regarding this: it has an auto-correct button which (Behind the scenes) overlays the embedded jpeg preview file on top of the raw data, and then automatically sets the correction parameters to make the RAW data and the embedded JPEG data to 'line up'. So it doesn't read (or detect) any 'embedded correction' info, it analyzes the preview jpeg to automatically set settings. This is at least a good 'fall back' to make use of correction in camera's (now or newer ones in the future) without having to know the metadata of each and every RAW format. @ the OP: ACR doesn't 'support' to Fuji film-profiles do they? I think Fuji is keeping this close to their chest. There are 3rd party film simulations for ACR, and for Affinity you can use the LUT system and the haldclut film-emulation pack floating around (check rawtherapee's site) or maybe use the LUT functions to try make your own? Use the in-camera RAW processing to create a 'before and after' picture, then use the LUT tools to create a lut that transformers the picture 'before' into the 'picture after', then apply it to your RAW conversions.
  8. Sort of what I said : linear space in 8 bit is wrong and should never be used. But.. linear space in 16 bit is used a lot and completely ignored by Affinity Photo. In 32 bit floating point, linear space is even the standard / norm :).
  9. I asked about this in a previous post somewhere (I called it a bug I believe ). Affinity photo does not support ICC profiles with a linear gamma,while in 8bit / 16 bit. They appear and work fine if working with 32 bit files. I do gamma correction with imagemagick or Photoshop and then work with Affinity photo. OR I load the file in Photoshop, save it as 32bit and then open that in Affinity Photo. Then you can do all the editing you want in linear space (while actually seeing it gamma corrected for your monitor ) and save it back. But you do have a 32 bit file. If you ever want to go back to a linear 16bit file you'd have to use Photoshop or something else again. Another way they explained it to me is that you can open your linear file in Affinity photo , and then add a LUT layer with a 3d LUT loaded to view the file normally on your monitor. Keep that LUT layer always on top, to use as a proofing layer. When saving, remember to disable the LUT layer so you actually save linear data again. Which LUT to use and where to get it depends on your colour profile that you want to work in. My files are linear-adobergb so I use a lut that converts it to normal Adobe RGB while working. A gamma of 2.2 is never exactly correct, but it's often 'close enough'. (sRGB isnt entirely 2.2, there are parts in the shadows there the gamma is different. So not a constant gamma. Adobe RGB is 'constant' but it's more 2.19 than 2.2 (2 + 51/256). I don't know why linear ICC profiles are ignored / removed with 16 bit files. On 8bit I understand (you don't want linear space in 8bit, it will posterize like crazy) but 16 bit is used a lot. Scanner output as a simple example.
  10. I just tested it here. Windows defaults to 100% scaling on both my monitors (including my 15" quadhd screen on another laptop ?!?!). If I crank the scaling to 175% and I log out and log in again (as the display dialog states: "Some apps will not respond to scaling changes until you sign out". After logging in again I start Affinity, and I get all the UI and text bigger _but not blurred_. The image data is still at the exact same size (pixel for pixel), thank god for that. If I make the scaling bigger but I do not log out again,then yes, Affinity is displaying blurred, because it can't read the new value yet until you log out and back again. I've been reading up about it on MSDN about the scaling changes. It seems _nothing has changed_ since Windows 7 in how the scaling works exact for one thing: Previously you _had_ to log out and ilog in again to get some effect, what happens now is that they 'try' to apply the changes without logging out and in.. but still recommend you do. That's the blurring -> It's Windows scaling it up _until_ you logged out and logged back in. Since the WDDM driver model in Windows Vista the scaling for Windows has not changed in how it works. In Windows 8 (or 8.1) they finally had the courage to enable it by default on a new Windows install if the monitor reported an actual display size, and now in Windows 10 Anni-update they tried to make it so you don't have to logout/login before it has affect (Which they warn for, doesn't work in all programs, Affinity one of them). So: To get the display scaling you want: Set the scaling bigger in the Windows settings, log out, log in, tadah: Affinity does what you want. If you now have problems with _other legacy old_ software that looks blurry -> contact them to support the Windows scaling. (Disclaimer: I'm not affiliated with Serrf / Affinity, just to be clear and in case I start to piss some people off :P)
  11. > All I want is to be able to read the text in the panel boxes and settings dialogs without needing to mess with the Windows settings in any way Then I think you're wanting something very special that you can't expect programs to cater for. The Windows settings are there for a reason. > Increasing this causes blurring Not in Affinity it doesn't.. not here at least. The whole thing is that you have a small surface area with a large amount of pixels, a larger-than-'normal'-DPI. So you need to tell a program that it must use more pixels 'than normal' to draw text and UI, because otherwise it would be too small. The DPI scaling setting is made for this (and since Windows 8 is set automatically by Windows if your monitor reports 'an actual screen size' so DPI can be calculated). It sounds like you're asking to be driven somewhere, but you don't want to get in a car. Sounds impossible.
  12. jorismak

    Photo | Performance on Windows 10

    You want the CPU usage to spike, since you want it to do work as quickly as possible :). You don't want it to say 'I could do this adjustment in 10 milliseconds, but to keep the fans quiet I will do it in 500 milliseconds'.... For what it's worth, I'm on Windows 10 x64 with a GTX 1060 6gb, but I have the much older Core i7-860 with only 8gb of RAM.. and it all runs snappy. Some actions are more cpu intensive and take a couple of seconds (for instance, I don't expect FFT stuff or the denoiser to do stuff 'instantly', that kind of operation takes a little in every application). The slow loading times are what is curious to me (it saves a 24 MP 16bit TIFF file faster with compression than it opens a 24MB 16bit tiff file without compression :P), but otherwise it runs snappy.
  13. DPI scaling is what you _want_ in this cases. It tells the whole system to scale UI elements higher because you have a higher DPI monitor. The 'font blurring' happens with older programs which don't support it natively. Affinity Photo does. I set my scaling higher, I get bigger UI elements, _without blurring_ and without issues. The image itself is unscaled, as it should be (was a long discussion before :P). So what you're asking is that you _do not_ enable OS wide UI scaling, but that programs independently discover you have a high DPI monitor and scale up? That's not how it works, sounds like a very bad idea. The OS scaling is made for this. In the WindowsXP-era this could cause issues because some programs behaved nicely without while others did not, and weird stuff happened :). They fixed that from Windows 7 upwards so that older programs that don't support the OS-scaling parameter can scaled in their entirety. Yes, this causing 'blurring' (cause it's a low resolution image scaled up) but this way old legacy software still appears on your screen, without issues and while being bigger (because that is what you wanted :)). Modern software listens to the OS scaling natively, and only scales up what makes sense (such as fonts and UI elements). I have a fullhd-laptop on a < 15" screen, and I enjoy the OS DPI scaling set to (the weird )128%. All fonts and UI elements are perfectly sharp, no blurring. What you're asking is for Affinity to say "no thanks Windows for this OS-wide dpi scaling solution, we do not want that, we will event the wheel again and do it ourselves"... doesn't sound right.
  14. Hmm, no.. it's not 'limited' to 16bit.. it's more like "only working in 32bit" :P. But linear-space work in 8bit isn't recommended at all and will cause problems, so I wouldn't hold it against the Affinity team if they prevented linear spaces on purpose in 8bit
  15. Hi, I have some color profiles installed under Windows that are linear variants of srgb and adobergb. I even used the File -> Import ICC profile to try to add them to Affinity Photo. Yet, they don't show up when editing 16bit images! Only when I switch the color format to 32bit do they suddenly show up, but since Affinity automatically tries to convert to linear-space the profiles are messed up. Is there _any_ way to tell Affinity Photo that the data in the file is _already_ in linear space. So to _interpret_ the data as gamma 1.0 instead of converting it? It doesn't help that the gamma slider in the Levels-adjustment doesn't go higher than 2.0 as well :)
  16. jorismak

    Nik Color Efex Pro 4 and Viveza wrong colors

    Changes to nik while every other program supporting PS plugins seems to habdle them just fine.. doesn't make sense to me. Also at the original poster : there is a known issue with Viveza and the colors (while not zoomed in ). Color Efex is fine. It seems the trouble you afe having is coming from somewhere else as nobody else has that issue. Maybe something in your color profile configuration?
  17. Thanks for the tips ! Been using luts in videoworld , never thought about using them in photo work. Stupid :). If the 16bit conversion is indeed done in working space, that means I can work around it which is a huge help already
  18. Neat image 8 was releases some time ago. In my mind, huge improvements but also 'more of the same '. They added more 'assist views' to make life easier dialing the settings in. Neat still has the tendency to make parts that become too blurry look weird. I don't know if I can explain it,but imagine this: you have a tree in the background. You have to apply aggressive denoising , so the tree becomes blurry and/or looses a lot of detail. Most other noise plugins make the tree look like there is some kind of gaussian blur applied. Natural looking blur. But neat doesn't get natural. It starts looking like triangles and weird shapes, which kinda stinks to be honest. Topaz is still my 'if all else fails' plugin. Good results all the time, but requires careful dialing in and is slllooowww. But i live it in my toolbox. Most denoisers work with settings to 'set noise level' and then other settings to 'set amount of reduction'. After setting noise level you dial in the amount to leave a preferable amount of noise in, giving a natural looking image that doesn't look so blurry and no banding. Topaz doesn't have the 'amount' settings, just the level. In other words , topaz always goes for ultra smooth , removing all the noise entirely, giving it a too smooth unnatural feel. The trick is to then blend some of the original image back in using layers after the plugin has run. (Or use their 'add noise' slider). You gotta remember this. Go for an unnatural oversmoothed look in topaz, then blend layers to get the amount you want. Other plugins like Neat and Noiseware have this built in. The preview assists in Topaz (and Neat now as well ) are extremely handy though , and missed in Noiseware or others . These views show you the chroma channels, or boost the shadows or show just the high frequencies as an example. Makes it very easy to dial in each slider. Neat 8 and topaz 6 both work fine in Affinity for me. And both are under 100 bucks (well, Neat was for me with upgrade s. Don't know the new-customer-price). If you have noise issues, both are extremely good to have. Shooting on an 6year old aps-c camera (the first 24mp aps-c ever I think ) , a m4/3 camera and a lot of analog film, dealing with noisy is large part of my photo hobby :).
  19. Luts sure help with looking at the project :). May I ask how did you get those, generated them somewhere somehow? But are they also precise enough to 'apply' / rasterize the effect? For what it's worth, my test case was this: I have a 16bit tiff file in Photoshop, with my own generated / customized profile assigned, which is AdobeRGB with a gamma of 1.0. I called this 'adobergb-lin'. So that 16bit file my my adobergb-lin assigned gets converted to 32bit in Photoshop. Photoshop at this points still lists my 'adobergb-lin' as the profile assigned to the image. I save this, embedding the ICC profile. I open this in Affinity Photo and it looks OK, and Affinity Photo also lists my 'adobergb-lin' profile as the assigned profile. This file, I can't get to 'normal' AdobeRGB inside of Affinity Photo. If I convert that 32bit tiff file into 16bit, it becomes sRGB. If I assign the 32bit tiff file the 'Adobe RGB (linear)' profile that Affinity Photo lists, I also still get sRGB when converting to 16bit... This is in the .54 public beta btw. I missed that there was a lut-adjustment layer. Opens up a lot of stuff :P. Let me try a bit today, I think it's a workaround for now but will have to do some quality checks with the final output. The idea is still to convert the image to 16bit-non-linear at some stage to.. well.. actually view it :). It now still requires some roundtripping through ImageMagick or something. Not bad, but not ideal either :).
  20. But also the conversion from 32bit (in linear adobergb for example) to 16bit (normal adobergb) does not work.. it always turns 32bit into 16bit-sRGB which is a waste if you were working in bigger color gamuts.
  21. Well.. with other things they just said clearly 'by design' or "won't fix". They acknowledged that it didn't do what you want it to do, but it did what they wanted it to do :). In other words, they listened to questions / bug reports but didn't agree on them. Which is OK, people can have different opinions :P. They're just blatantly being silent on threads about this topic which I find very weird and annoying. As an example, how Affinity Photo loads a file and how Photoshop loads the file: and I get the message in Affinity Photo when loading the file that it ignored the embedded profile and assigned my working space profile (where it of course goes wrong). If the files are linear-AdobeRGB I can somewhat managing it by applying a gamma correction of 2.2 (which is still annoying to do with the 'max correction' of 2.0, requiring two level-corrections). But for profiles such as sRGB which don't have a perfectly linear gamma curve this means that I can't properly convert the linear-srgb into gammacorrected-sRGB inside of Affinity Photo, and it doesn't open files with an embedded profile correctly. One of the little reasons why this awesome piece of software is now sitting unused on my system. edit: I tried to be smart. Open the file in Photoshop, save it as 32bit floating-point TIFF, open it up in Affinity Photo. This opens the 32bit file OK (and I see the name of the embedded profile even). But then when I try to convert it to RGB-16, Affinity Photo turns it into sRGB instead of (in this case) Adobe RGB... Wrong again. So while still in 32bit I 'assign' a profile, the 'Adobe RGB (1998) (linear)' option that Affinity Photo gives me. (the image had my own custom adobergb-linear profile previously). When I try to switch this file back to RGB-16, it _still_ makes it sRGB instead of Adobe RGB!! So it seems even if I manage to assign Affinity Photo's own linear-AdobeRGB profile, I can't convert to non-linear 16bit AdobeRGB with it. As long as they don't even respond to this I'm afraid this won't ever be the program I hoped it would be.
  22. jorismak

    ORF (Olympus) raw file strangeness

    He said the problem was he saw bowed lines which should not be there, and other raw converters are not showing. In other words, other raw converters are applying distortion correction, Affinity Photo isn't in this case. The thing is, on my Olympus ORF files, the 'baked in' lens corrections are applied no matter what (while in DxO I can clearly see the 'uncorrected' image if choose to do so). So yay, a good thing, Affinity Photo seems to use the 'lens-correction-in-metadata' in ORF files. But bleh, bad thing, you can't disable it. For what it's worth, you can't disable it in CameraRaw... if you enable 'lens profile corrections' or not, you'll always get a corrected view. And the nasty thing in ACR is that it corrects the lens by making the image smaller. No matter what I try, I get a 3:2 image (from my 4:3 sensor..) in CameraRaw (resized to 1200 long side, I get 1200x800), while DxO gives me 1200x900 and Affinity Photo gives me 1200x897. DxO seems just a bit better corrected, bit less distortion around the edges. Colors are way different between the 3 (the blues in this case), but that's to be expected. CameraRaw tries to let the camera behave to their 'adobe standard' color-look, while DxO and Affinity Photo let the camera be their own thing.. but then DxO has some lightning corrections applied automatically so the colors can of course be different because of that :). most uninteresting picture ever, just snapped in the building near some strong lines with a cheap lens that has quite strong distortion if you don't correct it: https://1drv.ms/f/s!AgtoBEiLfbmcpl-Opm-b_M2J_Imk
  23. jorismak

    ORF (Olympus) raw file strangeness

    Couple of things: I actually don't know for sure but I wouldn't be surprised the lens-corrections that are 'written in the RAW file' are not applied or not properly. This can maybe cause the 'bowed lines' you were talking about. I could compare one of these days with my cheap 17mm f2.8. It has huge distortion when you're looking at the image uncorrected. But since the jpeg engine auto-corrects it and Adobe Camera Raw auto-corrects it, you might not notice it was there before. The pixels _are normal_. This has been explained somewhere else, but different raw converters can have (slight) different output in the number of pixels. Around 16 to 24 pixels wider (or less wide) and higher (or less high) is normal. What happens is that the outer most pixels of your sensor are unused by the jpeg engine in your camera. So they crop the image over-so-slightly to get the image dimensions you are used to. Adobe Camera Raw crops the same. Affinity Photo doesn't crop these most outer pixels, hence you get a few more pixels than what you're used to. It's all still correct. That output size depends on the algorithm in debayering. That 'solid line' you're talking about doesn't happen on my OM-D E-10 files (same sensor I guess? No clue about older m4/3 camera's to be honest). But Affinity Photo does have the habbit of showing a line on the left or on the top on certaim zoom scenario's. It looks like the pattern of a transparent background, but just one pixel wide or one pixel high. It's a rounding error in the zooming and Photoshop has it too on my system (so it might be videocard related). Everytime I doubt about it I zoom in a bit to see if the line 'is real or not'. If not, I ignore it.
  24. How far to you get by upsizing all your shots (still would use a softer bicubic for this, not nearest neighbor) aligning the images (as layers) and then moving one of the layers one pixel (or subpixel if APh can do that), maybe by using the keyboard or something. Does PhotoAcute work on your raw files or does it work on debayered pixels ?