Jump to content

James Ritson

Moderators
  • Content Count

    697
  • Joined

  • Last visited

About James Ritson

  • Rank
    Product Expert

Contact Methods

  • Website URL
    http://www.jamesritson.co.uk

Profile Information

  • Gender
    Not Telling

Recent Profile Visitors

10,819 profile views
  1. Hey @one2gr82c, your screenshot confirms the same issue: you must use the filter on a pixel layer (you're currently trying to apply it to an adjustment layer). Have you watched the tutorials? Generally, you should do a Merge Visible after the initial tone stretching to create a merged, tone-stretched pixel layer, then apply the Remove Background filter to that layer. Something like this:
  2. Hey @ds.johnston apologies it's not working for you (I did the videos you are referring to). My experience on both Mac and Windows has generally been successful with both FIT and RAW files. RAW files take longer to register and stack because they have to be debayered—and are typically higher resolution—but I haven't come across any crashing when stacking. The only crashing I've experienced has been when accidentally mixing up pixel binned data—e.g. Bin1x1 light frames but accidentally adding Bin2x2 dark frames. Do you have any details about the data you are trying to stack you would be able to share? E.g. pixel resolution, number of light frames and calibration frames. In particular, I see you have selected the Bad Pixel Map Tool and presumably the Detect Bad Pixels stage is taking a while—how many dark frames are you using? It would be useful to clarify this, however: Does the app actually crash (as in completely exit of its own volition) or do you assume it's crashed because the progress bar is still present after an hour? We are noticing a non-linear increase in stacking time after a certain threshold of light frames are added, which is also dependent on the quality of the frames. The alignment method used is quite slow but accurate and this may be factoring into the time required. For example, my largest data set so far has been just over 550 light frames—if I stack all of them, the process can easily take over an hour, which is exacerbated by bad frames with star trailing etc. If I use Select Best Light Frames and set the percentage to around 70-80% which rejects probably around 100-150 of these frames, the stacking time reduces to anywhere between 20-30 minutes. Worth a try if you're actually stacking all of your frames—filter out the bad ones first? As mentioned above, any more information about your data would be useful as well!
  3. Hi again Harald, no problem—the live stacking functionality is interesting, I was aware of it but hadn't seen any results. It would be more practical to be able to just open a singular FIT file within Photo rather than having to use the Astrophotography Stack Persona. I will chase this up as I think there may have been plans to implement it. For now, however, I would recommend stacking with your individual subs and calibration frames—Photo will process everything in 32-bit floating point precision, whereas the FIT file you provided appears to just be 16-bit which may limit flexibility in post production (particularly with tone stretching). Thanks again, James
  4. Ah, I see, that looks like a pre-stacked image, possibly from other software? I couldn't tell anything from the metadata. Photo is expecting raw or calibrated frames to stack with. Have you tried any of your individual subs rather than a stacked result from other software?
  5. Hey @hjnet, this is when adding your files to the Light frames file list? Across on RAW Options, have you tried the FITS Bayer pattern option? It should be inferred from the metadata but we're finding this isn't the case with files from some setups and it defaults to Monochrome. I think the ASI224MC uses an RGGB pattern so try setting that. Hope that helps!
  6. Hi Fivel, there is a short video for the filter, you can view it here: Hope that helps!
  7. Hello all, I will be gradually adding 1.9 functionality videos to YouTube (they are all currently on our website's Learn section). I've just finished the astrophotography videos so you can now view them on YouTube: Astrophotography stacking: SHO composition Astrophotography stacking: One shot colour workflow Astrophotography stacking: Monochrome colour mapping Astrophotography stacking: LRGB composition Astrophotography stacking: HaRGB composition Astrophotography stacking: Bi-colour composition Astrophotography: Removing background gradients Astrophotography: Equalising colours The initial post has been updated with these videos too!
  8. There will be six full-length tutorials for launch on different astrophotography workflows 🙂 (and probably more in the future).
  9. @Danielcely @Russell_L @Griffy apologies for bumping this thread, but there's hopefully a solution—do any of you happen to be using desktop mice, and if so are they gaming mice, Intellimouse, other high-end devices etc? Big Sur has an issue with mice that have high polling rates, and it's causing all sorts of stuttering and performance issues across the board. I have an Intellimouse Pro and brush work in the Affinity apps is unusable—in fact, anything relying on the mouse pointer, so manipulating layers, selecting UI elements etc. Trying to navigate around the Big Sur UI (especially the dock) causes stutter and slowdown as well. The ham-fisted solution was to download the Microsoft Mouse and Keyboard Centre app on Windows, change my mouse polling rate from 1000hz to 125hz, and now everything in macOS is fine, including performance in the Affinity apps. Worth a try on the off chance that you are all using non-Apple pointer devices? You should be able to find software for most manufacturers (e.g. Razer has a management app for both Mac and Windows) to change the polling rate and see if it improves things. Here are a few links to the polling rate issue: https://www.reddit.com/r/MacOS/comments/juj8zs/mouse_lagging_in_big_sur/ https://discussions.apple.com/thread/252047056 https://www.reddit.com/r/MacOS/comments/jzd8iq/macos_big_sur_1101_bug_with_mouse/ https://www.reddit.com/r/MacOS/comments/k3880z/for_mouse_user_fixing_big_sur_lag_when_using_mouse/
  10. Thanks, we're aware of various unlinking issues, let us know if you find any more! So far it seems to be dragging layers up and down the layer stack (using Arrange options is fine), and moving layers in/out of enclosures or moving them to different enclosures. @Old Bruce with your mention of dragging the pair of layers, it might just be worth trying the Arrange menu options (or shortcuts, CMD+brackets/CTRL+brackets for Mac/Windows) to see if that works.
  11. Hi @CJI, I'll do my best to address the issues you've raised. I do a lot of 3D retouching work from Blender and 3ds Max/V-ray (EXR output) so will hopefully be able to give you a few pointers here. First of all, I assume you're using OpenEXR or Radiance HDR formats in 32-bit? 32-bit in Photo is obviously a little bit different as it's not just linear but also unbounded, whereas Photo's 16-bit and 8-bit formats are gamma corrected and bounded. This probably means there are two main points to address: tone mapping and compositing. How would you usually achieve tone mapping in Photoshop? In Photo, one option is to use the Tone Mapping Persona (not Develop), which will give you methods of mapping unbounded values to within the 0-1 range. You can also use OpenColorIO transforms—for example, with Blender, you can apply the Filmic view transform and looks. I did a video on that a couple of months ago: You can also try various adjustments for a more manual approach—the Exposure slider to shift extreme highlights down, for example, then Curves and Levels with a gamma adjustment. This brings me onto compositing—everything operates in linear space (scene linear) within 32-bit, then you have a gamma corrected view transform applied afterwards. It does mean that adjustments in particular may behave differently or seem more "sensitive". Photo allows you to use pretty much the entire roster of tools, adjustments and filters (with the exception of median operators) in 32-bit, but there are a few caveats. Most adjustments should avoid clipping or bounding values, but pre-tone mapping I would stick to Exposure, Curves, White Balance, Channel Mixer, Levels etc. Brightness & Contrast will only operate on a 0-1 value range, but won't clip any unbounded values, and Curves will operate on the 0-1 range by default but you can change the minimum and maximum input values—so if you wanted to only manipulate bright values above 1, for example, you can set minimum to 1 and maximum to 100 (or whatever the brightest value is in your document). Adjustments like HSL, Selective Colour and others that tend to focus on colour manipulation are best saved post-tone mapping, where your pixel values will be in the 0-1 range. If you use OCIO adjustment layers (or check out my Blender Filmic macros which do away with the OCIO dependency) you can add these adjustments above the tone mapping layers or group. If you want to use the Tone Mapping Persona, I'd advise you to do Layer>Merge Visible and then tone map the merged pixel layer, then put your adjustments above this. Hopefully by following the above advice, you'll avoid the clipping and washing out that you describe. I suspect you may not have tone mapped your image and are trying to use adjustments like HSL, Selective Colour on the linear values? Converting to 16-bit at this point will not help the issue, since unbounded values outside 0-1 will be clipped. You need to tone map first using methods described above, then you can manipulate colour freely. That said, as I've covered above, there are certain colour manipulations you can do on the linear values pre-tone map. Channel Mixer, for example, won't clip values, nor will White Balance. I also do a lot of stacked astrophotography editing in 32-bit linear, and sticking a White Balance adjustment before tone stretching is a really powerful way of neutralising background colour casts. It's useful with render compositing too since you can completely shift the temperature and tint without introducing artefacting. One final caveat, then—have you configured OpenColorIO at all with your Photo setup? This throws people off, because when you do have it configured, opening an EXR or HDR document will default to the OCIO transform method (the final document to screen conversion) rather than ICC-based. This is great if you intend to export back to EXR/HDR and simply want to do some retouching work, but if you plan to export from Photo to a standard delivery format like TIFF/PNG/JPEG etc you need to be using ICC Display Transform for an accurate representation. You can configure this on the 32-bit Preview Panel (View>Studio>32-bit Preview). To follow on from this, you mentioned profiles at the bottom of your post—I think you might be referring to document profiles etc? Don't get too hung up on this—in linear unbounded, the colour profile is somewhat arbitrary, and is only used with ICC Display Transform to convert and bound the linear values you're working with into gamma corrected values. With Unmanaged or OpenColorIO view options, this profile does not matter. If you're aiming for web delivery, stick to sRGB (Linear) with ICC Display Transform set and everything will be fine! Apologies for the small essay, hope you find all the above useful.
  12. Well, you would need to just research whether the listed monitors will accept an HDR10 signal. If they advertise HDR specifically and you can find a peak brightness value listed such as 1000nits/600nits etc then yes, chances are you will be good to go. Even better if you can find detailed specifications and confirm they will accept HDR10. I cannot speak about the USB-C connectivity on those monitors and whether they would also carry an HDR10 signal, but since you’re using a Mac with a discrete GPU I guess that’s not an issue here... Unless any third party display specifically advertises it—none that I know of—EDR is limited to Apple displays and (possibly) LG’s third party Thunderbolt displays. However, just to reiterate @Johannes, you would first need to update to Catalina or Big Sur. I believe EDR support is available on Mojave, but not HDR, you will need Catalina minimum for that. However, for all intents and purposes, they behave the same in the Affinity apps, and are simply both referred to as EDR on the user interface.
  13. Hi @Johannes, Apple opened up HDR support to most displays that support an HDR10 signal with Catalina (EDR is their term for their own displays). I've tested it successfully on a Samsung CHG70, Acer XV273K and a couple of larger monitors/TVs. Apple displays themselves will typically have EDR which is just activated automatically, and is available within the Affinity apps when working with 32-bit format documents. Displays include the 2019 MacBook Pro, 2020 iMac (I think?), Pro Display XDR etc. At one point the third party LG 5K monitors supported it in a public beta, but I'm not sure if that's still the case. Any modern display with genuine HDR support should be fine—be aware of lower end models that have some kind of "HDR picture processing", they need to actually support an HDR10 signal. If in doubt, look for VESA certified models, e.g. HDR400, HDR600, HDR1000 etc. Looking at those models you've listed, as long as they actually accept an HDR signal they should theoretically work with macOS's HDR compositing. The bigger issue is making sure you have the right connectivity and OS updates—I noticed you're on Mojave, I believe you need Catalina as a minimum. The RX580 is Polaris architecture so you should be OK there. Use an up-to-date DisplayPort cable (1.4) or HDMI cable (2.0 or higher). Try and use DisplayPort, since HDMI 2.0 has various a and b permutations which affect the maximum chroma sampling and refresh rate you can achieve at 4K HDR etc. I don't have the figures to hand at the moment but I think the RX 580 might only be HDMI 2.0, so just stick with DisplayPort to be safe. You're on a Mac Pro with a discrete GPU, so you don't need to mess around with dongles—good news there! There are USB-C DisplayPort dongles that support 4K60 HDR etc, and Apple's HDMI adapter will do 4K HDR but I'm not sure about chroma sampling limitations there. Assuming all your hardware supports it, and you've got Catalina or Big Sur installed, you should have an HDR toggle on your display preferences: Then in your Affinity app (typically Photo or Designer), make sure your document is in 32-bit and open the 32-bit Preview panel (View>Studio). Enable EDR and then values above 1.0 will be mapped to the extended dynamic range: And that should be it! Hope that was helpful.
  14. Hey @jomaro, hopefully this should give you enough info to work with.. Looks like you have an OCIO configuration set up, and your EXR file is appended with "acescg"? So Photo will convert from ACEScg to scene linear because it's found a matching profile. It's just the way 32-bit linear in Photo works—everything is converted to scene linear, then the colour space is defined either by traditional display profile conversion (ICC) or via OCIO. You choose which using the 32-bit preview panel (View>Studio>32-bit Preview). Bear in mind that if you plan to export from Affinity Photo as a gamma-encoded format (16-bit, 8-bit JPEG/TIFF etc), you need to preview using ICC Display Transform. So just to clarify, the document's colour profile (e.g. sRGB, ROMM RGB etc) is arbitrary and only applied when using ICC Display Transform—then the scene linear values are converted and bounded to that profile. If you choose Unmanaged, you'll be looking at the linear values. OCIO Display Transform will convert the scene linear values according to the view and device transforms that are mandated in the combo boxes. You would need to install a given ICC profile for that to show up. However, I suspect you probably want to be managing with OpenColorIO. Looks like you've already managed to set it up, but here's a reference video just in case: Within the 32-bit preview panel, you will want to choose OCIO Display Transform (it will be enabled by default with a valid OCIO configuration). Then you set your view transform on the left and device transform on the right. The OCIO adjustment layer is for moving between colour spaces within the same document—you might want to do this for compositing layers with different source colour spaces, for example. You can also bake in colour space primaries if you wish to do that as well (typically go from scene linear to whichever colour space you require). Yes, Photo can convert to profiles on import/export by appending the file name with a given colour space. For example, if you imported a file named "render aces.exr", it would convert from the ACES colour space to scene linear. Similarly, if you append "aces" to your file name when you export back to EXR, it will convert the document primaries from scene linear back to ACES. Hopefully the above all helps? Let me know if you have any further questions!
  15. Hi again @nater973, that's interesting regarding the monitor difference. I have a 4K monitor which is running nearer 5K (4608x2592) before being downsampled, and I saw no issues. I wonder if your BenQ has some artificial sharpening being applied, or whether it's some kind of retina scaling issue within Photo on Windows. I'll try and hook my 4K display up to my Windows box later and see if I can reproduce it. Just something to try if you have time—in Edit>Preferences>Performance you'll see a Retina Rendering option. Does the grid pattern change/disappear if you set this to Low or High quality rather than Auto? If it prompts you to restart, you can instead just go to View>New View for your settings to take effect. If that's the noise level from stacking, what did your original exposures look like? There's still a lot of chromatic noise in your final image which really should have been cancelled out to some degree by stacking—are you sure the image stacked successfully? PS if you don't need star alignment (I'm guessing due to it being a wide field shot you don't), you could always try pre-processing the NEF files to 16-bit TIFFs then stacking them in Photo using File>New Stack (don't stack RAWs as you get no tonal control). With a Mean operator (default is Median) you might be surprised at the noise reduction you can achieve. Possibly worth a try anyway! Thanks again, James
×
×
  • Create New...

Important Information

Please note the Annual Company Closure section in the Terms of Use. These are the Terms of Use you will be asked to agree to if you join the forum. | Privacy Policy | Guidelines | We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.