Jump to content
You must now use your email address to sign in [click for more info] ×

James Ritson

Staff
  • Posts

    855
  • Joined

  • Last visited

Everything posted by James Ritson

  1. @the_tux I would certainly lean towards RAM being the issue. From profiling the GPU, it can use roughly 5.5GB of allocated memory during the merging process with all of your files selected, and will actually shoot up temporarily to over 8GB during the initial alignment process whilst the images are being decoded and held in memory. If I select just three files (picking a dark, middle and bright exposure) that usage reduces to around 2GB. Because of the unified memory architecture (as opposed to GPU having its own pool of dedicated memory), you would have to factor in other apps, general overheads and other requirements that will easily push the memory over 8GB and therefore into swap. I've generally found the swap is very efficient with M1—in most cases I haven't even realised it's being used—but it appears there is likely a bottleneck when using the GPU for this task. I'm sorry your experience with M1 has been frustrating—I got one a couple of weeks ago and I was very skeptical, but was happy to be proven wrong. It's quiet, fast and my hugely expensive MPB 16" is now gathering dust! Apart from a few scenarios with CLI apps that are going through the translation layer, everything is basically quicker and snappier, especially in the Affinity apps. Again, I do wonder whether this is either an optimisation issue or a memory issue. Have you tried FCP at all? For me, it's just as fast (if not faster) than when it's being used on a high-end MPB, and without obnoxiously loud fans too which is a welcome change.
  2. Hi Dave, yes, was a bit disappointing—the low resolution was compounded by a decision to frame everything in a window surrounded by graphics. The resampling was very poor so UI text ended up illegible. Tomorrow should hopefully address these issues as I believe the presentation will at least be full screen, and possibly streamed at 1080p. The two pre-recorded evening sessions will certainly go up on YouTube—for the two live sessions, I may well record versions of these to upload as well...
  3. Not at all, there’s nothing better than a quiet workstation! I had a 5700XT with single fan blower design and it frankly drove me barmy. Same with the latest MacBook Pro models that have Navi GPUs—fans that take off if you dare to do anything remotely taxing. I ended up disabling turbo boost on the CPU and taking a performance hit just so the fans weren’t so obnoxiously loud. It becomes a big issue if you’re trying to record audio as well. Using a Mac Mini M1 now and I can’t recommend it enough! The thing is basically silent and I’ve never heard the fan at all. In most situations it’s basically as fast as a MPB four times the price..
  4. Not necessarily—single blower fan cards can be quite noisy but there are various designs that are cool and quiet even with higher end models. On my 2070 Super the fans are actually off most of the time (it’s a triple fan intake design). It takes strenuous activity like gaming to actually activate them—using Photo, video editing and even rendering in blender doesn’t tend to spin them up very often. I would tend to avoid older generation cards just for the sake of noise or heat—they’re more likely to use less efficient and older architecture so you might find it’s the opposite...
  5. That's definitely a bottleneck nowadays—I was previously using an i7 3930K and I put off upgrading for years believing that I wouldn't see a worthy increase in CPU performance. Finally moved to a 10850K and it became clear just how much the old CPU was holding my system back! GPU benchmark score in Photo went from around 4000 to 10,000—and everything else was notably snappier and quicker of course, including gaming. A new GPU can improve an old system, but ideally you would want to pair it with a modern setup that can drive it efficiently. VRAM requirement is dependent on your workflow and setup—screen resolution, document pixel resolution, number of pixel/image layers, bit depth etc. If you really push the envelope editing on a 4K screen with huge resolution documents for example it's better to have a minimum of 4GB, with 8GB giving plenty of headroom for multi-layer 32-bit work. It's clear from the benchmark scores (and real world usage) that the Affinity apps benefit from more powerful GPUs: they can deliver smoother performance especially with live filters, and can also result in quicker compositing times, especially for exporting/merging layers. An Acer XV273K 4K display for colour critical work (shared between Mac and Windows setups), and I have an Asus 280Hz gaming monitor for general use/gaming which feels incredibly smooth! I use an i1 Display Pro with displayCal (Argyll CMS) for calibration and profiling. I'm quite precious about colour accuracy so the Acer monitor was a nice surprise—I check every couple of months or so but it hasn't drifted at all yet.
  6. The majority of gaming cards are quite performant for creative workflows. Unless you need the double precision performance of a workstation card (and 10-bit output, although I understand nVidia's Studio Drivers unlock this capability for consumer cards now) you should find any mid-to-high end gaming card is a great addition to your setup. I run an nVidia 2070 Super on my Windows setup and it's great, can't complain about OpenCL performance in Affinity Photo at all. It lets me work with 32-bit precision documents in real time, applying several live filters with no real dip in performance. Optix support in blender for accelerated ray tracing is also pretty awesome! The main reason I got it however was for its hardware video encoding. The RTX architecture received a real upgrade to the quality/efficiency of its hardware encoding—it's on par with the Medium preset of x264 except it can realtime encode 4K 60fps, which is impressive. AMD's Navi cards are also pretty good, but unfortunately there's a driver issue that means kernel compilation currently takes a huge amount of time in the Affinity apps. Hopefully AMD will address that soon, because once you get past that initial roadblock the actual OpenCL performance is really good. I've tested with a 5700XT and it's not that far off the 2070 Super. If you're incorporating video editing and encoding into your workflow I would probably lean towards nVidia. They just seem to have a stronger focus on video encoding/decoding tech—the enhancements of the RTX line are present and equal on all models, so you could get the same benefits from a 2060 as you would a more expensive 2080Ti for example. Same goes for the RTX 3000 models.
  7. Apologies that you feel misled, but they are tutorials and not designed to advertise or lay claim to any kind of speed expectation—during the recording process, the stacking would usually take anywhere from 30 seconds to 2 minutes, which understandably is a long time for a viewer to sit through when they are just trying to see a step-by-step tutorial guiding them through the workflow. Unfortunately it seems there must be an issue somewhere—I would be leaning towards either the CR3 format or maybe even the pixel resolution, both of which are issues that would need to be addressed. What pixel resolution are your images? Do you happen to be using a full frame 40-50MP (or greater) camera? The alignment implementation we use is quite slow but precise—if you were attempting to stack hundreds of light frames I would understand the process taking so long. The most challenging stack I've given Photo so far is around 550 light frames (24 megapixel resolution) with around 40-50 calibration frames of each type. With all frames enabled, including bad frames that contain star trailing, the stacking can take around an hour. The fact that you cannot stack a mere 22 frames, however, is why I'm asking about pixel resolution. Are you able to try stacking just two light frames to begin with, and perhaps using a smaller number of calibration frames, e.g. 10 each? I'm not suggesting this as a workaround, merely a way of determining whether it's the amount of data causing the issue, or if it could be related to your particular camera and RAW format. Possibly. I would however argue that DSS has years of maturity under its belt, and therefore plenty of testing, bug fixing and no doubt countless users providing data samples from all manner of setups. In order to bring this feature to 1.9 we gathered as much data as possible within the development time period: primarily FIT files of varying resolutions, both monochrome and OSC, and RAW file data from 24 megapixel cameras, mainly Sony and Canon (albeit CR2 format). When DSS first released I would imagine it wasn't perfect out of the gate. We had a lengthy beta period but obviously only a handful of users will realistically engage with that—the initial results from external users were promising however. Since release we've become aware of issues from a larger pool of users and are trying to tackle them. Any help is greatly appreciated. All the best, James
  8. Hey @one2gr82c, your screenshot confirms the same issue: you must use the filter on a pixel layer (you're currently trying to apply it to an adjustment layer). Have you watched the tutorials? Generally, you should do a Merge Visible after the initial tone stretching to create a merged, tone-stretched pixel layer, then apply the Remove Background filter to that layer. Something like this:
  9. Hey @ds.johnston apologies it's not working for you (I did the videos you are referring to). My experience on both Mac and Windows has generally been successful with both FIT and RAW files. RAW files take longer to register and stack because they have to be debayered—and are typically higher resolution—but I haven't come across any crashing when stacking. The only crashing I've experienced has been when accidentally mixing up pixel binned data—e.g. Bin1x1 light frames but accidentally adding Bin2x2 dark frames. Do you have any details about the data you are trying to stack you would be able to share? E.g. pixel resolution, number of light frames and calibration frames. In particular, I see you have selected the Bad Pixel Map Tool and presumably the Detect Bad Pixels stage is taking a while—how many dark frames are you using? It would be useful to clarify this, however: Does the app actually crash (as in completely exit of its own volition) or do you assume it's crashed because the progress bar is still present after an hour? We are noticing a non-linear increase in stacking time after a certain threshold of light frames are added, which is also dependent on the quality of the frames. The alignment method used is quite slow but accurate and this may be factoring into the time required. For example, my largest data set so far has been just over 550 light frames—if I stack all of them, the process can easily take over an hour, which is exacerbated by bad frames with star trailing etc. If I use Select Best Light Frames and set the percentage to around 70-80% which rejects probably around 100-150 of these frames, the stacking time reduces to anywhere between 20-30 minutes. Worth a try if you're actually stacking all of your frames—filter out the bad ones first? As mentioned above, any more information about your data would be useful as well!
  10. Hi again Harald, no problem—the live stacking functionality is interesting, I was aware of it but hadn't seen any results. It would be more practical to be able to just open a singular FIT file within Photo rather than having to use the Astrophotography Stack Persona. I will chase this up as I think there may have been plans to implement it. For now, however, I would recommend stacking with your individual subs and calibration frames—Photo will process everything in 32-bit floating point precision, whereas the FIT file you provided appears to just be 16-bit which may limit flexibility in post production (particularly with tone stretching). Thanks again, James
  11. Ah, I see, that looks like a pre-stacked image, possibly from other software? I couldn't tell anything from the metadata. Photo is expecting raw or calibrated frames to stack with. Have you tried any of your individual subs rather than a stacked result from other software?
  12. Hey @hjnet, this is when adding your files to the Light frames file list? Across on RAW Options, have you tried the FITS Bayer pattern option? It should be inferred from the metadata but we're finding this isn't the case with files from some setups and it defaults to Monochrome. I think the ASI224MC uses an RGGB pattern so try setting that. Hope that helps!
  13. Hi Fivel, there is a short video for the filter, you can view it here: Hope that helps!
  14. Hello all, I will be gradually adding 1.9 functionality videos to YouTube (they are all currently on our website's Learn section). I've just finished the astrophotography videos so you can now view them on YouTube: Astrophotography stacking: SHO composition Astrophotography stacking: One shot colour workflow Astrophotography stacking: Monochrome colour mapping Astrophotography stacking: LRGB composition Astrophotography stacking: HaRGB composition Astrophotography stacking: Bi-colour composition Astrophotography: Removing background gradients Astrophotography: Equalising colours The initial post has been updated with these videos too!
  15. There will be six full-length tutorials for launch on different astrophotography workflows 🙂 (and probably more in the future).
  16. @Danielcely @Russell_L @Griffy apologies for bumping this thread, but there's hopefully a solution—do any of you happen to be using desktop mice, and if so are they gaming mice, Intellimouse, other high-end devices etc? Big Sur has an issue with mice that have high polling rates, and it's causing all sorts of stuttering and performance issues across the board. I have an Intellimouse Pro and brush work in the Affinity apps is unusable—in fact, anything relying on the mouse pointer, so manipulating layers, selecting UI elements etc. Trying to navigate around the Big Sur UI (especially the dock) causes stutter and slowdown as well. The ham-fisted solution was to download the Microsoft Mouse and Keyboard Centre app on Windows, change my mouse polling rate from 1000hz to 125hz, and now everything in macOS is fine, including performance in the Affinity apps. Worth a try on the off chance that you are all using non-Apple pointer devices? You should be able to find software for most manufacturers (e.g. Razer has a management app for both Mac and Windows) to change the polling rate and see if it improves things. Here are a few links to the polling rate issue: https://www.reddit.com/r/MacOS/comments/juj8zs/mouse_lagging_in_big_sur/ https://discussions.apple.com/thread/252047056 https://www.reddit.com/r/MacOS/comments/jzd8iq/macos_big_sur_1101_bug_with_mouse/ https://www.reddit.com/r/MacOS/comments/k3880z/for_mouse_user_fixing_big_sur_lag_when_using_mouse/
  17. Thanks, we're aware of various unlinking issues, let us know if you find any more! So far it seems to be dragging layers up and down the layer stack (using Arrange options is fine), and moving layers in/out of enclosures or moving them to different enclosures. @Old Bruce with your mention of dragging the pair of layers, it might just be worth trying the Arrange menu options (or shortcuts, CMD+brackets/CTRL+brackets for Mac/Windows) to see if that works.
  18. Hi @CJI, I'll do my best to address the issues you've raised. I do a lot of 3D retouching work from Blender and 3ds Max/V-ray (EXR output) so will hopefully be able to give you a few pointers here. First of all, I assume you're using OpenEXR or Radiance HDR formats in 32-bit? 32-bit in Photo is obviously a little bit different as it's not just linear but also unbounded, whereas Photo's 16-bit and 8-bit formats are gamma corrected and bounded. This probably means there are two main points to address: tone mapping and compositing. How would you usually achieve tone mapping in Photoshop? In Photo, one option is to use the Tone Mapping Persona (not Develop), which will give you methods of mapping unbounded values to within the 0-1 range. You can also use OpenColorIO transforms—for example, with Blender, you can apply the Filmic view transform and looks. I did a video on that a couple of months ago: You can also try various adjustments for a more manual approach—the Exposure slider to shift extreme highlights down, for example, then Curves and Levels with a gamma adjustment. This brings me onto compositing—everything operates in linear space (scene linear) within 32-bit, then you have a gamma corrected view transform applied afterwards. It does mean that adjustments in particular may behave differently or seem more "sensitive". Photo allows you to use pretty much the entire roster of tools, adjustments and filters (with the exception of median operators) in 32-bit, but there are a few caveats. Most adjustments should avoid clipping or bounding values, but pre-tone mapping I would stick to Exposure, Curves, White Balance, Channel Mixer, Levels etc. Brightness & Contrast will only operate on a 0-1 value range, but won't clip any unbounded values, and Curves will operate on the 0-1 range by default but you can change the minimum and maximum input values—so if you wanted to only manipulate bright values above 1, for example, you can set minimum to 1 and maximum to 100 (or whatever the brightest value is in your document). Adjustments like HSL, Selective Colour and others that tend to focus on colour manipulation are best saved post-tone mapping, where your pixel values will be in the 0-1 range. If you use OCIO adjustment layers (or check out my Blender Filmic macros which do away with the OCIO dependency) you can add these adjustments above the tone mapping layers or group. If you want to use the Tone Mapping Persona, I'd advise you to do Layer>Merge Visible and then tone map the merged pixel layer, then put your adjustments above this. Hopefully by following the above advice, you'll avoid the clipping and washing out that you describe. I suspect you may not have tone mapped your image and are trying to use adjustments like HSL, Selective Colour on the linear values? Converting to 16-bit at this point will not help the issue, since unbounded values outside 0-1 will be clipped. You need to tone map first using methods described above, then you can manipulate colour freely. That said, as I've covered above, there are certain colour manipulations you can do on the linear values pre-tone map. Channel Mixer, for example, won't clip values, nor will White Balance. I also do a lot of stacked astrophotography editing in 32-bit linear, and sticking a White Balance adjustment before tone stretching is a really powerful way of neutralising background colour casts. It's useful with render compositing too since you can completely shift the temperature and tint without introducing artefacting. One final caveat, then—have you configured OpenColorIO at all with your Photo setup? This throws people off, because when you do have it configured, opening an EXR or HDR document will default to the OCIO transform method (the final document to screen conversion) rather than ICC-based. This is great if you intend to export back to EXR/HDR and simply want to do some retouching work, but if you plan to export from Photo to a standard delivery format like TIFF/PNG/JPEG etc you need to be using ICC Display Transform for an accurate representation. You can configure this on the 32-bit Preview Panel (View>Studio>32-bit Preview). To follow on from this, you mentioned profiles at the bottom of your post—I think you might be referring to document profiles etc? Don't get too hung up on this—in linear unbounded, the colour profile is somewhat arbitrary, and is only used with ICC Display Transform to convert and bound the linear values you're working with into gamma corrected values. With Unmanaged or OpenColorIO view options, this profile does not matter. If you're aiming for web delivery, stick to sRGB (Linear) with ICC Display Transform set and everything will be fine! Apologies for the small essay, hope you find all the above useful.
  19. Well, you would need to just research whether the listed monitors will accept an HDR10 signal. If they advertise HDR specifically and you can find a peak brightness value listed such as 1000nits/600nits etc then yes, chances are you will be good to go. Even better if you can find detailed specifications and confirm they will accept HDR10. I cannot speak about the USB-C connectivity on those monitors and whether they would also carry an HDR10 signal, but since you’re using a Mac with a discrete GPU I guess that’s not an issue here... Unless any third party display specifically advertises it—none that I know of—EDR is limited to Apple displays and (possibly) LG’s third party Thunderbolt displays. However, just to reiterate @Johannes, you would first need to update to Catalina or Big Sur. I believe EDR support is available on Mojave, but not HDR, you will need Catalina minimum for that. However, for all intents and purposes, they behave the same in the Affinity apps, and are simply both referred to as EDR on the user interface.
  20. Hi @Johannes, Apple opened up HDR support to most displays that support an HDR10 signal with Catalina (EDR is their term for their own displays). I've tested it successfully on a Samsung CHG70, Acer XV273K and a couple of larger monitors/TVs. Apple displays themselves will typically have EDR which is just activated automatically, and is available within the Affinity apps when working with 32-bit format documents. Displays include the 2019 MacBook Pro, 2020 iMac (I think?), Pro Display XDR etc. At one point the third party LG 5K monitors supported it in a public beta, but I'm not sure if that's still the case. Any modern display with genuine HDR support should be fine—be aware of lower end models that have some kind of "HDR picture processing", they need to actually support an HDR10 signal. If in doubt, look for VESA certified models, e.g. HDR400, HDR600, HDR1000 etc. Looking at those models you've listed, as long as they actually accept an HDR signal they should theoretically work with macOS's HDR compositing. The bigger issue is making sure you have the right connectivity and OS updates—I noticed you're on Mojave, I believe you need Catalina as a minimum. The RX580 is Polaris architecture so you should be OK there. Use an up-to-date DisplayPort cable (1.4) or HDMI cable (2.0 or higher). Try and use DisplayPort, since HDMI 2.0 has various a and b permutations which affect the maximum chroma sampling and refresh rate you can achieve at 4K HDR etc. I don't have the figures to hand at the moment but I think the RX 580 might only be HDMI 2.0, so just stick with DisplayPort to be safe. You're on a Mac Pro with a discrete GPU, so you don't need to mess around with dongles—good news there! There are USB-C DisplayPort dongles that support 4K60 HDR etc, and Apple's HDMI adapter will do 4K HDR but I'm not sure about chroma sampling limitations there. Assuming all your hardware supports it, and you've got Catalina or Big Sur installed, you should have an HDR toggle on your display preferences: Then in your Affinity app (typically Photo or Designer), make sure your document is in 32-bit and open the 32-bit Preview panel (View>Studio). Enable EDR and then values above 1.0 will be mapped to the extended dynamic range: And that should be it! Hope that was helpful.
  21. Hey @jomaro, hopefully this should give you enough info to work with.. Looks like you have an OCIO configuration set up, and your EXR file is appended with "acescg"? So Photo will convert from ACEScg to scene linear because it's found a matching profile. It's just the way 32-bit linear in Photo works—everything is converted to scene linear, then the colour space is defined either by traditional display profile conversion (ICC) or via OCIO. You choose which using the 32-bit preview panel (View>Studio>32-bit Preview). Bear in mind that if you plan to export from Affinity Photo as a gamma-encoded format (16-bit, 8-bit JPEG/TIFF etc), you need to preview using ICC Display Transform. So just to clarify, the document's colour profile (e.g. sRGB, ROMM RGB etc) is arbitrary and only applied when using ICC Display Transform—then the scene linear values are converted and bounded to that profile. If you choose Unmanaged, you'll be looking at the linear values. OCIO Display Transform will convert the scene linear values according to the view and device transforms that are mandated in the combo boxes. You would need to install a given ICC profile for that to show up. However, I suspect you probably want to be managing with OpenColorIO. Looks like you've already managed to set it up, but here's a reference video just in case: Within the 32-bit preview panel, you will want to choose OCIO Display Transform (it will be enabled by default with a valid OCIO configuration). Then you set your view transform on the left and device transform on the right. The OCIO adjustment layer is for moving between colour spaces within the same document—you might want to do this for compositing layers with different source colour spaces, for example. You can also bake in colour space primaries if you wish to do that as well (typically go from scene linear to whichever colour space you require). Yes, Photo can convert to profiles on import/export by appending the file name with a given colour space. For example, if you imported a file named "render aces.exr", it would convert from the ACES colour space to scene linear. Similarly, if you append "aces" to your file name when you export back to EXR, it will convert the document primaries from scene linear back to ACES. Hopefully the above all helps? Let me know if you have any further questions!
  22. Hi again @nater973, that's interesting regarding the monitor difference. I have a 4K monitor which is running nearer 5K (4608x2592) before being downsampled, and I saw no issues. I wonder if your BenQ has some artificial sharpening being applied, or whether it's some kind of retina scaling issue within Photo on Windows. I'll try and hook my 4K display up to my Windows box later and see if I can reproduce it. Just something to try if you have time—in Edit>Preferences>Performance you'll see a Retina Rendering option. Does the grid pattern change/disappear if you set this to Low or High quality rather than Auto? If it prompts you to restart, you can instead just go to View>New View for your settings to take effect. If that's the noise level from stacking, what did your original exposures look like? There's still a lot of chromatic noise in your final image which really should have been cancelled out to some degree by stacking—are you sure the image stacked successfully? PS if you don't need star alignment (I'm guessing due to it being a wide field shot you don't), you could always try pre-processing the NEF files to 16-bit TIFFs then stacking them in Photo using File>New Stack (don't stack RAWs as you get no tonal control). With a Mean operator (default is Median) you might be surprised at the noise reduction you can achieve. Possibly worth a try anyway! Thanks again, James
  23. Hi @nater973, there are a couple of things to dissect here (I'm internal staff so the samples have been shared with me, hopefully that's OK?). Firstly, are you seeing the grid pattern with a particular image viewer, or perhaps when uploaded? I've checked your output JPEG in Photo and macOS Preview and cannot see any grid artefacting. I've also tried resampling the TIFF you've provided using different resampling methods (Bilinear for softest resampling, for example) and cannot see any particular grid patterning. What's in your layer stack when editing? Are you using any live filters for example? Perhaps try flattening your document before using Document>Resize Document to see if the results differ? Whilst I can't see any grid patterning, I can see shimmering when moving between different zoom levels. There is a lot of high frequency content in your document as a result of the noisy sky. Have you applied some additional sharpening as well? I would propose at the very least removing all the colour noise—this may help with your grid pattern issue, since it should hide obvious bayer pattern noise, but it will also help greatly with compression efficiency when you come to export. You can de-noise destructively (Filters>Noise>Denoise) or non-destructively (Layer>New Live Filter Layer>Noise>Denoise). Whichever method you use, just bring the Colour slider up to about 15% and you should find it gets rid of the colour noise without sacrificing your star colour information. For export, I would also recommend using a small amount of luminance de-noising as well (before resampling your document, if you choose to do so). This will just take the "edge" off all the high frequency detail and make it more compressible. If you want to go further, some kind of low pass may help as well (e.g. small amount of gaussian blur). Just by de-noising, I managed to halve the file size of the eventual export at full resolution (JPEG, at 90 quality): it went from 31MB to 15MB. Hope the above helps—if the issue persists, could you perhaps capture a screengrab of however you are viewing the image when the grid pattern is visible? Finally, to give this post some interesting visuals, here's a comparison of the image before/after de-noising in the frequency domain! You can see this by using the Scope panel (View>Studio>Scope) and selecting Power Spectral Density. Useful for finding out the frequency "composition" of your image (and potentially how noisy it is).
  24. Hey all, recently I've been re-recording some old videos and producing some new ones too. They've been going up on YouTube but I've now updated the original post with the new links. Here are all the new/re-uploaded videos: New document with templates Tool cycling Keyboard and Mouse Brush Modifier Stacking: Object removal Panoramas Sky replacement Bitmap pattern fills Selecting sampled colours Selection Brush Tool Freehand Selection Tool: Freehand, Polygonal and Magnetic modes Procedural Texture: Nonlinear Transform Correction Editing metadata Retouching scanned line drawings Applying Blender Filmic looks Compositing 3ds Max and V-Ray render passes As always, hope you find them useful!
×
×
  • Create New...

Important Information

Terms of Use | Privacy Policy | Guidelines | We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.