Jump to content

James Ritson

Moderators
  • Content Count

    689
  • Joined

  • Last visited

Everything posted by James Ritson

  1. @Danielcely @Russell_L @Griffy apologies for bumping this thread, but there's hopefully a solution—do any of you happen to be using desktop mice, and if so are they gaming mice, Intellimouse, other high-end devices etc? Big Sur has an issue with mice that have high polling rates, and it's causing all sorts of stuttering and performance issues across the board. I have an Intellimouse Pro and brush work in the Affinity apps is unusable—in fact, anything relying on the mouse pointer, so manipulating layers, selecting UI elements etc. Trying to navigate around the Big Sur UI (especially the dock) causes stutter and slowdown as well. The ham-fisted solution was to download the Microsoft Mouse and Keyboard Centre app on Windows, change my mouse polling rate from 1000hz to 125hz, and now everything in macOS is fine, including performance in the Affinity apps. Worth a try on the off chance that you are all using non-Apple pointer devices? You should be able to find software for most manufacturers (e.g. Razer has a management app for both Mac and Windows) to change the polling rate and see if it improves things. Here are a few links to the polling rate issue: https://www.reddit.com/r/MacOS/comments/juj8zs/mouse_lagging_in_big_sur/ https://discussions.apple.com/thread/252047056 https://www.reddit.com/r/MacOS/comments/jzd8iq/macos_big_sur_1101_bug_with_mouse/ https://www.reddit.com/r/MacOS/comments/k3880z/for_mouse_user_fixing_big_sur_lag_when_using_mouse/
  2. Thanks, we're aware of various unlinking issues, let us know if you find any more! So far it seems to be dragging layers up and down the layer stack (using Arrange options is fine), and moving layers in/out of enclosures or moving them to different enclosures. @Old Bruce with your mention of dragging the pair of layers, it might just be worth trying the Arrange menu options (or shortcuts, CMD+brackets/CTRL+brackets for Mac/Windows) to see if that works.
  3. Hi @CJI, I'll do my best to address the issues you've raised. I do a lot of 3D retouching work from Blender and 3ds Max/V-ray (EXR output) so will hopefully be able to give you a few pointers here. First of all, I assume you're using OpenEXR or Radiance HDR formats in 32-bit? 32-bit in Photo is obviously a little bit different as it's not just linear but also unbounded, whereas Photo's 16-bit and 8-bit formats are gamma corrected and bounded. This probably means there are two main points to address: tone mapping and compositing. How would you usually achieve tone mapping in Photoshop? In Photo, one option is to use the Tone Mapping Persona (not Develop), which will give you methods of mapping unbounded values to within the 0-1 range. You can also use OpenColorIO transforms—for example, with Blender, you can apply the Filmic view transform and looks. I did a video on that a couple of months ago: You can also try various adjustments for a more manual approach—the Exposure slider to shift extreme highlights down, for example, then Curves and Levels with a gamma adjustment. This brings me onto compositing—everything operates in linear space (scene linear) within 32-bit, then you have a gamma corrected view transform applied afterwards. It does mean that adjustments in particular may behave differently or seem more "sensitive". Photo allows you to use pretty much the entire roster of tools, adjustments and filters (with the exception of median operators) in 32-bit, but there are a few caveats. Most adjustments should avoid clipping or bounding values, but pre-tone mapping I would stick to Exposure, Curves, White Balance, Channel Mixer, Levels etc. Brightness & Contrast will only operate on a 0-1 value range, but won't clip any unbounded values, and Curves will operate on the 0-1 range by default but you can change the minimum and maximum input values—so if you wanted to only manipulate bright values above 1, for example, you can set minimum to 1 and maximum to 100 (or whatever the brightest value is in your document). Adjustments like HSL, Selective Colour and others that tend to focus on colour manipulation are best saved post-tone mapping, where your pixel values will be in the 0-1 range. If you use OCIO adjustment layers (or check out my Blender Filmic macros which do away with the OCIO dependency) you can add these adjustments above the tone mapping layers or group. If you want to use the Tone Mapping Persona, I'd advise you to do Layer>Merge Visible and then tone map the merged pixel layer, then put your adjustments above this. Hopefully by following the above advice, you'll avoid the clipping and washing out that you describe. I suspect you may not have tone mapped your image and are trying to use adjustments like HSL, Selective Colour on the linear values? Converting to 16-bit at this point will not help the issue, since unbounded values outside 0-1 will be clipped. You need to tone map first using methods described above, then you can manipulate colour freely. That said, as I've covered above, there are certain colour manipulations you can do on the linear values pre-tone map. Channel Mixer, for example, won't clip values, nor will White Balance. I also do a lot of stacked astrophotography editing in 32-bit linear, and sticking a White Balance adjustment before tone stretching is a really powerful way of neutralising background colour casts. It's useful with render compositing too since you can completely shift the temperature and tint without introducing artefacting. One final caveat, then—have you configured OpenColorIO at all with your Photo setup? This throws people off, because when you do have it configured, opening an EXR or HDR document will default to the OCIO transform method (the final document to screen conversion) rather than ICC-based. This is great if you intend to export back to EXR/HDR and simply want to do some retouching work, but if you plan to export from Photo to a standard delivery format like TIFF/PNG/JPEG etc you need to be using ICC Display Transform for an accurate representation. You can configure this on the 32-bit Preview Panel (View>Studio>32-bit Preview). To follow on from this, you mentioned profiles at the bottom of your post—I think you might be referring to document profiles etc? Don't get too hung up on this—in linear unbounded, the colour profile is somewhat arbitrary, and is only used with ICC Display Transform to convert and bound the linear values you're working with into gamma corrected values. With Unmanaged or OpenColorIO view options, this profile does not matter. If you're aiming for web delivery, stick to sRGB (Linear) with ICC Display Transform set and everything will be fine! Apologies for the small essay, hope you find all the above useful.
  4. Well, you would need to just research whether the listed monitors will accept an HDR10 signal. If they advertise HDR specifically and you can find a peak brightness value listed such as 1000nits/600nits etc then yes, chances are you will be good to go. Even better if you can find detailed specifications and confirm they will accept HDR10. I cannot speak about the USB-C connectivity on those monitors and whether they would also carry an HDR10 signal, but since you’re using a Mac with a discrete GPU I guess that’s not an issue here... Unless any third party display specifically advertises it—none that I know of—EDR is limited to Apple displays and (possibly) LG’s third party Thunderbolt displays. However, just to reiterate @Johannes, you would first need to update to Catalina or Big Sur. I believe EDR support is available on Mojave, but not HDR, you will need Catalina minimum for that. However, for all intents and purposes, they behave the same in the Affinity apps, and are simply both referred to as EDR on the user interface.
  5. Hi @Johannes, Apple opened up HDR support to most displays that support an HDR10 signal with Catalina (EDR is their term for their own displays). I've tested it successfully on a Samsung CHG70, Acer XV273K and a couple of larger monitors/TVs. Apple displays themselves will typically have EDR which is just activated automatically, and is available within the Affinity apps when working with 32-bit format documents. Displays include the 2019 MacBook Pro, 2020 iMac (I think?), Pro Display XDR etc. At one point the third party LG 5K monitors supported it in a public beta, but I'm not sure if that's still the case. Any modern display with genuine HDR support should be fine—be aware of lower end models that have some kind of "HDR picture processing", they need to actually support an HDR10 signal. If in doubt, look for VESA certified models, e.g. HDR400, HDR600, HDR1000 etc. Looking at those models you've listed, as long as they actually accept an HDR signal they should theoretically work with macOS's HDR compositing. The bigger issue is making sure you have the right connectivity and OS updates—I noticed you're on Mojave, I believe you need Catalina as a minimum. The RX580 is Polaris architecture so you should be OK there. Use an up-to-date DisplayPort cable (1.4) or HDMI cable (2.0 or higher). Try and use DisplayPort, since HDMI 2.0 has various a and b permutations which affect the maximum chroma sampling and refresh rate you can achieve at 4K HDR etc. I don't have the figures to hand at the moment but I think the RX 580 might only be HDMI 2.0, so just stick with DisplayPort to be safe. You're on a Mac Pro with a discrete GPU, so you don't need to mess around with dongles—good news there! There are USB-C DisplayPort dongles that support 4K60 HDR etc, and Apple's HDMI adapter will do 4K HDR but I'm not sure about chroma sampling limitations there. Assuming all your hardware supports it, and you've got Catalina or Big Sur installed, you should have an HDR toggle on your display preferences: Then in your Affinity app (typically Photo or Designer), make sure your document is in 32-bit and open the 32-bit Preview panel (View>Studio). Enable EDR and then values above 1.0 will be mapped to the extended dynamic range: And that should be it! Hope that was helpful.
  6. Hey @jomaro, hopefully this should give you enough info to work with.. Looks like you have an OCIO configuration set up, and your EXR file is appended with "acescg"? So Photo will convert from ACEScg to scene linear because it's found a matching profile. It's just the way 32-bit linear in Photo works—everything is converted to scene linear, then the colour space is defined either by traditional display profile conversion (ICC) or via OCIO. You choose which using the 32-bit preview panel (View>Studio>32-bit Preview). Bear in mind that if you plan to export from Affinity Photo as a gamma-encoded format (16-bit, 8-bit JPEG/TIFF etc), you need to preview using ICC Display Transform. So just to clarify, the document's colour profile (e.g. sRGB, ROMM RGB etc) is arbitrary and only applied when using ICC Display Transform—then the scene linear values are converted and bounded to that profile. If you choose Unmanaged, you'll be looking at the linear values. OCIO Display Transform will convert the scene linear values according to the view and device transforms that are mandated in the combo boxes. You would need to install a given ICC profile for that to show up. However, I suspect you probably want to be managing with OpenColorIO. Looks like you've already managed to set it up, but here's a reference video just in case: Within the 32-bit preview panel, you will want to choose OCIO Display Transform (it will be enabled by default with a valid OCIO configuration). Then you set your view transform on the left and device transform on the right. The OCIO adjustment layer is for moving between colour spaces within the same document—you might want to do this for compositing layers with different source colour spaces, for example. You can also bake in colour space primaries if you wish to do that as well (typically go from scene linear to whichever colour space you require). Yes, Photo can convert to profiles on import/export by appending the file name with a given colour space. For example, if you imported a file named "render aces.exr", it would convert from the ACES colour space to scene linear. Similarly, if you append "aces" to your file name when you export back to EXR, it will convert the document primaries from scene linear back to ACES. Hopefully the above all helps? Let me know if you have any further questions!
  7. Hi again @nater973, that's interesting regarding the monitor difference. I have a 4K monitor which is running nearer 5K (4608x2592) before being downsampled, and I saw no issues. I wonder if your BenQ has some artificial sharpening being applied, or whether it's some kind of retina scaling issue within Photo on Windows. I'll try and hook my 4K display up to my Windows box later and see if I can reproduce it. Just something to try if you have time—in Edit>Preferences>Performance you'll see a Retina Rendering option. Does the grid pattern change/disappear if you set this to Low or High quality rather than Auto? If it prompts you to restart, you can instead just go to View>New View for your settings to take effect. If that's the noise level from stacking, what did your original exposures look like? There's still a lot of chromatic noise in your final image which really should have been cancelled out to some degree by stacking—are you sure the image stacked successfully? PS if you don't need star alignment (I'm guessing due to it being a wide field shot you don't), you could always try pre-processing the NEF files to 16-bit TIFFs then stacking them in Photo using File>New Stack (don't stack RAWs as you get no tonal control). With a Mean operator (default is Median) you might be surprised at the noise reduction you can achieve. Possibly worth a try anyway! Thanks again, James
  8. Hi @nater973, there are a couple of things to dissect here (I'm internal staff so the samples have been shared with me, hopefully that's OK?). Firstly, are you seeing the grid pattern with a particular image viewer, or perhaps when uploaded? I've checked your output JPEG in Photo and macOS Preview and cannot see any grid artefacting. I've also tried resampling the TIFF you've provided using different resampling methods (Bilinear for softest resampling, for example) and cannot see any particular grid patterning. What's in your layer stack when editing? Are you using any live filters for example? Perhaps try flattening your document before using Document>Resize Document to see if the results differ? Whilst I can't see any grid patterning, I can see shimmering when moving between different zoom levels. There is a lot of high frequency content in your document as a result of the noisy sky. Have you applied some additional sharpening as well? I would propose at the very least removing all the colour noise—this may help with your grid pattern issue, since it should hide obvious bayer pattern noise, but it will also help greatly with compression efficiency when you come to export. You can de-noise destructively (Filters>Noise>Denoise) or non-destructively (Layer>New Live Filter Layer>Noise>Denoise). Whichever method you use, just bring the Colour slider up to about 15% and you should find it gets rid of the colour noise without sacrificing your star colour information. For export, I would also recommend using a small amount of luminance de-noising as well (before resampling your document, if you choose to do so). This will just take the "edge" off all the high frequency detail and make it more compressible. If you want to go further, some kind of low pass may help as well (e.g. small amount of gaussian blur). Just by de-noising, I managed to halve the file size of the eventual export at full resolution (JPEG, at 90 quality): it went from 31MB to 15MB. Hope the above helps—if the issue persists, could you perhaps capture a screengrab of however you are viewing the image when the grid pattern is visible? Finally, to give this post some interesting visuals, here's a comparison of the image before/after de-noising in the frequency domain! You can see this by using the Scope panel (View>Studio>Scope) and selecting Power Spectral Density. Useful for finding out the frequency "composition" of your image (and potentially how noisy it is).
  9. Hey all, recently I've been re-recording some old videos and producing some new ones too. They've been going up on YouTube but I've now updated the original post with the new links. Here are all the new/re-uploaded videos: New document with templates Tool cycling Keyboard and Mouse Brush Modifier Stacking: Object removal Panoramas Sky replacement Bitmap pattern fills Selecting sampled colours Selection Brush Tool Freehand Selection Tool: Freehand, Polygonal and Magnetic modes Procedural Texture: Nonlinear Transform Correction Editing metadata Retouching scanned line drawings Applying Blender Filmic looks Compositing 3ds Max and V-Ray render passes As always, hope you find them useful!
  10. You can create an intensity brush from a mask layer—bear in mind however that it will use the document bounds for the brush unless you make an active selection of the mask contents first (CMD-click). I’ve queried this behaviour as I can’t think of any scenario where you would want the entire document bounds being used for the brush contents..
  11. Hey @Roland Rick, it's worth asking whether you are applying tone mapping following on from an HDR merge, or whether you're just applying it to single exposure images? A resultant HDR merge will typically be much cleaner (as it's equalising then merging exposures based on the most detailed pixel information), I'd be surprised if you were getting very noisy results from a bracketed set of images. If you're tone mapping using single images it would make more sense that local contrast especially would bring out noise. If that's the case, have you tried the Clarity filter instead? It's not quite the same effect, but might be suitable for your requirements. You can use the destructive version (Filters>Sharpen>Clarity) which will let you apply it quite aggressively, or you can use it non-destructively (Layer>New Live Filter Layer>Sharpen>Clarity) which will apply it dynamically at the expense of overall strength (the effect will be less aggressive). You may notice on the HDR Merge dialog there's an option for noise reduction–this is typically recommended when merging RAW images, so perhaps ensure this is enabled if that's what you are doing. With JPEGs however, there's typically some in-camera noise reduction that has already been applied, so I would avoid applying more noise reduction (although every camera is different, so always try for yourself!). Hope that helps.
  12. Apologies if I’ve misunderstood this, but couldn’t you just use 32-bit format for your document? 32-bit in all the apps uses linear compositing but has a gamma-corrected view transform applied non-destructively so the document view looks consistent with the exported result (as you would typically export to a non-linear format with gamma encoded values). You can toggle this behaviour on the 32-bit preview panel if you need the linear unmanaged view (or an OpenColorIO transform) but it defaults to ICC Display Transform which is the exact workflow you describe: all colour blending operations happen in linear space, with a final view transform to non-linear space. If you wanted linear compositing within a non-linear RGB document format (so 8-bit or 16-bit), you could approximate this by using two live Procedural Texture layers: one underneath the layers you wish to blend in linear space, and one above. The one beneath you would create equation fields for each channel and transform the channel value by 2.2 (e.g. R^2.2), then the one above you would transform by the reciprocal power value, so R^0.454 etc. Just an approximation, but perhaps close enough for your needs? I’ve attached some screen grabs: I exported two 16x16 images with a black triangle—one composited in 32-bit, one in 16-bit. To visualise them, I loaded them back into Photo and screen grabbed the resulting document views: Finally, here is a 16-bit non-linear sRGB document using the live Procedural Texture technique I mentioned above: Hope that helps!
  13. Well, I've waited almost nine years for an upgrade so I finally pulled the trigger—swapped everything out this afternoon and did a fresh install. Was sceptical about how much faster a new CPU and RAM combo would be over my old setup, but... i9 10850k at stock speeds, 64GB RAM 3200Mhz, 2070 Super. Might push for an overclock and see if that raster GPU score goes any higher. I'm also surprised that the integrated Intel 630 graphics beefs up the multi GPU score significantly, really wasn't expecting that.. I wonder if anyone's gotten hold of an nVidia 3080 yet?
  14. Sooo.. overclocked my 3930K CPU to 4.6GHz (from 3.2GHz) and my 2070 Super score has shot up considerably—talk about CPU bound! Here's 3.2GHz: And 4.6GHz:
  15. MacBook—8-core i9, 64GB RAM, 5500M 8GB & Intel UHD 640: Windows: i7 3930K 3.2GHz, 32GB RAM, GeForce 2070 SUPER Not too bad given the age of the CPU (starting to feel like it's bottlenecking the GPU)—I see @jc4d has almost double the GPU raster score when paired with a beefy CPU! I did previously run a benchmark where my score was lower at 3743 as well.
  16. Hi @stephen2020, Windows Photos is not colour managed so you may notice slight differences for sRGB colour space images—the difference you're seeing is more dramatic though. I noticed on your export settings dialog that the ICC Profile option is empty, which is strange as it should default to "Use document profile". Could you perhaps try and set this option then re-export and check again? Failing that, also try setting it to "sRGB IEC61966-2.1" and exporting. Also, you may want to try previewing your images in a colour managed application—I'm not up to date with Windows solutions nowadays but you may want to try Irfanview (which I believe has optional colour management you can enable) or perhaps a web browser like Firefox which should colour manage with profile-tagged images by default. Hope that helps!
  17. Hi @DQ_C, just chipping in as I'm a big 3D enthusiast and I do a lot of supplemental work around retouching workflows with Photo. Having native multichannel EXR support in Photo is really useful, but I agree that proper Cryptomatte support would be great too. ArionFX sounds interesting—I always find working with tones more difficult in 32-bit because of linear compositing. I developed a couple of techniques to help with this, such as using live Procedural Texture filters to non-destructively transform to non-linear, do some tonal work (e.g. brush work with blend modes, adjustment layers), then transform back to linear. The other thing I did was develop a set of macros to recreate the Filmic transforms without any OCIO dependencies, so you can non-destructively apply the log transform and whichever look you want. Because the Filmic transforms take the document colour values into bounded space, you can then use all blend modes and adjustment layers without worrying about creating negative values. Glow and Reflect in particular do not play nicely with unbounded values. Hopefully that helps a lot of Blender artists out because Photo's OCIO view transform (32-bit preview panel) causes a lot of confusion—it's not designed for exporting straight from Photo to gamma corrected formats. For that, you need to be using ICC Display Transform with OCIO adjustment layers (and LUTs) to apply the transforms. Blender's OCIO configuration causes an issue with Photo because it specifies "None" as a device transform—which Photo defaults to 😑 so instead of at least seeing a gamma corrected sRGB transform, you just see linear unmanaged values when opening an EXR with that configuration active. Many users will set up OCIO, never realise this (or the fact that there's a 32-bit preview panel to configure it) and then get a nasty surprise when they finally export their work to 8-bit or 16-bit formats. Anyway, I tend to use those Filmic macros for pretty much any tone mapping now! (3D or just general HDR photography). Sometimes I'll just do the Log transform to compress the tonal range and then use adjustment layers until I get a decent look. Bloom/diffuse glow I tend to achieve a couple of different ways: if I have a Bloom layer (e.g. as a pass from Eevee) I'll composite that above the composite render layer with an Add blend mode, then clip an Exposure adjustment into it and tweak until I get the right look. Alternatively, I'll use a live Gaussian Blur filter with an Overlay blend mode, then take advantage of Blend Ranges (parametric tonal blending, the cog icon next to the blend mode dropdown) to blend the effect out of the shadow tones so it doesn't overpower and darken the entire scene. That despeckling feature from ArionFX looks really useful actually—do you know if it tackles path tracing noise? I do a lot of volumetric lighting in Blender and whilst the composite render pass is denoised, I'll sometimes use the render passes to enhance areas—e.g. transparency passes for foliage, or the volume passes to add atmosphere/mist—and they're typically very noisy unless you render with a huge amount of samples. Sometimes that's nice as it lends a bit of texture to the scene—sometimes not! Anyway, sorry for the ramble, just wanted to share a few ideas etc. I'm always keen to learn how 3D artists are using Photo in their workflows, and I'm certainly happy to help address any issues or suggest techniques. P.S. I notice you mentioned premultiplied alpha—just wanted to mention in case you hadn't seen them, but there are a couple of things to help with this. Preferences>Colour has an option to associate alpha channels with the RGB information, which is premultiplication, and then on the Filters>Colour menu you've got options to multiply and divide alpha as well.
  18. If your Eizo monitor is 2560x1440 (or 2560x1600 for 16:10) then you'll be in an even better position performance-wise! Probably a wise choice given the ARM MacBooks. I have an almost maxed-out 16" MacBook for work—it performs well but the thermals are terrible, the fans are almost constantly active if I'm working in Photo or anything else for that matter. I've had previous MacBooks with the older Polaris GPU architecture (and CPUs with fewer cores!) and they were much quieter. Cannot wait for the ARM models.. I believe compute performance will increase going from 5500XT - 5700 - 5700XT, since each model has a greater number of compute units and stream processors. Whether the performance increase is meaningful is another matter—a 5500XT will likely be sufficient for most workflows since even older mobile Polaris GPUs (e.g. the 455) could easily hit over 100fps when compositing, say, a live motion blur filter. It's dependent on your screen resolution, typical document bit depth and your workflow however. If you work with lots of high resolution layers, you may want to opt for a more powerful GPU, especially if it has more VRAM. If you just develop RAW files, add a few adjustments and filters then call it a day, you would be fine with any modern baseline GPU—even integrated solutions. It's worth noting that not every operation is hardware accelerated as of yet—vector layers like a Curve path or quick shape for example. If you use these in conjunction with complex live filters, you'll be bottlenecking the render pipeline with CPU operations, therefore losing the advantage of compositing purely on GPU. It will still be faster than software, just not excessively fast—in this sense, you can't really "brute force" performance with a faster GPU, so don't be too concerned with spending large amounts on a top of the line model.
  19. Hi @A_B_C, for a modern Mac system with a reasonably powerful GPU that has sufficient VRAM, you should absolutely use Metal Compute (hardware acceleration) where possible. The main hurdle to overcome is graphics memory. macOS will use a certain amount of VRAM to simply drive the display—the amount is dependent on your screen resolution. Previous 15” MacBook Pro models would use a resolution of 3360x2100 I believe (1680x1050 logical resolution) but the 16” models use a slightly higher resolution. Be aware of this memory usage if you plan to use an external display at 4K or 5K—even more so if you run an external drive whilst keeping the lid open to use the internal display as well. In addition to this, the Affinity apps will demand graphics memory for the Metal view context. With Metal Compute, this memory requirement increases because the GPU is also being used for compositing, and this is typically where you may run into slowdown if you max out the VRAM—you’ll then be eating into swap memory which incurs a performance penalty. Under everyday editing circumstances, you may never run into any memory issues. If your editing experience suddenly becomes very slow and laggy, however, you may have run into the limitations of your graphics card. I know you can configure the 5500M/5600M models with 8GB, but am I right in thinking the 5300M can only have 4GB? In which case you will simply have to work in your usual manner and try for yourself! As an example, I do a lot of work with 32-bit OpenEXR multichannel documents—I typically have about 20-30 layers in a document that averages 5000-6000px in width (and usually similar in height). On top of all that, I run a 5K monitor. This combination is a little extreme, but I can edit comfortably with 8GB VRAM most of the time. I previously had a MacBook with a 4GB VRAM GPU however, and I did find it wasn’t quite sufficient—I ended up borrowing an external GPU enclosure with a card that had 8GB VRAM. For most editing, including compositing in 16-bit with multiple layers, I think you will typically be fine with 4GB—again, though, this is also dependent on your screen resolution and document resolution. The advice about switching to OpenGL is typically just a troubleshooting step, but it has more meaning for Photo since you have to disable hardware acceleration to use it—thereby alleviating the significant GPU memory requirement and potentially speeding up older systems where the GPU’s capabilities would not be meaningful enough anyway. Hope that helps!
  20. @Luke_LC3D I'm guessing you are "placing" the ID pass EXR into the colour pass document (File>Place or drag-dropping)? In this regard, it becomes an embedded document and the Flood Select Tool won't work on it. This is actually unique behaviour to the Windows version—on Mac, it's placed as an Image layer instead (so Flood Select works as expected). Previously, Flood Select wouldn't work on Image layers either (meaning you had to rasterise them to Pixel layers) but this was fixed. I'll enquire as to which behaviour is correct. In the meantime, there are a couple of ways to solve this: Right click your ID pass layer and choose Rasterise to convert it to a pixel layer (you can also use the top menu, Layer>Rasterise). Open your ID pass as a separate document, copy and paste the RGB layer into your colour pass document. Hope that helps!
  21. @kyle-io apologies for bumping an old thread, but I found it and realised that this issue has been long solved. Firstly, I've done a tutorial on the main Photo channel which covers how to apply any Filmic look you want using the OpenColorIO configuration: Second—and this may appeal more—I set about producing some macros which allow you to apply the Filmic looks with no OpenColorIO dependencies—so you can add them to a fresh installation of Photo and get the looks instantly. Here's a video which shows how to use them, and also gives you some workflow tips: You can grab the Filmic macros from my website resources page here: http://www.jamesritson.co.uk/resources.html Hope that helps! PS the above will make it redundant now, but the issue with your above screen grabs is you're transforming from Linear to Filmic sRGB—you want Linear to Filmic Log, then add the look LUT, then finally the sRGB device transform to linearise the values. I’ve edited my original post to update the instructions!
  22. Try @kirkt's suggestion of rasterise and trim. I would presume you opened an image (diagram/plan?) then cropped it to a square ratio? In which case you have pixel information outside the current canvas bounds—usually you would see the rest of the image, but in this case it appears to be solid white that was perhaps cropped away.
  23. Hi @Gru, I'm not familiar with the end step—make the shadows? Do you mean the content aware replacement once you have your stars selected? I usually do: Select>Tonal Range>Select Highlights Select>Grow/Shrink (somewhere between 6-12px) Select>Smooth (optional, depending on image) Edit>Inpaint The last step will inpaint all the selected stars out and typically replace them with dark sky detail. Other things you may want to try: Duplicate your pixel layer (e.g. Background), go to Filters>Sharpen>Clarity and use a negative value, then apply the filter. Change the layer opacity to alter the strength of the star removal. Select>Select Sampled Colour: useful for selecting dimmer stars, especially if they have colour fringing (e.g. red/purple). Similar procedure to the inpainting technique mentioned above. Layer>New Live Filter Layer>Blur>Minimum Blur, or alternatively use the destructive Filter menu version on a duplicated pixel layer. Mask away from the important areas like your deep sky objects to ensure you don't remove too much information. Hope that helps!
  24. Hi @srg, your Pentax camera is actually exhibiting quite a few hot pixels, most of which are automatically remapped by Affinity Photo. I'm guessing it's the 30 second long exposure which is causing the sensor to heat up and exhibit these hot pixels. The offending area near the middle of the image doesn't get remapped however because it's actually two hot pixels next to each other of differing intensities: Because of this, the pixel arrangement could potentially be genuine useful information within the image, so Photo doesn't remap it. Hopefully this issue wouldn't show up with most images, just those shot with a long exposure. Some cameras will remap pixels at the bayer level as they're written to the memory card (the Sony E-series cameras do this I believe), and some don't. Sony cameras get a bit of flack for this from astrophotographers—they've been nicknamed "star eaters"! Unfortunately I think the only course of action for you is to quickly inpaint out the hot pixels once you've developed the image—alternatively, you can use the Blemish Removal Tool in the Develop Persona and just single-click over the hot pixel area. Hope that helps!
  25. Hi @Dmi3ryd, the video Gareth posted would help with making selections from Material/Object ID render passes (or Cryptomatte in its bitmap form). However, it sounds like you just want to convert a pixel/image layer to a mask, is that correct? If so, copy/paste or place your alpha image into your document, then go to Layer>Rasterise to Mask. Alternatively, you can right click the layer thumbnail and Rasterise to Mask will be on that menu too. Once the layer is a mask, you can mask other layers with it (drag-drop over the layer thumbnail, not the label text), and you can CMD-click (Mac) / Ctrl-click (Windows) the mask layer to create a selection from it. Finally, with the mask layer selected, you can also go to the bottom of the channels panel and right click [Layer Name] Alpha then choose Create Spare Channel. This will create a channel from the mask which you can load into other masks or into your active selection. PS if you want to invert a mask, just select it and use Layer>Invert, or CMD+I (Mac) / Ctrl+I (Windows). One final note: if you need to multiply or divide alpha by the colour values, you need to flatten the mask into its parent pixel layer first. With a mask clipped to a layer, right click the parent layer and choose Rasterise. Now, on the Filters>Colours menu, you have Multiply by Alpha and Divide by Alpha. You can also do this non-destructively with a Live Procedural Texture filter, but that's for another day 😉 Hope that helps!
×
×
  • Create New...

Important Information

Please note the Annual Company Closure section in the Terms of Use. These are the Terms of Use you will be asked to agree to if you join the forum. | Privacy Policy | Guidelines | We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.