Jump to content

marcdraco

Members
  • Content count

    40
  • Joined

  • Last visited

About marcdraco

  • Rank
    Member

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. marcdraco

    RAW Engines

    Multiple sub-frames? Is that one of those new fangled cameras that shifts the sensor a hair for three exposures one for each colour? (Not joking, I hear they can do that now)
  2. This is from another thread - looking at the demosaicing feature in Raw. I noted that AP on Windows "only" has the Serif engine which bears a striking resemblance to AMaZE (which would make sense as it's so highly regarded). However, there are multiple other algorithms out there including ones to deal with high noise images like this one. I wonder if we could have some alternative engines built in so we can take a crack to see what produces the best initial results off the sensor before we start fiddling?
  3. That would be a (short-term) fix that doesn't cost anything. I use Resolve Studio for my editing work (and even my VFX now when I get time to learn that UI). Most of my current stuff doesn't need me to fire up a 16 core Xeon and since electricity it's free I'm sticking with an i3 laptop for a daily driver which is fine for most Affinity Photo work. (More props to the developers for making this possible!)
  4. Does the free version of Resolve have that feature? It's been so long since I used it that I've forgotten and my daily driver isn't anywhere near powerful enough to run it.
  5. marcdraco

    Raw Engine

    I did some tinkering and tried a few different Raw engines on this image (this is extreme magnification) and uses RawTherapee so you can see the actual pattern as recorded by the camera's sensor. No noise reduction has been applied here - this really is exactly how the camera saw this section of a Microsoft mouse. The inset goes in even closer still so you can see exactly the problem these convertors have and that's before they start applying noise reduction, colour correction, curves, sharpening … you name it. I have to say that I was very pleasantly surprised with Serif's convertor - which performed as well as Aliasing Minimization and Zipper Elimination did (to my eye) even on this fairly noisy image which isn't where AMaZE is know to shine. I wonder if Serif can be persuaded to include alternative engines in future? RawTherapee suggests some algorithms for Sony cameras and others that work well with high-ISO (noisy images). Fascinating stuff.
  6. marcdraco

    Raw Engine

    I was just talking a pal of mine (a pro photographer) about this. He likens it to a hammer. There's more than one hammer design (I lost count actually) it's case of finding the right one. On Windows I don't get a choice of which engine to use but I can switch to Open Source software like Darktable or RawTherapee to get other options. @firstdefence is spot on. It's a case of finding the right engine for (a) your camera and (b) the shots you take. Not every shot will work well on any given camera/image because of this: https://en.wikipedia.org/wiki/Bayer_filter Perhaps the most important part of the process is demosaicing (https://en.wikipedia.org/wiki/Demosaicing) which converts the human-eye biased sensor pattern into a proper RGB image. The Wikipedia article is quite easy to follow and it will help you decide if you want to go down that path or stick with what you have. Ultimately, a lot of this is personal choice and as an artist you're the only person to decide what looks right to you. Personally, I think Affinity Photo rocks and it's the best £50 I spent on software. Period.
  7. I don't know if this has been suggested but I use curves a fair bit and particularly LAB for those really powerful wide gamut adjustments. The current graph only has a four by four grid which isn't anywhere near enough to lock in really fine changes. It would be really handy if it were possible to size these graphs too. It's something I'm used to in other software so I was expecting to see it here and (I assume) that changing the reticule is a five minute job for an intern. Resizing is a different matter because that carries other problems … but pretty, please with knobs and bells on (and a cherry on top)?
  8. I'm not a Python programmer - it's just another technology that I don't have time to learn. (Remember the Woodcutter problem? No? Take me too long to explain. Repeat.) I prefer compiled Python simply because the interpreted version is a bit too slow. There's a least one project that uses it, in Blender IIRC, that makes a huge difference but it's a couple of years since I used that. Damn I'm getting old fast.
  9. @SrPx completely with you on what you said about Blender and the UI (as I've said before) still looks like it was put together with all the charm and forethought of sneeze. The post 2.5 releases (from memory) replaced the real rip-snorter with a slightly more suppressed ach-cho. It's still awful and because of the "Not invented here" attitude of many Blenderheads (most of whom, unlike me, don't remember when Blender was an in-house project) it's rather hard to get a meaningful discussion on rectifying that. As you've said, it's a tool. I'm going to give LuxCoreRender a go soon - that combines the full compute power of the nVidia/AMD chips with the full core potential of whatever you happen to be running on. My 6gb card just isn't quite big enough to render large/complex scenes and even with 32 threads, things are still slow. Combining the two might just give me the edge I need. I'll be interested to see what happens with Eeveee and if it proves to be something more than just a glorified GL renderer. (I know it's a trick, but it's damn good trick.) In film it's largely a case of how long something is static on the screen - it's amazing what you can hide in a little bit of motion blur - god I love 24FPS! Although I use Affinity for my everyday "memes" (they're infographics actually) I have a toolbox with Krita, Inkscape (which is better than Designer right now, just not as pretty) MyPaint and even Artrage for that really fine art look. It comes down to this "When all you have is a hammer" thing. I hope Serif don't "pull" tools from Photo, but for the sort of design work I do which is mainly text flow and stroke-based shapes, it's plenty powerful enough. The only thing I need Designer could do (that Photo couldn't) was text on a path. I know Designer does a lot more but I just don't need those features. So between all the art software I've had to buy for support and features, I've only spent about 100UKP, which is a deal in anyone's currency. Max/Maya are just out of my budget not to mention the 1-2 years I'd have to spend cramming even more into my noggin - which is already well past its use-by date. My only misgiving with Blender is the weak particle system but now Davinci has a professional VFX compositor built in with a decent (if sluggish) particle pipline, I'm golden. But I'll say this again, Photo is bloody amazing, particularly at such a low price and I can't see why anyone would want to be chained to Adobe these days. Ironically, some Photoshop people I've spoken to think it's too cheap! I guess that's what happens when you get used to paying through the nose: you get the idea that money=quality and that's clearly not the case. Some people will never realise that, but that's their loss and our gain.
  10. Oh, not seen that before @PedroOfOz. I've used a couple of others like Darktable which also does RAW of course. It's another one for the toolbox, so thanks! Oh wait, it's not actually FOSS, there's a small licence cost for commercial use. Still not bad I guess.
  11. @annaisabel did you find that feature you wanted? I'm familiar with the technique in Photoshop but being in L*A*B it's a little over my pay grade. However, if you hit the Curves adjustment and switch to LAB, there is a "picker" you can use and you can operate on all three channels or any one. As I recall, there was a tutorial using a wedding dress that was ever-so-slightly off white and the guy used LAB to correct for that. Very clever stuff. Does this help? See how you can take a sample in LAB, RGB, CMYK, etc. and have multiple samples running? Pretty sweet.
  12. James Ritson has a rather jolly tutorial on this - although he didn't so much align the stars as take a sequence of high-ISO shots in fairly quick succession so they earth's rotation didn't go an spoil everything.
  13. Of course and as Woven has alluded to me separately, this is just version 1 (beta) so Serif's team is playing catchup. This may be a case of them not being able to see the wood for the trees and leaving out this "feature" because they weren't happy with the results. I gather that's why Designer doesn't have a Trace Bitmap facility (c.f. Potrace in Inkscape). Quality control like this is rare these days as publisher rush to get software out there. I don't think it's unfair to mention Adobe and Microsoft as two of the worst offenders in this regard. Props to Serif if this is the case and either way, I appreciate their willingness to share beta software so publicly so we can all have at it. It's a good business practise that others could learn from.
  14. I've grabbed it Walt, I was just hanging on until V1.0 hit. However, Jens does have a very valid point which took me by surprise. There's no asset/resource/clip manager where you can drop (say) 100 photos into draw and just pull and place them as you need. I just assumed it would have one - I thought the iOS 12 assets would be a sample of clip art but I was clearly mistaken. Now it does have a placed asset manager, which is fine, but I can't find "bin" (as they're called in some of my video software). When I'm editing video, I have 100s (or more ugh) clips in a bin or sets of bins, or nested bins... and I can grab and use them as needs be. Quite an oversight really - I'm pretty sure InDesign CS4 (which is what I was using two years ago, and was old then) has this facility.
×