Jump to content


  • Content Count

  • Joined

  • Last visited

About sonofzell

  • Rank

Recent Profile Visitors

200 profile views
  1. @Pšenda I owe you both an apology and immense gratitude. I completely misunderstood the suggestion you provided, and thus had originally dismissed it. The ability to use existing images as inferences to create LUTs is something I was completely unaware of; I agree with your suggestion that this method is likely the easiest for applying general color corrections inspired by the Dive+ algorithm! For anyone else that may have missed the brilliance of Pšenda's suggestion, I'll attempt to "dumb it down" to my own level of inexperience: 1. In order to mimic the Dive+ output, I needed two source images: the original, unmodified photo and the color-corrected variant output by the Dive+ app (for reference, we'll refer to those images as "V1" and "V2", respectively). 2. With the unmodified image (V1) opened in AP, I selected LUT from the adjustment panel. In the setting dialog, I clicked "Infer LUT", which prompted me to select two sources (V1 & V2). 3. [Presumably] using the differences between the two sources, AP applies a LUT that mimics the adjustments between the source images you selected and applies them to the current image. The result is astoundingly similar to the desired output of the Dive+ app... 4. I saved the new "custom" LUT I had created as a present, allowing me to apply the same correction to other photos with once click! 5. The source photo above was taken in the Caribbean. Not surprisingly, the new LUT did not have the same results when applied to photos from the soupy North Atlantic! To remedy this, I simply repeated the above steps using new source files from the murky environments. Again, the LUT inference did an amazing job of replicating the Dive+ output: Obviously, there's some additional "tweaking" that could further improve the output, but this by far the best result I've seen in replicating the Dive+ color correction on a desktop. Thank you! @donheff I've attached a few variants of the custom LUTs for you to play around with. Dive+ Caribbean.look Dive+ Quarry.look
  2. I should clarify... I did not intend to imply that "File" links did not work, only that I haven't tested them (yet).
  3. Just updated to AP beta v1.7.0.305 earlier today and privately did a little "happy dance" at my desk... Text hyperlinks have arrived - time to rejoice! I've had limited time to play with the feature, but if you're like me and eager to check it out, grab the latest beta and test drive the "Text > Interactive" menu option. Anchors can be added to text within any layer, which can then be used as targets for hyperlinks. The complete offering of URL targets (at least on my Mac version) includes: Page Anchor URL File Email With the exception of "File", all of the above worked flawlessly in a PDF export. I'm not 100% clear on the intent of the "File" link, since logic dictates it would need to reference a specific directory structure that likely wouldn't be known at the time of publishing... the only application I can foresee at the moment would be if one were to somehow publish a "package" of sorts, containing multiple docs with links between them(?). Otherwise, I could only see the "file" target being an online doc (which shouldn't differ from the "URL" option). Either way, I'm excited to poke around and find out! APhyperlinks.pdf
  4. Greetings everyone, Although this thread is dated, I'm throwing my hat in here, even if only to keep up with any new insights on this workflow. I myself have several .afdesign files that are layered, 2D floor plans and are referenced almost daily! Similar to the original poster, I realize that "true" architectural drawings are out of Affinity's wheelhouse, however, I share the thought that (perhaps with just a few minor tweaks & features) designer really can be useful for this application. Obviously, these 2D models are not going to cut it for any real architectural work, but for my day-to-day needs, they're a godsend... Corporate IT is showing malware on "computer Y" at "wallplate X"?... Found it. Bill from accounting forgot his office key... Found it. Contractors need a visual for where the new cubicles are being installed?... Here you go. To ensure proper scale, I simply used shape tools to "trace over" images of actual floor plans (or created scaled models using sketchup/sweethome and exported to SVG). While they may not be down-to-the-inch accurate, they're more than capable of providing a scalable reference to suit my needs. As @hildevert alluded to, there are some minor tweaks that could really add to this functionality. For example, modals/popovers for supplemental info, the ability to search text strings in layers, and connector elements would all be immensely useful for similar applications of designer. For the record, I understand and fully appreciate that this is way outside of the software's intended purpose. For that reason, these are certainly not features I'd bug the developers for. That being said, as the Affinity suite continues to mature & expand, I'd like to think they'd serve other purposes as well, so I'm simply offering my "+1" for this usage scenario and hopefully adding a bit of clarification. These docs may have started as a bit of a pet project, but have since become an incredible asset, saving me countless hours over time. It's hard to put a price on answering "what wall plate is printer X using?" instantly - without having to crawl around under desks with a flashlight! Just my $.02. Cheers! 3624MKTfloorplan.afdesign
  5. YES!!!! That's the base effect I am trying to achieve - each color displaying accurately without the single-color (blue or green) saturation. I might be inclined to play with some additional tweaks but your example is what I have been trying to achieve for weeks without any success! I'm so grateful for you taking the time to make these edits... I suppose you know I'm going to ask you for the specifics of each layer in your "Adjust" group... If I can reproduce the result you've shown here, I'd love to assemble a macro that could be applied to the hundreds of my underwater photos in need! Sincere thanks, K
  6. @DWright , thanks so much for the reply! I've played with each of the adjustments you mentioned (admittedly without any real tact or expertise) and was able to achieve similar results... while the image is definitely improved, it still seems to be saturated with that blue-green-ish saturation. I've overlayed the app output on your example for comparison - what I'd love to be able to achieve is the color contrast that appears in the overlay. The yellow fish tails, red rust and blue water contrast against the white sand are what I'm not having any luck with producing in Affinity. Thank you for the example and taking the time to edit - I sincerely appreciate the help!!! K
  7. Greetings, I know what you're thinking - "this guy didn't search the forums before posting"…. I actually have been researching extensively for several weeks now, including several similar posts in this forum, and I've concluded that either my situation is unique or I'm just more dense than your average user (these may not be mutually exclusive lol). So, as the title suggests, I am searching for guidance on color correction on underwater photos. I'll preface by stating I am NOT a photographer, so my knowledge of post-processing technique is frustratingly minimal. It is for that reason that I had previously just written off my failures, just accepting really dismal diving photos. That changed when I was recently introduced to an iOS app called "Dive+". On a whim, I decided to run its color correction function on several shots I had on my phone, and I was blown away by the results! In more than one case, I actually "discovered" fish and corals in my photos that were previously invisible! The ease with which the shots were transformed has inspired me to re-visit touching up many of my underwater shots, but as you can imagine - iOS is certainly not the best or most efficient environment to do it. Re-visiting all the tutorials and articles on the topic, however, has brought me right back to where I began - slightly less crappy photos and a lot of swearing. It seems the majority of the methods described in tutorials (even those for photoshop or other editors) just don't produce similar results for me as they do in the examples shown. For instance, the vast majority of underwater correction guides suggest beginning at adjusting white balance. In the examples, adjustments produce immediate color and contrast improvements, while my photos simply turn from "all blue" to "all green". Additionally, suggested techniques using levels invariably begin with minimizing black levels and maximizing white…. In all my photos, these positions are already selected (and moving them in opposite directions only worsens the output). While I'm certainly willing to tackle a mild-to-moderate learning curve, I have no aspirations of professional photography. What's nagging me is the ease with which this free phone app is able to drastically improve my photos, as well as my inability to determine exactly HOW it corrects the images so I can replicate the process. Below is a random photo from my collection that I hope will better clarify what I'm attempting (and failing) to achieve: This is my original photograph: This image shows the output from the Dive+ iOS app: Applying "Auto" White Balance to the image in AP makes no discernable difference. When manually selecting a neutral area (white dot indicates the area I selected) using the white balance "picker" as most tutorials suggest, a green overlay appears as shown below: This final image shows my sliders for the Levels adjustment layer. Note that the sliders for black/white levels are already at opposite extremes, negating the ability to adjust in the way most tutorials have suggested. The sliders for master/red/blue/alpha all have identical positions. I've attached the original photo in case anyone is feeling gracious enough to play around with it. The edits described above are certainly not the only ones I've tried - I've been playing with pretty much every setting I can find in photo, develop, and tone mapping personas. While I am able to make some minor improvements to my underwater shots, they still don't compare to the difference I get with one tap on the iOS app. Furthermore, the results are very inconsistent compared to the app. For instance, I have freshwater dive photos that have a green saturation in place of the blue shown in this example. Results from the app on those photos are equally impressive, yet the only similarity in my manual edits is the lackluster result. I'd love to know what type of algorithm this app is using so that I can create some type of similar macro or workflow for editing in AP. Heck, I'd even gladly pay to add Dive+ to my workflow if it were available on the desktop, but it is iOS/Android only. If anyone can offer guidance on what I'm doing wrong, or direct me to any tutorial resources that are Affinity-specific, I would greatly appreciate it! Sincere thanks in advance for any advice you have! Best, Kirk ***NOTE: I have no affiliation with the Dive+ app, nor any software mentioned. It is not my intention to present Dive+ as an alternative or competitor to Affinity Photo; my impression is that the latter is a much more capable product in the hands of a knowledgeable user (which I am obviously not!).
  8. Apologies if this has already been referenced, but I was not able to find any relevant posts... As a user of both Designer and Photo (Win + Mac), I often find myself looking for identical "resources" in each respective program. I'm referring to custom palettes, styles, brushes, etc. I know the Affinity programs allow for importing these resources application-wide OR per-document, but I can't help imagining how nice it would be to have just one central resource repository, that either program could recognize by default, and pre-load those resources that can be used in both programs rather than jumping back to app A > exporting > jump to app B > importing, etc. If there's a current workaround that anyone is using to achieve such behavior, I'd love to hear your suggestions. Cheers! K
  • Create New...

Important Information

Please note the Annual Company Closure section in the Terms of Use. These are the Terms of Use you will be asked to agree to if you join the forum. | Privacy Policy | Guidelines | We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.