Jump to content
You must now use your email address to sign in [click for more info] ×

Recommended Posts

I've watched a bunch of the tutorial videos, but can't seem to find a way to do what I want.

Basically, I want to create a set of healing- / cloning-type adjustments on an adjustment layer, to preserve them independently of the pixels from which they were originally derived and to which they were originally applied. I want to do this so that, for example, if I fix facial blemishes on a portrait, or clean up its background, but then decide I want to change the underlying raw conversion (which I do in another program, and export as a 16-bit TIFF), I don't have to start over and redo my facial retouching. Ideally, I could have a 'zit fix adjustment layer', a 'background cloning adjustment layer', and just copy those adjustment layers and drop them on top of a new, differently-raw-converted TIFF.

I have to think others want similar capability, and that it probably exists, but I can't figure out how to get at it. Thanks!

Link to comment
Share on other sites

23 hours ago, N'Awlins Contrarian said:

Basically, I want to create a set of healing- / cloning-type adjustments on an adjustment layer, to preserve them independently of the pixels from which they were originally derived and to which they were originally applied.

Healing, cloning, & the like are implemented as brush tools that interact directly with the pixels of layers, not as separate adjustment layers. There is no way to make them independent of the pixels of a layer or of their individual x & y pixel coordinates.

 

So for example, since it would be extremely unlikely that two portrait photos would have faces in exactly the same position in the photo frame, exactly the same facial blemishes, & exactly the same backgrounds, there is no independent adjustment layer that would produce the same or even similar results for both of them, much less for a larger number of photos.

All 3 1.10.8, & all 3 V2.4.1 Mac apps; 2020 iMac 27"; 3.8GHz i7, Radeon Pro 5700, 32GB RAM; macOS 10.15.7
Affinity Photo 
1.10.8; Affinity Designer 1.108; & all 3 V2 apps for iPad; 6th Generation iPad 32 GB; Apple Pencil; iPadOS 15.7

Link to comment
Share on other sites

I think what the OP means is that he wants the brushwork of the cloning/healing/inpainting tool on a separate layer. This can be done with the method @barninga describes. The main problem however remains: If you first do some retouching to a separate layer and then afterwards exchange the base image, your brushwork will immediately become visible because it will no longer match the underlying source layer (because you just altered the colors in e.g. an external RAW editor). You'd either have to throw a merged version of your retouched image into the external editor (thus losing the original pixel data) or do all the editing in AP and place some adjustment layers on top of the existing layer stack so they affect the whole composition.

Link to comment
Share on other sites

I assumed when the OP said "on an adjustment layer," he meant that literally. @barninga describes something quite different from that, but regardless there is no way to make it independent of the pixels of the layer(s) of the current document or applicable to other documents, which I believe is what the OP wants.

All 3 1.10.8, & all 3 V2.4.1 Mac apps; 2020 iMac 27"; 3.8GHz i7, Radeon Pro 5700, 32GB RAM; macOS 10.15.7
Affinity Photo 
1.10.8; Affinity Designer 1.108; & all 3 V2 apps for iPad; 6th Generation iPad 32 GB; Apple Pencil; iPadOS 15.7

Link to comment
Share on other sites

Thanks for all your replies, and sorry if I was unclear on what I want, why, and how I think it should be able to work. Maybe two examples will illustrate the issue:

(1) My camera sits on a tripod, and other than changes in shutter speed, nothing changes as I take a series of photos of a sunrise or sunset scene as the sun rises or sets. There are some distracting foreground elements. I want to take the first image in the series, add an adjustment layer, on that layer clone out the distracting foreground elements, copy that adjustment layer, and paste it on top of each of the other photos in the series. The relationships among the source and destination pixels in each photo in the series should be the same, or very nearly so. How can I do this? If I cannot, is that not an omission that should be rectified?

Barninga / Stefano, and everyone, if I have several such TIFFs, can I stack them as layers in one file, simultaneously perform the same clone operations on all of them, then just export versions with only one of the source files visible?

(2) I take a portrait. I perform a raw conversion in other software, and export the result as a 16-bit TIFF. I want to use the healing tools to fix facial blemishes in Affinity Photo. If later I decide that I want an 11x14 inch print of the portrait instead of the 5x7 inch version I originally planned, it might be very desirable to apply stronger noise reduction during raw conversion. If my healing tool work is on an adjustment layer, I can just copy that layer and paste it over a new TIFF, the revised version from the raw converter. Again, the relationships among the original pixels and retouched pixels in each photo should be the same, or very nearly so. How can I do this? If I cannot, is that not an omission that should be rectified?

In this scenario, I don't see how what Barninga / Stefano suggested would help me--but maybe I'm missing it.

Straight up, the prospect of applying cloning and healing from a prior effort to a new file (either a new raw conversion of a prior photo, or a new photo of an essentially-identical subject) was the driving reason for my definitively concluding that I'd outgrown GIMP. Yes, better color-management facilities and support for higher bit-depths (in non-beta versions) also contributed. But this was my top hope / goal. Am I just out of luck?

Thanks!

 

Link to comment
Share on other sites

5 hours ago, N'Awlins Contrarian said:

I want to take the first image in the series, add an adjustment layer, on that layer clone out the distracting foreground elements, copy that adjustment layer, and paste it on top of each of the other photos in the series.

 

Yes, you can do that.

 

Try watching the video below, you end up with a 'retouching layer', which you could copy and paste between your images. That should do it for you.

 

 

Windows PCs. Photo and Designer, latest non-beta versions.

Link to comment
Share on other sites

@N'Awlins Contrarian I think part of the confusion may be because you are using "adjustment layer" in a generic sense while in Affinity Photo "adjustment layer" refers to one of the following specific kinds of layers that allow you to make non-destructive corrections and enhancements to documents or their individual layers:

 

LUT

Black and White

Brightness and Contrast

Channel Mixer

Color Balance

Curves

Exposure

Gradient Map

HSL

Invert

Levels

OCIO

Posterize

Recolor

Selective Color

Shadows/Highlights

Soft Proof

Split Toning

Threshold

Vibrance

White Balance

 

The builtin help topics Using adjustment layersApplying adjustments (links are to the online US English versions of these topics) explain how they can be used. Note that unlike the cloning, healing, blemish removal, & other brush tools, none of them actually replace the pixels in any layer they are applied to -- they non-destructively change the appearance of those pixels, but do not contain any pixels of their own that are independent of the pixels of the layers they affect.

 

So basically, to do what you want you must use one of the methods that replace or create pixels on pixel layers.

All 3 1.10.8, & all 3 V2.4.1 Mac apps; 2020 iMac 27"; 3.8GHz i7, Radeon Pro 5700, 32GB RAM; macOS 10.15.7
Affinity Photo 
1.10.8; Affinity Designer 1.108; & all 3 V2 apps for iPad; 6th Generation iPad 32 GB; Apple Pencil; iPadOS 15.7

Link to comment
Share on other sites

well when i wrote my answer above, i missed the bit about replacing the original pixel layer with a new one, coming from a different raw-to-image conversion. right said @kaffeeundsalz: the pixels created by the healing tools would be taken from the original image and the healing could be strongly noticeable when you change the underlying image. my fault.

on the other hand, as @R C-R explained, an adjustment layer only contains effects, to be applied to the underlying pixels, but it does not contain pixels at all. this means that one cannot use them for healing; or you could achieve your goal masking them so that the effect only applies to certain areas of the image below, but if you change the image, their interaction with the new image could not result as expected.

@owenr's suggestion sounds interesting: if you record your healing work (on a pixel layer) as a macro, you can repeat exactly the same healing for another image: since the macro would not record the actual pixels, but only the healing tool strokes, playing it in a new image would take the pixels from the new image itself.

and (Offtopic), as to concerns the gimp, i used it for at least ten years and i really loved it. i also customized it somehow and recompiled it more than a few times. what drove me to AP was the non destructive editing and i never looked back since i switched. i still think that the gimp is a great tool: it is powerful and it is free (as in speech, not just as in beer).

take care,

stefano

Link to comment
Share on other sites

Grr, nope, unfortunately the technique suggested in the tutorial video that Toltec embedded does not do what I want. Specifically, what I want is a layer (adjustment layer or whatever you want to call it) that captures the relative changes to an underlying pixel layer that are made by healing and cloning. While the technique suggested does produce non-destructive changes, those changes are not transferable as relative changes to, say, a different (outside-done) raw conversion with different color or exposure. Instead, what happens is you get a new pixel layer that puts on top the actual pixels created by healing and cloning. Here's an illustration, showing respectively, by checking and unchecking layers, the original image, the healed image, and the same healing layer atop a different raw conversion with the color cranked up:

Affinity--nope.thumb.jpg.2a9b4d1a18cbe83ff690e432898af080.jpg

As you can see, it is merely the exact pixels generated by the healing that are on that layer, not the relative relationship of the healed pixels to the underlying pixels. If the technique had done what I want, then the third segment would have had the orange skin but with the same relative healing applied to the more normal skin color.

Back to the drawing board. Hopefully someone has another suggestion (I guess recording a macro being the only other thing to try, but that's a kluge). Thanks all.

 

Link to comment
Share on other sites

7 hours ago, N'Awlins Contrarian said:

Specifically, what I want is a layer (adjustment layer or whatever you want to call it) that captures the relative changes to an underlying pixel layer that are made by healing and cloning.

Relative to what? Cloning & healing are inherently based on the actual pixels of the original image. They are not relative to anything external to that specific image, other than in the trivial sense that the pixels are different in different images.

All 3 1.10.8, & all 3 V2.4.1 Mac apps; 2020 iMac 27"; 3.8GHz i7, Radeon Pro 5700, 32GB RAM; macOS 10.15.7
Affinity Photo 
1.10.8; Affinity Designer 1.108; & all 3 V2 apps for iPad; 6th Generation iPad 32 GB; Apple Pencil; iPadOS 15.7

Link to comment
Share on other sites

1 hour ago, R C-R said:

Relative to what? Cloning & healing are inherently based on the actual pixels of the original image. They are not relative to anything external to that specific image, other than in the trivial sense that the pixels are different in different images.

@R C-R i think @N'Awlins Contrarian means something like this:

when the healing tool is applied to copy the pixels from an area to another area on another layer, what should get stored with this second layer are not the actual pixels, but the difference in hue and lightness between the pixels that are copied and the pixels that they cover ("heal").

at least, i think.

take care,

stefano

Link to comment
Share on other sites

7 minutes ago, barninga said:

@R C-R i think @N'Awlins Contrarian means something like this:

when the healing tool is applied to copy the pixels from an area to another area on another layer, what should get stored with this second layer are not the actual pixels, but the difference in hue and lightness between the pixels that are copied and the pixels that they cover ("heal").

at least, i think.

Maybe so, but those differences are not going to be useful for other images because cloning & healing are selective processes that copy pixels from specific areas in an image to other specific areas in an image, not 'global' adjustments applied to the entire image.

 

It is extremely unlikely that either those specific source or destination areas will somehow match up with the specific areas in other images where cloning or healing is desired.

All 3 1.10.8, & all 3 V2.4.1 Mac apps; 2020 iMac 27"; 3.8GHz i7, Radeon Pro 5700, 32GB RAM; macOS 10.15.7
Affinity Photo 
1.10.8; Affinity Designer 1.108; & all 3 V2 apps for iPad; 6th Generation iPad 32 GB; Apple Pencil; iPadOS 15.7

Link to comment
Share on other sites

yes @R C-R you are right when referring to adjustment layers as we know them in AP (and in PS, and other photo retouching programs, i suppose). However, we could think about a special kind of adjustment layer, where the adjustment is not globally applied, but only locally, where painted by a healing tool, or some other kind of brush. A sort of mix between actual adjustment layers and regular pixel layers. it could implement several kinds of "relativeness" to the underlying layer - hue, luminosity, and so on.

i don't know if it's possible from programming point of view, and how much it could be tricky.

take care,

stefano

Link to comment
Share on other sites

6 minutes ago, barninga said:

However, we could think about a special kind of adjustment layer, where the adjustment is not globally applied, but only locally, where painted by a healing tool, or some other kind of brush.

There is still the problem of trying to match the localities of the source(s) & destination(s) between or among different images. I can't see any way this could be done other than manually with a brush or brushlike tool, which I do not think is what @N'Awlins Contrarian wants.

 

Besides, I think there is already a method provided to apply cloning between different images, as explained in the Affinity Photo - Clone Sources video.

All 3 1.10.8, & all 3 V2.4.1 Mac apps; 2020 iMac 27"; 3.8GHz i7, Radeon Pro 5700, 32GB RAM; macOS 10.15.7
Affinity Photo 
1.10.8; Affinity Designer 1.108; & all 3 V2 apps for iPad; 6th Generation iPad 32 GB; Apple Pencil; iPadOS 15.7

Link to comment
Share on other sites

I think R C-R is assuming you are using different images

But assuming you are using the same image but with a different RAW conversion (e.g. Contrast, Brightness, Colour etc) this could work.

Use the healing brush tool on a new layer to remove the spots/zits etc

If then you want to change your base image with a differently converted RAW image... when in the develop persona make a note of the settings you use (or simply create a preset) then go back into the develop persona and develop the, already created, healing brush tool layer using the same preset or settings.

To save time I am currently using an automated AI to reply to some posts on this forum. If any of "my" posts are wrong or appear to be total b*ll*cks they are the ones generated by the AI. If correct they were probably mine. I apologise for any mistakes made by my AI - I'm sure it will improve with time.

Link to comment
Share on other sites

1 minute ago, carl123 said:

But assuming you are using the same image but with a different RAW conversion (e.g. Contrast, Brightness, Colour etc) this could work.

Maybe, but a) it is still using a brush tool selectively & b) if the image is developed with different RAW settings that affect contrast, brightness, etc. it seems unlikely that applying the same preset or settings corrections to both would work -- their locality is the same but the amount of correction that would be needed to blend the corrected & uncorrected areas seamlessly is not.

 

At least as I see it, this takes us back to the 'relative to what' problem. I don't know of any way to define that that is somehow independent of the (developed) image. Any ideas on how to do that?

All 3 1.10.8, & all 3 V2.4.1 Mac apps; 2020 iMac 27"; 3.8GHz i7, Radeon Pro 5700, 32GB RAM; macOS 10.15.7
Affinity Photo 
1.10.8; Affinity Designer 1.108; & all 3 V2 apps for iPad; 6th Generation iPad 32 GB; Apple Pencil; iPadOS 15.7

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...

Important Information

Terms of Use | Privacy Policy | Guidelines | We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.