Jump to content
Doc Ricky

The same edits not working the same way between Photo and Designer

Recommended Posts

I am trying to recreate a pencil sketch effect in Affinity Designer. The way it works is to duplicate a photo layer, apply a desaturation on the top layer to -100% via HSL, then invert the image, before using a color dodge effect to the layer. If I do this in Affinity Photo, this should result in a blank image, where the top layers cancel out everything in the bottom main image - which should be correct. But in Designer, the same edits produce a different effect. I’ll attach the screenshots. 

Anyone else see this disparity?

31F6BF2C-1355-4F8C-A69E-B6807E1041E1.jpeg

5C49C92E-AE58-43D3-81A3-4AFD6A5AFBF2.jpeg

Share this post


Link to post
Share on other sites

You aren’t comparing like with like. The ‘Background’ layer in APh is a Pixel layer but the ‘Photo’ Layer In AD is an Image layer: you need to rasterize the Image layer so that you can work on it at the pixel level.


Alfred online2long.gif
Affinity Designer 1.7.0.367 • Affinity Photo 1.7.0.367 • Windows 10 Home (4th gen Core i3 CPU)
Affinity Photo for iPad 1.7.0.135 • Affinity Designer for iPad 1.7.0.9 • iOS 12.3.1 (iPad Air 2)

Share this post


Link to post
Share on other sites

Okay that did it. Thank you. So, I am a little confused, hopefully someone from Affinity can explain this. What's the difference between a Photo layer and a rasterized layer in Designer? Why can't they work the same way with regards to filters and layer options? What can you do with a Photo layer than you can't do with a rasterized layer? 

Share this post


Link to post
Share on other sites

Photo layers can be pixel or image layers. If you import a photo into AP an 'image' layer is created. To adjust this image at the 'pixel' level you must first 'rasterize' the image. This converts the 'image layer' to a 'Pixel Layer'. 

You can see what 'type' of layer it is in the Layer Studio. It will say either Pixel or Image (can be either type - see your pics above). It’s not that Designer and Photo use 'different' layers, it’s about what type of layer is being worked on and what you are attempting to do to the layer.

Here's how Affinity explain it.

 


IPad Pro 10.5 512GB iOS 12.1 Affinity Photo 1.6.11.85 Affinity Design 1.6.4.45

Share this post


Link to post
Share on other sites

Okay, that thread explains what is internally different between Image and Pixel layers (which is mostly programmatic), but for AD users, this is rather confusing. What's even more confusing is why exactly does this output happen when applied to Image layers - it doesn't make mathematical sense. Note that in these operations, nothing is destructive. Why couldn't this conversion be handled akin to the SVG export option, where items are rasterized as needed? 

I do appreciate the tip, though. Maybe this is a suggestion for the future. 

Share this post


Link to post
Share on other sites
1 hour ago, Doc Ricky said:

Okay, that thread explains what is internally different between Image and Pixel layers (which is mostly programmatic), but for AD users, this is rather confusing. What's even more confusing is why exactly does this output happen when applied to Image layers - it doesn't make mathematical sense. Note that in these operations, nothing is destructive.

I think it's something to do with 'mipmaps' which Affinity uses to render things more quickly. ?

If you do a search on the forum, you will find out more.

 


Windows PCs. Photo and Designer, latest non-beta versions.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×