Jump to content
You must now use your email address to sign in [click for more info] ×

Recommended Posts

2 hours ago, smadell said:

Hello @KCP. I confess up front that I am not a Photoshop user, so I am not fully conversant in everything that Smart Objects can do. However, that having been said, just what function of Smart Objects are you trying to achieve? Many of the functions of Photoshop Smart Objects are present in Affinity Photo, usually without resorting to linked files and such. What specifically are you trying to do?

I showed you an example in the last two screenshots compared.

 

Link to comment
Share on other sites

1 hour ago, smadell said:

I saw the same 2 screenshots @walt.farrell - one is from Affinity Photo and the other from Photoshop. The Photoshop screenshot looks like it was a duplicate of just the dog's face, made into a Smart Object, to which a "smart" Gaussian Blur filter was applied. The Affinity Photo screenshot seems to be a background (on the bottom) and a duplicate of the dog's face (on top), to which a destructive Gaussian blur has been applied. My question about what @KCP was trying to achieve was based on those screenshots. Obviously the same "Smart Object" functionality can be easily attained using a Live Gaussian blur, constrained to the dog's face by virtue of its built-in mask.

And, although it may be a moot point, @KCP could just as easily made the original background layer in Photoshop into a Smart Object and applied smart filters (the Gaussian Blur) directly, without the need to duplicate any part of the pixel layer.

You mentioned "Live" Gaussian Blur, which is the first time anyone here has mentioned it. I see no live filters anywhere in AP. Is "Live" the actual equivalent, because the screenshots I posted clearly demonstrate my point about how fast and useful layers converted to Smart Objects are in the workflow? Fwiw, I'm neither trying to fault AP nor be argumentative. I'm a pro user who is teetering on the fence of moving away from Adobe at this point, although I need to stress test the Affinity Suite to make sure that it can handle my workload.

Link to comment
Share on other sites

There are many live filters available, including gaussian blur.

-- Walt
Designer, Photo, and Publisher V1 and V2 at latest retail and beta releases
PC:
    Desktop:  Windows 11 Pro, version 23H2, 64GB memory, AMD Ryzen 9 5900 12-Core @ 3.00 GHz, NVIDIA GeForce RTX 3090 

    Laptop:  Windows 11 Pro, version 23H2, 32GB memory, Intel Core i7-10750H @ 2.60GHz, Intel UHD Graphics Comet Lake GT2 and NVIDIA GeForce RTX 3070 Laptop GPU.
iPad:  iPad Pro M1, 12.9": iPadOS 17.4.1, Apple Pencil 2, Magic Keyboard 
Mac:  2023 M2 MacBook Air 15", 16GB memory, macOS Sonoma 14.4.1

Link to comment
Share on other sites

1 minute ago, Return said:

The live filters are the smartfilters in PS aka smartobjects, they will be editable afterwards.

As I said before, I'm not interested in importing smart objects from PS. I'm want to know if there is an equivalent function, for creating them, in AP. The point is to be able to  have that functionality in AP as the default editor for my future workflow. 

Link to comment
Share on other sites

It was just explanatory.
The live filters or live adjustments are similar to the smartfilters and are always accessible(editable) whether on image/vector or pixellayers.
If one wants to use them, you may(must) convert an image layer into an pixellayer first, to get it's full potential.




 

Link to comment
Share on other sites

To preface, thank you to everyone for the input and thoughtful suggestions. I do appreciate them. That said, they don't demonstrate how to create smart objects as I demonstrated them in the screenshot from PS. Smart objects turn all of the adjustment filters into smart filters, which can easily be edited by simply clicking on the layer. It's one click, which makes workflow clean, fast, and simple. Can someone do a quick screen vid to demonstrate how AP (not Designer) can do this? It would not only help me but also anyone else in my situation trying to make this transition. Smart Filters are such an essential tool in PS that I'm surprised none of the Admins for Serif have posted a solution yet. 

Link to comment
Share on other sites

3 hours ago, smadell said:

I saw the same 2 screenshots @walt.farrell - one is from Affinity Photo and the other from Photoshop. The Photoshop screenshot looks like it was a duplicate of just the dog's face, made into a Smart Object, to which a "smart" Gaussian Blur filter was applied. The Affinity Photo screenshot seems to be a background (on the bottom) and a duplicate of the dog's face (on top), to which a destructive Gaussian blur has been applied. My question about what @KCP was trying to achieve was based on those screenshots. Obviously the same "Smart Object" functionality can be easily attained using a Live Gaussian blur, constrained to the dog's face by virtue of its built-in mask.

And, although it may be a moot point, @KCP could just as easily made the original background layer in Photoshop into a Smart Object and applied smart filters (the Gaussian Blur) directly, without the need to duplicate any part of the pixel layer.

 

3 hours ago, smadell said:

To follow up, I know that it is de rigeur in Photoshop to duplicate the background layer as the very first thing one does. I assume that this has to do with the notion that Photoshop's roots are mostly as a destructive editor. That automaticity about duplicating the background layer persists to this day, even though Smart Objects and Smart Filters seem to make this unnecessary. But… I am not a Photoshop user, so maybe I'm missing the bigger picture!

"My question about what @KCP was trying to achieve was based on those screenshots. Obviously the same "Smart Object" functionality can be easily attained using a Live Gaussian blur, constrained to the dog's face by virtue of its built-in mask."

I was trying to show that if I wanted to change the amount of blur, after the fact, it's as easy as clicking on the smart filter and changing the amount. How do you do this in AP?

 

Link to comment
Share on other sites

55 minutes ago, Return said:

Something like my crude example:
 

 


 

A video is worth a million words. I never even noticed the hour glass. Thank you! It works perfectly. My guess is that Affinity is much easier to learn from scratch, like with full-immersion language learning. I started using Adobe with the first version (Photo Shop 0.63), so everything is a comparison. 

Link to comment
Share on other sites

If a video is worth a thousand words (I agree with @KCP on that one), then watch the attached YouTube video from Serif. It's brief, and goes over most of the issues you've brought up.

 

Affinity Photo 2, Affinity Publisher 2, Affinity Designer 2 (latest retail versions) - desktop & iPad
Culling - FastRawViewer; Raw Developer - Capture One Pro; Asset Management - Photo Supreme
Mac Studio with M2 Max (2023}; 64 GB RAM; macOS 13 (Ventura); Mac Studio Display - iPad Air 4th Gen; iPadOS 17

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...

Important Information

Terms of Use | Privacy Policy | Guidelines | We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.