Jump to content

Affinity Photo: Sample color at (x, y) in Procedural Texture Filter


Recommended Posts

9 hours ago, GarryP said:

There’s no way to use/pick colours in there

You can use colors in Procedural Texture Filter. Each color channel (red, green, blue) is normalized and represented as a range of 0…1 float numbers. 

For example, to output an orange(-ish) color, you can write: 

vec3(1.0, 0.7, 0.5)

It is a normalized representation of RGB(255, 178, 127)

 

Usually, in GLSL-like languages, there's also a way to sample colors from the source image. 

affinity_proc-orange-color-vec3.png

Link to post
Share on other sites

See:

... which BTW is IMO instable and often cores ...

Quote

Process:               Affinity Photo [2790]
Path:                  /Applications/Affinity Photo.app/Contents/MacOS/Affinity Photo
Identifier:            Affinity Photo
Version:               1.8.3 (1.8.180)
App Item ID:           824183456
App External ID:       835357412
Code Type:             X86-64 (Native)
Parent Process:        ??? [1]
Responsible:           Affinity Photo [2790]

...

...

Time Awake Since Boot: 19000 seconds
Time Since Wake:       2300 seconds

System Integrity Protection: enabled

Crashed Thread:        0  Dispatch queue: com.apple.main-thread

Exception Type:        EXC_BAD_ACCESS (SIGSEGV)
Exception Codes:       KERN_INVALID_ADDRESS at 0x0000000000000000
Exception Note:        EXC_CORPSE_NOTIFY

VM Regions Near 0:
-->
    __TEXT                 000000010de3f000-000000010deab000 [  432K] r-x/r-x SM=COW  /Applications/Affinity Photo.app/Contents/MacOS/Affinity Photo

Application Specific Information:
Performing @selector(onDescriptionChanged:) from sender SchemeTextField 0x7faf54b6aa00

Thread 0 Crashed:: Dispatch queue: com.apple.main-thread
0   com.seriflabs.libcocoaui          0x0000000153232040 -[ExpressionConstantRangeView onDescriptionChanged:] + 48
1   libsystem_trace.dylib             0x00007fff88a7007a _os_activity_initiate + 75
2   com.apple.AppKit                  0x00007fff8a53cdbd -[NSApplication sendAction:to:from:] + 460
3   com.apple.AppKit                  0x00007fff8a54ef12 -[NSControl sendAction:to:] + 86
4   com.apple.AppKit                  0x00007fff8a4b355c -[NSTextField textDidEndEditing:] + 487
5   com.apple.CoreFoundation          0x00007fff91579b0c __CFNOTIFICATIONCENTER_IS_CALLING_OUT_TO_AN_OBSERVER__ + 12
6   com.apple.CoreFoundation          0x00007fff91579a9f ___CFXRegistrationPost_block_invoke + 63
7   com.apple.CoreFoundation          0x00007fff91579a17 _CFXRegistrationPost + 407
8   com.apple.CoreFoundation          0x00007fff91579782 ___CFXNotificationPost_block_invoke + 50
9   com.apple.CoreFoundation          0x00007fff91536592 -[_CFXNotificationRegistrar find:object:observer:enumerator:] + 1922
10  com.apple.CoreFoundation          0x00007fff915357e5 _CFXNotificationPost + 693
11  com.apple.Foundation              0x00007fff98108f5a -[NSNotificationCenter postNotificationName:object:userInfo:] + 66
12  com.apple.AppKit                  0x00007fff8a49f481 -[NSTextView(NSSharing) resignFirstResponder] + 942
13  com.apple.AppKit                  0x00007fff8a39432f -[NSWindow _realMakeFirstResponder:] + 228
14  com.apple.AppKit                  0x00007fff8a3941f7 -[NSWindow makeFirstResponder:] + 123
15  com.apple.AppKit                  0x00007fff8a3b86e5 -[NSWindow endEditingFor:] + 352
16  com.apple.AppKit                  0x00007fff8a2b680e -[NSView removeFromSuperview] + 82
17  com.apple.AppKit                  0x00007fff8a3e0c7b -[NSView removeFromSuperviewWithoutNeedingDisplay] + 38
18  com.apple.AppKit                  0x00007fff8a49c719 -[NSTableRowData _removeViewAndAddToReuse:forRow:] + 79
19  com.apple.CoreFoundation          0x00007fff915603e6 __65-[__NSDictionaryM enumerateKeysAndObjectsWithOptions:usingBlock:]_block_invoke + 102
20  com.apple.CoreFoundation          0x00007fff915602d9 -[__NSDictionaryM enumerateKeysAndObjectsWithOptions:usingBlock:] + 185
21  com.apple.AppKit                  0x00007fff8a63b41c -[NSTableRowData _removeNonVisibleViewsInDictionary:] + 88
22  com.apple.AppKit                  0x00007fff8a3db00c -[NSTableRowData _removeRowsBeingAnimatedOff] + 60
23  com.apple.AppKit                  0x00007fff8a3da8a7 -[NSTableRowData removeAllKnownSubviews] + 178
24  com.apple.AppKit                  0x00007fff8a3da520 -[NSTableRowData reloadData] + 92
25  com.apple.AppKit                  0x00007fff8a3ded86 -[NSTableRowData _doWorkAfterEndUpdates] + 153
26  com.apple.AppKit                  0x00007fff8a3debf7 -[NSTableView _endUpdateWithTile:] + 134
27  com.seriflabs.libcocoaui          0x0000000153230b8d -[ExpressionConstantsViewController deleteConstant:] + 61
28  libsystem_trace.dylib             0x00007fff88a7007a _os_activity_initiate + 75
29  com.apple.AppKit                  0x00007fff8a53cdbd -[NSApplication sendAction:to:from:] + 460
30  com.apple.AppKit                  0x00007fff8a54ef12 -[NSControl sendAction:to:] + 86
31  com.apple.AppKit                  0x00007fff8a54ee3c __26-[NSCell _sendActionFrom:]_block_invoke + 131
32  libsystem_trace.dylib             0x00007fff88a7007a _os_activity_initiate + 75
33  com.apple.AppKit                  0x00007fff8a54ed99 -[NSCell _sendActionFrom:] + 144
34  libsystem_trace.dylib             0x00007fff88a7007a _os_activity_initiate + 75
35  com.apple.AppKit                  0x00007fff8a54d3be -[NSCell trackMouse:inRect:ofView:untilMouseUp:] + 2693
36  com.apple.AppKit                  0x00007fff8a595f04 -[NSButtonCell trackMouse:inRect:ofView:untilMouseUp:] + 744
37  com.apple.AppKit                  0x00007fff8a54bae8 -[NSControl mouseDown:] + 669
38  com.apple.AppKit                  0x00007fff8aaa03c9 -[NSWindow _handleMouseDownEvent:isDelayedEvent:] + 6322
39  com.apple.AppKit                  0x00007fff8aaa13ad -[NSWindow _reallySendEvent:isDelayedEvent:] + 212
40  com.apple.AppKit                  0x00007fff8a4e0539 -[NSWindow sendEvent:] + 517
41  com.apple.AppKit                  0x00007fff8a460a38 -[NSApplication sendEvent:] + 2540
42  com.seriflabs.libcocoaui          0x000000015324a1a4 -[Application sendEvent:] + 756
43  com.apple.AppKit                  0x00007fff8a2c7df2 -[NSApplication run] + 796
44  com.apple.AppKit                  0x00007fff8a291368 NSApplicationMain + 1176
45  com.seriflabs.affinityphoto       0x000000010de40c04 0x10de3f000 + 7172

 

 

☛ Affinity Designer 1.9.3 ◆ Affinity Photo 1.9.3 ◆ OSX El Capitan

Link to post
Share on other sites
16 minutes ago, v_kyr said:

See:

... which BTW is IMO instable and often cores ...

 

I don't see a function that returns the colour at given coordinates.

If there is one, it would likely return the colour in the form of a vector (3D if RGB, for example), but there also seems to be no mentioned way to get the individual components from a vector.

What am I missing?

Link to post
Share on other sites
1 minute ago, anon2 said:

If there is one, it would likely return the colour in the form of a vector (3D if RGB, for example), but there also seems to be no mentioned way to get the individual components from a vector.

You can get the individual components of a vector by using its x, y, z properties.

For example, we can get `0.5` from a `vec3(0.5, 0, 0)` by accessing its `x` property:

var color = vec3(0.5, 0, 0); color.x

This code will output 50% gray color

 

4 minutes ago, anon2 said:

I don't see a function that returns the colour at given coordinates.

Yup. I couldn't find anything related to color sampling too. That's why I decided to ask here. Documentation has some missing bits, and I thought maybe color sampling is one of them.

Link to post
Share on other sites
1 hour ago, anon2 said:

I don't see a function that returns the colour at given coordinates.

There isn't any and the online doc I pointed to also doesn't mention any color sampler functionality. - Though they have a picker for the origin (x,y), but I didn't checked if that works at all. However, they could also sample/grab the color values then via that picker by using another shortcut.

screenshot.jpg.b58d609e14a626928acb7383b0bd9bc3.jpg

☛ Affinity Designer 1.9.3 ◆ Affinity Photo 1.9.3 ◆ OSX El Capitan

Link to post
Share on other sites
On 6/5/2020 at 3:45 PM, v_kyr said:

There isn't any and the online doc I pointed to also doesn't mention any color sampler functionality. - Though they have a picker for the origin (x,y), but I didn't checked if that works at all.

Experimentally, the origin 'picker' seems to provide a way to drag the origin of (most) procedural textures around within the layer it is applied to. (Clicking doesn't seem to do anything but double-clicking on some point sets the origin to that point.)

To see how this works, try using one of the built-in Basic shapes presets (except for Graduated random squares), or any of the built-in Noise or Monochrome patterns ones. With any of them selected, drag the crosshair around on the canvas & you should see the texture move.

It does not seem to have anything to do with color sampling.

Affinity Photo 1.9.3, Affinity Designer 1.9.3, Affinity Publisher 1.9.3;  2020 iMac 27"; 3.8GHz i7, Radeon Pro 5700, 32GB RAM; macOS 10.15.7
Affinity Photo 
1.92.236 & Affinity Designer 1.9.2 (showing 1.9.9) for iPad; 6th Generation iPad 32 GB; Apple Pencil; iPadOS 14.4 (18D52)

Link to post
Share on other sites
55 minutes ago, R C-R said:

It does not seem to have anything to do with color sampling.

No, as said actually it doesn't, but it could be programmatically enhanced to be used for that functionaility then too.

55 minutes ago, R C-R said:

To see how this works, try using one of the built-in Basic shapes presets (except for Graduated random squares), or any of the built-in Noise or Monochrome patterns ones. With any of them selected, drag the crosshair around on the canvas & you should see the texture move.

Well I don't use those Affinity build-in texure filters, equations panels etc., they are too limited, uncomfortable and error prone for me. - Thus if I do such things, I use such things more in a programming fashion. On MacOS I tend nowadays to do those core graphics and core image related filter things via Swift in XCode apps and playgrounds (sometimes also via Python in an even more interactively manner).

If you scroll down this little site from the autor from the core image book, you will maybe see that there are much more capabilities available when using Apple's already existing APIs and some own Swift or Obj-C code.

 

☛ Affinity Designer 1.9.3 ◆ Affinity Photo 1.9.3 ◆ OSX El Capitan

Link to post
Share on other sites
2 minutes ago, v_kyr said:

Well I don't use those Affinity build-in texure filters, equations panels etc., they are too limited, uncomfortable and error prone for me.

I sometimes find the live versions of the Procedural & other filters to be useful because they are non-destructive & there is no need to use anything other than AP to use them. Typically for the procedural ones, I either start with a built-in preset & try modifying it or use one that someone else has created & has been kind enough to post about it in the forums.

Some of the filters might also be useful with macros (possibly with batch jobs), but I do not have much experience with that.

Affinity Photo 1.9.3, Affinity Designer 1.9.3, Affinity Publisher 1.9.3;  2020 iMac 27"; 3.8GHz i7, Radeon Pro 5700, 32GB RAM; macOS 10.15.7
Affinity Photo 
1.92.236 & Affinity Designer 1.9.2 (showing 1.9.9) for iPad; 6th Generation iPad 32 GB; Apple Pencil; iPadOS 14.4 (18D52)

Link to post
Share on other sites

Well as already said, most of such things are too limited and don't offer the flexibility you have when you code with the already available OS framework & API capabilities. Further if you do such things programatically, you learn much more also from an algorithmic standpoint, things which are otherwise hidden behind the scenes for you, as for when just using AP etc. - Also I for my part also need certain functions which generate specific things in own software projects.

☛ Affinity Designer 1.9.3 ◆ Affinity Photo 1.9.3 ◆ OSX El Capitan

Link to post
Share on other sites
7 hours ago, T V said:

@darkwark Are you using this in procedural texture to expand the function or is this something that could be achieved by macro? 

I need this inside the Procedural Texture Filter, unfortunately. Color sampling would allow writing filters that can manipulate image data in a lot of different ways. Right now, it is only possible to generate graphics from scratch, which is fair — it is called Procedural Texture, not a Custom Image Filter after all.

Link to post
Share on other sites
On 6/6/2020 at 7:25 PM, darkwark said:

I need this inside the Procedural Texture Filter, unfortunately. Color sampling would allow writing filters that can manipulate image data in a lot of different ways. Right now, it is only possible to generate graphics from scratch, which is fair — it is called Procedural Texture, not a Custom Image Filter after all.

Unfortunately, I am at a loss on this as a procedural texture function.  As @v_kyr said, there are many things lurking behind the face of the app that we are just scratching the surface of.  Macros most likely would have to be incorporated.   

Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...

Important Information

Please note there is currently a delay in replying to some post. See pinned thread in the Questions forum. These are the Terms of Use you will be asked to agree to if you join the forum. | Privacy Policy | Guidelines | We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.