Jump to content

ShiftEleven

Members
  • Content count

    11
  • Joined

  • Last visited

About ShiftEleven

  • Rank
    Newbie

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Thanks Walt. Most docs I've seen where one app can launch a UWP is through a protocol. But I'll keep digging. The only other idea I have right now is to call Invoke-Item on the images and then let Windows use it's supported mime-type mapping. Gonna give it a try. The good news is that edit/open functionality in Capture One is done through a Plugin, and anyone can write a plugin.
  2. So I've run into a conundrum - the Affinity suite of apps that I purchased from the Microsoft Store doesn't appear to have command-line support. This is because those Windows Apps downloaded from the store are sandboxed off away from users. You cannot directly find the `Photo.exe` file from the Explorer (unless you're willing to blow away a lot of permissions). The best I could find on how to start Affinity Photo from the MS Store from the command line is the following: powershell.exe explorer.exe shell:AppsFolder\$(get-appxpackage -name SerifEuropeLtd.AffinityPhoto ^| select -expandproperty PackageFamilyName)!SerifEuropeLtd.AffinityPhoto But that doesn't allow me to pass in any arguments into the Photo executable. There does appear to be a solution though. First - the Affinity Suite of apps could use Command Line Activation. That would basically set up an alias so that users of the command line could send arguments to that exec alias. The apps would just need to update their OnActivated handlers for the new CommandLineLaunch activation kind. Another option could be to create and register a Protocol. This is similar to the Command Line Activation, but now you can have the Affinity apps open based on URL. For example, afphoto:// could open up Photo, and afphoto://file=foo.tif could open load the foo.tif file into Photo. Maybe you could even supply actions to Merge or create HDR from URL parameters - but that would just be icing on the cake. If your curious about my reasoning (like, who really needs to use the command line for this!?!?), it's that the UWP MS Store app boundaries are preventing integration with my DAM. I use Capture One, and C1 cannot find Affinity Photo because I cannot navigate to the app from the Windows Explorer (files and packages are hidden). By having these entry points, it could allow for me to create an extension to Capture One or a batch command wrapper so that I can send the files I want to edit over to Photo. To be clear, if I had purchases Photo directly from Serif, this wouldn't be an issue since the app gets installed in Program Files and I can browse for that. Thanks for your consideration. I hope the links show that to enable the feature wouldn't require too much code to get that done (well, how could I really know, I haven't seen your code) and would love to see that. I thought purchasing the MS Store version would just mean the apps would behave the same and that I wouldn't have to worry about performing updates manually - but now I know a little more about how UWP apps work and sorta regret doing so. Cheers and all the best
  3. Do you know if it's possible to go the other way? Purchase from Mac App Store or Microsoft Store and then get a download license to use from Serif? The reason I ask is that the Capture One integration with Photo from the Microsoft Store is non-existent for me. Just the way UWP apps work where there isn't the command line arg support that you can get from the Serif download. Apparently this has a lot to do with the sandboxing of the store UWP apps.
  4. This appears to be fixed in the beta of 1.7 on Windows Desktop. So here's to hoping it remains fixed in the full release Glad I wasn't the only one experiencing this issue and that it's been resolved.
  5. So I think there are two issues in here - the one you're experiencing and the one I'm experiencing. Yours appears to some driver support as no matter what you do, the UI doesn't update below the pen. Am I understanding that correctly? Then there's the one I have where the pen overlay works until I use some keyboard shortcut. I can get it working again by engaging the trackpad. So mine works until I start changing brush sizes of interacting with the UI that's not painting.
  6. The Surface Pro 2017 and Windows 10 allow you to hide or show the cursor. Settings > Pen & Windows Ink > Show Cursor I'm realizing I had a gap in my instructions Select the move tool with the pen Create a mask Select the brush tool with the pen Note that the brush size preview is gone Wiggle the mouse or touch the trackpad Move the pen to the screen. Note that the brush size preview is available So it does actually work in some instances. There are just times when it gets hung up and I have to touch my trackpad to get it back for my pen.
  7. If I'm editing a mask with a brush, then I would expect the blend modes to work like they do when I'm normally painting a layer. Examples include: Brush with White Color and Overlay Blend Mode should help fill in white parts of the mask, without revealing/coloring parts that are already black. Brush with Black Color and Overlay Blend Mode should help fill in black parts of the mask, without hiding/coloring parts that are already white. Works best with a low flow For clarity, these do what I expect if I just paint with black and white on a normal pixel layer. But none of the blend modes do anything when painting on a mask.
  8. Came here to say I see the same thing on my Surface. Sometimes the brush preview turns off even though I've turned it on in the preferences. The best way I've been able to recreate: Select the move tool with the pen Create a mask Select the brush tool with the pen At this point, the brush size preview does not show up. At this point, if I wiggle the mouse, it's good and I see the preview.
  9. I was wondering if anyone else has run into this issue and if there's some setup that I'm not doing correctly. So here's my process: - I have some hair around a person that I want to mask out. - I make a quick selection and then I go into "Refine Selection" - I refine the selection so that the hair looks properly selected. - I change the Overlay from the red selection, to Transparent, to White, and then to Black and it looks like Ive cut out the hair - I save my changes to a mask The result is that now I see halos around hair strands that I wasn't seeing in the Refine Selection screen. Anyone else?
  10. ShiftEleven

    Refine mask edge

    The last two are example of refining a selection. The topic here is refining a mask. On the desktop version there is a quick and easy way to do that. The best Ive found so far in the iPad app is to: Select the mask in the layers panel Move to Channels Studio Touch the tree dots "..." on "Mask Alpha". Then touch "Create Spare Channel" Touch the three dots on "Spare Channel" and then "Load to Pixel Selection" Delete the old mask layer (otherwise you double up the masks) From there, you now have the selection that you can further refine. And then you can save it back to a layer mask. I was hoping for something more automatic, like what exists for the Desktop.
  11. ShiftEleven

    Refine mask edge

    So....how do you do it for the iPad. You said you found out, but you didn't give the answer
×

Important Information

These are the Terms of Use you will be asked to agree to if you join the forum. | Privacy Policy | Guidelines | We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.