Jump to content
You must now use your email address to sign in [click for more info] ×

skiphunt

Members
  • Posts

    206
  • Joined

  • Last visited

Posts posted by skiphunt

  1. Hi, 

     

    Updated to 1.6.7 and so far no problems. Brilliant update. Love the 3 free brush & filter/marcro packs offered with this update too. Thanks!

     

    A couple questions, I’m not clear on how open in place is supposed to work. Is that the exact same thing that’s demonstrated in the “interworking” tutorial video?

     

    Also, what is a transparent TGA import?

  2. Looks like my question was a bit unclear. I was asking about any cool and interesting ways Affinity Photo has been optimized for the Apple Pencil specifically. Seems there were demos showing the lighting fx feature reacting to the tilt of the Apple Pencil (thought I haven't managed to get it to do that myself)

    I'm aware of the Apple Pencil's general function and some pressure sensitive stuff in Photo, but I was looking for anything unique, cool, tricks, etc. that Affinity has built into Photo to specifically take advantage of the Apple Pencil in particular. 

    As I mentioned, I was completely content with my Adonit Jot Pro (non-bluetooth) stylus for fine detail editing, but curious what other groovy stuff I can try now that I also have an Apple Pencil.

  3. Hey, 

     

    I used Affinity Photo for iPad with a regular Jot Pro (non-bluetooth) stylus and felt it served pretty much any need I could come up with, so I never bought an Apple Pencil. 

    Since then, I received an Apple Pencil as a gift and can definitely see the advance over a non-bluetooth stylus for painting on the iPad. 

     

    But, what are some of the coolest things you can do in Photo with an Apple Pencil that you absolutely can't do with a regular non-bluetooth stylus? 

  4. 29 minutes ago, dmstraker said:

    Buggy software seems to be the norm these days, although it has always been released with defects in -- sometimes lots of them. I worked in software quality and have direct experience.

    A dilemma is that it is almost impossible to find all defects and arrival rates are often on a negative exponential curve. A practical question for developers is 'how many defects do you want to release with the software' rather than 'when will the number of defects be negligible'. Software development is highly front-loaded in costs -- you have to pay loads before you have anything to sell, and there's huge pressure to get it out there, especially if you can find a friendly, tolerant market (unlike for example, medical and military customers, who pay a lot and must have very high quality systems).

    When I was working in the industry, we'd do one release per year, if that. Nowadays with the internet as an easier channel than disks, companies do regular bug-fixes and updates. For users, this means a little patience, although bug fix releases are more frequent. Affinity crashes on me regularly, but I forgive it because (a) it's great value, and (b) I love the way it works. I also hate the Photoshop price model (didn't like it either when it was hundreds of pounds for a DVD) and, after an extensive period of trial of other software, I concluded Affinity was, despite its current imperfections, (a) the best, and (b) likely to get better in the not too distant future.

     

     

    To be clear, I AM sympathetic to Affinity for the balancing act of when to launch. However, I think the main deciding factor they used with Photo's launch was having it ready in conjunction with it's feature in the Apple presentation, while it was fresh in the the mind to maximize launch revenue numbers. That was a marketing decision that didn't necessarily consider what the experience would be like for end users who were paying based on trust of the company. I don't know that had I been in charge of their marketing, I wouldn't have made the same decision. But I'm not the marketing director... I'm one of the suckers who got a little burned on launch.

    I also don't mind that I paid $20 for the app. It's worth the $30 they wanted to charge for it. The fact that it's now perpetually on sale and seems to be settling in the $15 range, is disappointing. 

    Like I've said, I will likely buy their products again. I have both Affinity Photo and Designer for the desktop. I've got Photo for iPad, and I'll likely buy the next apps they release. I just won't be foolish enough to buy them on launch again. I'll wait for a few updates and reviews until I know the app is reasonably stabile, and likely a sale... since they seem to be a quarterly thing anyway. 

     

    Once bitten... 

  5. First off, like I said... had I not had to basically pay a premium to have a buggy, unusable app at launch, and for months following... I wouldn't have minded basically paying a premium price for an app that would ultimately settle on less expensive pricing than what I paid. It's unacceptable to pay premium app prices for apps that barely even work. So, YOU give me an effin' break. 

     

    I even bought an effin' brand new iPad Pro... mostly because I thought all the problems with Affinity Photo were related to it not being able to run on an Air2. I didn't need to invest in a new machine, because the problems were with Affinity Photo, NOT the hardware.

     

    Secondly, I'm giving Affinity honest customer feedback. Maybe they'll think twice before launching an app that wasn't ready for prime-time. Don't tell me they didn't know the app wasn't ready for general use. It was a complete dog out of the gate. They launched an app they knew wasn't ready, charged what was allegedly a bargain launch price... simply because they wanted to sell a lot of units on the tail of being featured at the Apple presentation. They obviously didn't consider (or care) how the customers would have to deal with an app that was freakin' buggy as hell. 

     

    That said, I'm not asking for a refund. I'm not asking for anything at all. I'm just stating that although I'll continue to use Affinity Photo (now that it's mostly useabl), I won't necessarily be buying another app from this company. At least, I won't be foolish enough to buy an Affinity app at launch again. I'll just wait for all the early adopters to pay the premium cost, as well as all the aggravating time on a paid beta, and only after that period will I'll consider buying again, only after several updates and confirmation that the app is stable. Maybe even just wait for these deep sales they like to have. I bought on launch and didn't question the cost because I trusted this company to deliver a great and stabile product for a fair price. With this launch, they sadly disappointed and lost my trust.

  6. I was excited about Affinity Photo for iPad. I trusted the company and bought it at launch before any reviews or feedback for $19.99. This was supposed to be a temporary early adopter discount before it went up to $29.99. Only, in the last 7 months since launch, it’s only been $29.99 for one week. It’s even been discounted as low as $9.99 a couple of times and now it’s $14.99.

     

    If I’d been able to actually use Photo at launch, it wouldn’t have bothered me as much, but it was horribly buggy at launch and for months after. I even bought a new iPad Pro with hope that would make photo run better. It didn’t. 

     

    So, I basically, stupidly paid a premium price for the agonizing privilege of being a beta tester. Had I waited until the app had been properly tested by all of the other suckers, I could’ve picked it up for $9.99.

     

    What’s done is done, but I can say that I’ve lost a fair amount of trust in this company, and I for one will NOT be foolish enough to buy one of your apps at launch again. Fool me once... :(

  7. 24 minutes ago, MEB said:

    Hi LilleG,

    Here's the inpainting tool in the Panorama Persona:

     

    inpainting_panorama.png

     

     

     

    To use the Edit ▸ Inpaint  command (which is a different thing as carl123 pointed out) you must have a pixel selection active.

     

     

    Just to be clear.. this is the desktop version only that we can do the auto-paint/inpaint of extra canvas around a panorama correct? 

    My original post was specifically about doing this in iPad Photo, not the desktop version. We still can't do this on the iPad correct?

  8. On 7/28/2017 at 0:37 PM, LilleG said:

    Ah!  Overlooked that part of your post.

     

    2 hours ago, dmstraker said:

    I've just found that you have to rasterise before inpainting into transparent corners (and after canvas stretch).

    Process:

    • Layer/Rasterise…

       

    • Select/Alpha Range/Select Opaque

       

    • Select/Invert Pixel Selection

       

    • Edit/Inpaint

       

    • Select/Deselect

       

    Macro to do this in one click attached.

    Dave's Alpha Inpaint.afmacros

     

     

    Well, this is what I'm already doing, ie. manually inpainting in. Thanks, but I think you misunderstood the question. What I was asking about, and what MEB replied to... is the function of automatically "inpainting" in all of the canvas exposed when you do a panorama or straighten the image. The desktop version does this. You don't have to manually "inpaint" it in like you're describing. The desktop version actually does an awesome job of this consistently, and I'd venture to say even more accurately than Photoshop's "content-aware" fill. 

    However, according to MEB... it doesn't look like that function made it into the iPad version.

     

  9. 9 hours ago, ErrkaPetti said:

    I edited an 150 megapixel panorama on my iPad Pro 12.9" (2017) with no issue at all!

     

    But, I've got the latest 1.64 beta 4 installed...

     

    Really impressed with Affinity Photo so far, and, now longing after Affinity Designer / Affinity Publisher to reach the beta stadium on iPad...

     

    Good to know. I haven't tried anything like that yet, but I deleted my purchased app, and installed the beta to take a spin. 

    Was that 150MB panorama based on raw image sources? I ask because it seemed like just about every time I had a crash on my Air2, it'd be when I was starting with a raw source image.

  10. I fortunately was gifted as an early birthday present from my sweetie :) A 10.5 iPad pro 256GB. 

     

    Although I already own Photo iPad, I joined the Testflight beta testing. I haven't installed the beta yet, but I just spent over 2 hours trying to get Photo iPad to crash on my new iPad. It never crashed. No issues at all. I started with a RAW image, did a regular edit.. then started stacking on layers, layer fx, etc. Then cut/pasted an even higher resolution image to blend on top of it, etc. Then I took a high res jpeg, and stacked 4 layers of the same image with various blends and fx between layers, etc. 

     

    I wanted to crash Photo first, so that when I install the current beta I could retrace my steps and see if the beta is more stable. However, I've run out of time to mess with this because I can't get Photo to crash. 

     

    So, at least for me... so far... it's looking like the more powerful iPad may have resolved my prior crashing. I've already erased my Air2 so I can't go back and compare. Maybe it's because I was working with a fresh iOS install on the new iPad? Or, maybe Photo needs the latest gear? I'd hesitate to make that assumption since other's here with iPad Pros are still reporting some crashing. But despite my best effort, Photo on my new 10.5 Pro is rock solid. I'll see if I can crash it later before I take the latest beta for a spin. :)

  11. 1 minute ago, LilleG said:

    It works, but erratically and incompletely.  The only way I've found so far to get it to work as skiphunt wants is to Duplicate the layer, rotate and crop, then Rasterize the layer and use Current Layer and Below to Inpaint the transparent sections.  Even then, it acts more like the Clone tool than true inpainting.

     

    Well, this is what I'm already doing, ie. manually inpainting in. Thanks, but I think you misunderstood the question. What I was asking about, and what MEB replied to... is the function of automatically "inpainting" in all of the canvas exposed when you do a panorama or straighten the image. The desktop version does this. You don't have to manually "inpaint" it in like you're describing. The desktop version actually does an awesome job of this consistently, and I'd venture to say even more accurately than Photoshop's "content-aware" fill. 

    However, according to MEB... it doesn't look like that function made it into the iPad version.

  12. Hi,

     

    In the desktop version, after you're straightened and image, or created a panorama, etc. and are left with blank areas around the canvas with no data, there's an easy way to have those areas automatically inpainted/filled based on content in proximity. Much like "content-aware" fills in Photoshop. 

    Is there a way to do with with the iPad version? I know I can manually use the inpainting brush around the periphery, but it's a bit tedious and hit & miss. Any way to fill these areas with auto-inpainting like you can with the desktop version?

  13. @stokerg thanks! I remember doing the tutorial on the desktop side and having the same question. I can sort of see the advantage in the example used for the desktop Photo, where you're retouching rough skin... almost, but wasn't sure what the benefit would be vs 1-step methods, when using it on lens flares. 

     

    I do see that it's a bit less ham-fisted for retouching for lots of situations I suppose.

     

    Thanks for the link on the Patch tool. Looks like it's basically cloning into a selection. Or, maybe it's slightly different. I'm guessing that with cloning into a selection, you're cloning an exact selection into a selected region. But with the Patch tool, you don't have to match up the exact region... and it will sort of fill the area selected for you based on the source reference.

     

    thx

  14. Hi skiphunt,

    Affinity Photo should work equally well with JPG's or RAW's. If you are having consistent issues with RAW's, are you able to pinpoint which features/operations you are performing that leads to the issues/freezes/crashes? If we can reproduce them here we should be able to fix them assuming the bugs are not related with iOS/Apple (Core Image RAW).

     

    It's very hard to pinpoint, but I'll try to pay closer attention. Seems random mostly, but only when I start with a RAW image. I was testing to see if the RAW image import from the camera roll was working. It seems to work, but I don't get how it works. I have one camera set to take both RAW and highest quality jpegs, so there are two files for each shot RAW+JPG. I use the Olympus app to transfer the image via wifi to my iPad. The app only shows one image (not the separate images). I open that import that image from the camera roll, and Photo iPad seems to recognize it as RAW. It opens in the develop persona, and has "RAW" in the file description area. It appears to be the RAW version as well, ie. low saturation and needs to be sharpened, etc. These tests I'm describing are coming from an Olympus compact TG-4. I think it's only 16MP. 

     

    If I start with one of those, sometimes I get all the way through an edit, sometimes it freezes and crashes quickly, sometimes it takes longer. Very random. 

     

    However, if I start with a 20MP highest quality JPEG image from a Panasonic FZ or other high-res jpegs I've got handy... I don't have many problems at all to speak of. 

     

    I'm not looking for support on this per se, but more interested in whether it's most likely related to software, ios, or hardware. I'll be upgrading to the Pro for sure, but I'd like to hold out as long as possible and stick with the Air 2 until iOS11 comes out. If this RAW issue is more related to the Air 2 not being enough to handle RAW images, then I'll just not bother with editing RAW until I can upgrade. If this is more likely software related, I'll just hold off editing RAW until there are more updates. 

     

    Are there more reports of RAW image issues from Air 2 users than iPad Pro users? Or, is it about the same for each?

  15. I'm loving Photo iPad more and more, but I can't in good conscious rate it 5-Star when I'm still getting crashes and freezes. 

    What I've noticed though, is that if I start with a jpeg image, even a hi-res one... I can do quite a bit with rarely a hiccup. However, if I start with a RAW image, it's almost certain that I'll eventually get a freeze or crash. That's even after it's been "developed". It doesn't seem to be resolution/file size related.

     

    For example, I can start with about a 6000x4000 highest quality jpeg, do lots of edits, various layers, inpainting, etc. then export with no issues at all consistently. But, if I start with a paltry 4000x3000 RAW image, I'll likely not even make it past half a dozen edits after developing, before I get issues. 

    Does this imply that Photo has an issue with RAW issues only and will be resolved in another update? Or, that RAW isn't really all that supported in the current iOS10.3.2, and will be resolved once iOS11 launches? Or, that for non-crashing/freezing RAW image editing, you're simply going to have to be using an iPad Pro, ie. that the iPad Air 2 will never be enough to handle Affinity Photo iPad?

  16. Thanks MEB! Got it. 

     

    Now on the other question about scaling the image inside a mask shape... I just tried again... 

     

    I think what I'm seeing here is the confusing difference between a clipping mask and a layer mask. If I drop my image onto my mask shape, and release while there's a blue line instead of solid blue, I can select the child/image under the shape layer and scale/position it independently of the shape... as I was asking about. But, if I release with the full blue square, I've got a different kind of masking going on and I can't select the filled/placed image independently. 

     

    I'm still not crystal clear on the differences between a clipping and layer mask. And I'm not clear of if a "nested mask" is something else altogether. But, I am seeing some difference now. I'm just not able to identify which is which, and when to use one or the other. Is a "nested" mask the same as a clipping mask?

  17. What do you mean by locking here? how are you locking the layer?

     

     

    I understood both your questions regarding scaling the image independently and feathering the shape. This is quite easy to do in the desktop version but apparently not so much on iOS (seems a few controls weren't implemented yet) - I'm checking this...

     

     

    I've muddied the waters here. It appears I've mistaken the function of the lock button within the layer settings box. I was actually either selecting the image or mask object. 

     

    Here's what I did... create document. Place lion image. Place Star object. Hover Star object layer over Lion image until full blue square and release. If I select the image layer, I can scale the shape and the lion image scales with it. If I select the star shape layer, I can scale and move that, but the lion image remains stationary.

     

    What I was asking was if I can place the star shape where I want in the document, then select my fill image (the lion) and scale/move/position it within the star mask. I just tried the same thing with the desktop version, and I can scale each independently (mask shape or background image) just can't seem to do the same in the iPad version. 

     

    The feathered edge isn't a big deal. I can paint the feathering around it, but it'd be nice if I could just assign a uniform value of feather all the way around without trying to paint it in manually. And, if you don't mind... can you say how this is done in the desktop version so that when/if it's possible in the iPad version, I'll know how to do it?

     

  18. Maybe I haven't asked these questions clearly enough... 

     

    1. If I drag a star-shaped object onto the canvas, then drag that onto an image of a face to make a star-shaped mask with the face inside, and want to fill up the star with just an eye from the face... I can unlock the star shape, scale it down until the eye fills the star, then relock and scale the whole shape back up. But, can I ALSO leave the star-shape mask the size that I want it, select the face image I've placed into the star, and scale the face up to fill the shape with the eye instead? 

     

    2. Now that I have a star-shaped mask with an eye image filling it, the edge of the star-mask has a hard, clean edge. If I want that edge to be equally softened with a feathered edge, all the way around by the same 10px amount, and remain scalable and movable within the canvas, can I select the mask edge and feather it? 

  19. Hi

    I have the  iPad Air 2 So far no issues.

    Perhaps there are some settings interfering have you tried Apple Support while they don't give support for 3rd party apps.

    They have helped me trouble shoot when I was trying to find out where my settings were the issue.

    Had nothing to do with settings. The launch build was a horror show. It's actually been MUCH better after the 1.6.2 update. This post thread was from the launch build that was not such a pleasant experience at all, to put it mildly. ;)

     

    Still getting the occasional crash, but much more rare and the app is now saving where I left off. The memory problems seem to be gone as well.

     

    At present, it's almost 5-star for me. Clearly the most powerful photo editor on iOS and approaching rivaling many respected desktop alternatives as well.

  20. Hi, I've got another question in addition to my previous one about scaling placed images within a mask object.

     

    The second question is this... Let's say I have the heart-shaped object (that I used as an example in my previous question) that's filled with the placed lion image within it. Let's say I've got this lion-filled heart-shaped masked object floating on top of an image of flowers. The heart shape has a hard edge. Is there a way I can make the masked heart-shaped object have a feathered edge instead of a hardline edge? And, still be able to move it around atop the background image while retaining it's feathered edge?

     

    By the way... I already know that I can simply select the layer object and paint/ease around the edges to soften it up. What I'm asking about is can I make the edge a uniform 10px feather all the way around without having to paint/erase it away manually?

  21. MEB, thanks!

     

    I'm still mostly confused... but I'm getting closer to understanding how it works by simply making some mask shapes and trying both ways of clip/layer/nesting, etc. 

     

    As a test, I made a heart-shape mask (using a different obvious mask shape than a rectangle to try and avoid some confusion). I've got an image of a lion that's placed inside the heart shape. When it's "locked" I can resize the heart mask and the lion image resizes with it. If it's unlocked, I can change the scale and shape of the heart mask/shape independent of the lion image, but I can't select the placed lion image to independently scale it up or down within he heart shape mask. I can change the scale of the heart shape to smaller, then relock and rescale... but I can't independently change the scale of the placed image within the heart shape. Should this be possible?

    Again, this is just a test. I know how to make this do what I'm describing. I'm merely trying to understand how the functions work, ie. how to independently scale/size a placed image within a mask shape.

  22. In some situations there's really no difference - for example with adjustments - nesting an adjustment to a pixel layer or clipping it inside the pixel layer will not produce any visual difference on canvas. This is because the nested adjustment will be applied to the whole layer while the clipped one will only have effect inside the boundaries of the pixel layer data which basically leads to the same result. This is the confusing part i was referring to previously.

     

     

    I just watched that video twice and still don't completely get the real difference between a clipping mask and a layer mask. I'll have to watch it again later and see if maybe it sinks in... but both clipping and layer masks appear to be capable of doing essentially the same thing. 

     

    Yes, confusing. ;)

     

    Can you tell me when I'd want to use a clipping mask versus a layer mask, and visa-versa?

×
×
  • Create New...

Important Information

Terms of Use | Privacy Policy | Guidelines | We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.