Jump to content

James Ritson

Moderators
  • Content Count

    666
  • Joined

  • Last visited

Everything posted by James Ritson

  1. Hey, on the context toolbar for the Dodge/Burn brush tools you should have a Tonal Range dropdown that allows you to target Shadows/Midtones/Highlights - think this is what you're after? Hope that helps! [Edit] Attached a screen grab
  2. What you're seeing is the 32-bit preview panel (this is accessible on Desktop through View>Studio). This is used to preview different tonal ranges of the document and is important for a number of use cases, but for typical HDR merging where you'll be tone mapping straight after it's arguably less useful. In the iPad version, this panel is intrinsically linked to the Hand Tool as its context toolbar in 32-bit: it's a design decision, since adding another studio on the right hand bar just for a couple of esoteric options isn't a great use of valuable screen space The issue is that most tools will have a context toolbar, so switching to another tool won't solve this. If you intend to tone map the HDR image, you should find that it doesn't appear in the Tone Mapping Persona. Once you've tone mapped the image, you could always convert it to 16-bit (unless you really need it to remain in 32-bit) and the 32-bit preview toolbar will disappear. The only other solution I can suggest right now is to select the Move Tool (directly underneath the Hand Tool) and then tap off somewhere on the canvas outside of the image to deselect it. Hope that helps.
  3. Hi Conrad, you're referring to the Shadows/Highlights adjustment? This is just for tonal compression - instead, try the Filter version which is under the Filters menu (no subcategory). It behaves very similarly to the implementation in the Develop Persona.
  4. Hey Mettsy, apologies if I'm missing something here, but are you just comparing the result you get straight after HDR merging (so no tone mapping)? What steps do you take in Photoshop, do you use 8-bit/16-bit adaptive tone mapping or complete it in Camera Raw? After it's just been HDR merged in Photo, you'd really need to tone map the image before doing anything else. What you're seeing is simply the starting exposure or "point" that Photo has picked - notice the entire image is exposed brighter than the Photoshop result. If you went to View>Studio>32-bit Preview and brought the Exposure slider down by perhaps 0.5 or 1 stop, you might find the result looks more like Photoshop's (don't forget to reset it though before doing any further work). The detail that looks washed out is there, the image just needs tone mapping via global compression/local contrast, both of which you can do in the Tone Mapping persona - top left of the interface, the fourth icon along. Let me know if this does the trick.
  5. Hey Ben, Chris asked me to look into it - Motion seems to import PSDs fine here using any of the three presets (I thought preserving editability/accuracy would cause an issue, but apparently not). This is using the latest MAS 1.6.7 release. If you could attach the sample document that's not importing correctly that would be ideal - if not, however, a screenshot of your layers panel would be equally useful (with everything expanded so we can see nested layers etc).
  6. Hello all, just letting you know that I've rolled out the search for Designer help as well now. English and US should be fully searchable, but I don't believe other languages have been indexed yet - I've submitted them, so hopefully within a couple of days you'll be able to search them too. As always, if you discover any issues let me know and I'll endeavour to fix them!
  7. Hi Tom, it's available but has been consolidated to the main assistant options for the iPad version. If you open a document, then tap the document menu (next to the close document button, top left), you'll see Assistant. Tap that, and the tone curve option will be near the bottom, along with the bit depth output option. Hope that helps!
  8. Hi Barry, those are just open documents (I happen to have named them sequentially), you'll get that toolbar any time you have more than one document open. Hope that helps.
  9. Hey again, just checking in to offer a new video, this one focuses on creating an HDR result from one RAW exposure (as opposed to merging bracketed exposures): HDR from one exposure - YouTube / Vimeo @Chinderah Have you tried the online help at https://affinity.help ? It's searchable and printable (just click the print icon on the left hand menu), and it works well on tablets/phones so you could have it as a reference whilst you work on your desktop machine.
  10. Hey @KyleG, I didn't get a notification that you had replied so I'm sorry I haven't seen your message sooner. I've investigated and haven't found much that would help so far - I've tested on a couple of TVs and a 4K monitor and there's no overscan on any of them. The app itself is basically a bare bones template that Xcode provides, and it's unlikely there would be anything in there to dictate display scaling. The majority of the videos are 16:9 - a few older videos may be 16:10 but they should appear pillar boxed. When you say other Apple TV apps work fine, are you using any other apps where critical content would be displayed outside of the safe area? Most apps design their UI to be title-safe, and video content follows this convention too. Screen capture content is more difficult - short of scaling the entire video down and leaving black space, there's not much we can do to accommodate action and title safe regions. Is there any chance you could take a few pictures of your TV settings and also the Video/Audio settings on the Apple TV menu? Additionally, if you go into the Video/Audio menu and choose Calibrate>Zoom and Overscan, do you definitely see the outer white border near the "Full Screen" text? Finally, what's your HDMI Output set to - is it RGB Low/High or YCbCr? I'm aware some TVs may alter their overscan setting based on the input signal (crudely speaking, they might regard YCbCr as "television" and thus overscan), despite the picture fit setting. I'd like to get to the bottom of the issue and see if it's affecting other users as well, so any further information would really be appreciated. Thanks!
  11. It's something I'm trialling, was going to make a quick post about it at some point. From initial tests it seems to do an OK job (better than the in-app search, in fact ) - it's currently in Photo English, US and German. If it seems to be functioning well then we'll probably just roll it out across all languages and for both apps.
  12. Hi, looks like you have inverted the Background (Hintergrund) layer - instead, make sure you select the HSL layer and go to Layer>Invert. This should invert the HSL layer's mask and allow you to paint back onto it. Hope that helps!
  13. The display acceleration and Metal compute hardware acceleration are two entirely separate things - using Metal for display acceleration just means it's used to present to screen (i.e. the canvas view). It should be faster than OpenGL, but between the final High Sierra beta and the public release something changed and presented some issues with the way Affinity's Metal renderer is implemented. It's hopefully something that will be addressed in the future. In the meantime, the OpenGL renderer was tweaked to compensate (it's noticeably faster in 1.6 than 1.5). Metal compute is hardware acceleration, and is a back port from the iPad development where Metal implementation was necessary to achieve good performance. In particular, equirectangular projection absolutely flies using Metal compute, often hitting 60fps at 5K resolutions and above. Complex live filters like Twirl and other distortions should also redraw much faster. At the moment, however, it's limited to integrated graphics chips which you'll typically find either on MacBook models or the 21" iMacs. You'll notice enabling Metal compute will use the integrated GPU, but you don't need to check "Use only integrated GPU". Photo can still use the discrete GPU for presenting to screen and the integrated GPU for Metal compute quite separately. Hope that clears it up a bit!
  14. Hardness at 0% is fine and is desirable for a softer edge (it's specified in the book). It definitely looks like a flow issue, Flow should be set to 100% by default - have you perhaps changed it at some point, or are you using a custom/different brush? If in doubt, switch to the Basic brush category and choose one of the first few brushes (up to the 128px sized brush) - they're all guaranteed to have 100% flow and accumulation.
  15. Hello all, Just to let you know that in conjunction with the latest update, I've re-recorded one video and there are three new videos: Opening, Saving & Exporting (Updated; 22nd February) Learn how to open images from different sources (Photos App, cloud storage), see how images are auto-saved and discover how to export and share images - both in common formats (JPEG, PSD) and as Affinity documents. Watch: Vimeo / YouTube iPad/Desktop Interworking Check out a quick and simple workflow to share and edit your work between the iPad and desktop versions of Affinity Photo using Open In Place. Watch: Vimeo / YouTube Exporting Learn how to use the export dialog on the iPad version of Affinity Photo to share your work in a number of different formats. Watch: Vimeo / YouTube 360 Live Editing See how to apply Live Projection editing to 360 images and retouch them. Watch: Vimeo / YouTube As usual, the list in the first post has been updated. Hope you find them useful!
  16. Hey Kenzor, I'm able to import OpenEXR documents exported from Photo fine here - when you say they won't load in Unreal Engine, do you mean they won't even import? I've tried Radiance HDR and those do fail to import - not sure why though, will have to look into it. Have you tried OpenEXR?
  17. @KyleG overscan is typically a TV issue, not an app issue - have you checked your picture settings? It's usually called Picture Size or Picture Frame, and you'll have options like 16:9 (which typically overscans), Zoom, Fill, etc. There should be one called Just Scan or Fit, which will display the entire frame. Hope that helps.
  18. Hi, don't worry, you're not losing any quality - what you see during the RAW development and subsequent edits in the Photo persona is the full resolution image. It's just the resolution metadata that's incorrect. You can double check this if you develop the RAW file, then go to the Document menu at the top and choose Resize: the X and Y dimensions should match the expected full resolution values. Hope that helps!
  19. Hi Sam, if you follow the links in the Designer and Photo tutorial threads to the Vimeo versions, you can download high quality versions - there's usually a Download button to the left of Share. Given the choice, I'd go for the highest resolution available - this is usually 1080p, UHD 1440p, or, in some rare cases, UHD 4K. You can also download the "Original" versions which are the original uploaded video files, but they'll be significantly bigger in file size. If file size and download time aren't a concern the originals will look very nice Hope that helps.
  20. Hi again, check your Develop Assistant settings. At some point, you've switched to 32-bit output rather than 16-bit. 32-bit will be using a linear colour space, which Apple Photos doesn't interpret correctly, so it's causing the mismatch that you see when passing the document back. Once you switch back to 16-bit and then open a RAW file from Photos, everything should then correct itself (that's my hope, anyway ) If you're not sure where the assistant is, check out this video: Bear in mind that you might have to re-open the RAW file once you've changed the setting for it to apply (the above video will need an update when I have time!). Hope that solves it!
  21. Hi Allison (I think? ), I see the issue with the first part where you're using Edit in Affinity Photo. Apple Photos passes Affinity Photo the RAW file, which it opens in Develop from scratch. Because of this, all the changes you made in Photos won't carry over. The solution here is to simply pass the RAW file to Affinity Photo without making any edits first, then make all your edits within Affinity Photo (you don't have to use Develop necessarily, just develop the image then use adjustment layers in the Photo persona if you wish). Once you save the document and close it, the changes should then be reflected correctly. I've tried this workflow and it seems to work fine here without any issues (no big differences between Photo and Photos), just avoid making any changes first before opening the image in Photo. Do also bear in mind that you lose a lot of flexibility this way: Apple Photos will flatten the document passed back from Photo because it will always be over 16MB in size, so you'll lose your layer structure and non-destructive editing. Hope that helps!
  22. Hi, as firstdefence mentioned above, the best solution is to download the trial and see if it works for you. I can tell you that your specs are still pretty decent - the 13" i7 processor is I believe a 4578U model (or variant thereof) which is dual core with hyper threading, so it should run Affinity Photo reasonably well. Because you've got an Intel Iris graphics chip, however, the first thing I would check is under Preferences>Performance. See if you can enable "Metal compute acceleration". If so, you should find the 360 projection will run incredibly well. Some of the live filters will also benefit from Metal compute too. Hope that helps!
  23. Hi Mike, looks like you've got your colour panel set to 16-bit values rather than 8-bit. This means each colour value will top out at 65535 rather than 255 (thus explaining why you can't match the values given in the book). To set the values back to 8-bit, click the "burger" icon next to the X and choose 8-bit. I've attached a screenshot to show you. Hope that helps, let me know if you're still running into issues!
  24. Hello all, just reappearing briefly to post a new video! It covers how to achieve the Orton effect in Photo. I've seen various videos and guides that more or less replicate the Photoshop method, but that's a destructive approach, and I wanted to demonstrate a few tweaks to make it non-destructive. Hope you find it useful! Orton Effect - YouTube / Vimeo
  25. Hey Paul, it's a tricky one. The vast majority of monitors, especially lower-end models, tend to ship with a high brightness level for the same reason as you've observed: everything looks "punchier". Unless you're working specifically to a medium such as print, you should aim to make your brightness the same as the ambient light level in the room you're in. If it's brighter or darker then you won't perceive tones and detail correctly. However, I'd really recommend that you invest in a colorimeter (e.g. the i1Display Pro) - this way you can profile your monitor to particular conditions, including brightness levels. For example, in a typical office environment with overhead lighting, you might calibrate your brightness to 100cd/m2. 120cd/m2 is the typical value given for general office and web use, but it really depends on the environment lighting. What will shock you is just how bright monitors ship by default: most iMac 5K panels I've profiled tend to be around 170 to 180cd/m2 by default (this is with the automatic brightness control enabled), but I've seen some other monitors that come in at over 200. I had an old Hazro monitor that was highly rated for photo work, and that was insanely bright to begin with. With a colorimeter, you can also profile your display to a colour temperature more accurately. Most of the time you'd profile to D65 (6500K) for office and web use, as well as photo editing, but you can also profile to D55, D50 and other temperatures for print work, proofing, etc. It depends on what you need to do. You will likely find that your monitor has some sort of colour cast, even if it's slight. The 2014 iMac I use shipped with a horrible green tint, and I recently profiled a 2015 model that had a blue cast. To give you an example, I typically create two profiles for my photo editing at D65 and D50 and I keep my brightness at 80cd/m2 because of dim lighting conditions. I stick with D50 most of the time (it also reduces my eye strain because it's warmer ), but toggle between that and D65 to sanity check my work. I'll often create a third profile which is based off the office's ambient temperature (profiling software allows you to take a measurement from the colorimeter) - this is for printed work where I want a closer idea of how it will look when printed and viewed under the same lighting conditions. So, a bit of a ramble, apologies... At the very least, I would recommend making sure your monitor's brightness looks "level" with the room lighting, then work from there. I would definitely recommend looking at a colorimeter though, because that way you can ensure that you've taken steps to standardise your working conditions - that's all you can do, really. Different panels, different devices - they can all have varying temperatures, colour casts and brightness levels, and you'll drive yourself mad trying to satisfy every scenario. That said, if you have devices you can test on (such as phones, tablets, other monitors), all the better. Hope that helps somewhat!
×
×
  • Create New...

Important Information

Please note the Annual Company Closure section in the Terms of Use. These are the Terms of Use you will be asked to agree to if you join the forum. | Privacy Policy | Guidelines | We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.