Jump to content
You must now use your email address to sign in [click for more info] ×

Resolution is reduced in Photo2.


kuhl

Recommended Posts

Hello.

I am a fan of Affinity tools.
I am shooting with an a7III.

The pixel count of an a7III is 6000*4000.
But when I load the RAW in Photo2, it changes to 6024*4024.
(Not 3:2 aspect ratio.)

I am stumped because of the loss of resolution.
Is there any way to improve it?

Thanks.

Link to comment
Share on other sites

Isn’t it higher resolution? Perhaps the camera trims the sensor and affinity sees the full sensor.

M1 IPad Air 10.9/256GB   lpadOS 17.1.1 Apple Pencil (2nd gen).
Affinity Photo 1.10.5 Affinity Design 1.10.5 
Affinity Publisher 2, Affinity Designer 2, Affinity Photo 2 and betas.

Official Online iPad Help documents (multi-lingual) here: https://affinity.https://affinity.help/ 

 

Link to comment
Share on other sites

  • Staff
8 hours ago, kuhl said:

Hello.

I am a fan of Affinity tools.
I am shooting with an a7III.

The pixel count of an a7III is 6000*4000.
But when I load the RAW in Photo2, it changes to 6024*4024.
(Not 3:2 aspect ratio.)

I am stumped because of the loss of resolution.
Is there any way to improve it?

Thanks.

Hi Kulh, as DM1 mentioned above this is not a loss of resolution—you will find that most RAW files contain a handful of pixels around the edges, but these are cropped away as per the mandated resolution in the metadata. Photo ignores this, however, and always processes the full width and height of the image, which is why you end up with a marginally higher resolution than you would expect. The aspect ratio is still 3:2.

A good example of this is the Olympus cameras: e.g. the E-M1 mk3's actual RAW resolution is 5240x3912, but the metadata instructs software to crop it to 5184x3888. It doesn't sound like much, but the extra resolution is actually quite welcome when dealing with wider angle lenses. If Photo applies automatic lens corrections, you can actually go into the Lens tab and bring Scale down—you'll often find there is pixel data being pushed outside of the crop bounds that you can bring back, and this is made partially possible by Photo not cropping those edge pixels.

Hope that helps!

Product Expert (Affinity Photo) & Product Expert Team Leader

@JamesR_Affinity for tutorial sneak peeks and more
Official Affinity Photo tutorials

Link to comment
Share on other sites

1 hour ago, James Ritson said:

Hi Kulh, as DM1 mentioned above this is not a loss of resolution—you will find that most RAW files contain a handful of pixels around the edges, but these are cropped away as per the mandated resolution in the metadata. Photo ignores this, however, and always processes the full width and height of the image, which is why you end up with a marginally higher resolution than you would expect. The aspect ratio is still 3:2.

A good example of this is the Olympus cameras: e.g. the E-M1 mk3's actual RAW resolution is 5240x3912, but the metadata instructs software to crop it to 5184x3888. It doesn't sound like much, but the extra resolution is actually quite welcome when dealing with wider angle lenses. If Photo applies automatic lens corrections, you can actually go into the Lens tab and bring Scale down—you'll often find there is pixel data being pushed outside of the crop bounds that you can bring back, and this is made partially possible by Photo not cropping those edge pixels.

Hope that helps!

 

Thank you for your answer.

I understood that the image is not stretched.
Problem solved, thank you very much!

Link to comment
Share on other sites

10 hours ago, kuhl said:

So maybe I felt the resolution was reduced.

V1 pic looks sharper to me. Are these showing AP raw screens or exported photos? Can you attach the raw file here.

M1 IPad Air 10.9/256GB   lpadOS 17.1.1 Apple Pencil (2nd gen).
Affinity Photo 1.10.5 Affinity Design 1.10.5 
Affinity Publisher 2, Affinity Designer 2, Affinity Photo 2 and betas.

Official Online iPad Help documents (multi-lingual) here: https://affinity.https://affinity.help/ 

 

Link to comment
Share on other sites

On 1/20/2023 at 5:50 AM, James Ritson said:

Hi Kulh, as DM1 mentioned above this is not a loss of resolution—you will find that most RAW files contain a handful of pixels around the edges, but these are cropped away as per the mandated resolution in the metadata. Photo ignores this, however, and always processes the full width and height of the image, which is why you end up with a marginally higher resolution than you would expect. The aspect ratio is still 3:2.

A good example of this is the Olympus cameras: e.g. the E-M1 mk3's actual RAW resolution is 5240x3912, but the metadata instructs software to crop it to 5184x3888. It doesn't sound like much, but the extra resolution is actually quite welcome when dealing with wider angle lenses. If Photo applies automatic lens corrections, you can actually go into the Lens tab and bring Scale down—you'll often find there is pixel data being pushed outside of the crop bounds that you can bring back, and this is made partially possible by Photo not cropping those edge pixels.

Hope that helps!

Taking wide angle photos on one of my cameras, Canon PowerShot G7X, after editing RAW files with Affinity Photo, I recover the edges by a total of 8% (reducing the scale from 100% to 92%). I have a lot of examples with this little camera. With the Sony A6000 I recover 2% of the edge and with the Canon 6D I still have to try.
 

(I notice that I am not sure that I have done well the account of how much of the photo I recover for each edge, but I do reduce the scale from 100% to 92%).

recover_borders_affinity_photo_Canon_Powershot_G7x.jpg

Link to comment
Share on other sites

On 1/20/2023 at 12:16 PM, kuhl said:

So maybe I felt the resolution was reduced.

 

No, Affinity does not "touch" resolution, except you crop it actively off, or indirectly by lens correction, or actively resize or copy / paste to other apps.

The perceived sharpness of images comes predominatly from sharpening (unsharp mask, highness filter, refinement), and you edits in V1 / V2 may use different settings.

Next, on Apple devices it makes a great difference if you use Apple RAW or Serif RAW (Development Assistant)

Using "Lens Correction" can introduce slight blurriness as it will stretch some pixels, normally most at out parts (not the centre).

Using "CA correction" may improve perceived sharpness a tiny bit.

 

So if you want to compare 2 images, please include (upload) all relevant settings used in editing:

  1. RAW image used Serif / Apple
  2. OS of your device (Apple RAW results depend on OS version)
  3. Development Assistant (RAW engine, tone curves, ...)
  4. Basic panel (clarity has a huge impact)
  5. Details panel (refinement is essentially sharpening, noise reduction can reduce sharpness)
  6. Lens panel (CA, lens correction)

Using the right adjustments and slider settings will give you a result which gives same sharpness compared to there apps.

There are some cameras / lens combinations which are not correctly imported, these are rare cases and your files are probably not impacted.

Mac mini M1 A2348 | Windows 10 - AMD Ryzen 9 5900x - 32 GB RAM - Nvidia GTX 1080

LG34WK950U-W, calibrated to DCI-P3 with LG Calibration Studio / Spider 5

iPad Air Gen 5 (2022) A2589

Special interest into procedural texture filter, edit alpha channel, RGB/16 and RGB/32 color formats, stacking, finding root causes for misbehaving files, finding creative solutions for unsolvable tasks, finding bugs in Apps.

 

Link to comment
Share on other sites

On 1/20/2023 at 5:16 AM, kuhl said:

I understood that 6024*4024 is the number of pixels before camera trim.
Maybe that's why I feel the edges are sharper in Photo v1 than Photo v2.
*Photo v1 is 6000*4000.
So maybe I felt the resolution was reduced.

527B3984-6015-47F4-801F-618881E3FC54.jpeg

4604A114-D3D2-4BA2-9A8C-312133CE9645.jpeg

Are these export of the images from Photo V1/V2? Or are these screenshots of what is happening in the Affinity viewport? Same zoom levels? Is this the exact same RAW file loaded into Develop Persona? Absolutely no changes after? Otherwise, we can't really say for sure why things are different. I haven't noticed that my RAWs load differently (Nikon D5600 here). I never really used Affinity's RAW editor in V1, but now that it has non-destructive, I'm much more keen now to use it and am using it more regularly.

Link to comment
Share on other sites

I have literally developed thousands of Photos (mostly Canon CR2) with different apps (Canon DPP,  Photo V1, Photo V2, Darktable, ..).

There are many differences in UI and presets, affecting colors and tones dramatically, but sharpness was never an (unsolvable) issue. I prefer Apple RAW as more Cameras / lenses are supported and results are more in Line with out-of-camera results. For critical images, I only use canon DPP as its results are superior in certain relevant details: 

  • Lens correction for Canon lenses (I have 3rd party lenses too, which get ignored by DDP)
  • CA correction
  • de-fringing is so much better. Just works (no parameter to adjust). Neither Apple nor Serif RAW deliver anything close. 
  • Canon Picture Profile presets occasionally deliver just the right grading I want. Achieving the same result in Photo is much more time consuming - you can't get the exact identical result, but this does not matter for me when i edit a photo "to taste"
  • Last time I edited about 250 images in Nov/Dec with photo it crashed (actually hang forever) reproducible every 6-8 image. So I have to close Photo after 5 images latest to avoid loosing my edits. Changes my workflow again to Develop in DPP and export as TIFF for later edits in Photo.

Mac mini M1 A2348 | Windows 10 - AMD Ryzen 9 5900x - 32 GB RAM - Nvidia GTX 1080

LG34WK950U-W, calibrated to DCI-P3 with LG Calibration Studio / Spider 5

iPad Air Gen 5 (2022) A2589

Special interest into procedural texture filter, edit alpha channel, RGB/16 and RGB/32 color formats, stacking, finding root causes for misbehaving files, finding creative solutions for unsolvable tasks, finding bugs in Apps.

 

Link to comment
Share on other sites

Dear All.

Sorry for the delay.
Thank you for all the replies.

The device I am using is 'iPad Air 3rd'.

The iPadOS version is '16.2'.

The development version of Photo V1 is '1.10.7'.
The development version of Photo V2 is '2.0.3'.


Regarding the two images you posted.

The two apps were reinstalled.
The settings of the two apps have been initialized.

The Raw pixel count is different in the two apps.
(I understand the reason for this.)

  • Photo V1: 6000 * 4000
  • Photo V2: 6024*4024

The two applications did not apply lens correction.
RAW was saved as JPEG without retouching.
Screenshots were taken by zooming in to the same magnification.

Is this problem caused by the iPad?
Or will it be resolved when the development version of Affinity Photo is updated?

* These are enlarged photos of a portion of a building owned by someone else. Therefore, I cannot decide if I can upload the RAW.

* My English is not good, so I might say something rude. I'm sorry for that.

 

Thanks.

Link to comment
Share on other sites

10 minutes ago, kuhl said:
  • Photo V1: 6000 * 4000
  • Photo V2: 6024*4024

 

This let me Assume that V1 used Apple RAW, as Serif RAW normally gives always the full resolution, both in V1 and V2. Could explain some differences.
Those pixel are simply cut off, the image is not rescaled.

if you can’t upload a RAW file, simply use this one so we can use the same file. 

_MG_6281.CR2


please make screenshots of Develop Assistant open, like :
821965422_Screenshot2023-01-11at18_30_42.thumb.png.69931b9d8bf6c027e3439f23ff593869.png562152474_Screenshot2023-01-11at18_30_06.thumb.png.7a194d61bf4c238fcbcd66cfa51f5eba.png

Mac mini M1 A2348 | Windows 10 - AMD Ryzen 9 5900x - 32 GB RAM - Nvidia GTX 1080

LG34WK950U-W, calibrated to DCI-P3 with LG Calibration Studio / Spider 5

iPad Air Gen 5 (2022) A2589

Special interest into procedural texture filter, edit alpha channel, RGB/16 and RGB/32 color formats, stacking, finding root causes for misbehaving files, finding creative solutions for unsolvable tasks, finding bugs in Apps.

 

Link to comment
Share on other sites

1 hour ago, NotMyFault said:

This let me Assume that V1 used Apple RAW, as Serif RAW normally gives always the full resolution, both in V1 and V2. Could explain some differences.
Those pixel are simply cut off, the image is not rescaled.

if you can’t upload a RAW file, simply use this one so we can use the same file. 

_MG_6281.CR2 23.23 MB · 3 downloads


please make screenshots of Develop Assistant open, like :
821965422_Screenshot2023-01-11at18_30_42.thumb.png.69931b9d8bf6c027e3439f23ff593869.png562152474_Screenshot2023-01-11at18_30_06.thumb.png.7a194d61bf4c238fcbcd66cfa51f5eba.png

 

Thank you for your kindness.
I will try it as soon as I get my home.

 

I will answer your questions.

  • RAW image used Serif.
  • Sorry, what is 'CA correction'?

 

Thanks.

Link to comment
Share on other sites

38 minutes ago, kuhl said:

Sorry, what is 'CA correction'?

Chromatic Aberration, see https://en.wikipedia.org/wiki/Chromatic_aberration

In optics, chromatic aberration (CA), also called chromatic distortion and spherochromatism, is a failure of a lens to focus all colors to the same point.[1] It is caused by dispersion: the refractive index of the lens elements varies with the wavelength of light.
https://en.wikipedia.org/wiki/Chromatic_aberration#/media/File:Chromatic_aberration_lens_diagram.svg

 

Mac mini M1 A2348 | Windows 10 - AMD Ryzen 9 5900x - 32 GB RAM - Nvidia GTX 1080

LG34WK950U-W, calibrated to DCI-P3 with LG Calibration Studio / Spider 5

iPad Air Gen 5 (2022) A2589

Special interest into procedural texture filter, edit alpha channel, RGB/16 and RGB/32 color formats, stacking, finding root causes for misbehaving files, finding creative solutions for unsolvable tasks, finding bugs in Apps.

 

Link to comment
Share on other sites

20 hours ago, NotMyFault said:

Chromatic Aberration, see https://en.wikipedia.org/wiki/Chromatic_aberration

In optics, chromatic aberration (CA), also called chromatic distortion and spherochromatism, is a failure of a lens to focus all colors to the same point.[1] It is caused by dispersion: the refractive index of the lens elements varies with the wavelength of light.
https://en.wikipedia.org/wiki/Chromatic_aberration#/media/File:Chromatic_aberration_lens_diagram.svg

 

 

Thank you so much.

I see, you meant chromatic aberration correction.
 When importing RAWs, should I just upload the V1 and V2 screenshots?
 Or should I upload the RAWs converted to JPEGs for each of V1 and V2?

Link to comment
Share on other sites

2 hours ago, kuhl said:

Or should I upload the RAWs converted to JPEGs for each of V1 and V2?

Only the RAW files allow us trying to reproduce the issue you observed. 

Mac mini M1 A2348 | Windows 10 - AMD Ryzen 9 5900x - 32 GB RAM - Nvidia GTX 1080

LG34WK950U-W, calibrated to DCI-P3 with LG Calibration Studio / Spider 5

iPad Air Gen 5 (2022) A2589

Special interest into procedural texture filter, edit alpha channel, RGB/16 and RGB/32 color formats, stacking, finding root causes for misbehaving files, finding creative solutions for unsolvable tasks, finding bugs in Apps.

 

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...

Important Information

Terms of Use | Privacy Policy | Guidelines | We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.