roadcone Posted April 11 Posted April 11 This is not an Affinity question so I apologise and suggest that you can stop now if you wish. I cannot find an answer to the question: What determines the dpi of an image? Now, I don't mean the resolution divided by the size I want the image to appear - loads of website think that is the answer (and so does Google AI). Here's the scenario: I take essentially the same photo from my camera and my phone, same place, very roughly the same view and more-or-less the same resolution (so although the sensor sizes may be physically different, they have a similar number of photo sites). My camera provides that image at full resolution at 300dpi and the phone at full resolution at 72dpi. Two very similar 20MP images, different dpi by a factor of four. Why? Is the phone doing some smoke-and-mirrors trick where, really, it is a quarter the size but by changing the dpi it grows in size and helps the marketing? At what point does the dpi come in to existence and what characteristic causes that to occur? Thank you for reading this far on an OT topic. Clive Quote
GarryP Posted April 11 Posted April 11 In essence, DPI can be thought of as a ‘serving suggestion’. It has no effect on the image itself, it’s just there for the creator of the image (the camera, the software, the user, etc.) to suggest how it should be displayed. It’s the same number of pixels no matter what the DPI is. A 3000 DPI image containing one million pixels contains the same information as a 10 DPI image with the same number of pixels (as long as those pixels are the same colours). Some consumers of images (stock websites, software, etc.) might have specific requirements for DPI but that’s for you to determine as needed. If you don’t care what the DPI is then you can just ignore it. Quote
R C-R Posted April 11 Posted April 11 Please consider studying https://affinityspotlight.com/article/understanding-dpi/ for a comprehensive treatment of what DPI means. Quote All 3 1.10.8, & all 3 V2.6 Mac apps; 2020 iMac 27"; 3.8GHz i7, Radeon Pro 5700, 32GB RAM; macOS 10.15.7 All 3 V2 apps for iPad; 6th Generation iPad 32 GB; Apple Pencil; iPadOS 15.7
roadcone Posted April 11 Author Posted April 11 GaryP: I think I now see my misconception. Thank you. I have a reasonable camera and when I process RAW the jpeg output is 300dpi. I don't ever get jpeg from my camera. My phone only delivers jpeg at 72dpi so I made the thought-leap that cameras deliver 300 and phones 72. Then previously my output was for screen. Now I am doing something in Publisher for another person. Their printer has told them to check images are a minimum of 300 and a good number of their images are high resolution but phone images at 72dpi and they are concerned that they will not be good enough to print. Then an awful lot are photos from the 1930s & 1940s which, despite being scanned at 600, are from tiny originals. It took me ages to persuade them that their 72 comes out at 40-something inches and by the time it is reduced to a sit on a quarto page with text down the side, it will be a lot more than 300. But then insisted on checking each of their images and telling me which ones were unsuitable. Then, their tiny images at 600 are likely to be unsuitable because they are tiny, soft and in many cases, scratched. This completely side-tracked my thought process. R C-R: I will follow that link as this is a subject which previously I had not needed - when using images for screens you can pretty much ignore the issue. Thank you. Clive Quote
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.