Jump to content
Lamont

PPI, not DPI , and numerical entries

Recommended Posts

In the Resize Document window there is DPI listed as an image parameter. Since DPI is relevant only to printer settings and has absolutely nothing to do with image size, I don't know why DPI shows up instead of PPI in the Resize Document window. I hope I don't have to elaborate on the fundamental differences between PPI and DPI and the proper respective usage.

 

I also have an issue with numerical data entry into the pixel box under Resize Document. Although you alread set the unit in its own window under the resolution wndow, the latter still has "px" etc in it.The "px" etc trips me up when I want to type a numerical value. Also, since I usually resize mutiple images  to the same size it would be nice to either be able to make a custom preset or to have the progam remember the last used settings. I don't want to have to type the same numerical values for each image I scale.

 

The same issue goes for the USM. Since I tend to use the USM settings for at least several images, I don't wnat to have to set radius and fcator each time I sharpen an image. At least make the program remeber the last used setting, if a custom preset is too difficult.

 

Thank you for listening to my gripes!

Share this post


Link to post
Share on other sites

Hi Lamont,

 

This hasn't fallen on deaf ears - I'm afraid that the timing has been unfortunate and can only apologise. I have personally had a few health problems and the only other team member who would be in a position to action the request to change DPI to PPI is on holiday, so I have only seen this for the first time now, sorry. With regards to that specific request, we have deliberately used DPI and it is not an error, although it is arguable for the case where we describe pixel density. The more specific term should read PPI, you are quite correct, but it is also commonly referred to as DPI even for screens (Apple's own high resolution guidelines only refer to it as DPI, for example) and it is quite common (as described here) and universally understood. It would need to change to say DPI/PPI each time the units changed between screen or physical types and it would not further enhance anyone's understanding of what was meant - in fact it may detract for many users. We also show items on screen at physical size with screen density corrections, so it does indeed have an actual physical meaning in our program.

 

With respect to your other suggestions, I think we are trying to work through things like that right now, so I'll make sure to mention it to the team when I am back in the office :)

 

Thanks again - and sorry to not have replied sooner :)

Matt

Share this post


Link to post
Share on other sites

I also feel strongly about the PPI vs DPI issue. I don't understand why you chose Dots over Pixels. AD & AP define image sizes in Pixels, not Dots, so why jump to dots when using an abbreviation.

 

It feels a bit lazy to use DPI just because it's more commonly used that the correct term PPI. These are professional tools and should use the correct, professional terminologies. Leave the common-place "slang" for consumer products.

 

I took a 5 day PhotoShop course 10 years ago, and the first day was dedicated just to basic terminologies like resolution, size, dimension etc. We couldn't learn the app unless we understood the basic principles and I find today's graduates don't know them and make terrible mistakes preparing files for printing, and interchanging DPI and PPI just causes confusion, handicapping them further.  We don't call rectangles blocks, or strokes lines… or pixels dots. Ever seen a "16 megaDOT" digital camera and you don't get a phrase more "consumer-ish" than that.

 

If DPI is used for describing the resolution of a monitor… well, that's a bit of a debatable term because the diodes that represent pixels are roundish (kind of pill-shaped) and since DPI is a hardware term, then it's not wrong. 3 diodes (RGB) create an optical illusion we perceive as a pixel. I'll really put my head on the block here and say: there are no physical pixels on a screen.

 

 

For the uninitiated who would like to know...

 

PPI: Pixels per inch (resolution of raster files - images)

DPI: Dots per inch (resolution of hardware devices like printers & scanners)

LPI: Lines per inch (the number of rows of printed dots of ink per inch on paper)

 

If PPI is low, you get pixelation

If DPI is low, you get banding

If LPI is low, you see the ink dots on the paper

 

Pixelation can be fixed by reducing the LPI

Banding can be fixed by reducing the LPI

LPI can be increased by printing onto a paper that produces less bleed / dot gain.

 

 

… Yes, I know I'm an anal dooshbag on this topic.

 

(and to anyone who read right to the end, thank you for humoring me)

Share this post


Link to post
Share on other sites

Lamont… you are my hero!

 

Please enlighten me on the second part of your post… what does USM stand for? I haven't quite followed your argument because of this.

 

Thanks.

Share this post


Link to post
Share on other sites

Actually, when speaking of scanning, I have heard it referred to as SPI which is samples per inch  :)

 

Interesting. I haven't heard the term, but I have seen many consumer scanners that produce a scan of a higher resolution than their DPI should allow. I always thought they were just upsampling the image post scanning. Perhaps what they are doing are overlapping scan passes to "figure out" what more pixels should be. It could be a better "universal" term since some scanners have different horizontal resolutions compared to their vertical resolutions – something that always confused me… perhaps they are scanning rectangular pixels which are then chopped up through multiple passes?

 

Any scanner salesmen or old-school repro scan operators out their who could explain this to me?

Share this post


Link to post
Share on other sites

Hi Lamont,

 

This hasn't fallen on deaf ears - I'm afraid that the timing has been unfortunate and can only apologise. I have personally had a few health problems and the only other team member who would be in a position to action the request to change DPI to PPI is on holiday, so I have only seen this for the first time now, sorry. With regards to that specific request, we have deliberately used DPI and it is not an error, although it is arguable for the case where we describe pixel density. The more specific term should read PPI, you are quite correct, but it is also commonly referred to as DPI even for screens (Apple's own high resolution guidelines only refer to it as DPI, for example) and it is quite common (as described here) and universally understood. It would need to change to say DPI/PPI each time the units changed between screen or physical types and it would not further enhance anyone's understanding of what was meant - in fact it may detract for many users. We also show items on screen at physical size with screen density corrections, so it does indeed have an actual physical meaning in our program.

 

With respect to your other suggestions, I think we are trying to work through things like that right now, so I'll make sure to mention it to the team when I am back in the office :)

 

Thanks again - and sorry to not have replied sooner :)

Matt

I feel this was a poor choice, and I'd very much like to see it changed.

Share this post


Link to post
Share on other sites

I also feel strongly about the PPI vs DPI issue. I don't understand why you chose Dots over Pixels. AD & AP define image sizes in Pixels, not Dots, so why jump to dots when using an abbreviation.

 

It feels a bit lazy to use DPI just because it's more commonly used that the correct term PPI. These are professional tools and should use the correct, professional terminologies. Leave the common-place "slang" for consumer products.

 

I took a 5 day PhotoShop course 10 years ago, and the first day was dedicated just to basic terminologies like resolution, size, dimension etc. We couldn't learn the app unless we understood the basic principles and I find today's graduates don't know them and make terrible mistakes preparing files for printing, and interchanging DPI and PPI just causes confusion, handicapping them further.  We don't call rectangles blocks, or strokes lines… or pixels dots. Ever seen a "16 megaDOT" digital camera and you don't get a phrase more "consumer-ish" than that.

 

If DPI is used for describing the resolution of a monitor… well, that's a bit of a debatable term because the diodes that represent pixels are roundish (kind of pill-shaped) and since DPI is a hardware term, then it's not wrong. 3 diodes (RGB) create an optical illusion we perceive as a pixel. I'll really put my head on the block here and say: there are no physical pixels on a screen.

 

 

For the uninitiated who would like to know...

 

PPI: Pixels per inch (resolution of raster files - images)

DPI: Dots per inch (resolution of hardware devices like printers & scanners)

LPI: Lines per inch (the number of rows of printed dots of ink per inch on paper)

 

If PPI is low, you get pixelation

If DPI is low, you get banding

If LPI is low, you see the ink dots on the paper

 

Pixelation can be fixed by reducing the LPI

Banding can be fixed by reducing the LPI

LPI can be increased by printing onto a paper that produces less bleed / dot gain.

 

 

… Yes, I know I'm an anal dooshbag on this topic.

 

(and to anyone who read right to the end, thank you for humoring me)

No, don't criticize yourself.  This is a great post, and I also feel strongly about it.  As of the time of this post, this issue has not been fixed/corrected.  I think professional software using professional terminology is a great point, and I think the fact that pixels equates to digital information storage, and not the way printers or even screens deliver information is also a great point.

Share this post


Link to post
Share on other sites

NobleValerian,

 

Stirring up this many old threads is not the right way to behave in a forum. I suggest that you link to these threads from one new post in one of the current threads. It's more work but means that the conversation is kept in one place. 

 

You are flooding, which is exactly what you said you didn't want to do.


Patrick Connor

Serif (Europe) Ltd.

Share this post


Link to post
Share on other sites

 With regards to that specific request, we have deliberately used DPI and it is not an error, although it is arguable for the case where we describe pixel density. The more specific term should read PPI, you are quite correct, but it is also commonly referred to as DPI even for screens (Apple's own high resolution guidelines only refer to it as DPI, for example) and it is quite common (as described here) and universally understood. It would need to change to say DPI/PPI each time the units changed between screen or physical types and it would not further enhance anyone's understanding of what was meant - in fact it may detract for many users. We also show items on screen at physical size with screen density corrections, so it does indeed have an actual physical meaning in our program.

 

With respect to your other suggestions, I think we are trying to work through things like that right now, so I'll make sure to mention it to the team when I am back in the office :)

 

Thanks again - and sorry to not have replied sooner :)

Matt

 

You are joking right?! It's damn error AF. This is a fundamental error! 

 

 
As simple as possible:
PPI (point per inch) - we use when we talk about image resolution.
DPI (dot per inch) - we use when we talk about screen frequency.
LPI / LPC - (line per inch / centimeter per line) - usually these units we use when we refer to the screen frequency set on the RIP (Raster Image Processor)

Share this post


Link to post
Share on other sites

 

You are joking right?! It's damn error AF. This is a fundamental error!

 

 

As simple as possible:

PPI (point per inch) - we use when we talk about image resolution.

DPI (dot per inch) - we use when we talk about screen frequency.

LPI / LPC - (line per inch / centimeter per line) - usually these units we use when we refer to the screen frequency set on the RIP (Raster Image Processor)

Thanks for your simplified explanation, however I can assure you that nobody was in any doubt in the first instance. Can I ask how many users have been unable to understand what we were trying to convey? Can I also ask how many users would have instead been confused if we had mixed DPI and PPI (whichever was strictly correct in the UI at that time)? I think you'll find that actual people trying to use the software are not left confused by why it says DPI all the time, whereas they may have been confused if it sometimes said DPI then sometimes said PPI even though that may be more correct. My case in point was actually cited in the Apple documentation which was also incorrect but in no way was it ambiguous to a reader.

Share this post


Link to post
Share on other sites

I think this issue can be chalked up to people either not knowing the difference between dpi and ppi or people knowing better but so used to hearing the incorrect usage that they understand the intent. It's like if a recipe calls for 3 tablespoons of sugar, many people will say 3 tablespoon fulls when the correct way to say it is 3 tablespoons full.

 

BTW MrFlexo, PPI is actually pixels per inch not points per inch when referring to resolution if you wish to be proper :)

Share this post


Link to post
Share on other sites

I think this issue can be chalked up to people either not knowing the difference between dpi and ppi or people knowing better but so used to hearing the incorrect usage that they understand the intent. It's like if a recipe calls for 3 tablespoons of sugar, many people will say 3 tablespoon fulls when the correct way to say it is 3 tablespoons full.

 

The correct way to write it is actually "3 tablespoonfuls", which sounds like "three tablespoon fulls" when you say it. ;)


Alfred online2long.gif
Affinity Designer 1.6.5.123 • Affinity Photo 1.6.5.123 • Windows 10 Home (4th gen Core i3 CPU)
Affinity Photo for iPad 1.6.9.81 • Affinity Designer for iPad 1.6.3.44 • iOS 12.0.1 (iPad Air 2)

Share this post


Link to post
Share on other sites

Hi,

 

I've been into photography for 40 years, and am now teaching my daughter (doing design in college) photography and teaching my wife how to design logos etc for herself; I bought the two Affinity products as I've always loved Serif software and the new Affinity range (well, new to windows!) seems to cover my two needs so well.

 

However, having already taught my other daughter (in college doing animation) about DPI / LPI / PPI, I am pretty disappointed that I have to now explain that - in Affinity - the program I am recommending they use has a stupid mistake right at the beginning - namely, asking for the DPI of a new file!

 

Over and over again I am teaching them to think for themselves and giving them the knowledge so that they can make their own way forwards ... it really doesn't help them to understand things when I explain something correctly and then have to go through with them how the SOFTWARE is wrong ... I *refuse* to teach them the mistakes people make until they've learnt and have understood the correct facts first! And, I don't want to take the easy way out and keep such mistakes going, promoting further misunderstandings in the future.

 

PLEASE. Stop trying to pander to the post-truth people, and be brave - use the correct term, PPI.

 

Gary

Share this post


Link to post
Share on other sites

...

However, having already taught my other daughter (in college doing animation) about DPI / LPI / PPI, I am pretty disappointed that I have to now explain that - in Affinity - the program I am recommending they use has a stupid mistake right at the beginning - namely, asking for the DPI of a new file!

...

 

 

Well, in 1990 when I was teaching my boys how to use vector applications, image editing applications, and 3D software, I also taught them that various applications use the terms in-application interchangeably. 

 

My oldest son hasn't really used a computer since those days with the exception he started using a tablet recently. He's now 41 and was over here this morning. I asked a simple question about "image resolution" and he still knew the terms and how they can be interchangeable.

 

If I were making a point, it would be once someone has been taught--by forums, the help system in an application or by actual instruction--they shouldn't be confused how the terms are used in various applications.

 

And Serif is considering this change. I personally would welcome it, but my use of their software will not change on iota. It's simply not dependent on a label.

 

Mike


My computer is a nothing-special Toshiba laptop with unremarkable specs running Windows 10 64-bit.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×