Jump to content

Recommended Posts

I'm making my letter on Twitter and Facebook "open" because I'd really like to see this addressed.  I believe the issue at hand is also more relevant to a "bug", since the correct term is PPI and not DPI.  You've responded to this issue in the forums, but I'm considering this topic an error that I feel many users would like corrected.  Please consider making this change.  Thank you.

"Respectfully,
 
I believe ignoring the significant difference between PPI and DPI is a huge mistake, and an irresponsible choice for a company that does what yours does.
 
I understand you feel this is addressed.  However, if the vibe I'm getting is correct, then you feel that so many people misuse the term DPI, it's not worth the effort to address questions and concerns if you change it.  I think that's lazy.  It seems like a couple lines in an FAQ that you could link to would take care of it.

As far as professional work and the factor these properties have on the final outcome of your project, I think these differences are way too significant to ignore.  If I create a document at 300 PPI, and then print it at 300 DPI and 1200 DPI, the difference is immediately obvious.  With the exact same pixel data, the document printed at 1200 DPI is significantly higher quality.

If I create a document at 1200 PPI, with 4 times the pixel data, the quality difference between a 300 DPI and 1200 DPI print is also immediately apparent.  Again, the 1200 DPI print is far higher in quality, contrast, clarity of detail, color accuracy, and intensity.

However, if I look at two images printed at 1200 DPI, one from a 300 PPI file and one from a 1200 PPI file, then the difference is almost completely indiscernible.  This is also true printed at 300 DPI.  In spite of having 4 times the pixel data in the same amount of space, the difference between the two printed images is almost impossible to discern.  However, two images printed at a different DPI from the exact same digital file are *easily* distinguishable from each other.

I'm an artist and designer, and I don't personally deal with printing images very often.  That said, after 8 years doing this stuff, the difference between DPI and PPI has always been simple and clear, and I don't think there's a legitimate reason to use them interchangeably.  Especially given the impact they have on the final outcome is weighted so differently.

I've waited a long time for you guys to come to Windows so I could be done with Adobe, and seeing DPI every day instead of PPI won't change that.  I just think the responsible thing to do is set an example, use the correct term, and help clarify that there is a difference between the terms.  From what I've seen in the forums, a lot of your users also understand the difference,  and it also frustrates them.  It's a small change, but it could have a big, positive impact.  The users frustrated by it no longer have to be frustrated, and all the people that don't understand the difference can start to.  Please consider this.

Thank you for your time."

Share this post


Link to post
Share on other sites

As we explained to you (briefly) on twitter 

 

"This has been brought up a few times in the past you can see our response regarding this here :)  https://forum.affinity.serif.com/index.php?/topic/25774-its-ppi-not-dpi/"

 

You are correct in that DPI is a technically incorrect term. Like some of our users, we do understand the term and we don't need it explaining again, but thanks, hopefully some readers will understand better now.

 

We consider all requests, but if you do post this multiple times that is flooding and won't be tollerated. Your other private Facebook direct message and Twitter Private messages are best dealt with here in public, where others can join in, or indeed in the original thread where this has been discussed in great depth,  https://forum.affinity.serif.com/index.php?/topic/25774-its-ppi-not-dpi/


Patrick Connor
Serif (Europe) Ltd.

Latest releases on each platform 

Share this post


Link to post
Share on other sites

As we explained to you (briefly) on twitter

 

"This has been brought up a few times in the past you can see our response regarding this here :)  https://forum.affinity.serif.com/index.php?/topic/25774-its-ppi-not-dpi/"

 

You are correct in that DPI is a technically incorrect term. Like some of our users, we do understand the term and we don't need it explaining again, but thanks, hopefully some readers will understand better now.

 

We consider all requests, but if you do post this multiple times that is flooding and won't be tollerated. Your other private Facebook direct message and Twitter Private messages are best dealt with here in public, where others can join in, or indeed in the original thread where this has been discussed in great depth,  https://forum.affinity.serif.com/index.php?/topic/25774-its-ppi-not-dpi/

It wasn't my intention to cause "flooding", or spam you, but with so many public channels it's hard to know the best place to engage about this.

 

Unfortunately the forum link you shared appears to be locked.  I also see moderators being somewhat dismissive.  For instance, when Ben says, "Anyone who really understands this doesn't worry themselves about the (very slight) infringement of terminology."  You have several users in the forum who understand it very well, and insist that the issue be corrected, because it's not "very slight".

 

Let's discuss some of the issues with his post.  DPI and PPI are both incredibly relevant to current technology in design/photo software and printing software, so using each term correctly must also still be incredibly relevant.  Since all digital displays I'm aware of are still created with pixels, I have no idea what his reference to "points" could be.  Your primary competitor, and the industry standard for digital illustration and photo editing uses pixels/inch, and it's not a measure of image quality over real space, it's a measure of digital information density (measured in pixels) in relation to real space.  So, politely, Ben doesn't seem to understand the topic very well at all.

 

Also, as mentioned before, I don't feel that "It's largely assumed that these can be used interchangeably" is a valid justification for perpetuating incorrect information.  Maybe I'm underestimating the cost of replacing a letter in the next update, but that feels like a very lazy way to address something your users seem very knowledgeable about.  And if what Ben was saying really was true, and they're interchangeable, then there really shouldn't be any confusion when you change it from PPI to DPI, should there?  Because they're interchangeable, right?  (And I don't mean to sound patronizing, but that's how I feel like you're treating your users who seem to really care about this issue).  If it could *REALLY* go either way, then why wouldn't you just change the letter, so that it's correct?

 

Now, I also understand where Ben is coming from when he talks about printers.  The vast majority of printers I have worked with also [mistakenly] ask for a document to be in DPI instead of PPI.  However, their lack of understanding about digital files is no more justification than a designer or artists lack of knowledge about printing technology.  Accepting that people screw it up, and saying we should just keep letting people screw it up does not sound like the proper way to handle the issue.

 

I feel like if you really understood the term, it wouldn't say DPI when I open up Affinity.  I also think it's important to note, I'm not just explaining the term, I'm also explaining why some of the most significant factors in print quality and resolution have nothing to do with your software, which I genuinely believe makes a pretty strong case for using the correct term.

 

And finally, you guys make great software.  Thank you so much for creating these programs and [finally] bringing them to Windows.  It's an awesome product!

Share this post


Link to post
Share on other sites

The addition of PPI would make sense in a context like the persona I proposed a while ago: https://forum.affinity.serif.com/index.php?/topic/30454-professional-pre-press-persona-please/

 

It's fairly technical as well as practical. Not something every designer deals with all the time. Most people create assets and send them off to somebody else to deal with. Some forums are even filled with people asking what this DPI thing is when we have pixels on a screen. In general I noticed people use DPI and pixel count in a very specific context today which is when looking at a physical space "I'm filling an area about yea big with this much pixels, it'll be looked at from about that distance, so I'll need to put the pixels this close together / need x amount of pixels per inch on surface". Which is highly simplified but has been working for ages now.

 

I think having that option in a very print specific context where people who go there for a reason won't be confused by this. It could also help to and be used for explaining the concept to others. PPI and DPI sliders for example to see how your output performs.

Share this post


Link to post
Share on other sites

So I am just curious. Are you equally upset about the use of the term "raster" or "rasterize" when technically that refers to a scanning process to create an image & not an inherent property of an image? I have not used a CRT display for decades & I don't think any printer in common use has ever printed an image one row of dots at a time.

 

DPI was once interchangeable with PPI (back when we used monochrome dot matrix printers for desktop printing, & software & hardware was so crude that there was an unchangeable 1:1 mapping between screen pixels & printed dots). Like "raster" (which was borrowed from the analog video world), the term has become firmly entrenched in the lexicon. There is no point in trying to change it now, just like there is no point in trying to change the "i" to a metric standard that most of the world uses for linear measurements, or "raster" to something more technically correct.


Affinity Photo 1.7.2, Affinity Designer 1.7.2, Affinity Publisher 1.7.2; macOS High Sierra 10.13.6 iMac (27-inch, Late 2012); 2.9GHz i5 CPU; NVIDIA GeForce GTX 660M; 8GB RAM
Affinity Photo 1.7.2.153 & Affinity Designer 1.7.2.6 for iPad; 6th Generation iPad 32 GB; Apple Pencil; iOS 12.3.1

Share this post


Link to post
Share on other sites

So I am just curious. Are you equally upset about the use of the term "raster" or "rasterize" when technically that refers to a scanning process to create an image & not an inherent property of an image? I have not used a CRT display for decades & I don't think any printer in common use has ever printed an image one row of dots at a time.

 

DPI was once interchangeable with PPI (back when we used monochrome dot matrix printers for desktop printing, & software & hardware was so crude that there was an unchangeable 1:1 mapping between screen pixels & printed dots). Like "raster" (which was borrowed from the analog video world), the term has become firmly entrenched in the lexicon. There is no point in trying to change it now, just like there is no point in trying to change the "i" to a metric standard that most of the world uses for linear measurements, or "raster" to something more technically correct.

That is utterly absurd.  I have lost count of how many forums and threads I have been through, and it's quite obvious to me that many professional artists and designers working with digital files understand the difference between PPI and DPI.  The notion that any designer of the last 2 decades is thinking of dot matrix technology, on screens or with printing, when creating their digital files is one of the worst arguments I've ever heard.  It's not like this was a dominant, long lasting technology, or that the majority of the professional market is made of people old enough for that to be a significant memory in their professional careers (I'm not exactly young, so I don't buy it).  Though, still not as absurd as saying there is no point trying to "change" it from DPI to PPI when the industry digital photo editing tool in the world, now a verb, correctly uses PPI.

 

If that's why their program costs more than 10 times as much, I'm completely underestimating the issue  ;)

 

It seems to me the reason not to change it is that the experienced professionals who understand the issue make up too small a portion of the user base.  It's just too difficult, or expensive, to educate people with a brief forum post, help section, FAQ, or short video.  And possibly an issue because so many printers, who usually work in DPI, can't grasp the concept that digital files aren't measured that way.  As if it isn't confusing to say your 350 DPI image is going to print great at 2400 DPI, but your 50 DPI image is too low res to look good at any DPI.  I mean come on, if my 50 DPI image doesn't look good, just keep increasing the print DPI until it does, isn't that how it works?  Aren't those things inextricably linked?!  Why strive for progress or excellence when it's so much easier to keep doing everything the same stupid, incorrect way?!

 

And now I'm upset.  So, let me just say, I love the software.  These are excellent programs and I'm proud to be a user.  I, apparently along with several other vocal users, would obviously very much appreciate if this issue was corrected.

Share this post


Link to post
Share on other sites

The notion that any designer of the last 2 decades is thinking of dot matrix technology, on screens or with printing, when creating their digital files is one of the worst arguments I've ever heard.

That isn't even close to what I was saying.


Affinity Photo 1.7.2, Affinity Designer 1.7.2, Affinity Publisher 1.7.2; macOS High Sierra 10.13.6 iMac (27-inch, Late 2012); 2.9GHz i5 CPU; NVIDIA GeForce GTX 660M; 8GB RAM
Affinity Photo 1.7.2.153 & Affinity Designer 1.7.2.6 for iPad; 6th Generation iPad 32 GB; Apple Pencil; iOS 12.3.1

Share this post


Link to post
Share on other sites

No offense meant to anyone, but if you want to just understand the difference, google it.

 

One site:https://99designs.com/blog/tips/ppi-vs-dpi-whats-the-difference/ has a pretty good explanation. And there are many more sites.

 

If the discussion is to change the software then continue. Seems a waste of time to me though. If you understand the difference you know what is meant on the software dialog boxes.

Share this post


Link to post
Share on other sites

This has been an informative thread, I have enjoyed it! My “4 pence” would be if its inaccurate fix it now not later. Its important for a professional level product to be accurate, even if its seen by some as being a bit pedantic. Anything less depletes the product. So bite the bullet Affinity.

Share this post


Link to post
Share on other sites

But where exactly is it inaccurate? DPI is just a bit of metadata, a number that is relevant for printing & in some contexts for defining "actual size," so why not label it as such in things like the new document or Photo's "Resize Document..." dialog boxes? Would "PPI" really be any more accurate for that?


Affinity Photo 1.7.2, Affinity Designer 1.7.2, Affinity Publisher 1.7.2; macOS High Sierra 10.13.6 iMac (27-inch, Late 2012); 2.9GHz i5 CPU; NVIDIA GeForce GTX 660M; 8GB RAM
Affinity Photo 1.7.2.153 & Affinity Designer 1.7.2.6 for iPad; 6th Generation iPad 32 GB; Apple Pencil; iOS 12.3.1

Share this post


Link to post
Share on other sites

No offense meant to anyone, but if you want to just understand the difference, google it.

 

One site:https://99designs.com/blog/tips/ppi-vs-dpi-whats-the-difference/ has a pretty good explanation. And there are many more sites.

 

If the discussion is to change the software then continue. Seems a waste of time to me though. If you understand the difference you know what is meant on the software dialog boxes.

I agree, you should learn the difference if you don't know it.  And I agree it's not going to have a huge impact if you see DPI and know they meant PPI.  However, if you see DPI and don't know the difference, then seeing an award winning software use DPI could add to the confusion and miscommunication with printers.

Share this post


Link to post
Share on other sites

But where exactly is it inaccurate? DPI is just a bit of metadata, a number that is relevant for printing & in some contexts for defining "actual size," so why not label it as such in things like the new document or Photo's "Resize Document..." dialog boxes? Would "PPI" really be any more accurate for that?

 

I explain this in my post, and later posts.  The digital file resolution may only be relevant for *when* you print, but it's also still only relevant to your digital file.  You can print a 300 PPI document at 300 DPI and 1200DPI and get drastically different results, but printing a document 300 PPI and 1200 PPI at the same DPI is an almost indistinguishable result.  If you didn't label them, you wouldn't know which one came from which file.  And then if you don't understand the difference, then hearing that you can print a 300 DPI image at 1200 DPI to make it a better quality image than 300 DPI, is inherently confusing.  Because if that's true, why can't you just print my 50 DPI image at 1200 DPI to make *that* a better image.  The "DPI" settings in Affinity have almost nothing to do with the final print quality of the file.  That comes down to print DPI, paper quality, finish, etc.  So, it's an important distinction to say your document is measured in pixels, and your print isn't, and describing them both the same way is confusing.  And between professional printer reviews and feedback, print on demand feedback, and other online discussions, there's just no legitimate way to say that calling them both DPI makes it easier for anyone, because people all over the place are still getting crappy results and they have to reach out to the printer to find out why.  Which is also why so many great printers have so many terrible reviews.  My prints will always be sized correctly.  Everyone who understands it is always going to have their work sized correctly (well, all other things being equal, I guess).  I'm guessing the vast majority of people jumping into a $50 art program are not working professionals who understand this stuff, and either way, it's still not an accurate way to describe your document.

Share this post


Link to post
Share on other sites

Marj, you are right-on!

 

But don't hold your breath...

 

I have worked with electronic imaging solutions since 1981 and at the outset of "Desktop Publishing" in the late 80's the language migrated from 'technically correct' to 'marketing cool', it would be difficult to reverse the trend now. It was more important for copier companies to communicate with corporate types than to conform to technically correct terms. This pre-dates digital workflow in photography by over a decade.

 

Follow the money and you'll find the answer.

 

Selling laser and inkjet printers into the office market was, and still is, huge business. Additionally, most graphics software is also geared to a less sophisticated (non-professional) market. Thirty-years ago presentations were created by graphic artists not office-clerks and sales-people, and it shows...

 

So, as long as there is money to be made, we will favour 'cool' over 'correct' and 'common' over 'correct' because no-one chooses  'correct' over 'money and acceptance'.

 

For those who might be interested:

 

DPI = the pitch or frequency of a print head in a laser or inkjet printer. It is not a measurement of dimension. It tells us the number of laser beam exposures or inkjet droplets used to image per-inch of image.

 

PPI = the number of pixels-per-inch in a raster image (tiff, jpeg, png etc.). Metric uses the term 'res' for lines per mm. e.g. 'res12' which = 304ppi.

 

LPI = "Lines-per-Inch"  LPI referred to screen frequency for half tone images and graphics used to make printing plates and display resolution. For example, we print the photos in a magazine @ 175lpi which requires tiff files of 300ppi for ideal quality.

 

Enjoy, but don't waste time over this,

 

Stan

Share this post


Link to post
Share on other sites

 no-one chooses  'correct' over 'money and acceptance'.

 

You're not seriously suggesting that Affinity considers the common use of DPI  a selling point, are you? That people (in the round, not just the occasional edge-case posting on this thread) will refuse to buy software because what you deem to be a "correct" term has not been used?

 

Because I don't believe for a second that a decision has been made to use the term "DPI" so that Affinity sells more software, which is what you're implying - sorry, it's a frankly preposterous notion.


Keith Reeder

 

(I don't need bird photography lessons - OK..?)

Share this post


Link to post
Share on other sites

 Anything less depletes the product.

 

No, it really doesn't, Marj. Not even a tiny bit.

 

This post says everything that needs to be said about the Real World implications of this "problem":

 

https://forum.affinity.serif.com/index.php?/topic/25774-its-ppi-not-dpi/?p=123411


Keith Reeder

 

(I don't need bird photography lessons - OK..?)

Share this post


Link to post
Share on other sites

I explain this in my post, and later posts.

None of that really answers the question I asked. What practical, real world benefit would there be to changing "DPI" to "PPI" in things like the new document or "Resize Document..." dialog boxes in the Affinity apps?


Affinity Photo 1.7.2, Affinity Designer 1.7.2, Affinity Publisher 1.7.2; macOS High Sierra 10.13.6 iMac (27-inch, Late 2012); 2.9GHz i5 CPU; NVIDIA GeForce GTX 660M; 8GB RAM
Affinity Photo 1.7.2.153 & Affinity Designer 1.7.2.6 for iPad; 6th Generation iPad 32 GB; Apple Pencil; iOS 12.3.1

Share this post


Link to post
Share on other sites

It is time some common sense was used here.

 

PPI is and always has been self-explanatory. Pixels Per Inch.

 

DPI similarly also is self-explanatory. Dots Per Inch.

 

PPI relates to digital file size/quality/density.

 

DPI relates to the conversion of an image file into a print affected by an inkjet printer which produces hard copy images by squirting ink onto paper at so many Dots of Ink per Inch.

 

The two units are not interchangeable or confusable.

 

I do not claim multiple years of knowledge or a technical vocabulary large enough to confuse a physicist; but.

 

If your software is designed to work with digital files, I ask you. Why can you not see/comprehend that to use these terms as if they are the same 'thing' is just foolish.. and an indication of fundamental disregard for the subject you purport to be expert in.

 

With all due respect to people who produce a product that is as far beyond my capability as a 9second 100m run. This must be the most simple of fixes for you, so why make such pathetic arguments when acceptance of fact and fixes would gain you so much more credos and respect.

 

I do not think it will affect sales but you never know!

 

Regards    Sharkey

 

ps.. please do not launch into another tirade about what is or is not an industry wide acceptance because as a part of that industry for thirty years or so it is not accepted here!


MacPro (late 2013), 24Gb Ram, D300GPU, Eizo 24",1TB Samsung 850 Archive, 2x2Tb Time Machine,X-t2 plus 50-140mm & 18-55mm. AP, FRV & RawFile Converter (Silkypix).

Share this post


Link to post
Share on other sites

None of that really answers the question I asked. What practical, real world benefit would there be to changing "DPI" to "PPI" in things like the new document or "Resize Document..." dialog boxes in the Affinity apps?

 

Sometimes getting it right is enough - no matter what software/hardware named.

 

Practical, real world benefit is in the eye/mind of the beholder/user. 

 

A deeper respect for the tool is usually achieved by producing a tool that demands respect!


MacPro (late 2013), 24Gb Ram, D300GPU, Eizo 24",1TB Samsung 850 Archive, 2x2Tb Time Machine,X-t2 plus 50-140mm & 18-55mm. AP, FRV & RawFile Converter (Silkypix).

Share this post


Link to post
Share on other sites

 

I'm making my letter on Twitter and Facebook "open" because I'd really like to see this addressed.  I believe the issue at hand is also more relevant to a "bug", since the correct term is PPI and not DPI.  You've responded to this issue in the forums, but I'm considering this topic an error that I feel many users would like corrected.  Please consider making this change.  Thank you.

 

"Respectfully,

 
I believe ignoring the significant difference between PPI and DPI is a huge mistake, and an irresponsible choice for a company that does what yours does.
 
I understand you feel this is addressed.  However, if the vibe I'm getting is correct, then you feel that so many people misuse the term DPI, it's not worth the effort to address questions and concerns if you change it.  I think that's lazy.  It seems like a couple lines in an FAQ that you could link to would take care of it.

 

As far as professional work and the factor these properties have on the final outcome of your project, I think these differences are way too significant to ignore.  If I create a document at 300 PPI, and then print it at 300 DPI and 1200 DPI, the difference is immediately obvious.  With the exact same pixel data, the document printed at 1200 DPI is significantly higher quality.

 

If I create a document at 1200 PPI, with 4 times the pixel data, the quality difference between a 300 DPI and 1200 DPI print is also immediately apparent.  Again, the 1200 DPI print is far higher in quality, contrast, clarity of detail, color accuracy, and intensity.

 

However, if I look at two images printed at 1200 DPI, one from a 300 PPI file and one from a 1200 PPI file, then the difference is almost completely indiscernible.  This is also true printed at 300 DPI.  In spite of having 4 times the pixel data in the same amount of space, the difference between the two printed images is almost impossible to discern.  However, two images printed at a different DPI from the exact same digital file are *easily* distinguishable from each other.

 

I'm an artist and designer, and I don't personally deal with printing images very often.  That said, after 8 years doing this stuff, the difference between DPI and PPI has always been simple and clear, and I don't think there's a legitimate reason to use them interchangeably.  Especially given the impact they have on the final outcome is weighted so differently.

 

I've waited a long time for you guys to come to Windows so I could be done with Adobe, and seeing DPI every day instead of PPI won't change that.  I just think the responsible thing to do is set an example, use the correct term, and help clarify that there is a difference between the terms.  From what I've seen in the forums, a lot of your users also understand the difference,  and it also frustrates them.  It's a small change, but it could have a big, positive impact.  The users frustrated by it no longer have to be frustrated, and all the people that don't understand the difference can start to.  Please consider this.

 

Thank you for your time."

 

 

I am in complete agreement.

 

Much as I am committed to this software the obtuse nature of some of the terms and methodology does grate on occasions. Almost an obtuse language used to avoid the obvious..


MacPro (late 2013), 24Gb Ram, D300GPU, Eizo 24",1TB Samsung 850 Archive, 2x2Tb Time Machine,X-t2 plus 50-140mm & 18-55mm. AP, FRV & RawFile Converter (Silkypix).

Share this post


Link to post
Share on other sites

It is time some common sense was used here.

 

PPI is and always has been self-explanatory. Pixels Per Inch.

 

DPI similarly also is self-explanatory. Dots Per Inch.

 

PPI relates to digital file size/quality/density.

 

DPI relates to the conversion of an image file into a print affected by an inkjet printer which produces hard copy images by squirting ink onto paper at so many Dots of Ink per Inch.

 

The two units are not interchangeable or confusable.

 

...

 

But even here there is confusion.

 

PPI is not a demarcation of quality. It is only dimensional.

 

Image PPI also does not directly relate to the physical dots per inch on a printed page, except when the image's physical dimensional PPI matches the physical capabilities of the output device. These days that is rare.

 

A physical printer, be it an inkjet or imagesetter, will generally have more printer dots per inch than the image being imaged has physically. That is, a typical inexpensive inkjet using extrapolated dpi may use 720 printer dots per inch to print say an image being sent to it having 300 ppi. And a typical output resolution is 2400 dots per inch regardless of the image being set to it.

 

While I would welcome Serif changing the label for DPI to PPI, it is after all, just a label and a mathematical unit for various operations and those operations are dimensionally agnostic. I still have found no "real life" person that is confused by which term is used.

 

But either way there requires a modicum of explanation as to the effect of PPI/DPI to someone that is new to image manipulation. But once one gets it, they easily interchange those two terms just like we all have done.

 

I have a feeling I should have stayed out of another near-pointless thread. Serif has made clear this is being reviewed. No amount of our banter will change that end decision.

 

Mike


My computer is a nothing-special Toshiba laptop with unremarkable specs running Windows 10 64-bit.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×