Jump to content

LibreTraining

Members
  • Posts

    1,114
  • Joined

  • Last visited

Everything posted by LibreTraining

  1. The font includes some OpenType features which are On by default (correctly). Some applications do not support these features or have them Off. Which is why you do not see this in the other applications. Open the Typography panel, look for Positional Alternates, and disable Initial Forms, Final Forms, and Isolated Forms. Then your text will go back to "normal." This is where some font documentation would be helpful. Note: I tested the v1.010 from Google Fonts (today). Some of the older versions from other font websites do not have these features.
  2. @DanM. I just tested Export to PDF using the original OTF fonts and it worked fine - with and without sub-setting. So your issue is probably just because you had both formats of the fonts installed and that confused the application on Export to PDF.
  3. For most uses there is no practical difference. So many people have heard the same nonsense repeated over and over that it has become "common knowledge" even though it is wrong. In one article linked to here recently as evidence, the author actually compared an old TrueType font (not OpenType-TT) to an OpenType-PS font - ridiculous. And the "PostScript curves are better" blah, blah, blah argument. Lucas de Groot (who created Calibri) did a talk at Typo Labs 2018 where he compared using PS curves to using TT curves for various contours on different glyphs. The score was about even as to which type of curve was better in the different cases. This is from someone who actually knows, using the actual tools, not someone regurgitating some biased nonsense they read online about curves math in a theoretical setting. Real world font design is different. There are a few use cases where there is a valid reason to use PS fonts. 99.99% of the time the user does not know what these are. Some are related to typical uses/workflow, and some are to work around bugs in software and hardware. But most of the time the blanket "OTF fonts are better" is just ignorance. For those that say they look better, I have offered to put both versions on a page and have the person tell me which one is which - no takers so far. And then there is the web fonts arena where nearly 100% are OpenType-TT. Commercial font vendors stay out of the discussion and provide both. What ever makes the customers happy. So you are not alone in the confusion. 🙂
  4. You probably have the Raleway variable font installed. Affinity applications do not support variable fonts. When variable fonts are not supported you get the default master. In Raleway the default master is Light. So when you do the Export to PDF that Light is what you get. Use the static fonts instead.
  5. There is no difference in features between OpenType-TT (.ttf) and OpenType-PS (.otf). In older original TrueType fonts that may be true. But for this font, both versions are OpenType. This is the case for most fonts now. TTF and OTF are embedded differently in PDFs. And right now there appears to be less issues with TTF. Your issue may be related to the embedding permissions in this font. Please test the TTF fonts (after you a have un-installed the OTF fonts) and let us know how that goes. I seem to remember an issue with the particular setting used in this font. On my phone at the moment so I cannot test. If you still have issues we can change the setting to test (later today when I am back at my computer).
  6. Here are the other four (so now you have the full family of six). Noto.Sans.KR.TTF-Thin.Light.Medium.Black.zip I did check all the names for the full family together, so you should not have any issues. Have fun!
  7. OK. Give these a try. I only converted the Regular and Bold. Noto Sans KR TTF fonts.zip Let me know if you have any issues. I did change the name to include TTF, so you can have them installed at the same time as the OTF fonts.
  8. Did you find the character name/code point cross reference files? They are in the GitHub repository but kinda buried.
  9. There have been problems with sub-setting for a long time. So I do not think the issue is these fonts, or non-Latin characters. OTF and TTF fonts are embedded differently so it may just be the OTFs. I do not remember now, but this has come up over-and-over. And it seems that sometimes it works, and sometimes it does not. The usual work-around is to turn-off sub-setting. But with a font this big that is kinda painful. It would take a lot more testing to narrow down the cause. But I suspect it is related the CID not being handled correctly. Inside PDFs the characters are identified by a Character ID (CID). And those are not necessarily the same as the Glyph ID inside the font. When a sub-setted font is embedded in a PDF it is really a little mini-font. In your PDF the TTF font embedded has assigned CIDs 1-14, for those 14 characters. The embedded OTF fonts appear to still have the CIDs from the original full font. I do not know if that is correct, but other applications do not have this in the PDFs they produced. They have lower consecutive CIDs like the TTF example above. So I am guessing that the issue is related as visually you can see the correct glyphs are not being connected to the correct codes in the PDF displayed. And the embedding the full font works (so not a font problem). All of this PDF stuff is very complex I quickly get kinda lost when rummaging around inside PDFs. Reading the specs is quite an ordeal (and confusing). But I am fairly sure something is wrong here.
  10. Ah, no. There is definitely some problems in this PDF. Note: I did check the fonts and did not find any issues. In PDF-Exchange Editor it displays the Noto Sans KR Bold as Regular, and shows both Noto Sans KR Regular & Bold as not embedded. So it is having issues trying to figure-out the embedded fonts. Ahh ... as I am writing I see @Luca Huelle just posted above. Nitro Pro displays this: FlexiPDF (and InfixPDF) display this: FlexiPDF (and Infix) Have a useful feature to remap characters. But you can also use it to look at the codes behind various characters. Here is the font that does display properly - Noto Sans Italic. The codes info for the selected D is displayed below the table. Here is the same table for the Noto Sans KR Bold embedded font. The B is selected, but it cannot connect the the glyph to the codes. The other characters are there in the table, but nothing displays. If I open the PDF in PDF DeBugger and go look at the fonts in the resources the Noto Sans Italic correctly shows Glyphs: 14 which is the correct number for that embedded font. The Noto Sans KR Regular and Bold both show Glyphs: 24861 - when the actual glyph numbers are 13 and 12. Not all of those are listed in the table, but most of the glyphs listed in the table are not used in the PDF. What should be there is only the glyphs used. So the info in the PDF is wacko. To test if OTF vs. TTF is an issue, I converted the Noto Sans KR Regular and Bold fonts from OTF to TTF. The TTF fonts appear to work in APub when exported to PDF. In PDF Exchange the Noto Sans KR Bold still displays as Regular. But the TTF versions both work. In FlexiPDF, the OTF Bold is still bad, and the TTF fonts work. Note that I exported-to-PDF the same test text from Word and LibreOffice. Both PDFs display correctly in all PDF editors, and the glyphs counts are reasonable when examined in PDF Debugger, and I can see all the correct codes/glyphs in the FlexiPDF Remap fonts dialog. So it appears something is wrong when the OTF is sub-setted and embedded. It may be the CIDs are not correct for a sub-setted font (look like the originals). Which would explain why embedding a full not-sub-setted font works. Regardless, something is not right which is probably why Acrobat is balking.
  11. @waltonmendelson If you created the CALVIN text in another application which understands that old encoding, such as Word or LibreOffice, it will appear to be OK when imported or pasted into APub - because the those applications will connect what was typed to the characters in the font, and characters are what you paste. But those characters do not have correct Unicode codes behind them. Try searching for the text in APub and it will not be found. Also the minute you start to type something new or to correct something you are typing in Unicode so the new/corrected text will be in a fallback font, because CALVIN does not have the character codes you are typing. No Affinity does not require you to buy newer fonts. The Warnock fonts are fine. I just mentioned the age only because fonts do sometime have bugs fixed. @lacerto FontCreator has a Convert to Unicode Font feature. Works OK for old converted T1 fonts - usually. Have to be on the look-out for duplicate names. And it does not seem to fix characters beyond the 233 symbol font limit. Anything beyond that and you need to do manual fixes (up to the 255). FontLab has Generate Unicodes feature which works much better. It has a file which contains all the "standard" glyph names and their codes. So it uses those to the assign the Unicode codes to the characters. You can also do it with fonttools (on GitHub) using Python. Never tried that so I am not sure what it would do with duplicates, etc. Because the original conversions are often not the best, you kinda need to be able to see what you are doing (to fix the duplicates and other character name errors).
  12. Affinity apps only support Unicode encoding. The CALVIN font is an old (1997) conversion from an old Type 1 font. The encoding is the old Mac-Roman. Some older applications which had to do deal with this back-in-the-day understand this encoding - modern Affinity applications do not. Nothing wrong that I can see with the Warnock fonts other than being a bit old - v1.009 from 2000 vs. v2.000 now. I searched for the Warnock fonts in the doc to see if I could spot anything obvious, but the document is such a mess it is hard to see anything. Looks like all manual formatting. And there does not appear to be any rhyme or reason as to when some text is Warnock or Times New Roman. You really need to use some proper styles. So your original question appears to be answered. And the rest appears to be document clean-up.
  13. I was just fixing the Calvin when you posted - it is a pre-Unicode font. Thus the code points are different than what is expected (all up in the PUA). Below is a quickie conversion to Unicode. It installs, the text appears. Did not test extensively. It should work. Now I am going to look at the other issues.
  14. That is not about old PostScript Type 1 fonts. That is about modern OpenType-PS (.otf) fonts which are OpenType file format with PostScript glyph outlines. Not the same thing. The fact that the PDF export appears to support Type 1 fonts is probably a happy accident because the PDFlib supports it. When you export to print you then go thru the Affinity rasterizer where it is definitely not supported. PostScript Type 1 fonts are not supported in Affinity apps - officially.
  15. Another workaround is to use the OTF fonts if available. OpenType-PS (.otf) fonts do not use components so they will not have overlaps (for now). To support variable fonts, text rendering engines must support properly rendering overlaps. So apps which support variable fonts have already dealt with this issue. Typically when font developers export their fonts from their GlyphsApp, or FontLab source files, they select "Remove Overlaps" as one of the export parameters. So the fonts they provide in their repo generally do not have overlaps in the TTF files. Google Fonts takes the original source files and uses their own tools to build the fonts for GF. That does not include "remove overlaps" (fonttools can do it but it is not 100%). Overlaps is not the only rendering issue they have to contend with. There are a number of other issues where the rendering in apps does not work properly. And that includes Adopey, Apple, and others. Bugs. All the time. They have made the decision to not modify valid fonts to work around various app limitations. Otherwise you end up with a bunch of non-standard franken-fonts. So until Affinity supports variable fonts, the easiest workaround is to use the OTF fonts. If it does not need to remain text, you can merge the shapes too.
  16. PostScript Type 1 fonts are not supported in Affinity apps.
  17. Don't use a font designed for Arabic for Latin text. The Latin characters included in this font are designed to be used within some Arabic text. So there are a number of localizations applied which affect the Latin text. Your odd break issue is probably because the space character is localised when the language applied to the text is Latin - which increases the width of the space, and since the "space" is actually no longer a space character the APub line breaking algorithm probably confused, and so it just breaks it wherever. Basically it is as if you had entered one long line of characters with no spaces. Best WAG I got without seeing the actual doc.
  18. IIRC enStep was one of the companies who bundled converted Type 1 fonts, and they had a habit of using all caps for the font names. If this is the JOAN font I suspect, it has the old encoding and is a pre-Unicode font. So the characters are not encoded where expected and so you see a fallback font (Arial). Here is a quick conversion to Unicode: JoanAF-Regular.zip Note: these converted Type 1 fonts are really old (1998), are limited to less that 256 characters, often have no kerning at all, and have no OpenType features.
  19. @Anonym Rage Italic (RAGE.TTF) is installed with Microsoft Office. So any version since Office 2003 should install it (far back as I have it). You can also get it from M365 Cloud Fonts (but then you need to do Install for All Users to work).
  20. I tested in LibreOffice and it appears they are showing the characters based on the glyph name. But what really surprised me is the text search actually worked. In early pre-Unicode Type 1 fonts the glyph name was used a lot rather than a specific code. So my guess is this is a leftover from those days when Type 1 fonts were supported. Affinity apps have never supported Type 1 fonts so there would no such leftover. That's a weird one. Hmmm ... did a little digging in the font file. And it appears to be an old Macintosh encoded TrueType file. Type 1 fonts from back then had a specific list of glyph names, and those were always in particular glyph numbers. But those numbers don't match this TrueType font. So I do not know exactly how this is working. 😵 But what I do know is these apps are using some old encoding mapping to be able to work. The glyph names is how FontCreator converts it to Unicode. And that is fairly quick and easy.
  21. That is actually a symbol font, and all of the characters/glyphs are encoded up in the PUA (Unicode Private Use Area). So when you type something you what are seeing is a fallback font (because the characters are not there at the expected Unicode code points). Here is a quick conversion to Unicode: BadwrenchAF-Regular.zip Note: This is a really bad font. The character weights are not consistent at all. But you you should now be able to see it in AF apps.
  22. Yes, as long as the font you choose supports Romanian. Some fonts are not localized for Romanian so be sure to check.
  23. I have never heard of 'scaffolding' used as a term related to end-user font quality assessment. So I guess we need some clarification as to what that actually means. As mentioned already, coverage is a common end-user assessment tool. Font developers use a number of different quality assessment tools, but those would not normally be the kind of thing found in a user app. So I too am interested in what 'scaffolding' means in relation to font quality assessment.
×
×
  • Create New...

Important Information

Please note there is currently a delay in replying to some post. See pinned thread in the Questions forum. These are the Terms of Use you will be asked to agree to if you join the forum. | Privacy Policy | Guidelines | We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.