At 11.38 +0100 2001-02-20, J%ORG KNAPPEN wrote:
>> Yes i'm seriously thinking that splitting TS1 into say, TSA (adobe
>> (with expert set) and so on would be helpful to actually make sure that
>> have a font that claims to be in some encoding it really has the glyphs of
>> that encoding.
>I suggested making a TSA encoding for Adobe fonts years ago, but noone
>undertook the work.
I suspect I will split ts1.etx when I eventually get around to updating
that, and thus create the basic fontinst support. (\latinfamily should
probably also be reconstructed to use the proper symbol encoding, but I'm
not at all sure I will make the attempt.) That would however require that
the LaTeX definitions of the encodings are set first.
>> Similar T1 should then be expanded to have companion encodings which are
>> for fonts that do not have Ng etc.
>> The number of encodings wouldn't grow that much, but then you could
>> sure that you get what you ask for and not just some square boxes in the
>> output and some error messages from dvips.
>I'm afraid, the number of encodings will grow much. There are more founderies
>than Adobe around (like Monotype, Linotype, Agfa, Berthold to drop some
>names) and they all have different basic and expert glyph sets in their fonts.
How large are the differences? It the common core of e.g. the basic
encodings nearly as large as that provided by 8a? If it is a TSA can still
>My font book from FontShop lists about 70 founderies, the new edition
>probably has even more of them.
>In addition, glyph sets aren't constant in time; older fonts lack the
>Euro sign newer fonts have.
>Fonts are a real mess (not only with (La)TeX, but also with the so-called
>professional versions for PC and Mac) and I don't see that the state of
>will change on foreseeable future.
There is a change in that more and more fonts are being made with more
glyphs than what can fit in a single encoding vector. This may help to
reduce the basic/expert sets problems. But yes, it will probably still be a
>IMHO, the black box replacements in vf's are an error: An unavailable glyph
>should be unavailable in the tfm file as well and provoke a harsh TeX
>error message. To catch the black thingies at proof reading stage is rather
>late and error prone.
Usually these boxes are accompanied by a \special which should make the
driver scream about them, but I agree it is mean on TeX to lie and say the
character is there when in reality it isn't. Hopefully the reconstruction
of latin.mtx (which I haven't done anything about for about a month now)
should make it easier to not insert the \unfakables if you don't want them.
BTW, Jörg, any thoughts on my suggestion to add a comma accent character
(or rather two, when I think of it: one normal and one reverse=upside-down)
to TS(1|A|E)? Faking it in VFs is no problem.