Mon, 11 Jun 2001 00:21:45 +0200
At 23.27 +0200 2001-06-10, Marcel Oliver wrote:
>Lars =?iso-8859-1?Q?Hellstr=F6m?= writes:
> > At 14.29 +0200 2001-06-05, Chris Rowley wrote:
> > >1. How should LICR strings be written out to files used only by LaTeX
> > > itself?
>In the LICR, everything else will be a big mess... so UTF8 if the LICR
> > Are there technically any such files today? BibTeX reads .aux files.
> > Copying and editing the contents of .toc, .lof, etc. files is an
> > established (although maybe not kosher) method for finetuning the table of
> > contents and such.
>I don't think this means we have to support arbitrary encodings of
>auxiliary files. Editing such files is, after all, undocumented
>practice, and using an UTF8 editor will (rather elegantly) provide
>full access to these files. Do we need more?
I think Chris's original question was essentially "Can we use a different
method for encoding `language' information in internal files than in files
that are meant to be read by other programs?" and my reply was then "I
don't think there are files that are that internal." For an underlying
engine (e.g. Omega) that can primitively handle UTF-8 I/O, I believe we
agreed that this would be the preferable way of encoding the _characters_.