LATEX-L Archives

Mailing list for the LaTeX3 project


Options: Use Forum View

Use Monospaced Font
Show Text Part by Default
Show All Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Manuel Pégourié-Gonnard <[log in to unmask]>
Reply To:
Mailing list for the LaTeX3 project <[log in to unmask]>
Tue, 16 Feb 2010 15:04:07 +0100
text/plain (30 lines)
Joseph Wright a écrit :
> On 10/02/2010 14:52, Heiko Oberdiek wrote:
>> If an expandable version is not an issue, then \pdfstrcmp
>> can be implemented in virgin TeX. If the input of \pdfstrcmp
>> consists of `other' and perhaps `space' tokens, then
>> even an expandable version can be simulated. The main problem
>> is AFAIK the conversion of a general token list to the string
>> representation in an expandable way (that means without
>> having \edef).
> What happens at present is more or less that: \tl_if_eq:nn(TF) uses
> \pdfstrcmp if available and a non-expandable version otherwise.

By the way, I find is rather weird, that \tl_if_eq doesn't really compare token
lists, but only their conversions as TeX strings. IMO it would be clearer to
have a \str_if_eq::nn(TF) function in the string module, and then conversion
functions from the tl to the str data type.

Such conversion function could also be the place to implement "string input
methods" if they are needed. Eg, the most basic conversion function would be
something like \edef + \detokenize, but one could imagine a conversion function
that locally redefines \% as expanding into a catcode 12 %, etc. before
performing the \edef, so that a user can easily input arbitrary strings.

(While talking about encoding, there was thread in fctt recently about how to
percent-encode url components. It could be one useful encoding function in the
library, in addition to those already mentioned by Heiko.)