LATEX-L Archives

Mailing list for the LaTeX3 project

LATEX-L@LISTSERV.UNI-HEIDELBERG.DE

Options: Use Forum View

Use Monospaced Font
Show HTML Part by Default
Show All Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Subject:
From:
Philipp Stephani <[log in to unmask]>
Reply To:
Mailing list for the LaTeX3 project <[log in to unmask]>
Date:
Wed, 17 Feb 2010 03:14:17 +0100
Content-Type:
text/plain
Parts/Attachments:
text/plain (11 lines)
Am 16.02.2010 um 15:04 schrieb Manuel Pégourié-Gonnard:

> Such conversion function could also be the place to implement "string input
> methods" if they are needed. Eg, the most basic conversion function would be
> something like \edef + \detokenize, but one could imagine a conversion function
> that locally redefines \% as expanding into a catcode 12 %, etc. before
> performing the \edef, so that a user can easily input arbitrary strings.

I have also thought a bit about this, more from a user perspective: Users wonder why using \verb or \index inside macros produces weird results. So there should probably be a class of functions to define \index- and \verb-like commands – e.g. with argument specifiers like "s" and "v" in xparse. The commands thus defined should then receive only other tokens and also check for this. In this way misusing of the commands can be easily detected, and in this case the command can fall back to \detokenize and output a user-friendly error message.
I think that, apart from that, a strings module would be quite handy, as true strings (not token lists) occur in many places: file names, file i/o, terminal i/o, PDF strings, indexes.

ATOM RSS1 RSS2