LATEX-L Archives

Mailing list for the LaTeX3 project


Options: Use Forum View

Use Monospaced Font
Show Text Part by Default
Show All Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
David Carlisle <[log in to unmask]>
Reply To:
Mailing list for the LaTeX3 project <[log in to unmask]>
Tue, 30 Jun 1998 10:01:28 +0100
text/plain (41 lines)
Hans writes

> LaTeX already has several very slow commands with slow parsing, for
> example the variations of "new" with the LaTeX special style of defining
> arguments.

Having a definition command that is slower than def is acceptable.
Having a mechanism where the _use_ of commands is an order of magnitude
(or more) slower than directly calling a control sequence is not
acceptible, as long as the system is to be programmed in TeX (or a
TeX-like system such as etex or omega).

This means that while it might be useful sometimes to `parse out' the
argument specification from the command name this would only ever be
used in limited circumstances, eg to define one variant form in terms of
another if for some reason the normal `base' NNNN (or nnnn) form is for
some reason unavailable. As Javier commented the current N-n distinction
is not always perfect. The exact detail of the conventions may well need
changing, but the basic principle must be that command sequences are
accessed directly as tokens at the level we are talking about (which is
the low level programming conventions in which higher level markup can
be defined).

This does not mean that the document level markup has to be token
based. Already LaTeX has the environment constructs which are not.
\begin{enumerate} is 12 tokens rather than \begingroup\enumerate
which is two. The environment syntax can fairly easily be offered in an
alternative syntax, say <enumerate> ... </enumerate> which is about the
same in terms of speed and memory usage as \begin \end (you have to work
a bit harder to get a full XML system though:-). Having the document
level markup being something that is parsed, `by hand', using a parser
written in TeX is acceptable, but only if the result of that initial
processing is a set of command tokens that can be executed in the normal
way of command tokens directly looked up in TeX's hash table.

I mention this point (again) not to try to stifle discussion but because
I got rather lost at what level you are intending some of your module
proposals to be used.