LATEX-L Archives

Mailing list for the LaTeX3 project

LATEX-L@LISTSERV.UNI-HEIDELBERG.DE

Options: Use Classic View

Use Monospaced Font
Show Text Part by Default
Show All Mail Headers

Topic: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Hans Aberg <[log in to unmask]>
Sun, 28 Jun 1998 13:20:13 +0200
text/plain (56 lines)
At 20:53 +0200 98/06/27, Frank Mittelbach wrote:
>a) because TeX in its current incarnations (ie TeX3, eTeX, Omega) is a
>token based interpreter which works only at an acceptable speed if its
>internal coding is based on tokens which can directly be looked up in
>a hash table either resulting in a primitive action or into a macro
>which consist of tokens. the recursion here has to be coming to an end
>pretty fast and in current TeX code it reasonably does but there are a
>number of recursions. if you change that to implement a manual lookup
>functionality on all levels then you will find yourself slowing down
>the system by i would guess a factor of 50 or more --- and even if
>these days we are having bigger machines than the one i had when i
>first saw LaTeX this is still far outside of what is usable.

  I just want to point out that the way I worded the modules, it does not
exclude Frank's idea of a low level naming convention:

  A module would itself define how it uses its names. So a module like \tex
could define that all its names should be interpreted directly, that is,
\tex/foo is executed directly.

  Further, modules with long names could be allowed to use short names for
their commands. For example, the module "environment" could decide to use
names \envir/foo, or \@env/foo or something. One would then add a
high-level command \environment/ which knows how to expand to the short
names. Such a high level command is useful for those who do not use the
names much, because it is easier to remember. One can then go in and
optimize, using the short names directly.


>b) will be drastic: a current LaTeX format (without any packages
>loaded) uses about 51088 words of memory before begin document; if the
>average word length in commands is 10 (which is far too low with a
>consequent implemented module concept) then this gets basically blown
>to 500000 which is twice the amount of main mem that i have available
>on this machine for everything. i just looked: when we compiled the
>Companion the log file showed 208957 words of memory out of 262141
>try to imagine to what size this would grow.

  I worried about this: What kind of old computer is Frank using? I have
32MB RAM on my, and with the computers sold today, I do not think that is
considered to be much. Further, the figure doubles ever 18 months or so
(and so does the speed) or even faster. The average lifetime of a computer
is a few years.

  So by the time LaTeX3 is out, and is starting to become accepted, which I
will gather will take a few years, those memory concerns will no longer be
relevant.

  Speed perhaps: If a document can be processed in less than one second
instead of ten, that will always be a great advantage.

  Hans Aberg
                  * Email: Hans Aberg <mailto:[log in to unmask]>
                  * Home Page: <http://www.matematik.su.se/~haberg/>
                  * AMS member listing: <http://www.ams.org/cml/>

ATOM RSS1 RSS2