> On Wed, 12 August 1998 21:49:41 +0200,
 > Martin Schroeder <[log in to unmask]> writes:
 >  > In <[log in to unmask]> Frank Mittelbach <[log in to unmask]> writes:
 >  > >b) will be drastic: a current LaTeX format (without any packages
 >  > >loaded) uses about 51088 words of memory before begin document; if the
 >  > >average word length in commands is 10 (which is far too low with a
 >  > >consequent implemented module concept) then this gets basically blown
 >  > >to 500000 which is twice the amount of main mem
 > [...]
 > Frank, either I misunderstand your ``word'' or you are wrong with this
 > analysis.

i guess neither. :-) the problem is that Martin cited me out of
context. I was replying to a suggestion to replace TeX's token based
mechanism, ie \foobar being internally one token in main mem and a few
bits of char mem, with a mechanism in which \foobar is 7 tokens ---
only that we were discussing \foo/bar_bas_... eg even longer streams
of tokens stored and processed each time.

my claim back then is that TeX is tailored to be a token based program
and that giving this up is undesirable for several reasons.