Thanks for the comments: useful to hear what people not actively working
with the current LaTeX3 ideas think.
> Encourage Jonathan and Taco to converge, or at least Jonathan to try
> integrating essential programming parts of LuaTeX into XeTeX.
From what I understood on the XeTeX list, this is non-trivial
(especially as LuaTeX is not finalised yet).
> TeX's inherent limitations as a programming language are a major pain.
> They will both cause performance and maintenance issues for a
> "microkernel", to a degree where one will tend to code bypassing this
> kernel for efficiency reasons.
That's of course a good point. I'm not sure how one measures this.
Perhaps we'll get a sense for the "cost" of the system if something
which is big but efficient in TeX/LaTeX2e gets translated to expl3 and
we can then compare the two. Suggestions for a good test case? As
computers get faster, this is perhaps slightly less of an issue. (Robin
Fairburns has pointed out to me that something like my siunitx package
would have been impossible with earlier TeX systems as it would have
simply been too slow.)
> If the proposed microkernel would default to hook into Lua as its
> algorithmic interface and we can get Taco and Jonathan to converge to a
> common functionality subset that we can base this on, I think that the
> benefits could be worth the decision to ditch all other engines.
This is, I suppose, the calculation that the ConTeXt people have made.
However, Karl Berry has pointed out to me in the past that the ConTeXt
user base is rather "selective", and so their calculation has rather
different considerations to those for LaTeX.