Javier Bezos writes: > [This message was originally posted two weeks ago, but I have not > received it yet. Apologies if someone is receiving it for the second > time.] i don't think it ever made it to the list before > In addition, there is a potential danger. If LaTeX is so dramatically > rewritten, *will* existing packages work? The programming experience i would say *if* such a dramatical rewrite is done in full then: *no* existing packages will no longer work. of course you could make them work all in parallel by supporting all kind of old syntax as well as internal new syntax but that in my eyes would be a horrible nightmare. on the other hand, *assume* for the moment that there would be a full kernel not just a number of packages as we put out and a kernel that is much more consistent in naming conventions and much more orthogonal in functionality as what we have now. also assume that enough people (that understand about TeX programming) find the kernel being a good toolkit and are interested to use it now how many packages are on CTAN that are functional with the current latex and how many man/hours do you need to convert them once for all? many i guess but perhaps not that many to make it impossible in a short time frame assuming there are enough people finding it worth while. many ifs, true, but if you don't make a clean cut and instead built forever on the fragile base and all it different coding layers that LaTeX currently exist of then you will never get much further as we got now: with packages don't working with each other and incompatible coding for the same functionality and, what i think is important no structure whatsoever that makes the coding at least halfway understandable. > For many years the LaTeX Team has been encouraging LaTeX-like syntax in > packages; now, the proposed syntax is quite different from anything known. > I've found most of tools very useful, but I think their command names are > a bit clumsy. Why not a combination of the old and the new, say: > > \newcommand{\name}[cn]{definition} > > or similar? the main point of what we suggest (at least conceptual even if not exactly the way as presented) is dividing up the coding layers; there is a document level coding layer which consists of applying tags and allowing for alias like functions (which \newcommand sort of provides), there is supposed to be a design layer which allows to describe layout by declarations and there is a low-level coding layer which is supposed to be used for providing the tags and layout functions which would be something like the proposed L3PL. the importance here is consistency, eg \name:cn tells me how \name:cn handles its arguments in contrast to \name:oo. in other words it provides a scheme for defining \name:nn and all its argument handling variants once for all. this doesn't belong to the document level as there one should not be concerned about expanding arguments. and without the naming convention (perhaps not exactly the one used but something similar in sprit) it wouldn't be worth on the programming level either. right now many packages and the kernel do define such commands and some of their variants but they do it ad hoc and you don't know it looking at the code. > gained > over the past years could be lost in part (and the packages themselves) > and LaTeX3 functionality could be lesser than LaTeX2e one, at least for > a transition period. How long will be this period? While > uncompatibility is acceptable to some extent, perhaps l3 is *too* > innovatory. (Remember Jobs has partially rejected Rhapsody because of a > similar reason.) maybe it is, who knows? which is one of the reasons why we have put it out for discussion. and this by the way is also one of the reasons why we haven't put it out earlier; because we have never been sure ourselves (and aren't now) if it is something that can work or will work (even though it worked for ourselves for proto-typing of many ideas) Richard Walker said in some email earlier that he thought it bad that such ideas have been around for a long time in our heads without putting them up to the public and that we are not making friends this way. sorry for that, but i wonder what he would have said if we had made similar (probably even worse as far as consistency is concerned) ideas public at the same time as 2e was launched to get the latex world back into some usable state? it might have killed LaTeX or at least 2e (just like the dc/ec fonts nearly killed it) because probably nobody would have believed that there are two things: a stable production system to go for and in parallel also thinking innovative (and perhaps even too innovative) ideas at the same time i think that we can now think about such ideas with some detachment without fearing the worst for the stable system we have now. And with Omega and etex and perhaps some combination of it one day available to the community at large we have to think along those lines anyway, eg using functionality of either such system in the kernel would mean a large shift and a big reencoding and not using them would mean that we stay limited forever even if the tools are around at some point. so to summarize: this PL is showing concepts which we think being worth looking at and thinking about it. actually applying them and how is something that remains to be seen hopefully after people have done a little more with it than just glancing over its documentation. my intention at least is to provide a language interface concept coded in this language sometime this year to show how to make real stuff using the principles behind it. perhaps others get interested enough to explore its potential as well a bit. the naming convention is something that is unfamiliar the first time you use it but our experience is that it is something you don't think of being clumsy after a short period. definitely less clumsy as the coding i had to use in the kernel for things like NFSS to make it happen and which are awfully hard to understand afterwards. frank