LATEX-L Archives

Mailing list for the LaTeX3 project

LATEX-L@LISTSERV.UNI-HEIDELBERG.DE

Options: Use Forum View

Use Monospaced Font
Show Text Part by Default
Condense Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Sender:
Mailing list for the LaTeX3 project <[log in to unmask]>
Date:
Tue, 11 Aug 2009 16:45:24 +0100
Reply-To:
Mailing list for the LaTeX3 project <[log in to unmask]>
Subject:
MIME-Version:
1.0
Content-Transfer-Encoding:
7bit
In-Reply-To:
Content-Type:
text/plain; charset=ISO-8859-1
From:
Joseph Wright <[log in to unmask]>
Parts/Attachments:
text/plain (92 lines)
J.Fine wrote:
> Joseph Wright wrote:
> 
>> J.Fine wrote:
>>> I want documents that can be readily converted to XML.  This is a future
>> requirement for me.  TeX macros is not.
>>
>> There are two things going on here:
>>
>> 1) How LaTeX3 default input syntax might look. The current LaTeX2e model
>> is bad for this as it is not that structured. So there is a question of
>> trying to make things more structured, which in part means better
>> separation of content from appearance.
>>
>> 2) How LaTeX3 then typesets input.
>>
>> On (1), then a separate tool may well be the best approach, as you
>> suggest. However, I think that relates more to LaTeX3 input syntax (as
>> yet undecided). xparse is more about (2) (although I see that there is
>> quite a lot of overlap to worry about).
>>
>> In some ways, I wonder if this issue is related to something that is
>> explored in xparse.  The "real" version has the experimental concept:
>>
>> \DeclareDocumentCommandInterface
>> \DeclareDocumentCommandImplementation
>>
>> with the idea that the ...Interface function sets up the user syntax,
>> while the ...Implementation one actually contains the code:
>>
>> \DeclareDocumentCommandInterface \foo { s o m } { foo-implementation }
>> \DeclareDocumentCommandImplementation { foo-implementation } 3 {<code>}
>>
>> So to translate to XML (or whatever) the \foo part does not change, but
>> only the foo-implementation part does. In that way, the awkward nature
>> of the LaTeX user interface (optional arguments, stars, ...) is still
>> handled by the same code even if the result (in foo-implementaion) is
>> very different.
>>
>> So you could imagine \DeclareDocumentCommand doing these for every
>> function:
>>
>> \cs_new:Npn \DeclareDocumentCommand #1#2#3 {
>>   \DeclareDocumentCommandInterface #1 {#2}
>>     { \token_to_set:N #1 -impementation}
>>   \DeclareDocumentCommandImplementation
>>     { \token_to_set:N #1 -impementation}
>>     { \xparse_get_arg_num:n {#2} } % or some such thing!
>>     {#3}
>> }
>>
>> Writing XML is then a case of changing each implementation, without
>> changing the input at all.  Would this go in the right direction?
> 
> This is a wonderful example of the point I'm making.  One starts with TeX macros, adds more features, satisfies more requirements.

Whatever method you use to program, you have somewhere to add features
to satisfy requirements :-)

> And then one suggests programming TeX macros that write XML output.  And you find yourself programming a LaTeX-like input syntax to XML translator in TeX macros.  This is not something I want to be part of, nor something I'd like to use (even though TeX is Turing complete).

Somewhere in a LaTeX to XML conversion some code has to read LaTeX and
output XML.

> It seems that LaTeX3 project will use only TeX macros to solve its problems.  This restricts the problems it can solve.  I think this restriction prevents the creation of LaTeX3.

I take it you see a work-flow something like:

1) Write document in LaTeX3 document syntax (broadly, LaTeX2e syntax but
with some changes, and only the document part).

2) Process this with whatever language decided on to convert from LaTeX
input syntax to an intermediate form (perhaps LaTeX input -> XML ->
LaTeX typesetting).

3) If necessary, process the intermediate form to produce final output.
For example, the most likely approach (at present) would be to typeset
the intermediate form using TeX.

Am I understanding you correctly?  If so, I assume your broadly
supportive of doing TeX programming for (3) to make the typesetting work
as well as possible, but imagine (2) done with something else (I've
suggested Lua for obvious reasons, but the principal does not depend on
the programming language used).

This strikes me as rather similar to what the ConTeXt programmers are
doing, using Lua for the programming as I suggest above (although I
don't think they are taking quite such a separated out approach: anyone
familiar enough with Mk IV to comment?).
-- 
Joseph Wright

ATOM RSS1 RSS2