LATEX-L Archives

Mailing list for the LaTeX3 project

LATEX-L@LISTSERV.UNI-HEIDELBERG.DE

Options: Use Forum View

Use Monospaced Font
Show Text Part by Default
Show All Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Subject:
From:
"Michael J. Downes" <[log in to unmask]>
Reply To:
Mailing list for the LaTeX3 project <[log in to unmask]>
Date:
Tue, 10 Nov 1998 10:06:02 -0500
Content-Type:
text/plain
Parts/Attachments:
text/plain (88 lines)
David Carlisle writes:
>  >   Any mechanism which introduces extra packages into a central latex
>  >   `minimal distribution' must _not_ generate any extra work at the latex
>  >   support addresses (latex-l and latex-bugs).
>  >

Sebastian Rahtz <[log in to unmask]> writes:
> have you read Eric Raymond's "Cathedral and Bazaar" paper? your
> starting point is the Cathedral system, of a tightly controlled core
> run by mages; and, of course, given this, your conclusion is
> undeniable. However, Raymond does articulate Another Way (basically,
> The Linux Way) which could be applied to LaTeX.

I had just read that last week:

  http://www.redhat.com/redhat/cathedral-bazaar/

and was pondering that very question. Essentially what it would amount
to is handing off LaTeX development work to a large world-wide group
of "parallel processors" under the following strategy:

  1. No pre-screening of the developers. Anyone who wants to work on a
  piece is free to go ahead and do so.

  2. No explicit coordination from the top to divide up the work in an
  optimal way. Some people might end up working simultaneously on the
  same problem. Raymond says that in practice this has not been a
  problem for Linux because there is a natural tendency for people to
  choose different tasks (all the parallel processors are *different
  types* and choose their own work); each significant task, in fact,
  tends to get done by two or three people in slightly different ways:

    Buzzwords! "Distributed genetic optimization algorithm"

  3. The main kind of central coordination involves looking at what
  contributors have done and selecting pieces to be officially blessed
  into the standard distribution. For Linux this was done originally
  entirely by Linus, then later by designated teams for particular
  areas. I imagine this would often involve picking the best of two
  or three choices and merging in some nice details from the
  alternatives that were not selected.

  4. Very frequent releases

One potential area for trouble that I see is that there is a symbiosis
on a given computer between the LaTeX system and the documents that it
serves. If one cannot guarantee line break fidelity in existing
documents when upgrading to a new LaTeX, people will have to choose
between

  a. Not upgrading

  b. Upgrading and adding some correction to all the old documents to
  ensure that their line breaks remain unchanged.

  c. Upgrading and keeping the old version of LaTeX installed as well
  (and tying each document to a particular version of LaTeX (and to
  particular versions of all the packages that it uses (which will not
  seldom follow a different release schedule than that of the LaTeX
  kernel) and even of tfm files (recall the scenario a few years ago
  with interword spacing in the tfm files generated by fontinst: first
  rather narrow, then changed to wider values in a later release)))

I guess this would be equivalent in the Linux world to upgrading and
finding that some applications that worked with the old Linux don't
work with the new one. In the Linux case, however, most of the
applications would presumably be upgraded in due time by their owners,
whereas in the case of LaTeX documents *you* are the owner and have to
do all the upgrade work yourself.

I know that I find myself in this situation rather often: I go back to
a document that I wrote two years ago and try to run it through LaTeX
again; then I find that one of the pieces that it requires is missing
or doesn't work any more.

The potential benefits of the bazaar model sound very attractive,
enough to warrant a serious look at what it might require.

But as a matter of fact I would say that LaTeX development is already
proceeding under something very like the bazaar model (the whole
contrib area), with only the core being under central control; and
even there, improvements have been submitted and accepted from outside
the LaTeX team. A more thorough bazaar strategy might involve little
more than making the source code repository public. I don't know the
details of how that works in the Linux world.

Michael Downes

ATOM RSS1 RSS2