Mathematics Word
Processing with TeX and LaTeX
A Short History of TeX. TeX (= tau epsilon
chi, and pronounced similar to "blecch",
not to the state known for `Tex-Mex' chili) is a computer language designed for
use in typesetting; in particular, for typesetting math and other technical
(from greek "techne"
= art/craft, the stem of `technology') material.
In the
late 1970s, Donald Knuth was revising the second volume of his multivolume opus
The Art of Computer Programming, got the galleys, looked at them, and
said (approximately) "bleccch"! he had just
received his first samples of the new computer typesetting, and its quality was
so far below that of the first edition of Volume 2 that he couldn't stand it.
He thought for awhile, and said (approximately), "I'm a computer
scientist; I ought to be able to do something about this", so he set out
to learn what were the traditional rules for typesetting math, what constituted
good typography, and (because the fonts of symbols that he needed really didn't
exist) as much as he could about type design. He figured this would take about
6 months. (Ultimately, it took nearly 10 years, but along the way he had lots
of help from some people who should be well known to readers of this list –
Hermann Zapf, Chuck Bigelow, Kris Holmes, Matthew Carter and Richard Southall are acknowledged in the introduction to Volume E, "Computer
Modern Typefaces", of the Addison-Wesley "Computers &
Typesetting" book series.)
A year
or so after he started, Knuth was invited by the American Math Society (AMS) to
present one of the principal invited lectures at their annual meeting. This
honor is awarded to significant academic researchers who (mostly) were trained
as mathematicians, but who have done most of their work in not strictly mathematical
areas (there are a number of physicists, astronomers, etc., in the annals of
this lecture series as well as computer scientists); the lecturer can speak on
any topic s/he wishes, and Knuth decided to speak on computer science in the
service of mathematics. The topic he presented was his new work on TeX (for typesetting) and Metafont
(for developing fonts for use with TeX). He presented not only the roots of the
typographical concepts, but also the mathematical notions (e.g., the use of bezier splines to shape glyphs)
on which these two programs are based. The programs sounded like they were just
about ready to use, and quite a few mathematicians, including the chair of the
math Society's board of trustees, decided to take a closer look. As it turned out,
TeX was still a lot closer to a research project than to an
industrial strength product, but there were certain attractive features:
- it was intended to be used directly by authors
(and their secretaries) who are the ones who really know what they are writing
about;
- it came from an academic source, and was intended
to be available for no monetary fee (nobody said anything about how much
support it was going to need);
- as things developed, it became available on just
about any computer and operating system, and was designed specifically so
that input files (files containing markup instructions; this is not a
WYSIWYG system) would be portable, and would generate the same output on
any system on which they were processed – same hyphenations, line breaks,
page breaks, etc., etc.;
- other programs available at the time for
mathematical composition were:
- proprietary;
- very expensive
- often limited to specific hardware,
- if WYSIWYG, the same expression in two places in
the same document might very well not look the same, never mind look the
same if processed on two different systems.
Mathematicians are traditionally, shall we say, frugal; their
budgets have not been large (before computer algebra systems, pencils, paper,
chalk and blackboards were the most important research tools). TeX came along just before the beginnings of
the personal computer; although it was developed on one of the last of the
"academic" mainframes (the DECsystem
("Edusystem")-10 and -20), it was very
quickly ported to some early HP workstations and, as they emerged, the new
personal systems. From the start, it has been popular among mathematicians,
physicists, astrophysicists, astronomers, any research scientists who were
plagued by lack of the necessary symbols on typewriters and who wanted a more
professional look to their preprints.
To
produce his own books, Knuth had to tackle all the paraphernalia of academic
publishing – footnotes, floating insertions (figures and tables), etc., etc. As
a mathematician/computer scientist, he developed an input language that makes
sense to other scientists, and for math expressions, is quite similar to how
one mathematician would recite a string of notation to another on the
telephone. The TeX language is an interpreter. It accepts
mixed commands and data. The command language is very low level (skip so much
space, change to font X, set this string of words in paragraph form, ...), but
is amenable to being enhanced by defining macro commands to build a very high
level user interface (this is the title, this is the author, use them to set a
title page according to AMS specifications). The handling of footnotes and
similar structures are so well behaved that "style files" have been
created for TeX to process critical editions and legal
tomes. It is also (after some highly useful enhancements in about 1990) able to
handle the composition of many different languages according to their own
traditional rules, and is for this reason (as well as for the low cost), quite
widely used in Eastern
Europe.
Some
of the algorithms in TeX have not been bettered in any of the
composition tools devised in the years since TeX appeared. The most obvious example is the
paragraph breaking: text is considered a full paragraph at a time, not
line-by-line; this is the basic starting algorithm used in the HZ-program by
Peter Karow (and named for Hermann Zapf, who
developed the special fonts this program needs to improve on the basics).
In
summary, TeX is a special-purpose programming language
that is the centerpiece of a typesetting system that produces publication
quality mathematics (and surrounding text), available to and usable by
individuals.
References. The standard reference for the most widely used TeX dialect, LaTeX, is Leslie Lamport’s
LaTeX, A Document Preparation System.
User’s Guide and Reference Manual. Addison-Wesley, 2nd edition,
1994.
The website of the TeX Users Group (TUG) at http://www.tug.org/
has lots of useful information and links such as “The Not So Short Introduction
to LaTeX2ε” at
http://www.ctan.org/tex-archive/info/lshort/english/lshort.pdf
Implementation (Windoze). The LaTeX program itself is free. Currently, the most
popular implementation is MiKTeX at http://www.miktex.org/.
Installation is rather straightforward.
One
also needs a TeX friendly text editor; a nice shareware product is WinEdt at http://www.winedt.com/. A student license is
$30.
Download the trial version first. Installation can be a little bit confusing.
LaTeX
can produce PDF files for easy distribution; you’ll need to download the free Acrobat Reader at http://www.adobe.com/products/acrobat/readstep2.html
to view the PDF files. If you prefer Postscript files, you’ll need the free
programs Ghostview
and Ghostscript,
see http://www.cs.wisc.edu/~ghost/index.html
.