Growth that are used
The success of our Interdata 8/32 portability experiment soon led to another Tom London and John Reiser on the DEC VAX 11/780. This machine became much more popular than Interdata, and Unix and C began to spread quickly, both inside and outside AT&T. Although by the mid-1970s Unix had used a variety of projects within the Bell System as well as a small group of research-oriented industrial, academic, and government organizations outside our company, its real growth only began after portability had been achieved. Of particular note were the System III and System V versions of the system from the emerging Computer Systems division of AT&T, based on the work of the development and research groups of the company, and the BSD series of publications by the University of California at Berkeley derived from Bell Laboratories research organizations.
In the 1980s, the use of the C-language spread
widely, and compilers became available on almost every machine architecture and
operating system; in particular, it became popular as a programming tool for
personal computers, both for commercial software manufacturers for such
machines and end-users interested in programming. At the beginning of the
decade, almost every compiler was based on Johnson's PCC; by 1985, many
independently produced compiler products were available.
Standardization
It was clear by 1982 that C required formal standardization. The best approximation to the standard, the first edition of K&R, no longer described the language in actual use; in particular, neither the void nor the enum type were mentioned. While it foreshadows a newer approach to structures, only after it has been published did the language support assign them, transfer them to and from functions, and associate the names of the members firmly with the structure or union that contains them. Although the AT&T-distributed compilers incorporated these changes, and most of the non-PCC compiler providers quickly picked them up, no complete, authoritative description of the language remained.
The first edition of K&R was also insufficiently
accurate on many language details, and it became increasingly impractical to
see PCC as a 'reference compiler;' it did not fully embody even the language
described by K&R, let alone the subsequent extensions. Finally, the
incipient use of C in commercial and government contract projects meant that
the imprimatur of the official standard was important. Thus (at the urging of
Mr. D. McIlroy) in the summer of 1983, ANSI established the X3J11 Committee
under the leadership of CBEMA, intending to produce a C standard. At the end of
1989, X3J11 produced its report [ANSI 89], which was subsequently accepted by
ISO as ISO / IEC 9899-1990.
From the outset, the X3J11 committee took a cautious, conservative view of the language extensions. To my great satisfaction, they took their objective seriously: 'to develop a clear, consistent and unmistakable C programming language standard that codifies the common, existing C definition and promotes the portability of user programs across C language environments.' [ANSI 89] The Committee realized that the mere promulgation of a standard does not change the world.
X3J11 introduced only one truly important change to
the language itself: it incorporated the types of formal arguments into the
type of signature of the function, using syntax borrowed from C++ [Stroustrup
86]. In the old style, external functions have been declared as follows:
double sin();
That only says that sin is a function that returns a
double (i.e. double-precision floating-point) value. In a new style, this was
better rendered
double sin(double);
To make the type argument explicit and thus
encourage better type-checking and appropriate conversion. Even this addition,
even though it produced a considerably better language, caused difficulties.
The Committee rightly felt that it was not feasible simply to outlaw
'old-style' function definitions and declarations, but also agreed that the new
forms would be better. The inevitable compromise was as good as it could have
been, although the language definition is complicated by allowing both forms,
and portable software writers have to contend with compilers that are not yet
up to standard.
X3J11 also introduced a host of minor additions and
adjustments, such as const and volatile type qualifying rules and slightly
different type promotion rules. However, the standardization process did not
change the nature of the language. In particular, the C standard did not
attempt to provide a formal definition of language semantics, and so there
could be a dispute over fine points; nevertheless, it successfully accounted
for changes in use since the original description and is sufficiently accurate
to base implementations on it.
The core C language escaped almost undisturbed from
the standardization process, and the Standard emerged more as a better, more
careful codification than a new invention. More important changes have taken
place in the language environment: the preprocessor and the library. The
preprocessor performs macro substitution through conventions distinct from the
rest of the language. Its interaction with the compiler has never been well
described, and X3J11 has tried to remedy the situation. The result is
noticeably better than the explanation given in the first edition of K&R;
besides being more comprehensive, it provides for operations, such as
concatenation of tokens, which were previously only available through
implementation accidents.
X3J11 correctly believed that the full and careful
description of the standard C library was as important as the work on the
language itself. The C language itself does not provide input-output or any
other interaction with the outside world, and thus depends on a set of standard
procedures. At the time of publication of K&R, C was primarily thought of
as the Unix programming language of the system; although we provided examples
of library routines intended to be easily transportable to other operating
systems, the underlying Unix support was implicitly understood. As a result,
the X3J11 committee spent much of its time designing and documenting a set of
library routines required to be available for all conforming implementations.
The current activities of the X3J11 Committee are
limited by the rules of the standards process to the issuing of interpretations
on the existing standard. However, the informal group originally convened by
Rex Jaeschke as NCEG (Numerical C Extensions Group) was officially accepted as
subgroup X3J11.1 and continues to consider extensions to C. As the name
implies, many of these possible extensions are designed to make the language
more suitable for numerical use: For example, multi-dimensional arrays whose
boundaries are dynamically determined, the incorporation of IEEE arithmetic
processing facilities, and making the language more effective on vector
machines or other advanced architectural features. Not all possible extensions
are specifically numerical; they include a notation for literal structure.
Successors:
C and even B have several direct descendants,
although they do not compete with Pascal in the generation of offspring. One
branch of the side developed early. When Steve Johnson visited the University
of Waterloo on Saturday in 1972, he brought B with him. It became popular with
Honeywell's machines, and later Eh and Zed (Canadian answers to 'What follows
B?') spawned. When Johnson returned to Bell Labs in 1973, he was disconcerted
to find that the language whose seeds he brought to Canada had evolved back
home; even his YAC program had been rewritten in C by Alan Snyder.More recent
descendants of C are Concurrent C [Gehani 89], Objective C [Cox 86], C *
[Thinking 90] and, in particular, C++ [Stroustrup 86].
The language is also widely used as an intermediate
representation (essentially as a portable assembly language) for a wide range
of compilers, both for direct descendants such as C++ and for independent languages
such as Module 3 and Eiffel.