It suffered from the second system effect

Sunday, March 19th, 2023

I didn’t realize that Brian Kernighan had written UNIX: A History and a Memoir until I read Dwarkesh Patel’s review:

The story of the creation of Unix is kind of insane. In 1964, MIT developed CTSS, an operating system that allowed multiple users to use the same machine at the same time by quickly alternating between the users’ tasks. It was a great success, and MIT worked with Bell Labs and General Electric to develop a follow up called Multics. Among the researchers from Bell Labs working on Multics were Ken Thompson and Dennis Ritchie. In this book, Brian Kernighan doesn’t go into too much detail about why Multics failed except to say that it suffered from the second system effect, where engineers respond to a successful product by creating an overcomplicated second version which tries to do too many things. In 1968, Bell Labs pulled out of the Multics project.

It was around this time that Ken Thompson found a little used PDP-7 workstation (which was apparently a shitty computer even by 1969 standards). At first he built a space travel game on the machine. He then decided to write a disk scheduling algorithm on it, but he couldn’t test it without writing some other programs to load the disk with data. Kernighan writes, quoting Thompson:

“At some point I realized that I was three weeks from an operating system.” He needed to write three programs, one per week: an editor, so he could create code; an assembler, to turn the code into machine language that could run on the PDP-7; and a “kernel overlay—call it an operating system.”

Right at that time, Ken’s wife went on a three-week vacation to take their one-year-old son to visit Ken’s parents in California, and Ken had three weeks to work undisturbed.

[…]

Thompson wanted to eventually write Unix in a programming language above assembly, but the existing languages like Fortran and Cobol were too big to write operating systems on for computers with 8 KB of memory. He developed a language called B, but it didn’t have the abstractions necessary to make an operating systems.

Dennis Ritchie then developed C, which was typed and which formally separated pointers and integers. You could get the number of bytes you wanted from memory based on the type of your variable, and you could do things like add one to a pointer to get the next item in the array. But even this wasn’t enough to develop an operating system, so Ritchie eventually added structs, and at this point Unix was ready to be written in C.

[…]

In 1991, Richard Gabriel published an article where he explained that Unix and C were so successful because they followed the “Worse is Better” philosophy:

Both early Unix and C compilers had simple structures, are easy to port, require few machine resources to run, and provide about 50%-80% of what you want from an operating system and programming language.

Half the computers that exist at any point are worse than median (smaller or slower). Unix and C work fine on them. The worse-is-better philosophy means that implementation simplicity has highest priority, which means Unix and C are easy to port on such machines. Therefore, one expects that if the 50% functionality Unix and C support is satisfactory, they will start to appear everywhere. And they have, haven’t they?

Unix and C are the ultimate computer viruses.

This explanation helps us understand why Unix was successful and Multics was not. Naively, you would have expected Multics to succeed. After all, whereas Multics was developed by dozens of researchers across MIT, Bell Labs, and General Electric, Unix was created by a single person looking for something to do while his wife was out of town.

But maybe Unix succeeded because it was initially developed in a three week frenzy by one person. Ken Thompson was trying to test out a disk scheduling algorithm on a shitty machine with low memory. He didn’t have the time, inclination, or pressure to complicate things. He built a simple recursive file system, he allowed any program to process any kind of file regardless of format, and he implemented a small number of intuitive system calls. At the time, he was doing what was expedient, but only later would the wisdom of these decisions become apparent.

Kernighan writes:

The hierarchical file system was a major simplification of existing practice, though in hindsight it seems utterly obvious—what else would you want? Rather than different types of files with properties managed by the operating system, and arbitrary limits on the depth to which files could be nested in directories, the Unix file system provides a straightforward view …

Files contain uninterpreted bytes; the system itself does not care what the bytes are, or know anything about their meaning.

Files are created, read, written and removed with half a dozen straightforward system calls. A handful of bits define access controls that are adequate for most purposes.

If Thompson were part of a big research project, he would have been asked to add additional features which would complicate his system and thus render it unable to run on his shitty machine. More importantly, Unix would have become inaccessible to possible adopters trying to understand how it works. They wouldn’t be able to write the useful programs on it which would attract even more adopters.

The story of C is very similar. As Linus Torvalds explains…, you can practically see the assembly that a C program will generate. In 1972, this lack of overhead was a necessity for any programming language being used to write an operating system. Today, it is a boon to programmers who want to know exactly how their program will work at the level of individual instructions and bytes of memory.

Comments

  1. Bob Sykes says:

    So, today Macs run on Unix, which itself sits on C, and Windows machines run on C. Is that the takeaway?

  2. Adar says:

    Those persons able to write innovative code and operating systems no longer exist in the USA? Gone to foreign shores and for good?

  3. Freddo says:

    Bob, computers process assembly language, which is crazy detailed and error-prone for human programmers, so some smart people invented programming languages such as C so humans can program in a more abstract languages and then a compiler program can do the crazy detailed hard work of translating our C program into an assembly program.

    C and its cousin C++ are very powerful languages that allow you to stick in your fingers with the engine running. So very suitable for writing an Operating System such as Unix/Linux/Windows, but too error-prone for writing accounting software.

    Computers require an operating system for the same reason. Programming straight to the hardware is much too detail oriented, and everyone prefers to have an operating system that has figured out the hard parts of writing files, updating the display and protecting memory.

    The story above sits in that inflection point where computers went from one-per-company to one-per-department and on to individual work stations. Requiring a huge shift in speed, flexibility and price that the old bureaucracies such as IBM could not meet.

  4. TRX says:

    K&R C, the original standard, was extremely barebones, and implementations varied drastically across vendors.

    ANSI C added a boatload of commands to the basic language, and vastly expanded the “runtime library”, which technically isn’t part of the C language, but is integral to any useful C program. Then came all the modern impedimentia, and C++.

  5. Pseudo-Chrysostom says:

    “This explanation helps us understand why Unix was successful and Multics was not. Naively, you would have expected Multics to succeed. After all, whereas Multics was developed by dozens of researchers across MIT, Bell Labs, and General Electric, Unix was created by a single person looking for something to do while his wife was out of town.”

    In other words, the main difference between the former cases and the latter case, is that Ken was a king in his own kingdom, however small it may be.

    All successful endeavors are like this. You have your Gerald Bulls, your William Shockleys, your Ed Halls… Kelly Johnsons, Steve Jobs, Thomas Edisons, Henry Fords… guys who act like they are kings of their own castles, and who other guys who also want a slice of kingly power tended to consider assholes.

    Large bureaucratified entities mediate interpersonal conflict by giving every constituent the feeling that they are all getting a slice of kingly power – ie, committees, procedures, ‘meetings’, et cetera – that everyone involved are equal collaborators. But that’s not how things get done.

    The correct response to conflicting personalities is to either split them off into different teams, or make it clear that one is subordinate to the other, and who has the real authority to enforce that ordination. ‘We need to concentrate all our resources onto one thing to maximize its chances of success’ is a mental trap that doesn’t really correspond to the reality of what actually succeeds, or at the very least is much farther down the list of real limiting factors that one may encounter over the course of an endeavor than is commonly thought by practically anyone. As you can see, remarkable results can be achieved by even very small teams, by even one single guy alone, as long as they have proprietary freehold over their business.

Leave a Reply