Pages

May 26, 2012

Unix and its philosophy

Unix enthusiasts like me enjoy singing the praise of the "Unix philosophy", unwritten policies guiding development of computing, focusing on the collaboration of simple tools over the advent of big monolithic machines that do too much. In a recent blog post, John D. Cook criticizes this policy for being too uptopic, claiming that Unix itself (at least in its modern incarnations) does not abide by it. He makes the point that users have come to expect some complexity from their tools, and it's an acceptable alternative to the ideal Ted Ziuba describes as Taco Bell programming. In this article I will explore the concordance of Unix with its philosophy, as well as the reasons that might have led it to diverge (superficially) from its founding principles.

Unix simplicity


I have come across several definitions of the philosophy of Unix, Eric S. Raymond made a career out of attempting to define it; there's no canonical definition of what it is. For the sake of argumentation, let us take the definition made by Cook in his article:
  • Write programs that do one thing and do it well.
  • Write programs to work together.
  • Write programs to handle text streams, because that is a universal interface.

I will deliberately address these points starting by the last one, and try to demonstrate by doing so that the most important points are not the ones you would think, namely that programs do not need to be dead simple for the system to adopt a "Simple" philosophy.

Write programs to handle text streams

The most important word here is stream. (As a side note, Linus Torvalds prefers the name "byte stream" over text. That's understandable, since he works at such a low level, close to the machine, that it would be too costly to deal with text). Unix has evolved from the minds of Electrical Engineers, and has thus inherited the stream/filter design of digital circuits. This approach is also familiar to functional programmers, who reason in terms of filters, maps and folds.

What does this mean? Say you have to write a script to extract data from a log file. An intuitive approach would be to pass the log file (or its path) to the script as an argument. Alas, by doing so you break the stream. Instead, the philosophy would encourage you to read the file as a stream of lines coming from stdin. By reading the input that way, the program is freeing itself from the boundaries of the physical file on the physical disk. You are allowing it to leverage the power of other Unix tools like sed, grep, tr or cut. This is Unix philosophy at its best.

Extra link:


Write programs that work together.

In order to achieve simplicity in design, a system needs a powerful glue to coordinate the actions of its primitives. Unix's answer was the Shell, a simple, often awkward language, that does the job perfectly. Shell programming is awkward, if not sometimes plain wrong. I spent the better part of my day trying to diff two datetimes in Ksh. Believe me when I say, it's annoying. However there are two fields where shell shows superiority

Process management

Every shell will present you the following features in some sort:
  • Pipes: A kernel mechanism to link the output of a command to the input of another one, creating a pipe for the stream to go through.
  • Process substitution: A feature that launches a command and stores its output as a variable.
  • Redirection: Manual control over the process file descriptors, this will blur the limit between a stream and a file for you
  • job control: Interactive control over simple parallelism in process launching
All of the above will control how and when are separate processes launched and more importantly what to do with their outputs. After all the shell is nothing more than a glue making programs work together.

String manipulation

By string manipulation, I mean "pattern matching", or "globbing" depending on the shell you use. This refers to a particular syntax, too often mistaken for regular expressions, that will behave like an "autocomplete for scripts". Expansion is generally environment specific, as it gets affected by outside factors like environment variables, the current working directory or the files present in it. I will not expose here how it works, it would be too long. If you need more details, the manpage of your favorite shell is the best place to start. For now keep in mind that by allying the ability to manage processes with the ability to present patterns instead of definite strings, the shell will allow for the creation of fine-grained commands that do the job exactly like you want it to.

Matt Might exposes the following tasks on his blog as examples of what Unix does very well. I hope you get the image now that by combining the advanced features of the shell with the stream-like approach of design, how can a user do the following rather easily.

  • Find the five folders in a given directory consuming the most space.
  • Report duplicate MP3s (by file contents, not file name) on a computer.
  • Take a list of names whose first and last names have been lower-cased, and properly recapitalize them.
  • Find all words in English that have x as their second letter, and n as their second-to-last.
  • Directly route your microphone input over the network to another computer's speaker.
  • Replace all spaces in a filename with underscore for a given directory.
  • Report the last ten errant accesses to the web server coming from a specific IP address.


Unix complexity


There is one more rule I should visit. "Write programs that do one thing and do it well". "do it well" is self explanatory, albeit subtle. The first part of the sentence is more problematic. "do one thing". According to Cook (or another author he cites), the ls man page has over 35 options (mine has 58, I use GNU's). This only example he gives reveals his underlying argument that tools have inevitably evolve to become complex. I want to explore first why did these tools grow like this, and second why I think this does not necessarily go against the Philosophy, that there is a way to circumvent this.

Legacy

I believe complexity came from historical reason, that could be seen as evolutionary, but I prefer blaming commercial interests and short-sighted marketing. In the 90s, a plethora of Unix clones appeared, and with them came lawsuits, courts and legal complications. For marketing reasons, and in fear of lawsuits, vendors decided to differentiate their offerings by adding sugar on top of their "basic" tools. Blaming businesses is too simple: even Free and Open Source software got caught in the same game.

However, complexity is a curse you cannot easily get rid of. At the end of the day, Unix is a platform, it doesn't do anything by itself, if only providing support for a code base that's out there. Once vendors added their sugar (or cruft, depending on how you look at it), people started using them and relying on them so that we get to the situation we're at today: Providing cross-platform compatibility across Unices and achieving artificial portability seem to be of a more immediate danger than achieving ideal simplicity. I don't see any powerful actor showing interest in making things right. This is why ls has so many options, and why the man page seems to never finish. There are scripts out there who do use these options, and it's important to support them. Unfortunately.

Cryptic

The advent of Perl as a sysadmin programming language is another eminent example of the advent of cruft on Unix. Did we really need Perl? According to its author, Larry Wall, a linguist then working at NASA, "Perl is kind of designed to make awk and sed semi-obsolete". Given what we already discussed, you shouldn't be surprised to know that, instead of replacing existing tools with a better one, this led to the coexistence of several redundant tools that do very similar jobs.

Why was the Unix community so quick to adopt the new language?

  • Performance: A pragmatic reason going beyond the point of design, sed and awk fork too many processes and were considered then slow. Perl was supposed to save you the need to write a C program.
  • Better regular expression: This is the real reason why Perl won. It could be easily argued that these enhanced regexps should've been added to the existent instead of creating a new language. Why didn't they?
  • Laziness Despite its weird and aggressive syntax, Perl scripts were still easier and quicker to write than the combination of sed and awk.

Unix users are to be blamed for that last point. We found it simpler to use a larger tool than finding convoluted ways for combining two simpler tools. We willingly indulged into facility over rigorous purity. Combining simplistic tools often make for complex scripts, and anything non-trivial becomes quickly cryptic and vastly un-maintainable. After all, Perl comes from a man who cites "Laziness" as one of the three great virtues of the programmer.

"More features" is always tempting. Not so long ago I was quick to sing the praise of GNU for all the powerful extra features packed in their tools. In this aspect I believe Cook was right, there is a part of natural evolution that makes users demand complexity. My question is: Isn't fighting this urge more rewarding in the long run?

Right does not always prevail

I have to mention Plan9, this experiment done by the creators of Unix of re-writing it from scratch delivering what it should've been. In every aspect (arguably most aspects), Plan9 is superior to Unix. Namely it abides by the Unix philosophy better than Unix ever did. Yet in never left the bounds of the research labs it was born in. Why is that? Raymond has a theory:

"the most dangerous enemy of a better solution is an existing codebase that is just good enough."

Unix works despite its flaws, and that may be its biggest weakness. For better or worse, it will never really feel the need to cleanse itself from the bloat it acquired over the years and yes there's a lesson to be taken (or reminded) here: The cost of fixing an error is exponentially proportional to the time we've been dragging it.

Unix can be true to its philosophy

To summarize, I believe complexity has been added on top of Unix for 3 reasons: historical commercial interests, laziness of the users and inability to adopt the proposed solution. I want to finish my essay on the following note; I do not think modern Unix come in the way of the Unix philosophy. The existence of the first two points (namely "streams" and "combinations") is enough. While tools are growing in complexity, their simple core is still present and available for use. I advocate avoiding non-standard tools or options, even if it saves a few lines of code.

The programming community came up with the wonderful word "deprecated". It is a label that qualifies every option/feature deemed bad or obsolete, but still present for compatibility with older versions. It is too late to get rid of the bloat in Unix, but it is not too late to mark all these options as deprecated. Let us help rising generations understand these core values so they don't die.

I will leave you with an amusing anecdote, the day The Unix philosophy was stronger that Donald Knuth. I hope it will present a good example of why and how it should be regarded as a programming ideal.