Quote: If you consider the achievements of K & R, Bill Joy, Ken Olsen, MIT, and PARC (Xerox Labs) as a combined effort to change the face of computing, it would be difficult to find a single improvement that is not fairly directly attributable to these five sources.
For those of you who have managed to actually retain some interest in this conversation (somehow), but are not otherwise completely up to speed on computing history:
Brian Kernigan and Dennis Ritchie (K & R): Invented the "C" programming language in 1970. C is the base language for almost all of the popular operating systems, utilities, and applications widely in use today. From C, comes C++ (an "object oriented" extension on C created by Bjarne Stroustrup) the language used to create 100% of Sesame (and several hundred million other programs). C is so influential that it has changed the face of even pre-existing languages. Visual Basic, for example owes as much to C as it does to Basic. Almost all post C languages resemble C in many ways. PHP, Perl, even Java all borrow heavily from C. C as a language is a great example of elegance, efficiency, and well considered design. Its flexibility, speed, and graceful execution is virtually unmatched now - more than thirty years after its inception. C was derived from a much less successful language known as BCPL. The first attempt, a failure, was called "B" - thus the next attempt was called "C". I keep waiting for "P" and "L" to be released.
Dennis Ritchie: In 1971, Ritchie and a small (less credited) team at Bell Labs, created Unix. Unix was derived from another failed operating system called "Multix". Unix was (and is) a philosphic statement about computers and was intended to make the creation of programs (especially in C) easier. It was a "developer's system" made by the best developers for developers. The primary statement being made was that programs should each do a single task, do that single task well, but also be flexible enough to be chained together to perform much more complex tasks. To list the innovations and concepts that originate with Unix would take days. It was among the first operating systems to be written in a portable programming language (C) so it can be easily ported to any chip or computer. It used a unified and rational filesystem where everything (including devices, network, programs, etc...) was either a file or a directory. It had networking built in. It allowed for arbirary user interfaces (on Unix you can use any GUI or desktop program you like). It invented I/O redirection, pipes, aliasing, logical linking, and lightweight multi-tasking. Unlike other OSs, it was multi-user and multi-tasking, by design. It accomplishes all of this with extreme grace and elegance. They wrote a small "kernel" that coordinates the hardware and software. All of the rest of Unix is a set of small optional utilities and commands, designed to work together as the users see fit. Additionally, by emphasing program development tools, programs made on Unix, seamlessly fit into the broader philosophy of the operating system. In other words, many programs made for Unix, become part of Unix (very much unlike another popular OS).
C and Unix go hand in hand. They share the same philosophy and goals: Small simple parts that interact with each other interchangably using a consistentent framework made of small simple parts that interact with each other...
Bill Joy is the founder of Sun Microsystems. Mr. Joy was among the first advocates of using "better" computer technology rather than using "familiar" technology. His company brought (though they did not always invent): The GUI (invented by MIT/Xerox, but made useful by Sun), the first practical network model (TCP/IP, invented by ARPA. DoD, Vint Cerf, and many others), the first "true" practical workstations (a computer designed for the graphical display and manipulation of scientific and enginering information), the first RISC (reduced instruction set chip) CPUs, the first Unix centric desktop computers, the first deployment of web browsers and hyperlinks, the first web servers, the first practical multi-cpu computers, the first "unified desktop" GUIs, and last (but not least) Java - which is slowly replacing Cobol (one of the worst languages of all time) as the language for business and finance (and much else).
Sun is not so well known for actual in house innovation, so much as sticking their neck out by promoting good design over the cheap easy familiar solution. By betting the company on philosophic statements about what is good and right, Sun may well be the only company standing up to the MS onslaught. Everyone else has either folded or complied, or will soon. I hope Sun survives.
And talking about "folding", that brings us to Ken Olsen. Back before AT&T brought us Unix, if you had a computer, it was made by IBM (or Burroughs or NCR to resemble an IBM). It ran software made by IBM on an operating system made by IBM. At that time, IBM didn't think much of their users and, for the most part, made sure that no one but IBM trained consultants ever got to touch, program, or otherwise interact with their computers. If you had one of these room sized monsters, with requisite "fishtanks" (a special room just for the computer) nine-track tape drives, teletypes, "smart terminals", card punch machines, and assorted other really bad ideas, you were required to submit your data using terminals that seemed dead set to prevent you from entering data. You then processed your data by translating your program (in Cobol or Fortran) onto punch cards which were submitted. If you were lucky your program might be run this month. To get your data back, you submitted a program written in RPG. Again, with luck, you might get your results back the same week. Because of the non-interactive nature of this kind of data processing, errors were frequent, persisted forever, and computers were seen as a boring (but necessary) hindrance on getting work done. This view, largely archaic, still persists.
IBM had the choice. The technology was there to use computers interactively. The IBM console (hands off!) was actually an interactive device. But, they believed that their customers could not handle their own data, and even when their own field testing demonstrated otherwise, they persisted in lording over the computer.
Ken Olsen, saw the potential in a computer designed to be interactive. He founded a company: Digital Equipment Corporation, for the purpose of building such a beast. Hence the PDP-1 was created - the first production computer (almost) built to be actually used. The first two models were installed at MIT, where they were used by the famous MIT "hackers" (really the model railway club) to build the first text editor, the first video game, and the first graphical display. The PDP-1 was followed by the PDP-8, the first computer that would fit on a desk, the first computer to be widely used, and the first computer to cost less than $30,000. DEC went on to popularize the hard drive, the mini-computer, the 32-bit computer (VAXen), scientific/engineering computing. But it was their PDP series that allowed the creation of nearly everything good about software. It was the PDP that invented the idea of the PC (while not quite qualifying as a PC itself). The great irony of this is that Ken Olsen never saw the personal use of computers as "useful" and predicted that there would never be a home computer market. But DEC, did more to bring us the PC than IBM, MS, and Intel combined. DEC was eventually crushed by the PC market. It was bought by Compaq, which was bought by HP. Few DEC technology pieces have survived as current products after the multiple acquisitions. A real shame. Their early OSs RT-11 and RSX are the origin of CP/M, which is the origin of DOS. The VAX OS, VMS, is nearly as innovative as Unix, and may qualify as the most reliable OS ever created.
MIT has invented a lot of stuff, but very little of it ever gets out of MIT. But, as an exception to that is X Windows (often called "X11"). While Xerox may have come up with the idea of a graphical user interface, their software is not much more than a demo. In fact, MIT cannot really claim to have invented much of the GUI technology, but they wrote the actual software that made it practical, useful, and good by separating the GUI from the OS, making the part of the system that draws the screen a "server" serving applications that ask the server to draw their interfaces on the screen. This separates applications from the means by which they display themselves, providing a uniform interface, asynchronous update, and networkability. It also eliminates the repetition of that code from every program that uses it. This is absolutely brilliant. It allows updates, upgrades, even complete changes, to occur without affecting either the OS or the applications in any way. It allows the user to determine the interface (in much more than just look and feel) they want to use, without having to rewrite the applications they are using. And the change is across the board, unifying all of their applications. MIT brought us the idea that the person using a computer should be in charge of how that computer is used - not the programmer, not the consultant, not the company. This philosophy led to innovations such as the web browser and the hyperlink. It is a shame that neither Apple or MS adopted similar attitudes. By tightly integrating the GUI with the OS, both companies guarantee that even small changes to the GUI require that the entire OS and most applications be rewritten.
MIT also pioneered using actual studies (as opposed to guesswork and anecdotal evidence) to determine computer ease-of-use. They discovered (much to programmer's horror) that computers are actually terribly hard to use. They also confirmed that programmers were the cause of the problems - largely because they used guesswork and anecdotal evidence. What they discovered demonstrated that what was easy for beginners slowly degrades ease-of-use as the beginners become experts. Eventually a program that was seen as easy to use, becomes cumbersome and intrusive. They advocated that a program, if unable to adapt its behavior to the expertise of the user, should attempt to scale its automation, option persistence, and option flatness, by the looking at the expected lifetime of the program in front of any one user. The longer any one user uses a program, the flatter and more persistent the options should be. (Note: automation refers to options made by the program for the user. Option persistence refers to the conditions under which options are made available/unavailable to the user. Option flatness, refers to hiding/not hiding options under other optional elements (such as mutlilevel pulldown menus, or options that only appear on certain dialog boxes).] This ran counter to the prevailing philosophy that was based on only short term ease-of-use.
Lastly, we come to Xerox's PARC. They gave us the window, the mouse, the scroll bar, the pulldown, the GUI in general. Really, it was their idea to throw pixels, as opposed to ASCII characters at the screen. They, themselves, didn't do much with this idea. But they did show it to a few people that did. It was with this idea that computers stopped being the lonely province of a few people williing to memorize all of the ins and outs of eccentric and difficult command line driven operating systems. This led to an explosion of computer popularity. Without the GUI, computers would still only be used by programmers and business people (forced to use them). The GUI provides the means by which the non-expert can make computers do useful things without ever actually becoming an expert. This cannot be underestimated when considering world wide computer adoption. Only by travelling the most impoverished portions of our world (and the most absolutely elite) can you find people that have not at some time done useful things with a computer. Over and beyond its attractiveness, the appeal of the GUI boils down to one critical statement: it is hard to memorize stuff. The GUI provides a means to work with a computer that does not require memorization. And while, Xerox can hardly be said to have invented that, they did find a way to express it in practical codable ways - windows, mice, menus.
I have forgotten to mention Carnegie Mellon University for inventing Mach, Richard Stallman for hyperlinks, and Jim Clark at SGI/Netscape for making us all say "Wow!" time and time again.
Additionally, it should be noted that the above is purely my opinion, is not intended to be "factual" or complete, and may well represent a gross (though unintended) misrepresentation of computer history,