Early Days, AT&T UNIX, BSD
Richard Stallman, the GNU Project, and the “Free Software” term
The Linux Kernel, GNU/Linux and the Debian Free Software Guidelines
The “Cathedral and the Bazaar” and the coining of the term “Open-Source”
Linux Becomes More Popular
Open Source and Open Content Become Mainstream

This section is not a definitive overview of the history of the free software movement. It focuses on the issues regarding the usage of the common terms.

Early Days, AT&T UNIX, BSD

The free software movement (before it was called this way) started organically from individuals who distributed code they wrote under the Public Domain or what would now be considered open source or semi-open source licences.

AT&T UNIX that started at 1969 was the first showcase for this movement. Several Bell Labs Engineers led by Ken Thompson developed UNIX for their own use, and out of legal restrictions AT&T faced, decided to distribute it to academic organizations and other organizations free-of-charge with the source included. (that licence did not qualify as open-source but it was pretty close). UNIX eventually sported the C programming language, which enabled writing code that would run on many platforms easier, and the UNIX sources included a C compiler that was itself written in C. Around the early 70’s the only computers capable of running UNIX were main-frames and the so-called “mini-computers” so there initially weren’t as many installations as only large organizations could support buying computers to deploy UNIX on.

That changed as integrated circuits, and computers became cheaper and more powerful. Very soon, cheap UNIX-based servers and workstations became commonplace and the number of UNIX installations exploded. [1]

Nadav Har’El has prepared a coverage of the BSDs and early AT&T UNIX history.

The University of California at Berkeley (a.k.a UCB) forked its own version of AT&T UNIX and started re-writing parts of the code, and incorporating many changes of its own. The parts that the Berkeley developers wrote on their own had originally been licensed to UCB and kept as non-FOSS (= “free and open source software”) “All Rights Reserved” licence. The BSD system became very popular (perhaps even more than the AT&T one).

When Arpanet, the predecessor to the Internet was disbanded due to inadequacy, the Internet converted to running on top of 32-bit UNIX boxes such as the VAX architecture by Digital Equipment Corporation (now part of Hewlett-Packard). This caused a merging of the UNIX culture with the Arpanet enthusiasts who exchanged code on the Arpanet, and UNIX programmers started sharing code for various components and add-ons of UNIX on the Internet.

Richard Stallman, the GNU Project, and the “Free Software” term

After a while, the legal restrictions posed on AT&T subsided, and it started to “smell money” and believe it can do better selling UNIX commercially. It created the AT&T System V system, touted it was better than AT&T UNIX and the BSDs, and sold it to vendors. System V was sold under a very restrictive licence, that forced them to hold the source code for themselves. Even cooperation between two different vendors was not allowed.

Gradually, vendors licensed the System V source code and ported it to their own architectures. This caused an explosion of proprietary UNIX systems. Sun Microsystems and other vendors took the BSD source code, diverged from it and distributed it without full access to the code to all customers. A similar thing happened with other software distributed under similar licences.

To answer this threat, a new phenomenon sprang into existence: the “free software” movement, the GNU project and the copyleft licences, all led by one dynamic personality: Richard M. Stallman.

Richard Stallman (aka RMS) published the GNU Manifesto in 1984, which coined the term “free software”, and explained the rationale behind it. The Manifesto was also a creed for the the GNU project which aimed to be a complete UNIX-compatible replacement for UNIX systems, while being completely original work. The software of the GNU project was released as free software, under the terms of the GNU General Public License (or GPL for short).

Gradually, the GNU project created more and more C code to replace the UNIX and BSD utilities. It was already installable and usable on various flavours of UNIX, and became a fully independent system once the Linux kernel was written.

The GPL licence is a free software licence that has many fine points. The most important concepts in it are:

  1. Copyleft - making sure that derived work that are distributed to the outside includes the source and is distributed under the same licence. Note that this does not apply to modifications done for internal or private use.

  2. Restrictive Integration by Other Code bases - GPL code can only be linked against code with free software licences that match some criteria. [2]

The incentive to restrict a software this way rather than following the more traditional public domain or public-domain-like licences (as used by such software as the TeX typesetting system), was to make sure that the core GNU system would always remain free as well.

Encouraged by Stallman’s growing momentum behind the Free Software Foundation and the GNU project, Berkeley University changed the licence of the parts that they have originated, to a a free software licence which is now called “The Original BSD License”, which qualified as free software, but as opposed to the GPL was public-domain-like. [3] To add to this effort, some UCB students decided to rewrite the remaining parts that were licensed to AT&T under the BSD licence. This task was eventually completed that resulted in a BSD system that was entirely under the BSD licence.

However, AT&T did not stand by, and pressed charges against UCB and some other organisations, for claiming they actually own parts of the BSD operating system. This brought uncertainty into the BSD world, which would not be resolved until the 1990s, when the law-suit was decided mostly in favour of UCB. As a result of this uncertainty, the status of some spin-offs of BSD (such as 386BSD, and its derived operating systems such as FreeBSD or NetBSD) was in a legal limbo.

The Linux Kernel, GNU/Linux and the Debian Free Software Guidelines

In 1992, Linus Torvalds, then a student at Helsinki University, began writing the “Linux” kernel - a 32-bit kernel for UNIX-like operating systems. The kernel development advanced rapidly and was released under the GPL licence starting from an early stage. To complete the system and make it into a usable UNIX system, the Linux developers used various existing user-land utilities and libraries from the GNU project and other sources (such as the X-Windows system), and wrote a few user-land utilities from scratch.

From an early stage, this entire system was dubbed “Linux” as well. Richard Stallman instead has advocated the name “GNU/Linux” (pronounced “ggnoo-Linux”) which acknowledges the fact that the GNU project contributed the lion’s share of the system (including some pre-requisites of the Linux kernel itself). Most people haven’t consistently followed this piece of advice.

The importance of the Linux kernel was that it was the last brick in materialising a fully GNU system. Since GNU tools tend to be more complete, feature-rich and generally superior to tools of other systems, this has made Linux one of the most powerful UNIX systems available. Nowadays, most UNIX servers out there, many UNIX workstations, (and many embedded devices) run the GNU/Linux system. Linux was, thus, the spearhead that guided the acceptance of free software into the mainstream.

Debian GNU/Linux was a Linux distribution that was eventually endorsed by the GNU project. One of the aspects that made it unique was the fact it distinguished between “free” and “non-free” packages as far as the user is concerned. The guidelines for determining which software is “free” in the Debian sense were phrased by Bruce Perens.

Note that they deviate from the Free Software Definition (which was only published later on) and include some licences that are not free. I.e: “Debian Free” is a superset of free software according to the Stallman definition.

This fact is important because later on, the Debian Free Software Guidelines formed the basis for the open-source definition.

The “Cathedral and the Bazaar” and the coining of the term “Open-Source”

Eric Steven Raymond (now also known as ESR) wrote an essay titled “The Cathedral and the Bazaar”, and presented it to the Linux Kongress at 21 May 1997. This contrasted the Bazaar way of managing a software project to the old “Cathedral” way, that was used by almost all non-free projects and (until that point at least) by most free ones.

“Bazaar” projects are characterised by frequent and incremental release schedules, treating the users as co-developers, and generally getting a lot of peer review, ideas, input and cooperation. Despite a common misconception, the core group of the project contributors still usually remains relatively small except for some of the larger projects.

The article is considered one of the seminal works on free software, and was followed by other works in what is collectively known as the “Cathedral and the Bazaar” (or CatB for short) series. It has made Eric Raymond a famous person, at least among the community of free software hackers.

In February 3, 1998, in Palo Alto California, a brainstorming session which Raymond attended, coined the term “open source” as an alternative for “free software”. Their incentive was that when talking to a businessman, either free software will be understood as gratis software, or it will be associated with the relatively anti-Capitalistic views held by Richard Stallman (who claims non-free software is immoral). They decided that the term “open source” would be a better candidate for acceptance in the corporate world.

Consult the opensource.org history document for further coverage of the history of the term.

During the following week, Eric Raymond, and Bruce Perens launched the opensource.org web-site, and formed the Open source definition. This was based on the Debian Free Software Guidelines.

The term “open source” caught on. Very soon, Richard Stallman decided to reject it on the premise that the freedom of software is more important than the “openness” of its code. While he does not oppose the openness of the code, and acknowledges the fact that free software is open source as well, its freedom remained more important. For more about this stance, read the document “Free Software for Freedom” on the GNU web-site.

While some people have continuously stuck to the term “free software” and a few others converted to using “open source” entirely, most knowledgeable people don’t completely reject either term, and use each one whenever they see fit. Nevertheless, the term “open source” is more commonly used by both open source developers and even more so by non open source developers. See Eric Raymond’s “Terminology Wars” for more details.

Linux Becomes More Popular

Since 1997, Linux and other open-source systems have become more and more popular. Linux saw a lot of success in the server market, where cheap PCs that can be bought in stores can serve as an almost full replacement for more costly UNIX servers by installing Linux. Even if the latter are used, they very often run open source servers and other open source programs, utilities and frameworks.

Linux has become the number one choice for constructing clusters, a large set of computers that are networked together to form a fast computation system, with powers that rival or exceed super-computers. There are various kinds of clusters around. Some of them are performed at a relatively high level. Others, try to make the system believe it has as many processors as there are nodes.

Linux also had a lot of success in the embedded market, serving as the framework for creating software that is embedded in hardware.

The Internet boom not only made free software more essential for its operation , but also enabled more and more users and developers to share their code, get help and work together for advancing it.

At the moment, Linux had a much more limited success as a choice for a desktop system. While it used to be the only operating system that was gaining market share (at least until the renaissance of Apple with its Mac OS X), it still has a very low one, in comparison to Microsoft solutions. Many projects started to supply users with desktop and GUI environments and applications. Some of them are very mature, usable and successful. Only time can tell if and when Linux becomes the default solution for the desktop.

Apple’s Mac OS X was released and is based on Darwin, which is an open-source BSD-derived system. Mac OS X can run UNIX applications natively, and supports the X-Windows system, which is the de-facto GUI framework for UNIXes (including Linux). It is therefore a popular UNIX choice for PowerMac computers (and more recently for Macintosh computer based on Intel-based chips), albeit not the only one since Linux, and various open-source BSD clones and other UNIXes can run there as well.

The recent recession in the information technology market, did not seem to slow down the development of open source software. Freecode (formerly Freshmeat) is still busy as ever with releases of new software, and since the recession started, many important new releases were done for a lot of major applications and even many more less important ones.

Open Source and Open Content Become Mainstream

While open source software has existed for DOS and Microsoft Windows practically since the beginning, and some of it was relatively popular among people, most of the software available for these platforms has been non-open-source binary-only software, a lot of it from Microsoft.

This has started to change recently. The Firefox browser from the home of the Mozilla Foundation (and now also the Mozilla Corporation), is an open-source, modern and sophisticated browser, that has been virally publicised by various means such as the various “Spread Firefox” campaigns. It has become popular and as of July 2006 has passed the 10% usage in web site hits according to some firms, and in some countries much more so. It is still gaining some market share, even if its growth has declined somewhat.

Other cross-platform open source software includes OpenOffice.org, a powerful and usable office productivity software for Windows, Linux and other platforms, the GIMP (GNU Image Manipulation Program), a sophisticated raster image editing program, and Inkscape, a vector editing program, many open-source music and media players such as VLC, and also most Peer-to-Peer networking clients. These have probably seen less popularity than Firefox, but are still providing cheap, open, modifiable alternatives to traditional binary-only software.

In 2003, a study was published that estimated that by 2004, more software developers will write software for Linux than for Windows. While it definitely does not mean that more people will use Linux at home, it is still a good indication for its general mainstream acceptance and usefulness.

Another important recent trend was the rise of open content. The first edition of this article, included a small section about “open content”, where I concluded by saying that “Only time can tell whether other elements of open source besides its freely distributable nature will have an impact in other areas of creative arts besides software.”. Now, about 3 years later, I can say that by all means open content has already proven to be a great success.

Among the landmarks of open or semi-open content are:

  1. The Creative Commons project that specifies licences for open content, semi-open content or just freely redistributable artworks for individuals and organisations to use in their artwork, as well as supplying several resources for facilitating their publishing and use.

    Creative Commons’ licences have proven to be very popular among many web publishers for use in their works.

  2. The Wikimedia Foundation publishes several online multi-lingual wikis - web sites that are editable by common web visitors - all under an open content licence. The most famous and important one are the Wikipedias, which are free, online encyclopaedias. The English Wikipedia (which is still the largest) is larger than Encyclopaedia Britannica and Microsoft Encarta combined and is growing rapidly.

  3. There are many sites for independent musicians, such as ccMixter Magnatune (a record label that publishes artists whose songs are under a freely redistributable licence) and Jamendo (a musical showcase for artists whose music is under any of the Creative Commons licences).

  4. From weblogs and weblog comments, to wikis, to audiocasts or video-blogs - open or semi-open content is everywhere.

[1] At present day, UNIX clones such as Linux or the BSDs can run on regular Pentium-based computers that can be bought from PC shops. Most PC computers nowadays can out-compete the UNIX workstations of a few generations back. This allow assembling a UNIX server which is much more powerful and much less costly than the past ones, and that suffices for most needs.

[2] At one point in time, this property had been sometimes referred to as “viral”, which appear to have originated from Microsoft’s early criticism of it. However, while the GPL requires programs that use it to be licensed under compatible FOSS licences, the worst thing that can happen is that they will lose the ability to legally use the GPL-licensed code, while still retaining the copyrights for the original and possibly non-FOSS codebase.

[3] The original BSD licence also has an advertising clause, that makes it incompatible with the GPL, and a problem in general. Later versions of the license removed this clause, and use of the original BSD licence is no longer recommended by the FSF, although some FOSS packages are still distributed under it.