Philosophy of the system.
“Free software” means software that respects users’ freedom and community. Roughly, it means that the users have the freedom to run, copy, distribute, study, change and improve the software. Thus, “free software” is a matter of liberty, not price. To understand the concept, you should think of “free” as in “free speech,” not as in “free beer”.

Richard Stallman, aka RMS, started the original work on GNU tools which in turn led to the development of a Linux kernel (created by Linus Torvalds later on).
Sometimes the word “libre” is used, as opposed to just “gratis”. Usually the underlying principles apply:
- The freedom to run the program as you wish, for any purpose (freedom 0).
- The freedom to study how the program works, and change it so it does your computing as you wish (freedom 1). Access to the source code is a precondition for this.
- The freedom to redistribute copies so you can help your neighbor (freedom 2).
- The freedom to distribute copies of your modified versions to others (freedom 3). By doing this you can give the whole community a chance to benefit from your changes. Access to the source code is a precondition for this.
“Free software” implies Open Source (because of freedom 1), and the two movements largely overlap. Sometimes the acronym FOSS is used, for Free and Open Source Software.
The Free software movement has been implicitly around since the very beginning of computer software: in the early days, sharing source code was the norm — most programmers were in academia, with a tradition of sharing results with their peers and with the public. However, as commercial software became more important in the 1970s, closed-source proprietary software became more common. The GNU project was launched in 1983 as a reaction to this trend, with the explicit goal of creating a completely Free operating system complete with all essential applications. In the early 1990s, the Linux kernel was created, with a Free software license (the GNU GPL), and it became increasingly popular. Today, GNU/Linux is the platform of choice for many applications where the robustness, power and the flexibility of FOSS software is greatly appreciated, from web servers and desktop computers to embedded devices or film industry render farms.
Linux Kernel
To make a long story short, GNU tools were basically written to run UNIX-compatible and UNIX-like tools with an Open and Free code base philosophy. But GNU tools could not be considered an OS by themselves until you had a Kernel to manage how all applications interact with each other’s and how they communicate with the underlying hardware. Linux Torvalds was the first to provide a working UNIX-like kernel that could work with GNU tools, inspired by Minix. It was later referred to “Linux”, based on his creator’s name. The Linux kernel also uses also a Free Software License (GPL v2) and while it started by only being available on x86 architecture, it has since then been ported to a wide range of hardware outside of the intel world. Note that the GNU project did not wait for Linux to exist to start working on a kernel – they had already in the works the GNU Hurd project, and while it was not ready by the time Linux arrived, it is now reaching maturity and you can find more about it on the GNU Hurd page. So UNIX-like tools could work in theory on different kernels, without “Linux”.
Distributions
Nowadays it is very common to refer to “Linux” to describe something much larger than the kernel. Most Linux desktop distributions are actually large amounts of different packages (including GNU tools) including around a specific version of the Linux kernel. There are literally hundreds of distributions available in the wild, yet there are probably only about a dozen or so which are really big, market-wise. Usually distributions differ by the following attributes:
- What packages they include by default
- What package manager they use
- What windows manager they use
- What philosophy they follow
- Whether they are rolling releases or not.
You can have extremely tiny Linux distros that fit on CD, for the purpose of system recovery or security (to ensure nothing is ever recorded). Larger distributions take several gigabytes and include word processors, tons of device drivers, etc… Some of them are very flexible regarding the usage of software of different licenses, while others, like Trisquel, only accept Free Software in their repositories. There are new distros coming out every single year, tackling new problems and new markets. This is the nature of Free Software: everyone and anyone can decide to fork an existing codebase (like Ubuntu based on Debian) and build something new out of it. In that sense it’s difficult to speak of an unified Linux Market Share – the market is heavily fragmented, and depending on the popularity of one distro versus the others, trends can change fairly quickly.