Since the inception of computing, the role of the operating system (OS) has been to control and provide standard interfaces to computer hardware. As the competition between different hardware vendors arose, the need for different and elaborate operating systems designed for heterogeneous hardware environments increased. From this need came UNIX, Windows NT and Linux.
The Birth of UNIX
Developed by researchers at Bell Laboratories in the 1960s, UNIX was the first OS that was inherently hardware independent. Built as a flexible, portable and powerful operating system, UNIX blossomed into one of the most powerful pieces of software and contributed considerably to the demise of mainframe computing.
From its inception, UNIX systems were designed with many of the necessary features required to make strong server systems: robust network protocols for exchanging data with clients and other servers, file systems which could be tailored for multiple uses and support for managing large numbers of users and applications. As hardware requirements evolved and with the development of multiple processors, UNIX systems were extended to support running applications in parallel (symmetric multiprocessing).
The Evolution of Windows NT
The 1980s witnessed the birth of a new computing empire and the Microsoft Windows OS was born. Windows started off as a powerful desktop OS powering the inexpensive personal computers (PCs). The original Windows operating system was designed to run on PC hardware and was optimized for desktop use. Microsoft recognized that the Windows OS could not compete in the more lucrative server environment where multiple users ran on the same system. This realization led to the development of the Windows NT operating system. Microsoft Windows NT was designed as a multitasking OS, similar to UNIX; however, the creators had the difficult task of making sure that NT maintained backward compatibility with its desktop variant.
The Beginning of Linux
Most variants of UNIX and the NT operating system are commercial endeavors and are available for use under stringent and expensive licensing agreements. In 1991, this fragmentation motivated Finnish student Linus Torvalds to design and implement a UNIX-like OS called Linux. Linux is freely available for use and modification by anyone. The only caveat if you wanted to distribute changes to the OS, then you had to make the source code available to everyone. With this philosophy, Linux has become the most celebrated piece of open software. Furthermore, the Linux OS was a grassroots effort created to mimic the UNIX environment and has now found popularity among the programmer community that likes having full access to the OS source code.
Which is Right for Your Business?
All computers used in an enterprise must produce perfectly reliable results at all times during the hours that business is conducted. With this in mind, the most important aspect of mission-critical computing environments is reliability, because a system that is not 100 percent reliable is worthless to the business it serves. The next aspect to consider is availability. In today's e-business world, computer downtime translates directly into a monetary loss; and, depending on the business, that loss can translate into millions of dollars per minute of downtime. Then comes serviceability and maintainability, which are closely related to availability. If a computer system does need to be taken down for service or maintenance, it is not available for business use. The ability to solve problems or maintain the system without significant downtime is essential. The last aspect to consider when comparing operating systems for business use is cost.
Throughout the rest of this article, we will look at UNIX, Windows NT and Linux in terms of reliability, availability, serviceability, maintainability and cost.
Reliability is the measurement of how consistently and accurately a computer system produces results each and every time it is needed. Unless a computer is reliable, all other aspects of that system are irrelevant.
Over time, UNIX has proven to be an extremely reliable operating system, known to run heavy workloads for months on end without failure. Much of this reliability stems from the intense qualification efforts that OEM's institute prior to releasing new versions of their products.
On the other hand, Windows NT is not known for its reliability. System failures on a weekly basis are not uncommon and can often be attributed to the fact that NT is burdened with supporting a plethora of legacy applications. As NT is extended and improved, its implementation becomes more complicated as it must stretch itself to maintain backwards compatibility. It is this complicated design which lends itself to an unstable system.
Linux's strength is its reliability. Its simplicity of design, together with thousands of engineers stressing it on varying platforms, results in a very stable operating system. Studies have shown that Linux is considered to be the most reliable OS that runs on Intel-based computers. Since over 90 percent of the computer systems used for business today are Intel, Linux is considered to be a very viable option.
For more information on reliability, visit www.uptimes.net. This site contains data that compares system uptime between Unix, Windows NT and Linux.
In simplistic terms, serviceability is a measurement of how easily one can perform routine scheduled and unscheduled maintenance on a computer system. With this in mind, any computer system that is not highly serviceable will not be considered as a viable candidate within a business environment.
As more and more businesses become dependent on their systems to provide mission- critical service, serviceability becomes important. Today, many commercial flavors of UNIX have support for features which allow for servicing of a system while minimizing downtime. Most system administration and tuning operations can be performed live, without requiring a system reboot.
To address serviceability, Windows NT has a set of remotely accessible configuration parameters in a small subset of the overall system parameters. A well known point of consternation is the fact that even minor changes in application parameters or application installation typically require a system reboot.
Linux, on the other hand, has been designed with many of the online tuning capabilities present in UNIX and will support hot swap of hardware as it migrates to hardware servers.
Availability is how readily accessible the services of a computer system are to its end users. For instance, large computer systems are usually designed with some degree of redundancy so if one component within a computer system fails, the computer system can still provide its services to the end user.
Availability is key for enterprise systems to keep services up and running in light of failure and scheduled downtime. Clustering solutions play an important role in building available solutions. Minimal cluster solutions provide support for node monitoring services which restart applications on another cluster node in the event of a node failure. These services enable building highly available application services, such as print, mail, Web and database servers.
All variants of UNIX have value add from their respective OEM for this level of clustering. Microsoft also sells clustering products for NT, as does Legato (among others) with their Vinca and Octopus products. And several Linux distributors, such as Turbolinux and Redhat, provide special CD distributions including clustering products. Linux users can also download open source cluster solutions such as Linux-HA or Linux Virtual Server.
The licensing costs of a Windows NT server vastly exceed the comparable costs of a Linux configuration. For example, the cost of a 25-user licensed version of Windows NT Server 4.0 Enterprise edition is $4,000, while a solution running Red Hat's commercial Linux is available for $50, with no licensing restrictions.
Hardware costs for UNIX and Linux systems are also considerably less than NT systems providing comparable services. This can be attributed to UNIX and Linux systems requiring substantially lower system resources including less memory, disk space and CPU horsepower.
A UNIX or Linux OS gives choices. Users can deploy either OS on any type of hardware and implement many user interfaces. Along with choices, both UNIX and Linux are dynamic and can be built to produce customized configurations to fit the specific computing needs at hand.
Currently, Windows NT is restricted to supporting Intel and does not offer a choice of a user interface. The NT kernel is static with no ability to build a customized kernel. One size fits all.
Whether you choose UNIX, Windows NT or Linux, know that today all share a piece of the enterprise computing market. UNIX is firmly secured, having been a robust OS for many years and offering a rich set of applications. Windows NT continues to grow its share of the enterprise as it adds more of the features present in commercial versions of UNIX. And Linux, the upstart in this arena, is earning the highest growth rate of all server operating systems due to its reliability and robustness, total cost of ownership and appeal to traditional UNIX data centers.
Register or login for access to this item and much more
All Information Management content is archived after seven days.
Community members receive:
- All recent and archived articles
- Conference offers and updates
- A full menu of enewsletter options
- Web seminars, white papers, ebooks
Already have an account? Log In
Don't have an account? Register for Free Unlimited Access