Decision Points
Role
The first decision
when specifying any computer system is the machine’s role. Will you be sitting
at the console running productivity applications or web browsing? If so, a
familiar desktop is best. Will the machine be accessed
remotely by many users or provide services to remote users? Then it’s a server.
Servers typically sit
in a rack and share a keyboard and monitor with many other computers, since
console access is generally only used for configuration and troubleshooting.
Servers generally run as a CLI, which frees up resources for the real purpose
of the computer: serving information to clients (any user or system that
accesses resources remotely). Desktop systems primarily run a GUI for the ease
of use of their users.
Function
Next, determine the
functions of the machine. Is there specific software it needs to run, or
specific functions it needs to perform? Will there be hundreds, even thousands,
of these machines running at the same time? What is the skill-set of the team
managing the computer and software?
Life Cycle
The service lifetime
and risk tolerance of the server also needs to be determined. Operating systems
and software upgrades come on a periodic basis, called a release cycle.
Vendors only support older versions of software for a certain period of time
before not offering any updates; this is called a maintenance cycle or life
cycle.
In an enterprise
server environment, maintenance and release cycles are critical considerations
because it is time-consuming and expensive to do major upgrades. Instead, the
server hardware itself is often replaced because increased performance is worth
the extra expense, and the resources involved are often many times more costly
than the hardware.
Consider This
There is a fair amount of work involved in
upgrading a server due to specialized configurations, application software
patching and user testing, so a proactive organization will seek to maximize
their return on investment in both human and monetary capital.
Modern data centers are addressing this
challenge through virtualization. In a virtual environment, one
physical machine can host dozens, or even hundreds of virtual machines,
decreasing space and power requirements, as well as providing for automation of
many of the tasks previously done manually by systems administrators. Scripting
programs allow virtual machines to be created, configured, deployed and removed
from a network without the need for human intervention. Of course, a human
still needs to write the script and monitor these systems, at least for now.
The need for physical hardware upgrades has
also been decreased immensely with the advent of cloud services providers
like Amazon Web Services, Rackspace, and Microsoft
Azure. Similar advances have helped desktop administrators manage upgrades
in an automated fashion and with little to no user interruption.
Stability
Individual software
releases can be characterized as beta or stable depending
on where they are in the release cycle. When a software release has many new
features that haven’t been tested, it’s typically referred to as beta.
After being tested in the field, its designation changes to stable.
Users who need the
latest features can decide to use beta software. This is often done in the
development phase of a new deployment and provides the ability to request
features not available on the stable release.
Production servers
typically use stable software unless needed features are not available, and the
risk of running code that has not been thoroughly tested is outweighed by the
utility provided.
Software in the open
source realm is often released for peer review very early on in its development
process, and can very quickly be put into testing and even production
environments, providing extremely useful feedback and code submissions to fix
issues found or features needed.
Conversely,
proprietary software will often be kept secret for most of its development,
only reaching a public beta stage when it’s almost ready for release.
Compatibility
Another
loosely-related concept is backward compatibility which refers
to the ability of later operating systems to be compatible with software made
for earlier versions. This is usually a concern when it is necessary to upgrade
an operating system, but an application software upgrade is not possible due to
cost or lack of availability.
The norm for open
source software development is to ensure backward compatibility first and break
things only as a last resort. The common practice of maintaining and versioning
libraries of functions helps this greatly. Typically, a library that is used by
one or more programs is versioned as a new release when significant changes
have occurred but also keeps all the functions (and compatibility) of earlier
versions that may be hard-coded or referred to by existing software.
Cost
Cost is always a
factor when specifying new systems. Microsoft has annual licensing fees that
apply to users, servers and other software, as do many other software
companies. Ultimately, the choice of operating system will be affected by
available hardware, staff resources and skill, cost of purchase, maintenance,
and projected future requirements.
Virtualization and
outsourced support services offer the modern IT organization the promise of
having to pay for only what it uses rather than building in excess capacity.
This not only controls costs but offers opportunities for people both inside
and outside the organization to provide expertise and value.
Interface
The first electronic
computer systems were controlled by means of switches and plugboards similar to
those used by telephone operators at the time. Then came punch cards and
finally a text-based terminal system similar to the Linux command line
interface (CLI) in use today. The graphical user interface
(GUI), with a mouse and buttons to click, was pioneered at Xerox PARC (Palo
Alto Research Center) in the early 1970s and popularized by Apple Computer in
the 1980s.
Today, operating
systems offer both GUI and CLI interfaces, however, most consumer operating
systems (Windows, macOS) are designed to shield the user from the ins and outs
of the CLI.
Comments
Post a Comment