Computer Viruses Essay, Research Paper
In the past decade, computer and networking technology has seen enormous growth.
This growth however, has not come without a price. With the advent of the
"Information Highway", as it?s coined, a new methodology in crime
has been created. Electronic crime has been responsible for some of the most
financially devastating victimizations in society. In the recent past, society
has seen malicious editing of the Justice Department web page (1), unauthorized
access into classified government computer files, phone card and credit card
fraud, and electronic embezzlement. All these crimes are committed in the name
of "free speech." These new breed of criminals claim that information
should not be suppressed or protected and that the crimes they commit are really
not crimes at all. What they choose to deny is that the nature of their actions
are slowly consuming the fabric of our country?s moral and ethical trust in
the information age. Federal law enforcement agencies, as well as commercial
computer companies, have been scrambling around in an attempt to
"educate" the public on how to prevent computer crime from happening
to them. They inform us whenever there is an attack, provide us with mostly
ineffective anti-virus software, and we are left feeling isolated and
vulnerable. I do not feel that this defensive posture is effective because it is
not pro-active. Society is still being attacked by highly skilled computer
criminals of which we know very little about them, their motives, and their
tools of the trade. Therefore, to be effective in defense, we must understand
how these attacks take place from a technical stand-point. To some degree, we
must learn to become a computer criminal. Then we will be in a better position
to defend against these victimizations that affect us on both the financial and
emotional level. In this paper, we will explore these areas of which we know so
little, and will also see that computers are really extensions of people. An
attack on a computer?s vulnerabilities are really an attack on peoples?
vulnerabilities. Today, computer systems are under attack from a multitude of
sources. These range from malicious code, such as viruses and worms, to human
threats, such as hackers and phone "phreaks." These attacks target
different characteristics of a system. This leads to the possibility that a
particular system is more susceptible to certain kinds of attacks. Malicious
code, such as viruses and worms, attack a system in one of two ways, either
internally or externally. Traditionally, the virus has been an internal threat
(an attack from within the company), while the worm, to a large extent, has been
a threat from an external source (a person attacking from the outside via modem
or connecting network). Human threats are perpetrated by individuals or groups
of individuals that attempt to penetrate systems through computer networks,
public switched telephone networks or other sources. These attacks generally
target known security vulnerabilities of systems. Many of these vulnerabilities
are simply due to configuration errors. Malicious Code Viruses and worms are
related classes of malicious code; as a result they are often confused. Both
share the primary objective of replication. However, they are distinctly
different with respect to the techniques they use and their host system
requirements. This distinction is due to the disjoint sets of host systems they
attack. Viruses have been almost exclusively restricted to personal computers,
while worms have attacked only multi-user systems. A careful examination of the
histories of viruses and worms can highlight the differences and similarities
between these classes of malicious code. The characteristics shown by these
histories can be used to explain the differences between the environments in
which they are found. Viruses and worms have very different functional
requirements; currently no class of systems simultaneously meets the needs of
both. A review of the development of personal computers and multi-tasking
workstations will show that the gap in functionality between these classes of
systems is narrowing rapidly. In the future, a single system may meet all of the
requirements necessary to support both worms and viruses. This implies that
worms and viruses may begin to appear in new classes of systems. A knowledge of
the histories of viruses and worms may make it possible to predict how malicious
code will cause problems in the future. Basic Definitions To provide a basis for
further discussion, the following definitions will be used throughout the
report; Trojan Horse – a program which performs a useful function, but also
performs an unexpected action as well; Virus – a code segment which replicates
by attaching copies to existing executables; Worm – a program which replicates
itself and causes execution of the new copy and Network Worm – a worm which
copies itself to another system by using common network facilities, and causes
execution of the copy on that system. In essence, a computer program which has
been infected by a virus has been converted into a "trojan horse". The
program is expected to perform a useful function, but has the unintended side
effect of viral code execution. In addition to performing the unintended task,
the virus also performs the function of replication. Upon execution, the virus
attempts to replicate and "attach" itself to another program. It is
the unexpected and uncontrollable replication that makes viruses so dangerous.
As a result, the host or vi
damage by the virus, before anyone realizes what has happened. Viruses are
currently designed to attack single platforms. A platform is defined as the
combination of hardware and the most prevalent operating system for that
hardware. As an example, a virus can be referred to as an IBM-PC virus,
referring to the hardware, or a DOS virus, referring to the operating system.
"Clones" of systems are also included with the original platform.
History of Viruses The term "computer virus" was formally defined by
Fred Cohen in 1983, while he performed academic experiments on a Digital
Equipment Corporation VAX system. Viruses are classified as being one of two
types: research or "in the wild." A research virus is one that has
been written for research or study purposes and has received almost no
distribution to the public. On the other hand, viruses which have been seen with
any regularity are termed "in the wild." The first computer viruses
were developed in the early 1980s. The first viruses found in the wild were
Apple II viruses, such as Elk Cloner, which was reported in 1981 [Den90].
Viruses were found on the following platforms: Apple II IBM PC Macintosh Atari
Amiga These computers made up a large percentage of the computers sold to the
public at that time. As a result, many people fell prey to the Elk Cloner and
virus?s similar in nature. People suffered losses in data from personal
documents to financial business data with little or no protection or recourse.
Viruses have "evolved" over the years due to efforts by their authors
to make the code more difficult to detect, disassemble, and eradicate. This
evolution has been especially apparent in the IBM PC viruses; since there are
more distinct viruses known for the DOS operating system than any other. The
first IBM-PC virus appeared in 1986 [Den90]; this was the Brain virus. Brain was
a boot sector virus and remained resident in the computer until "cleaned
out". In 1987, Brain was followed by Alameda (Yale), Cascade, Jerusalem,
Lehigh, and Miami (South African Friday the 13th). These viruses expanded the
target executables to include COM and EXE files. Cascade was encrypted to deter
disassembly and detection. Variable encryption appeared in 1989 with the 1260
virus. Stealth viruses, which employ various techniques to avoid detection, also
first appeared in 1989, such as Zero Bug, Dark Avenger and Frodo (4096 or 4K).
In 1990, self-modifying viruses, such as Whale were introduced. The year 1991
brought the GP1 virus, which is "network-sensitive" and attempts to
steal Novell NetWare passwords. Since their inception, viruses have become
increasingly complex and equally destructive. Examples from the IBM-PC family of
viruses indicate that the most commonly detected viruses vary according to
continent, but Stoned, Brain, Cascade, and members of the Jerusalem family, have
spread widely and continue to appear. This implies that highly survivable
viruses tend to be benign, replicate many times before activation, or are
somewhat innovative, utilizing some technique never used before in a virus.
Personal computer viruses exploit the lack of effective access controls in these
systems. The viruses modify files and even the operating system itself. These
are "legal" actions within the context of the operating system. While
more stringent controls are in place on multi-tasking, multi-user operating
systems (LAN Networks or Unix), configuration errors, and security holes
(security bugs) make viruses on these systems more than theoretically possible.
This leads to the following initial conclusions: Viruses exploit weaknesses in
operating system controls and human patterns of system use/misuse; Destructive
viruses are more likely to be eradicated and An innovative virus may have a
larger initial window to propagate before it is discovered and the
"average" anti-viral product is modified to detect or eradicate it. If
we reject the hypothesis that viruses do not exist on multi-user systems because
they are too difficult to write, what reasons could exist? Perhaps the explosion
of PC viruses (as opposed to other personal computer systems) can provide a
clue. The population of PCS and PC compatible is by far the largest.
Additionally, personal computer users exchange disks frequently. Exchanging
disks is not required if the systems are all connected to a network. In this
case large numbers of systems may be infected through the use of shared network
resources. One of the primary reasons that viruses have not been observed on
multi-user systems is that administrators of these systems are more likely to
exchange source code rather than executables. They tend to be more protective of
copyrighted materials, so they exchange locally developed or public domain
software. It is more convenient to exchange source code, since differences in
hardware architecture may preclude exchanging executables. It is this type of
attitude towards network security that could be viewed as victim precipitation.
The network administrators place in a position to be attacked, despite the fact
that they are unaware of the activity. The following additional conclusions can
be made: To spread, viruses require a large population of similar systems and
exchange of executable software; Destructive viruses are more likely to be
eradicated; An innovative virus may have a larger initial window to propagate
before it is discovered and the "average" anti-viral product is
modified to detect or eradicate it.