|
IN THIS THREE PART SERIES PETER HAYES LOOKS AT THE PAST,
PRESENT AND FUTURE OF THE INTERNET - AND OTHER NETWORKED
COMPUTERS - FROM A TECHNICAL AS WELL AS A POLITICAL POINT
OF VIEW. TODAY - IN PART ONE - HE LOOKS AT THE BIRTH OF
NETWORKED COMPUTING AND THE EMERGENCE OF THE INTERNET.
Innovation and invention
are two areas of life that have little respect for
political correctness. They are often found grabbing
whatever chance or opportunity happens to come their way:
The Cold War may have been a time of great East/West
tension and distrust; but it was also boom time for
certain science and technology projects looking for a
government funding.
Unfashionable and ungracious as it might seem in cold
print, without this Superpower stand-off the world would
probably not be as advanced as it is today in the fields
of satellite launch and communication, space exploration,
or in the networking of computers.
The first US "internet" (note with a small
"i" - the difference will be explained in the
next paragraph) was designed to be the main
communications method on "the day after" a
Russian nuclear strike. With conventional communications
systems destroyed (or at least presumed to be destroyed),
a complex array of inter linked computers could still
have a chance to communicate through an array of possible
connections.
(In a computer dictionary sense, an "internet"
(with a small"i") is just another name for a
network. However in most instances it is used to describe
a collection of networks connected by a so-called
"router." The Internet (with a
capital"I") is the sophisticated
"multi-protocol" internet system that many
computer users now subscribe to in order gain access to
E-mail and the World Wide Web (WWW)).
From this unlikely Cold War beginning came what we now
know as the Internet, sometimes also known as the
Information Superhighway. Curiously the third common term
used for this area of computing - Cyberspace - came from
the world of literature:William Gibson used the term in
his novel "Neuromancer" where it described the
"society that gathers around the world of
computers."
The problems of computer networking are varied and exist
at many levels. However the number one problem - at any
point in its limited history - is dealing with networking
errors or "retaining system integrity" in more
technical language.
Simply linking two pieces of non-aligned computer
equipment together is not particularly difficult to a
skilled programmer, however unless the software is very
sophisticated any linked system crash (or unexpected
occurrence) can easily bring down the whole system - the
so-called "domino effect."
Keeping a network up and running through system crashes
and third party miscalibrations (not to mention possible
mischief such as hacking) will always be the most
difficult task - past, present or future.
While clearly less than perfect, today's Internet is
quite a mini-miracle. While we will discuss politics in
the second and third part of this series, many
independent parties have to show a great deal of skill,
investment and co-operation before a single byte of
information can be exchanged.
Internet law-and-order is achieved through a system of
so-called "firewalls" and "supervisor
modes." In non-technical language, minimising the
effects of distant third-party errors by constantly
checking - and if necessary "repairing" - the
system. However this doesn't mean that data loss cannot
occur or that an individual user cannot crash off the
system and have to log-on again.
(Only in a couple of instances have outside events
damaged the efficiency of the Internet in a noticeable
way. However in one instance this was the result of
physical fire damage.)
The first person to envisage a collection of computers
sharing common information was J.R. Licklider who worked
for the Boston-based MIT Institute.
In 1962 he wrote a paper envisioning a "Galactic
Network" of computers and how this would benefit
mankind. Another early pioneer was Leonard Kleinrock
whose studies resulted in a research paper stating -
correctly as it turned out - that computers would have to
develop a so-called "packet switching" protocol
in order to communicate correctly.
(Today packet switching is the heart and soul of both the
Internet and nearly all computer-to-computer information
systems - including so-called Local Area Networks or
LANs.)
In 1965 Thomas Mermill and Lawrence Roberts managed to
create a network between a computer called
"TX-2" in Boston and a Californian computer
called "Q-32" using only a normal phone line.
In those days computers were not mass produced and the
connection was only made possible by a huge amount of
inter-university co-operation.
In 1969 the government funded APRANET (Advanced Projects
Research Agency InterNET) opened for business. This small
network linked mainframe computers of the US universities
UCLA, MIT, Stanford and Harvard. The system crossed the
Atlantic in 1973 when the University College London and
Norway's Royal Radar Establishment became part of the
system.
With great irony APRANET was closed twenty years to the
day after it opened (in 1989) with a UCLA conference and
networking debate. In the words of the cliche, APRANET
had become a victim of its own success!
During the 1960's other systems were devised for passing
information over a telephone line. As early as 1964 the
Post office (now BT) developed the so-called
"Viewdata" system that allowed computer data to
be sent down a standard phone line. At its peak the
system had 30,000 terminals which looked a little like
portable television with a number pad for requesting
information.
(In 1974 a similar system was launched called CEEFAX -
better known today as Teletext - that used spare lines of
the television signal to send one-way data information.)
In 1980 Tim Berners-Lee of started work towards what we
today call the World Wide Web. Working through American
CERN Complex he put together a "hypertext"
system that allowed text and graphic to appear
side-by-side. Hypertext is a system in which the
instructions (or code) is embedded in the central text
rather than separate.
The joy of the system was that the system used
third-party software embedded in the users computer and
could be platform independent. Today WWW is the dominant
force of the Internet, but various other systems had
equal dominance while the standard became established.
Next time we will look at the state of the Internet
today. How standards are established and debated as well
as some of governing bodies such as Network Solutions
Inc. We will also start to explore the role of the
so-called "backbone" companies that provide the
raw hardware and expertise that speeds the huge amount of
data traffic across the Internet.
(C)
Peter Hayes
|
|