You are on page 1of 16

Computer Fundamentals

Charles Babbage was considered to be the father of computing after his invention and concept
of the Analytical Engine in 1837. The Analytical Engine contained an Arithmetic Logic Unit (ALU),
basic flow control, and integrated memory; hailed as the first general-purpose computer
concept.
Alan Turing is considered by many to be the father of modern computer science as the world
knows it. He formed the concept of the algorithms and computations with one of his inventions,
the Turing machine. Alan Turing was born on June 23, 1912 in England, the son of Julius and Sara
Turing.
What is a computer?
A computer is an electronic device that manipulates information, or data. The computer is able
to work because there are instructions in its memory directing it.
It has the ability to;
store,
retrieve,
and process data.
You may already know that you can use a computer to type documents, send email, play
games, and browse the Web. You can also use it to edit or create spreadsheets, presentations,
and even videos.
Hardware vs. software
Before we talk about different types of computers, let's talk about two things all computers
have in common:
Hardware and Software.
Hardware is any part of your computer that has a physical structure, such as the keyboard or
mouse. It also includes all of the computer's internal parts, which you can see in the image
below.
Software is any set of instructions that tells the hardware what to do and how to do it.
Examples of software include web browsers, games, and word processors. Everything you do
on your computer will rely on both hardware and software. For example, right now you may be
viewing this lesson in a web browser (software) and using your mouse (hardware) to click from
page to page.
As you learn about different types of computers, ask yourself about the differences in their
hardware?
As you progress through this lesson, you'll see that different types of computers also often use
different types of software.

Generations

Computer Generations
Generation in computer terminology is a change in technology a computer is/was being used.
Initially, the generation term was used to distinguish between varying hardware technologies.
But nowadays, generation includes both hardware and software, which together make up an
entire computer system. There are totally five computer generations known till date. Each
generation has been discussed in detail along with their time period and characteristics. Here
approximate dates against each generation have been mentioned which are normally accepted.
Following are the main five generations of computers.
Generation and Description
First Generation
The period of first generation: 1946-1959. Vacuum tube based.
The period of first generation was 1946-1959. The computers of first generation used vacuum
tubes as the basic components for memory and circuitry for CPU (Central Processing Unit). These
tubes, like electric bulbs, produced a lot of heat and were prone to frequent fusing of the
installations, therefore, were very expensive and could be afforded only by very large
organizations. In this generation mainly batch processing operating system were used. Punched
cards, paper tape, and magnetic tape were used as input and output devices. The computers in
this generation used machine code as programming language.
Second Generation
The period of second generation: 1959-1965. Transistor based.
The period of second generation was 1959-1965. In this generation transistors were used that
were cheaper, consumed less power, more compact in size, more reliable and faster than the
first generation machines made of vacuum tubes. In this generation, magnetic cores were used
as primary memory and magnetic tape and magnetic disks as secondary storage devices. In this
generation assembly language and high-level programming languages like FORTRAN, COBOL
were used. The computers used batch processing and multiprogramming operating system.
Third Generation
The period of third generation: 1965-1971. Integrated Circuit based.
The period of third generation was 1965-1971. The computers of third generation used
integrated circuits (IC's) in place of transistors. A single IC has many transistors, resistors and
capacitors along with the associated circuitry. The IC was invented by Jack Kilby. This
development made computers smaller in size, reliable and efficient. In this
generation remote processing, time-sharing, multi-programming operating system were used.

High-level languages (FORTRAN-II TO IV, COBOL, PASCAL PL/1, BASIC, ALGOL-68 etc.) were used
during this generation.
Fourth Generation
The period of fourth generation: 1971-1980. VLSI microprocessor based.
The period of fourth generation was 1971-1980. The computers of fourth generation used Very
Large Scale Integrated (VLSI) circuits. VLSI circuits having about 5000 transistors and other circuit
elements and their associated circuits on a single chip made it possible to have microcomputers
of fourth generation. Fourth generation computers became more powerful, compact, reliable,
and affordable. As a result, it gave rise to personal computer (PC) revolution. In this generation
time sharing, real time, networks, distributed operating system were used. All the high-level
languages like C, C++, DBASE etc. were used in this generation.
Fifth Generation
The period of fifth generation: 1980-onwards. ULSI microprocessor based First Generation
The period of fifth generation is 1980-till date. In the fifth generation, the VLSI technology
became ULSI (Ultra Large Scale Integration) technology, resulting in the production of
microprocessor chips having ten million electronic components. This generation is based on
parallel processing hardware and AI (Artificial Intelligence) software. AI is an emerging branch in
computer science, which interprets means and method of making computers
think like human beings. All the high-level languages like C and C++, Java, .Net etc. are used in
this generation.
AI includes:
Robotics
Neural networks
Game Playing
Development of expert systems to make decisions in real life situations.
Natural language understanding and generation.
Types of Computer
1. Supercomputer
The fastest, largest, most powerful and most expensive computer. Supercomputers are one of
the fastest computers currently available. Supercomputers are very expensive and are employed
for specialized applications that require immense amount of mathematical calculations (number
crunching). For example, weather forecasting, scientific simulations, (animated) graphics, fluid
dynamic calculations, nuclear energy research, electronic design, and analysis of geological data
(e.g. in petrochemical prospecting).

2. Mainframe Computer
This is a little smaller and less powerful than the supercomputer, but, like the supercomputer it
is also expensive.
3. Mini Computer
It is a multi-user computer system which is capable of supporting
hundreds of users simultaneously.
4. WorkStation
It is also a single user computer system which is similar to personal computer but have more
powerful microprocessor.
Workstation is a computer used for engineering applications (CAD/CAM), desktop publishing,
software development, and other such types of applications which require a moderate amount
of computing power and relatively high quality graphics capabilities. Workstations generally
come with a large, high-resolution graphics screen, large amount of RAM, inbuilt network
support, and a graphical user interface. Most workstations also have a mass storage device such
as a disk drive, but a special type of workstation, called a diskless workstation, comes without a
disk drive. Common operating systems for workstations are UNIX and Windows NT.
Like PC, Workstations are also single user computers like PC but are typically linked together to
form a local-area network, although they can also be used as stand-alone systems.
5. Personal Computer (PC)
A PC can be defined as a small, relatively inexpensive computer designed for an individual user.
PCs are based on the microprocessor technology that enables manufacturers to put an entire
CPU on one chip. Businesses use personal computers for word processing, accounting, desktop
publishing, and for running spreadsheet and database management applications. At home, the
most popular use for personal computers is playing games and surfing Internet.
Although personal computers are designed as single-user systems, these systems are normally
linked together to form a network. In terms of power, now-a-days High-end models of the
Macintosh and PC offer the same computing power and graphics capability as low-end
workstations by Sun Microsystems, Hewlett-Packard, and Dell.
There are two main types of personal computers.
1. Macintosh (Macs)

This type of computer began with the original IBM PC that was introduced in 1981.
Other companies began creating similar computers, which were called IBM PC
Compatible (often shortened to PC). Today, this is the most common type of personal
computer, and it typically includes the Microsoft Windows operating system.
2. PC compatibles (PC)
The Macintosh computer was introduced in 1984, and it was the first widely sold
personal computer with a graphical user interface, or GUI (pronounced gooey). All
Macs are made by one company (Apple), and they almost always use the Mac OS X
operating system.
The main differences between the two are the operating systems and the processor they use.
This category of computer has two additional types of computers.
1. These are mobile computer and
2. handheld computer.
The most popular type of mobile computer is the notebook or laptop computer, and the
handheld computer is a very small PC that you can hold in your hand.
Desktop Computer
Many people use desktop computers at work, home, and school. Desktop computers
are designed to be placed on a desk, and they're typically made up of a few different
parts, including the computer case, monitor, keyboard, and mouse.
When most people hear the word computer, they think of a personal computer such
as a desktop or laptop. However, computers come in many shapes and sizes, and they
perform many different functions in our daily lives. When you withdraw cash from an
ATM, scan groceries at the store, or use a calculator, you're using a type of computer.
Laptop computers
The second type of computer you may be familiar with is a laptop computer, commonly called a
laptop. Laptops are battery-powered computers that are more portable than desktops, allowing
you to use them almost anywhere.

Tablet computers
Tablet computersor tabletsare handheld computers that are even more portable than

laptops. Instead of a keyboard and mouse, tablets use a touch-sensitive screen for typing and
navigation. The iPad is an example of a tablet.
Servers
A server is a computer that serves up information to other computers on a network.
For example, whenever you use the Internet, you're looking at something that's stored
on a server. Many businesses also use local file servers to store and share files
internally.
Other types of computers
Many of today's electronics are basically specialized computers, though we don't always think
of them that way. Here are a few common examples.
Smartphones: Many cell phones can do a lot of things computers can do, including browsing
the Internet and playing games. They are often called smartphones.
Wearables: Wearable technology is a general term for a group of devices including fitness
trackers and smartwatchesthat are designed to be worn throughout the day. These devices
are often called wearables for short.
Game consoles: A game console is a specialized type of computer that is used for playing
video games on your TV.
TVs: Many TVs now include applicationsor appsthat let you access various types of online
content. For example, you can stream video from the Internet directly onto your TV.

The Origin of the Internet


The Internet has revolutionized the computer and communications world like nothing before.
The invention of the telegraph, telephone, radio, and computer set the stage for this
unprecedented integration of capabilities.
The Internet is at once a world-wide broadcasting capability, a mechanism for information
dissemination, and a medium for collaboration and interaction between individuals and their
computers without regard for geographic location.
The Internet represents one of the most successful examples of the benefits of sustained
investment and commitment to research and development of information infrastructure.
Beginning with the early research in packet switching, the government, industry and academia
have been partners in evolving and deploying this exciting new technology.
This is intended to be a brief, necessarily cursory and incomplete history. Much material
currently exists about the Internet, covering history, technology, and usage. A trip to almost any
bookstore will find shelves of material written about the Internet.
This history revolves around four (4) distinct aspects.
1. There is the technological evolution that began with early research on packet switching
and the ARPANET (and related technologies), and where current research continues to
expand the horizons of the infrastructure along several dimensions, such as scale,
performance, and higher-level functionality.
ARPANET (Advanced Research Projects Agency Network) - January 1, 1983.
DARPA or DARPANET (Defense Advanced Research Projects Agency Network) 1971.
DARPA was changed back into ARPA in 1993 and back to DARPA again in 1996.
2. There is the operations and management aspect of a global and complex operational
infrastructure.
3. There is the social aspect, which resulted in a broad community of Internauts working
together to create and evolve the technology.
4. And there is the commercialization aspect, resulting in an extremely effective transition
of research results into a broadly deployed and available information infrastructure.
The Internet today is a widespread information infrastructure, the initial prototype of what is
often called the National (or Global or Galactic) Information Infrastructure. Its history is complex
and involves many aspects - technological, organizational, and community. And its influence
reaches not only to the technical fields of computer communications but throughout society as
we move toward increasing use of online tools to accomplish electronic commerce, information
acquisition, and community operations.

The first recorded description of the social interactions that could be enabled through networking
was a series of memos written by J.C.R. Licklider of MIT in August 1962 discussing his "Galactic
Network" concept. He envisioned a globally interconnected set of computers through which
everyone could quickly access data and programs from any site. In spirit, the concept was very
much like the Internet of today. Licklider was the first head of the computer research program
at DARPA, starting in October 1962. While at DARPA he convinced his successors at DARPA, Ivan
Sutherland, Bob Taylor, and MIT researcher Lawrence G. Roberts, of the importance of this
networking concept.
Leonard Kleinrock at MIT published the first paper on packet switching theory in July 1961 and
the first book on the subject in 1964. Kleinrock convinced Roberts of the theoretical feasibility
of communications using packets rather than circuits, which was a major step along the path
towards computer networking.
In August 1968, after Roberts and the DARPA funded community had refined the overall
structure and specifications for the ARPANET, an RFQ was released by DARPA for the
development of one of the key components, the packet switches called Interface Message
Processors (IMP's).
Computers were added quickly to the ARPANET during the following years, and work proceeded
on completing a functionally complete Host-to-Host protocol and other network software. In
December 1970 the Network Working Group (NWG) working under S. Crocker finished the initial
ARPANET Host-to-Host protocol, called the Network Control Protocol (NCP). As the ARPANET
sites completed implementing NCP during the period 1971-1972, the network users finally could
begin to develop applications.
In October 1972, Kahn organized a large, very successful demonstration of the ARPANET at the
International Computer Communication Conference (ICCC). This was the first public
demonstration of this new network technology to the public. It was also in 1972 that the initial
"hot" application, electronic mail, was introduced.
In March Ray Tomlinson at BBN wrote the basic email message send and read software,
motivated by the need of the ARPANET developers for an easy coordination mechanism. In July,
Roberts expanded its utility by writing the first email utility program to list, selectively read, file,
forward, and respond to messages. From there email took off as the largest network application
for over a decade. This was a harbinger of the kind of activity we see on the World Wide Web
today, namely, the enormous growth of all kinds of "people-to-people" traffic.

Internet Society in 1991, under the auspices of Kahn's Corporation for National Research
Initiatives (CNRI) and the leadership of Cerf, then with CNRI.
In 1992, yet another reorganization took place. In 1992, the Internet Activities Board was reorganized and re-named the Internet Architecture Board operating under the auspices of the
Internet Society. A more "peer" relationship was defined between the new IAB and IESG, with
the IETF and IESG taking a larger responsibility for the approval of standards. Ultimately, a
cooperative and mutually supportive relationship was formed between the IAB, IETF, and
Internet Society, with the Internet Society taking on as a goal the provision of service and other
measures which would facilitate the work of the IETF.
The recent development and widespread deployment of the World Wide Web has brought with
it a new community, as many of the people working on the WWW have not thought of
themselves as primarily network researchers and developers. A new coordination organization
was formed, the World Wide Web Consortium (W3C). Initially led from MIT's Laboratory for
Computer Science by Tim Berners-Lee (the inventor of the WWW) and Al Vezza, W3C has taken
on the responsibility for evolving the various protocols and standards associated with the Web.
Thus, through the over two decades of Internet activity, we have seen a steady evolution of
organizational structures designed to support and facilitate an ever-increasing community
working collaboratively on Internet issues.
Commercialization of Technology
Commercialization of the Internet involved not only the development of competitive, private
network services, but also the development of commercial products implementing the Internet
technology. In the early 1980s, dozens of vendors were incorporating TCP/IP into their products
because they saw buyers for that approach to networking. Unfortunately they lacked both real
information about how the technology was supposed to work and how the customers planned
on using this approach to networking. Many saw it as a nuisance add-on that had to be glued on
to their own proprietary networking solutions: SNA, DECNet, Netware, NetBios. The DoD had
mandated the use of TCP/IP in many of its purchases but gave little help to the vendors regarding
how to build useful TCP/IP products.
In 1985, recognizing this lack of information availability and appropriate training, Dan Lynch in
cooperation with the IAB arranged to hold a three day workshop for ALL vendors to come learn
about how TCP/IP worked and what it still could not do well. The speakers came mostly from the
DARPA research community who had both developed these protocols and used them in day-to-

day work. About 250 vendor personnel came to listen to 50 inventors and experimenters. The
results were surprises on both sides: the vendors were amazed to find that the inventors were
so open about the way things worked (and what still did not work) and the inventors were
pleased to listen to new problems they had not considered, but were being discovered by the
vendors in the field. Thus a two-way discussion was formed that has lasted for over a decade.
After two years of conferences, tutorials, design meetings and workshops, a special event was
organized that invited those vendors whose products ran TCP/IP well enough to come together
in one room for three days to show off how well they all worked together and also ran over the
Internet. In September of 1988 the first Interop trade show was born. 50 companies made the
cut. 5,000 engineers from potential customer organizations came to see if it all did work as was
promised. It did. Why? Because the vendors worked extremely hard to ensure that everyone's
products interoperated with all of the other products - even with those of their competitors. The
Interop trade show has grown immensely since then and today it is held in 7 locations around
the world each year to an audience of over 250,000 people who come to learn which products
work with each other in a seamless manner, learn about the latest products, and discuss the
latest technology.
In parallel with the commercialization efforts that were highlighted by the Interop activities, the
vendors began to attend the IETF meetings that were held 3 or 4 times a year to discuss new
ideas for extensions of the TCP/IP protocol suite. Starting with a few hundred attendees mostly
from academia and paid for by the government, these meetings now often exceed a thousand
attendees, mostly from the vendor community and paid for by the attendees themselves. This
self-selected group evolves the TCP/IP suite in a mutually cooperative manner. The reason it is
so useful is that it is composed of all stakeholders: researchers, end users and vendors.
Network management provides an example of the interplay between the research and
commercial communities. In the beginning of the Internet, the emphasis was on defining and
implementing protocols that achieved interoperation.
As the network grew larger, it became clear that the sometime ad hoc procedures used to
manage the network would not scale. Manual configuration of tables was replaced by distributed
automated algorithms, and better tools were devised to isolate faults. In 1987 it became clear
that a protocol was needed that would permit the elements of the network, such as the routers,
to be remotely managed in a uniform way. Several protocols for this purpose were proposed,
including Simple Network Management Protocol or SNMP (designed, as its name would suggest,
for simplicity, and derived from an earlier proposal called SGMP) , HEMS (a more complex design
from the research community) and CMIP (from the OSI community). A series of meeting led to
the decisions that HEMS would be withdrawn as a candidate for standardization, in order to help
resolve the contention, but that work on both SNMP and CMIP would go forward, with the idea
that the SNMP could be a more near-term solution and CMIP a longer-term approach. The market

could choose the one it found more suitable. SNMP is now used almost universally for networkbased management.
In the last few years, we have seen a new phase of commercialization. Originally, commercial
efforts mainly comprised vendors providing the basic networking products, and service providers
offering the connectivity and basic Internet services. The Internet has now become almost a
"commodity" service, and much of the latest attention has been on the use of this global
information infrastructure for support of other commercial services. This has been tremendously
accelerated by the widespread and rapid adoption of browsers and the World Wide Web
technology, allowing users easy access to information linked throughout the globe. Products are
available to facilitate the provisioning of that information and many of the latest developments
in technology have been aimed at providing increasingly sophisticated information services on
top of the basic Internet data communications.
On October 24, 1995, the FNC unanimously passed a resolution defining the term Internet. This
definition was developed in consultation with members of the internet and intellectual property
rights communities.
RESOLUTION: The Federal Networking Council (FNC) agrees that the following language reflects
our definition of the term "Internet". "Internet" refers to the global information system that
(i)
is logically linked together by a globally unique address space based on the Internet
Protocol (IP) or its subsequent extensions/follow-ons;
(ii)
is able to support communications using the Transmission Control Protocol/Internet
Protocol (TCP/IP) suite or its subsequent extensions/follow-ons, and/or other IPcompatible protocols; and
(iii)
provides, uses or makes accessible, either publicly or privately, high level services
layered on the communications and related infrastructure described herein.
The Internet has changed much in the two decades since it came into existence. It was conceived
in the era of time-sharing, but has survived into the era of personal computers, client-server and
peer-to-peer computing, and the network computer. It was designed before LANs existed, but
has accommodated that new network technology, as well as the more recent ATM and frame
switched services. It was envisioned as supporting a range of functions from file sharing and
remote login to resource sharing and collaboration, and has spawned electronic mail and more
recently the World Wide Web. But most important, it started as the creation of a small band of
dedicated researchers, and has grown to be a commercial success with billions of dollars of
annual investment.
One should not conclude that the Internet has now finished changing. The Internet, although a
network in name and geography, is a creature of the computer, not the traditional network of
the telephone or television industry. It will, indeed it must, continue to change and evolve at the

speed of the computer industry if it is to remain relevant. It is now changing to provide new
services such as real time transport, in order to support, for example, audio and video streams.
The availability of pervasive networking (i.e., the Internet) along with powerful affordable
computing and communications in portable form (i.e., laptop computers, two-way pagers, PDAs,
cellular phones), is making possible a new paradigm of nomadic computing and communications.
This evolution will bring us new applications - Internet telephone and, slightly further out,
Internet television. It is evolving to permit more sophisticated forms of pricing and cost recovery,
a perhaps painful requirement in this commercial world. It is changing to accommodate yet
another generation of underlying network technologies with different characteristics and
requirements, e.g. broadband residential access and satellites. New modes of access and new
forms of service will spawn new applications, which in turn will drive further evolution of the net
itself.
The most pressing question for the future of the Internet is not how the technology will change,
but how the process of change and evolution itself will be managed. As this paper describes, the
architecture of the Internet has always been driven by a core group of designers, but the form of
that group has changed as the number of interested parties has grown. With the success of the
Internet has come a proliferation of stakeholders - stakeholders now with an economic as well
as an intellectual investment in the network.
We now see, in the debates over control of the domain name space and the form of the next
generation IP addresses, a struggle to find the next social structure that will guide the Internet in
the future. The form of that structure will be harder to find, given the large number of concerned
stakeholders. At the same time, the industry struggles to find the economic rationale for the large
investment needed for the future growth, for example to upgrade residential access to a more
suitable technology. If the Internet stumbles, it will not be because we lack for technology, vision,
or motivation. It will be because we cannot set a direction and march collectively into the future.

History of World Wide Web


The web is a wonderful place. It connects people from across the globe, keeps us updated with
our friends and family, and creates revolutions never before seen in our lifetime. It has certainly
come a long way since its humble beginnings back in the early 1980's.
In order to understand the history of the World Wide Web it's important to understand the
differences between the World Wide Web and The Internet. Many people refer to them as the
same thing, but in fact, although the end result is the common perception of most everyday
users, they are very different.
The internet is a series of huge computer networks that allows many computers to connect and
communicate with each other globally. Upon the internet reside a series of languages which allow
information to travel between computers. These are known as protocols. For instance, some
common protocols for transferring emails are IMAP, POP3 and SMTP. Just as email is a layer on
the internet, the World Wide Web is another layer which uses different protocols.
The World Wide Web uses three protocols:

HTML (Hypertext markup language) - The language that we write our web pages in.

HTTP (Hypertext Transfer Protocol) - Although other protocols can be used such as FTP,
this is the most common protocol. It was developed specifically for the World Wide Web

and favored for its simplicity and speed. This protocol requests the 'HTML' document from
the server and serves it to the browser.

URLS (Uniform resource locator) - The last part of the puzzle required to allow the web to
work is a URL. This is the address which indicates where any given document lives on the
web. It can be defined as <protocol>://<node>/<location>

In the beginning
Ideas for the World Wide Web date back to as early as 1946 when Murray Leinster wrote a short
story which described how computers (that he referred to as 'Logics') lived in every home, with
each one having access to a central device where they could retrieve information. Although the
story does have several differences to the way the web works today, it does capture the idea of
a huge information network available to everyone in their homes.
The real vision and execution for the World Wide Web didn't come about until around 40
years later in 1980 when an English chap by the name of Tim Berners Lee was working on
a project known as 'Enquire'. Enquire was a simple database of people and software who
were working at the same place as Berners Lee. It was during this project that he
experimented with hypertext. Hypertext is text that can be displayed on devices which
utilize hyperlinks. The Berners Lee Enquire system used hyperlinks on each page of the
database, each page referencing other relevant pages within the system.
Tim Berners Lee, inventor of the World Wide Web, at the London 2012 Olympic opening
ceremony.
Berners Lee was a physicist and in his need to share information with other physicists
around the world found out that there was no quick and easy solution for doing so. With
this in mind, in 1989 he set about putting a proposal together for a centralized database
which contained links to other documents. This would have been the perfect solution for
Tim and his colleagues, but it turned out nobody was interested in it and nobody took any
notice - except for one person. Tim's boss liked his idea and encouraged him to implement
it in their next project. This new system was given a few different names such as TIM (The
Information Mine) which was turned down as it abbreviated Tim's initials. After a few
suggestions, there was only one name that stuck; the World Wide Web.

The First Browsers


By December 1990 Tim had joined forces with another physicist Robert Cailliau who rewrote
Tim's original proposal. It was their vision to combine hypertext with the internet to create
web pages, but no one at that time could appreciate how successful this idea could be.
Despite little interest, Berners Lee continued to develop three major components for the
web; HTTP, HTML and the world first web browser. Funnily enough, this browser was also
called "the World Wide Web" and it also doubled as an editor.
On June 8th 1991, the World Wide Web project was announced to the world where the man
himself described it:
The WWW project was started to allow high energy physicists to share data, news, and
documentation. We are very interested in spreading the web to other areas, and having
gateway servers for other data.
Boring, perhaps, but this is the world's first website.
The page outlined the plans for the World Wide Web. It was also this year that HTML was born
and the first publicly available description of HTML was released. Some of these tags are still in
use today, such as h1-h6 tags, paragraph tags and anchor tags. If we take a look at the source
code from the world's first web page, we can see some of these in use.
http://www.w3.org/History/19921103-hypertext/hypertext/WWW/TheProject.html
01
02
03
04
05
06
07
08
09
10
11

<HEADER>
<TITLE>The World Wide Web project</TITLE>
<NEXTID N="55">
</HEADER>
<BODY>
<H1>World Wide Web</H1>The WorldWideWeb (W3) is a wide-area<A
NAME=0 HREF="WhatIs.html">
hypermedia</A> information retrieval
initiative aiming to give universal
access to a large universe of documents. <P>
Everything there is online about

12
13
14
15

W3 is linked directly or indirectly


to this document, including an <A
NAME=24 HREF="Summary.html">executive
summary</A>

Shortly afterwards other browsers were released, each bringing differences and improvements.
Let's take a look at some of these browsers.

Line Mode Browser - feb 1992. This was also brought to us by Berners Lee. It was the
first browser to support multiple platforms.

Viola WWW Browser released - march 1992. This is widely suggested to be the world's
first popular browser. It brought with it a stylesheet and scripting language, long before
JavaScript and CSS.

Mosaic Browser released - Jan 5th 1993. Mosaic was really highly rated when it first
came out. It was developed at University of Illinois.

Cello Browser released - June 8th, 1993. This was the first browser available for
Windows.

Netscape Navigator 1.1 released - March 1995. This was the first browser to introduce
tables to HTML.

Opera 1.0 released - April 1995. This was originally a research project for a Norwegian
telephone company. The browser is still available today and is currently at version 12.

Internet Explorer 1.0 released - August 1995. Microsoft decided to get in on the act
when its Windows operating system '95 was released. This was the browser that ran
exclusively on that.

You might also like