You are on page 1of 36

Nanocomputers

Guided By: Submitted By:

For Download Visit http://www.nectarkunj.byethost14.com/ -1-


Nanocomputers

 1. INTRODUCTION

1.1 NANOCOMPUTERS
The computer has already gone through a dynamic revolution. Thirty years ago, computers
were the size of a room and incredibly slow. Few people probably would have imagined
supercomputers that can do over a trillion calculations per second. Today, the average person's
desktop computer is more powerful than the fastest computers were 30 years ago. The only
way this can continue is if a new type of computer is developed. This computer is known as a
nano computer. It may one day replace the modern computer due to many economic and
scientific constraints that will one day halt the modern computer‟s advancement.

The constraints for computers come from the circuits that form them. The most
important component of a computer is its “brain”, commonly referred to as the central
processing unit. Computer chip manufacturers, such as Intel, spend billions of dollars to build
plants and do research that will allow these chips to shrink in size. However, the costs of
research and plants are increasing at a substantial rate. Once the components of these chips
come close to the size of atoms, the costs to build plants may be in the trillions of dollars (Ellis).
What's worse is nothing can become smaller than an atom, so advances in computer speed will
not be possible. However, two upsides do exist. Scientists estimate the end will come around
the year 2010, and scientists are working on developing a nano computer (Markoff).

The earliest computers, built in the middle of the 20th century, used vacuum tubes for
switching. These machines were so massive and bulky, and demanded so much electricity to
operate, that they required buildings and power plants of their own. In addition ,they used more

C.U.Shah College of Engg. & Tech. -2-


Nanocomputers

processing power as well as much more energy . So,Scientists needed to shrink computers to
make them more powerful.as The smaller an electronic system can be made, the more
processing power can fit into a given physical volume, the less energy is required to run it, and
the faster it can work (because distances among components are reduced, minimizing charge-
carrier transit time).

But, the technology of putting circuits on silicon, the basis of current


computer chips, is reaching the natural limits of the wafers to hold circuits, turning up the
pressure for a break through. the answer for this is nanocomputing. nanocomputing deals with
building computers whose fundamental components measure only a few nanometers. Such
computers largely known as nanocomputers have their circuitry so small that it can only be seen
through a microscope.it is a computer whose physical dimensions are microscopic.

This field of Nanocomputing is a part of emerging field of nanotechnology . So, first of all
before going any further in this topic we should clear what nanotechnology is.

NANOTECHNOLOGY

Nanotechnology is an exciting emerging science and technological field that is making a


splash in 2002. Nanotechnology involves man‟s ability to create and manipulate molecules
structures to create potentially new materials, devices, machines or objects. A nanometre is a
billionth of a metre, that is, about 1/80,000 of the diameter of a human hair, or 10 times the
diameter of a hydrogen atom. . If you blew up a baseball to the size of the earth, the atoms of
the baseball would be about the size of grapes. If you could take a ruler and measure 10 atoms
side by side, you would have a nanometer. The World is on the brink of a technological
revolution beyond any human experience. It has the potential to bring wealth, health and
education to every person on the planet. And it will be able to do this without consuming natural
resources or spewing pollution into the environment. Nanotechnology is about building things
atom by atom, molecule by molecule.

The goal of nanotechnology is to build tiny devices called “nanomachines”. To build


things on such a small scale, you have to be able to manipulate atoms individually. The challenge
of nanotechnology is to place atoms precisely where you wish on a structure.

Does this all sound like science fiction? Actually, it's based upon scientific fact. All
though we have yet to build a nanomachine, the principles nanotechnology uses are the well
established physical properties of atoms and molecules. Life itself would be impossible without
molecular machines. They are working in your body right now. For example, consider a protein
in the human body. You could think of it as a machine that moves molecules. This is an oxygen
pump used in red blood cells. The heat of other molecules around it powers it. A channel opens
periodically to the center of the protein alloying oxygen to pass from the outside and bind with
iron for transport through out the body.

C.U.Shah College of Engg. & Tech. -3-


Nanocomputers

Scientists can now construct natural proteins and even synthesize new ones with novel
properties never seen in nature. With enough understanding, we may be able to turn proteins
into microscopic tools to do the jobs we want.

Manufactured products are made from atoms. The properties of those products
depend on how those atoms are arranged. If we rearrange the atoms in coal we can make
diamond. If we rearrange the atoms in sand (and add a few other trace elements) we can make
computer chips. If we rearrange the atoms in dirt, water and air we can make potatoes.

Today‟s manufacturing methods are very crude at the molecular level. Casting, grinding,
milling and even lithography move atoms in great thundering statistical herds. It's like trying to
make things out of LEGO blocks with boxing gloves on your hands. Yes, you can push the
LEGO blocks into great heaps and pile them up, but you can't really snap them together the
way you'd like.

In the future, Nanotechnology will let us take off the boxing gloves. We'll be able to
snap together the fundamental building blocks of nature easily, inexpensively and in most of the
ways permitted by the laws of physics. This will be essential if we are to continue the revolution
in computer hardware beyond about the next decade, and will also let us fabricate an entire new
generation of products that are cleaner, stronger, lighter, and more precise.

It's worth pointing out that the word "Nanotechnology" has become very popular and is
used to describe many types of research where the characteristic dimensions are less than about
1,000 nanometers. For example, continued improvements in lithography have resulted in line
widths that are less than one micron: this work is often called "Nanotechnology." Sub-micron
lithography is clearly very valuable but it is equally clear that lithography will not let us build
semiconductor devices in which individual dopant atoms are located at specific lattice sites.
Many of the exponentially improving trends in computer hardware capability have remained
steady for the last 50 years. There is fairly widespread belief that these trends are likely to
continue for at least another several years, but then lithography starts to reach its fundamental
limits. If we are to continue these trends we will have to develop a new "post-lithographic"
manufacturing technology which will let us inexpensively build computer systems with mole
quantities of logic elements that are molecular in both size and precision and are interconnected
in complex and highly idiosyncratic patterns. Nanotechnology will let us do this.

An early promoter of the industrial applications of Nanotechnology, Albert Franks,


defined it as 'that area of science and technology where dimensions and tolerances in the range
of 0.1nm to 100 nm play a critical role'. It encompasses precision engineering as well as
electronics; electromechanical systems (eg 'lab-on-a-chip' devices) as well as mainstream
biomedical applications in areas as diverse as gene therapy, drug delivery and novel drug
discovery techniques. Nanotechnology is all about manipulating and controlling things on a small
scale.

Computers reproduce information at almost no cost. A push is well underway to invent


devices that manufacture at almost no cost, by treating atoms discretely, like computers treat

C.U.Shah College of Engg. & Tech. -4-


Nanocomputers

bits of information.This would allow automatic construction of consumer goods without


traditional labor, like a Xerox machine produces unlimited copies without a human retyping the
original information.

Electronics is fueled by miniaturization. Working smaller has led to the tools capable of
manipulating individual atoms like the proteins ina potato manipulate the atoms of soil, air and
water to make copies of itself.

The shotgun marriage of chemistry and engineering called "Nanotechnology" is ushering


in the era of self-replicating machinery and self-assembling consumer goods made from cheap
raw atoms Utilizing the well understood chemical properties of atoms and molecules (how they
"stick" together), Nanotechnology proposes the construction of novel molecular devices
possessing extraordinary properties. The trick is to manipulate atoms individually and place
them exactly where needed to produce the desired structure. This ability is almost in our grasp.
The anticipated payoff for mastering this technology is far beyond any human accomplishment
so far...

 Some applications of nanotechnology which one can’t imagine in their dreams :-

 Self-assembling consumer goods


 Computers billions of times faster
 Extremely novel inventions (impossible today)
 Safe and affordable space travel
 Medical Nano... virtual end to illness, aging, death
 No more pollution and automatic cleanup of already existing pollution
 Molecular food syntheses... end of famine and starvation
 Access to a superior education for every child on Earth
 Reintroduction of many extinct plants and animals
 Terra forming here and the Solar System

In a world of information, digital technologies have made copying fast, cheap , and
perfect, quite independent of cost or complexity of the content. What if the same were to
happen in the world of matter? The production cost of a ton of terabyte RAM chips would be
about the same as the production cost of steel. Design costs would matter, production costs
wouldn't.

By treating atoms as discrete, bit-like objects, molecular manufacturing will bring a


digital revolution to the production of material objects. Working at the resolution limit of matter,
it will enable the ultimate in miniaturization and performance. By starting with cheap, abundant
components--molecules-and processing them with small, high-frequency, high-productivity
machines, it will make products inexpensive. Design computers that each execute more
instructions per second than all of the semiconductor CPUs in the world combined.

C.U.Shah College of Engg. & Tech. -5-


Nanocomputers

Taking all of this into account, it is clear that Nanotechnology should improve our lives
in any area that would benefit from the development of better, faster, stronger, smaller, and
cheaper systems.

2. DRAWBACKS OF NANOCOMPUTERS

Proponents predict that nanotechnology will ignite the industrial revolution of the 21st
century, the effects of which will have a greater impact on the world‟s population than
antibiotics, integrated circuits, and human-made polymers combined. Now that scientists have
the tools and are developing the skills to manipulate nature‟s building blocks, nanotech
enthusiasts tout some mind-bending scenarios.

Yet some business-world skeptics wonder when nanotechnology‟s promise will


translate into practical applications-or if they will at all. With a few exceptions, nanotechnology
still resides in university laboratories and corporate and government research facilities, where
scientists are devising the technology to build structures and substances that are smaller, lighter,
faster, stronger, and more efficient than more conventional-products.

“This is pioneering new technology. It takes time for these ideas and areas to
germinate,” says Pat Dillon, a programs director at Minnesota Project Innovation Inc., a
Minneapolis organization that helps small Minnesota companies identify and capitalize on federal
research-funding opportunities. “In the meantime, people in the commercial world are looking at
what‟s happening at the university level wondering, „How am I going to be able to build a
business around this technology? What‟s the opportunity? What‟s the product? Who‟s going to
buy it?‟”

In other words, can nanotechnology live up to its fantastic hype?WillieHendrickson is the head
of the new nanotechnologies in Rushford.This is a promising area, but we don‟t want to over-
promise, or promise in an unrealistic timeframe,” says Rick Kiehl, an electrical engineering
professor at the University of Minnesota. “As with any scientific or technological developments,
if people hear too much about it too early, a sense of disappointment inevitably builds up, and
we don‟t want to see that happen. The people who expect this field to produce something
tomorrow are only referring to a very small part of it. And in my opinion, this is more of a
medium- to long-range proposition.”

Unlike most technologies, which are quite specific and limited to one industry,
nanotechnology cuts across almost every imaginable industry,” Uldrich says. “My job is to get
the state focused on the future. I‟m not an advocate of putting money down on one horse and
hoping it takes off; that‟s like pinning the state‟s hopes on picking the next Microsoft. With
nanotechnology, we‟re betting on the racetrack. If we start investing in the broad science right

C.U.Shah College of Engg. & Tech. -6-


Nanocomputers

now, we‟ll be able to position ourselves formidably in a wide variety of industries-the material
sciences, biomedical, telecommunications, and on and on.”

A nanocomputer is a computer whose physical dimensions are


microscopic. Several types of nanocomputers have been suggested or proposed
by researchers and futurists .

3. TYPES OF NANOCOMPUTERS

Electronic nanocomputers would operate in a manner similar to the way present-day


microcomputers work. The main difference is one of physical scale. More and more transistors
are squeezed into silicon chips with each passing year; witness the evolution of integrated
circuits (ICs) capable of ever-increasing storage capacity and processing power. The ultimate
limit to the number of transistors per unit volume is imposed by the atomic structure of matter.
Most engineers agree that technology has not yet come close to pushing this limit. In the
electronic sense, the term Nanocomputer is relative. By 1970s standards, today's ordinary
microprocessors might be called Nanodevices.

Chemical and biochemical Nanocomputers would store and process information in


terms of chemical structures and interactions. Biochemical Nanocomputers already exist in
nature; they are manifest in all living things. But these systems are largely uncontrollable by
humans. We cannot, for example, program a tree to calculate the digits of pi or program an
antibody to fight a particular disease (although medical science has come close to this ideal in
the formulation of vaccines, antibiotics, and antiviral medications). The development of a true
chemical Nanocomputer will likely proceed along lines similar to genetic engineering. Engineers
must figure out how to get individual atoms and molecules to perform controllable calculations
and data storage tasks.

Mechanical Nanocomputers would use tiny moving components called Nanogears to


encode information. Such a machine is reminiscent of Charles Babbage's analytical engines of
the 19th century. For this reason, mechanical Nanocomputer technology has sparked
controversy; some researchers consider it unworkable. All the problems inherent in Babbage's
apparatus, according to the naysayers, are magnified a millionfold in a mechanical
Nanocomputer. Nevertheless, some futurists are optimistic about the technology, and have even
proposed the evolution of Nanorobots that could operate, or be controlled by, mechanical
Nanocomputers.

A Quantum Nanocomputer would work by storing data in the form of atomic


quantum states or spin. Technology of this kind is already under development in the form of
single-electron memory (SEM) and quantum dots. The energy state of an electron within an
atom, represented by the electron energy level or shell, can theoretically represent one, two,

C.U.Shah College of Engg. & Tech. -7-


Nanocomputers

four, eight, or even 16 bits of data. The main problem with this technology is instability.
Instantaneous electron energy states are difficult to predict and even more difficult to control. An
electron can easily fall to a lower energy state, emitting a photon conversely, a photon striking
an atom can cause one of its electrons to jump to a higher energy state.

3.1 ELECTRONIC NANOCOMPUTERS

The power, flexibility, and ease of manufacture of conventional microelectronic two-


state devices have been and continue to be at the heart of the revolution in computer and
information technology that has swept the world during the past half century. Among the key
properties of these solid-state devices has been that they have lent themselves to the
miniaturization of electronic devices, especially computers. First, in the 1950s and 1960s,
solid state devices--transistors--replaced vacuum tubes and miniaturized all the devices (e.g.,
radios, televisions, and electronic computers) that originally had been invented and
manufactured using tube technology. Then, starting in the mid-1960s, successive generations of
smaller transistors began replacing larger ones. This permitted more transistors and more
computing power to be packed in the same small space.In fact, as noted by Gordon Moore, the
founder of the Intel Corporation, the number of transistors on a solid state silicon integrated
circuit "chip" began doubling every 18 months. This trend, now known as Moore's Law has
continued to the present day. Very soon, however, if computers are to continue to get smaller
and more powerful at the same rate, fundamentally new operational principles and fabrication
technologies such as Nanolithography will need to be employed for miniature electronic devices.

Nanolithography is used to create microscopic circuits. It is the art and science of


etching, writing, or printing at the microscopic level, where the dimensions of characters are on
the order of nanometers (units of 10-9 meter, or millionths of a millimeter). This includes various
methods of modifying semiconductor chips at the atom ic level for the purpose of fabricating
integrated circuits (IC).

Instruments used in Nanolithography include the Scanning Probe Microscope (SPM)


and the Atomic Force Microscope (AFM). The SPM allows surface viewing in fine detail
without necessarily modifying it. Either the SPM or the ATM can be used to etch, write, or print
on a surface in single-atom dimensions.

MECHANICAL COMPUTERS

Nanotechnology pioneers Eric Drexler Ralph Merkle and their collaborators


favor Nanocomputer designs that resemble miniature Babbage engines, mechanical
Nanocomputers that would calculate using moving molecular-scale rods and rotating molecular-
scale gears spinning on shafts and bearings. This idea originated in a 1959 talk entitled "There's

C.U.Shah College of Engg. & Tech. -8-


Nanocomputers

Plenty of Room at the Bottom" presented by the late Nobel-Prize-winning physicist Richard
Feynman Feynman pointed out that such tiny machinery was not prohibited by any known
physical principle.

Merkle envision that these tiny machines and computers would be assembled
by the mechanical positioning of atoms or molecular building blocks one atom or molecule at a
time, a process known as ``mechanosynthesis." Once assembled, the mechanical
Nanocomputer would operate a bit like a vastly scaled down, complex programmable versions
of the mechanical calculators that were familiar office tools in the period 1940 through 1970,
preceding the introduction of widely available, inexpensive solid-state electronic calculators.
Strong arguments can be made in favor of such an approach. For one thing, quantum mechanics
assures that the molecular-scale moving parts should not be subject to the large frictional effects
that defeated earlier attempts to build complex macroscopic mechanical computers, such as
those designed by Charles Babbage in the 1830s and 1840s. However, there are near-term
drawbacks. One such drawback is that the fabrication of such nanomechanical devices is likely
to require "hand-made" parts assembled one atom or molecular subunit at a time using STMs in
processes that are relatively slow. While this might be done, it would be tedious work to move
even a few atoms into a specific position this way, and it would be increasingly more difficult to
manufacture reliably the many precision parts for the computer. It is possible, though, that this
problem might be alleviated, somewhat, by the perfection and evolution of recently developed
STM arrays that could build many nanoscale components in parallel. Stereospecific chemical
reactions and chemical self-assembly also might be applied to help realize a mechanical
nanocomputer.

3.3 CHEMICAL & BIO-CHEMICAL COMPUTERS

In general terms, a chemical computer is one that processes information by making


and breaking chemical bonds, and it stores logic states or information in the resulting chemical
(i.e., molecular) structures. A chemical nanocomputer would perform such operations
selectively among molecules taken just a few at a time in volumes only a few nanometers on a
side. Proponents of a variant of chemical nanocomputers, biochemically based computers, can
point to an "existence proof" for them in the commonplace activities of humans and other
animals with multicellular nervous systems.

Nonetheless, artificial fabrication or implementation of this category of "natural," or


biomimetic biochemically based computers seems far off because the mechanisms for animal
brains and nervous systems still are poorly understood. In the absence of this deeper
understanding, research on biochemically-based computers has proceeded in alternative
directions. One alternative direction has been to adapt naturally occurring biochemicals for use
in computing processes that do not occur in nature an important example of this is.

C.U.Shah College of Engg. & Tech. -9-


Nanocomputers

A nanocomputer that uses DNA (deoxyribonucleic acids) to store information and


perform complex calculations. In 1994, University of Southern California computer scientist
Leonard Adelman suggested that DNA could be used to solve complex mathematical problems.
Adleman is often called the inventor of DNA computers. His article in a 1994 issue of the
journal Science outlined how to use DNA to solve a well-known mathematical problem, called
the Directed Hamilton Path problem, also known as the "traveling salesman" problem. The
goal of the problem is to find the shortest route between a number of cities, going through each
city only once. As you add more cities to the problem, the problem becomes more difficult.
Adleman chose to find the shortest route between seven citie

 Fig : Adelman's DNA-based computer

You could probably draw this problem out on paper and come to a solution faster than
Adleman did using his DNA test-tube computer. Here are the steps taken in the Adleman DNA
computer experiment:

 Strands of DNA represent the seven cities. In genes, genetic coding is represented by
the letters A, T, C and G. Some sequence of these four letters represented each city and
possible flight path.
 These molecules are then mixed in a test tube, with some of these DNA strands
sticking together. A chain of these strands represents a possible answer.
 Within a few seconds, all of the possible combinations of DNA strands, which represent
answers, are created in the test tube.

Adleman eliminates the wrong molecules through chemical reactions,which leaves


behind only the flight paths that connect all seven cities.

The success of the Adleman DNA computer proves that DNA can be used to calculate
complex mathematical problems. However, this early DNA computer is far from challenging

C.U.Shah College of Engg. & Tech. - 10 -


Nanocomputers

silicon-based computers in terms of speed. The Adleman DNA computer created a group of
possible answers very quickly, but it took days for Adleman to narrow down the possibilities.
Another drawback of his DNA computer is that it requires human assistance. The goal of the
DNA computing field is to create a device that can work independent of human involvement.

Three years after Adleman's experiment, researchers at the University of Rochester


developed logic gates made of DNA. Logic gates are a vital part of how your computer carries
out functions that you command it to do. These gates convert binary code moving through the
computer into a series of signals that the computer uses to perform operations. Currently, logic
gates interpret input signals from silicon transistors and convert those signals into an output signal
that allows the computer to perform complex functions.

The Rochester team's DNA logic gates are the first step toward creating a computer
that has a structure similar to that of an electronic PC instead of using electrical signals to
perform logical operations, these DNA logic gates rely on DNA code. They detect fragments of
genetic material as input, splice together these fragments and form a single output. For
instance, a genetic gate called the "And gate" links two DNA inputs by chemically binding
them so they're locked in an end-to-end structure, similar to the way two Legos might be
fastened by a third Lego between them. The researchers believe that these logic gates might be
combined with DNA microchips to create a breakthrough in DNA computing.

DNA computer components -- logic gates and biochips -- will take years to develop into
a practical, workable DNA computer. If such a computer is ever built, scientists say that it will
be more compact, accurate and efficient than conventional computers. In the next section, we'll
look at how DNA computers could surpass their silicon-based predecessors, and what tasks
these computers would perform.

The main benefit o using DNA computers to solve complex problems is that different
possible solutions are created all at once. This is known as parallel processing .Humans and
most electronic computers must attempt to solve the problem one process at a time (linear
processing). DNA itself provides the added benefits of being a cheap, energy-efficient resource.

In a different perspective, more than 10 trillion DNA molecules can fit into an area no larger
than 1 cubic centimeter. With this, a DNA computer could hold 10 terabytes of data and
perform 10 trillion calculations at a time.

3.4 QUANTUM NANOCOMPUTERS

3..4.1 BASIC TERMS

1) Quantum-Computer :

A quantum computer is a machine, as-yet hypothetical, that performs calculations based

C.U.Shah College of Engg. & Tech. - 11 -


Nanocomputers

on the behavior of particles at the sub-atomic level. Such a computer will be, if it is ever
developed, capable of executing far more millions of instructions per
second (MIPS) than any previous computer.

2) Qubits :

Engineers have coined the term qubit (pronounced KYEW-bit) to denote the fundamental
data unit in a quantum computer. A qubit is essentially a bit (binary digit) that can take on
several, or many, values simultaneously.

3) Teleportation :

Teleportation involves dematerializing an object at one point, and sending the details of
that object's precise atomic configuration to another location, where it will be reconstructed.
What this means is that time and space could be eliminated from travel -- we could be
transported to any location instantly, without actually crossing a physical distance.

4) Quantum mechanics :

As the name implies, a quantum computer involves quantum mechanics. A quantum


computer uses subatomic particles, such as electrons and protons, to solve problems.
According to quantum mechanics, electrons can be in many different places and many different
states all at the same instance in time (Carey A1). The possibility that an electron can be
anywhere or be in different states is supposed to make quantum computers extremely fast
(Carey A1). This along with other laws of quantum mechanics present the most challenges for
building quantum computers.

3.4.2 INTRODUCTION

Recently, there has been serious interest in the possibility of fabricating and applying
nanoscale quantum computers. In a quantum computer, it is proposed that massively parallel
computations can be performed through the ``natural" mechanism of interference among the
quantum waves associated with the nanoscale components of a multicomponent, coherent
quantum state. Proposed quantum computers would represent each bit of information as a
quantum state of some component of the computer, e.g., the spin orientation of an atom.
According to quantum mechanics, the state of each nanoscale component of a system can be
represented by a wave. These quantum matter waves are analogous to light waves, except that
their wavelengths tend to be much shorter, in inverse proportion to the momentum of the
quantized component. Thus, the quantum waves can be manipulated in the space of only a few
nanometers, unlike most light of moderate, nondestructive energy, which has wavelengths of
several hundred nanometers. By carefully setting up the states for the components of the
quantum system, a desired computation is performed through the wave interference among the
quantized components. All discrete computational path would be considered at once, at the
speed of light, through the wave interference patterns--fast, intrinsic parallel computation. Given
the correct initial preparation of the entire multicomponent computational system, constructive

C.U.Shah College of Engg. & Tech. - 12 -


Nanocomputers

interference among the components' waves would emphasize those wave patterns which
correspond to correct solutions to the problem, and destructive interference would weed out the
incorrect solutions.

The idea for a quantum computer is based upon the work of Paul Benioff in the early
1980s and that of David Deutsch and Richard Feynman, in the mid-1980s. Although quantum
computers originally were proposed as a theoretical construct for considering the limits of
computation, some researchers have suggested that fundamentally hard and economically
important problems could be solved much more rapidly using quantum interference and
parallelism. In particular, Peter Shor has proven that a quantum computer could factor large
numbers very rapidly and thereby, perhaps, provide cryptographers with a powerful new tool
with which to crack difficult codes. Some proposals have been made for implementing such a
computer. Seth Lloyd, in particular, has attracted much attention recently with a mechanism he
has proposed for the practical implementation of quantum computers. There have been some
quantitative arguments, though, that cast doubts upon the specifics and the ultimate utility of
Lloyd's proposals. More general reservations about proposed quantum computers include the
fact that they would have to be assembled and initialized with great and unprecedented
precision. Quantum computers would be very sensitive to extremely small physical distortions
and stray photons, which would cause the loss of the phase coherence in the multicomponent
quantum state. Thus, they would have to be carefully isolated from all external effects and
operated at temperatures very close to absolute zero. Even then, some errors are likely to be an
intrinsic problem, though they do not rule out the eventual application of quantum computers to
solve certain classes of difficult problems.

3.4.3 QUANTUM BASICS

For a non-scientist, understanding how quantum computing works is darn near


impossible. But the basics are worth taking a stab at.

Atoms have a natural spin or orientation, in the way a needle on a compass has an
orientation. The spin can either be up or down. That coincides nicely with digital technology,
which represents everything by a series of 1s and 0s. With an atom, a spin pointing up can be 1;
down can be 0. Flipping the spin up or down could be like flipping the switch on and off (or
between 1 and 0) on a tiny transistor.

So far so good. But here's one of the weird parts, which also is the source of quantum
computing's great power. An atom, which is not visible to the naked eye, can be both up and
down at the same time until you measure it. The act of measuring it -- whether using instruments
or a powerful microscope -- forces it to choose between up or down.

Don't ask why it works that way. It's part of quantum mechanics, which is a set of laws --
like Einstein's theory of relativity -- that govern the universe. In this case, the laws govern the
tiniest objects in the universe. Quantum mechanics is entirely unlike anything in the world of

C.U.Shah College of Engg. & Tech. - 13 -


Nanocomputers

ordinary experiences. The laws are so bizarre they seem made up. Yet they've been proven
time and again.

Since an atom can be up and down at once -- called putting it into a superposition -- it's not
just equal to one bit, as in a traditional computer. It's something else. Scientists call it a qubit. If
you put a bunch of qubits together, they don't do calculations linearly like today's computers.
They are, in a sense, doing all possible calculations at the same time -- in a way, straddling all
the possible answers. The act of measuring the qubits stops the calculating process and forces
them to settle on an answer.

Forty qubits could have the power of one of today's supercomputers. A supercomputer
trying to find one phone number in a database of all the world's phone books would take a
month, Chuang says. A quantum computer could do it in 27 minutes.

3.4.4 QUANTUM NANOCOMPUTER VERSUS CLASSICAL COMPUTER

The only similarity a quantum computer should have to an ordinary computer is


usefulness. A quantum computer will not be a machine in a box; instead it may look like some
big magnets surrounded by other stuff. A quantum computer may differ from a modern
computer in other ways also. For example, a quantum computer may not have the permanent
data storage a modern computer has with a hard drive. However, a quantum computer will
certainly need a device similar to a monitor in order to be of any use to an average person. The
composition of a quantum computer helps give it many advantages.

Scientists are trying to develop a quantum computer due to its potential. A quantum
computer is supposed to be able to solve a problem all at once instead of in steps (Markoff). A
modern computer takes a problem and quickly solves a single step then moves on to the next
one. If there are trillions of things to search through, like every word on the Internet, this can be
extremely slow (Markoff). However, a quantum computer would be exponentially faster than a
modern computer.

 A quantum computer shows no resemblance to a modern computer.


 Perhaps an even more useful task for the quantumcomputer involves factoring
numbers.
 Modern computers have been used a lot for factoring large numbers.

"The largest number ever known to have been factorized had 129 digits. It took a
network of supercomputers working in parallel eight months to find the answer! To factorize a

C.U.Shah College of Engg. & Tech. - 14 -


Nanocomputers

1,000-digit number would take our most powerful conventional supercomputers more than the
estimated 100 billion years the universe has left to run" (Ellis). Factoring large numbers is
known as cryptography (Ellis). It is used to secure things over the Internet such as financial
transactions and email (Ellis). However, "a quantum computer would break the most
sophisticated code in no time flat" (Ellis). This basically means someone could intercept
someone else's email messages. A quantum computer's potential goes beyond cryptography.

A quantum computer may prove useful in the math and physics. A


quantum computer would be fast enough for physicists to do computer
simulations of nuclear explosions and other physical processes (Powell).
A quantum computer could enable mathematicians to solve seemingly impossible
problems. While these two things might not sound great for people outside the
scientific community, a quantum computer will probably be more useful for
things that seem impossible.

One thing that scientists have deemed possible is teleportation (Hall).


However, teleportation has many problems and challenges in front of it just like
the quantum computer. Because teleportation uses quantum physics, the
development of the quantum computer may help scientists learn more to solve
problems related to teleportation. Fortunately, researchers have made some
progress developing a quantum computer.

Two physicists, Neil Gershenfield and Isaac Chuang, have built a very basic
quantum computer (Winters 94). They were able to solve two simple problems.
They used liquid alanine to solve the problem one plus one (Winters 94). They
were able to solve another problem in liquid chloroform (Markoff). This
problem was to select a correct telephone number given four different numbers
(Winters 94). The physicists hope to be able to make a more complex quantum
computer that is able to factor 15 into 5 and 3 (Winters 95). While all of these
tasks are simple for a modern computer, the process to solve the problem was
done differently than it would have been done on a modern computer. All
possible answers were checked at the same time, compared to checking each
answer until the correct one was found (Markoff).

3.4.5 QUANTUM NANO-ENCRYPTION

Quantum encryption pioneers promise to put the world's first uncrackably secure
networks online by early 2003. Based on the quantum properties of photons, quantum
encryption guarantees absolutely secure optical communications.

Three independent experiments recently have demonstrated such systems. Geneva-


based id Quantique SA encoded a secure transmission on a 70-kilometer fiber-optic link in
Europe; MajiQ Technologies Inc., here, used a 30-km link; and researchers at Northwestern
University (Evanston, Ill.) demonstrated a 250-Mbit/second quantum encrypted transmission
over a short link.

C.U.Shah College of Engg. & Tech. - 15 -


Nanocomputers

"Our quantum random-number generator and our single-photon detector module are
available now and are in use by several customers around the world," said Gregoire Ribordy, a
manager at id Quantique. A beta version of a third product, a quantum-key distribution system,
"has been fully tested, and we are in advanced discussions with several potential launch
customers," he added.

3.4.6 SECURING THE INTERNET

For its part, MagiQ says that its Navajo system is currently at the alpha stage and
promises real beta sites on selected campuses in the United States in the first quarter. Both
companies are also talking about secure through-the-air communications with satellites.

Northwestern, meanwhile, vows to have a 2.5-Gbit/s quantum- technology encryption


capable of securing the Internet backbone in five years. It says that commercial partners are
working with the technology.

There is strong interest in quantum encryption because of its ability to completely eliminate
the possibility of eavesdropping. Today encryption/decryption methods are only as good as the
length of the key - a 56- to 256-bit value used to scramble the data to be transmitted with a
one-way function - that's used to encrypt a message. A common way to create such a one-way
function is to multiply two large prime numbers, a simple operation for a computer to perform.
However, going backward - that is, taking a large number and finding its prime factors - is very
difficult for computers to execute.

Other methods use some hard mathematical problem to create one-way functions, but any
scheme of that kind is vulnerable both to advances in computational power and new
breakthroughs in mathematics.

3.4.7 APPLICATIONS OF QUANTUM NANOCOMPUTING

Quantum computers might prove especially useful in the following applications :

 Breaking ciphers
 Statistical analysis
 Factoring large numbers
 Solving problems in theoretical physics
 Solving optimization problems in many variables

3.4.8 DISADVANTAGES OF QUANTUM NANOCOMPUTING

The technology needed to build a quantum computer is currently beyond our reach. This
is due to the fact that the coherent state, fundamental to a quantum computers operation, is
destroyed as soon as it is measurably affected by its environment. Attempts at combatting this
problem have had little success, but the hunt for a practical solution continues.

C.U.Shah College of Engg. & Tech. - 16 -


Nanocomputers

4. FUTURE SCOPE OF NANOTECHNOLOGY


Here's a date for your diary November 1st, 2011. According to a group of
researchers calling themselves the Nanocomputer Dream Team, that's the day they'll unveil a
revolutionary kind of computer, themost powerful ever seen. Their nanocomputer will be made
out of atoms.

First suggested by Richard Feynman in 1959, the idea of nanotechnology, constructing


at the atomic level, is now a major research topic worldwide. Theoreticians have already come
up with designs for simple mechanical structures like bearings, hinges, gears and pumps, each
made from a few collections of atoms. These currently exist only as computer simulations, and
the race is on to fabricate the designs and prove that they can work.

Moving individual atoms around at will sounds like fantasy, but it's already been
demonstrated in the lab. In 1989, scientists at IBM used an electron microscope to shuffle 35
xenon atoms into the shape of their company's logo. Since then a team at IBM's Zurich labs has
achieved the incredible feat of creating a working abacus on the atomic scale.

C.U.Shah College of Engg. & Tech. - 17 -


Nanocomputers

Each bead is a single molecule of buckminsterfullerene (a buckyball), comprising 60


atoms of carbon linked into a football shape. The beads slide up and down a copper plate,
nudged by the tip of an electron microscope.

The Nanocomputer Dream Team wants to use these techniques to build an atomic
computer. Such a computer, they say can then be used to control simple molecular construction
machines, which can then build more complex molecular devices, ultimately giving complete
control of the molecular world.

The driving force behind the Dream Team is Bill Spence, publisher of Nanotechnology
magazine. Spence is convinced that the technology can be made to work, and has enlisted the
help of over 300 enthusiasts with diverse backgrounds - engineers, physicists, chemists,
programmers and artificial intelligence researchers. The whole team has never met, and
probably never will. They communicate by email and pool their ideas on the Web. There's only
one problem. Nobody is quite sure how to build a digital nanocomputer.

The most promising idea is rod logic, invented by nanotechnology pioneer Eric Drexler,
now chairman of the leading nano think tank The Foresight Institute. Rod logic uses stiff rods
made from short chains of carbon atoms. Around each rod sits a knob made of a ring of atoms.
The rods are fitted into an interlocking lattice, where each rod can slide between two positions,
and be reset by a spring made of another few atoms. Drexler has shown how to use such an
arrangement to achieve the effect of a conventional electronic transistor, where the flow of
current in one wire is switched on and off by current in a different wire. Once you have
transistors, you can build a NAND gate. From NAND gates you can construct every other
logic elements a computer needs.

C.U.Shah College of Engg. & Tech. - 18 -


Nanocomputers

Apart from the immensely difficult problem of physically piecing together these
molecular structures, massive calculations are required to determine if particular molecular
configurations are even possible. The Dream Team will perform these molecular simulation
calculations using metacomputing where each person's PC performs a tiny part of the overall
calculation, and the results are collated on the Web. There are already prototype tools for
experimenting with molecular configurations, such as NanoCAD, a freeware nano design system
including Java source code.

This may all sound like pie in the sky, but there's serious research and development
money being spent on nanotechnology. A recent survey counted over 200 companies and
university research groups working in the field. And April 1997 saw the foundation of Zyvex,
the world's first nanotechnology manufacturing company. Zyvex's goal is to build an assembler,
the basic element required for nanotechnology. The assembler will itself be a machine made
from molecules, fitted with atom sized tools for manipulating other molecules to build other
machines. It will also be capable of replicating itself from the materials around it.

While they may lack any actual working prototypes of their technology,
nanotechnologists are certainly not short of ideas. Once you have the ability to make molecular
machines, the possibilities are amazing and often bizarre. One idea is utility fog, where billions of
submicroscopic molecular robots each containing a nanocomputer are linked together to form a
solid mass. Controlled by a master nanocomputer, the robots could alter their configurations to
create any object you like.

C.U.Shah College of Engg. & Tech. - 19 -


Nanocomputers

Nanotechnology does come with one tiny drawback, however. What happens if a
molecular machine goes haywire, and instead of building, starts demolishing the molecules
around it? The world would quite literally fall apart.

Nanocomputers, if they ever appear, will be extraordinary things. But if, like most
computer systems, they have bugs, they could also be very nasty indeed.

C.U.Shah College of Engg. & Tech. - 20 -


Nanocomputers

5. APPLICATIONS OF NANOCOMPUTERS
5.1 NANOSPACE
Space science as long played a role in the research and development of advancing
technologies. Spacecraft are being launched, with hulls that are composed of carbon fibers, a
light weight high strength material. Combine that with smaller on board computers that perform
hundreds of times faster than computers used on spacecraft just a decade ago and one can see
the why of incredible advances in space exploration in just the past few years. The
advancements in material science and computer science have allowed the building, launching
and deploying of space exploration systems that continually do more and more as they become
smaller and lighter.

Some of the latest avenues being explored, that are more in the nano realm, in space
science, include smart materials for the hulls of spacecraft. These would be materials primarily
composed of nanotube fibers with nano sized computers integrated into them. These materials
along with being even lighter will also be far stronger too. One idea is to create a surface that
will help transfer the aerodynamic forces working on a spacecraft during launch. When the craft
is launched the nano computers will flex the crafts hull to offset pressure differences in the hull
caused by the crafts acceleration through the atmosphere. Then the same nano computer
network in the hull would go to work heating the shaded side of the craft and cooling the sun
exposed side and to even create heat shielding for reentry. To equalize the surface temperature
now, a spacecraft must be kept rotating and although a slight spin is good in maintaining the
attitude of a craft somtimes it interferes with the mission plan, like when a spacecraft is taking
photographs or is in the process of docking with another craft, also now upon reentry
spacecraft have to be oriented just right. With a smart material hull ablationing materials could
be gathered in real time obviating any crucial departures in mission landing plans.

Another avenue being investicated is a concept of nano robotics called


"Swarms". Swarms are nano robots that act in unison like bees. They theoretically, will act as
a flexible cloth like material and being composed of what's called Bucky tubes, this cloth will be
as strong as diamond. Add to this cloth of nano machines nano computers and you have smart
cloth. This smart cloth could be used to keep astronauts from bouncing around inside their
spacecraft while they sleep, a problem that arises when the auto pilot computer fires the course
correction rockets. The cloth like material will be able to offset the sudden movements and
slowly move the sleeping astronaut back into position. Still another application for the nano
robot swarms, being considered, is that the smart cloth could be used in the astronauts space
suits. This material will not only be capable of repairing itself quickly or controlling the
environment inside the suit but it will be able to communicate with it's wearer what it is doing
and what's going on outside the suit. On the planet Mars for example a suit made of smart cloth
could extract oxygen from the carbon dioxide in the atmosphere for the wearer. The same suit
could extract solar energy to power the suit.

C.U.Shah College of Engg. & Tech. - 21 -


Nanocomputers

This suit would also literally be a life saver on Earth. Imagine a fire fighter wearing a suit
that could extract Oxygen from the environment he is in. "Foam Swarms" not even as a suit
but sprayed from a container about like the average sized hand held fire extinguisher could be
used to extract and store dangerous toxins and flamables. The smart foam under the control of a
fire fighter would act as a portable environment that would engulf any victims found, protecting
them from heat and toxic gas, and supply them with oxygen. The Smart foam would be able to
shape some of itself into a suit for the victim and begin to monitor the victims vitals and even be
able to report to an on site, or by wireless satellite communication, off site medic or doctor the
condition of the victim including broken bones etc. The smart suit could even upon sensing a
broken bone begin to reinforce and create on the spot a cast, a cast that would be able to act
on the damaged bone so the victim could walk out on a broken leg. The smart foam would also
be able to utilize different stratagies to dissipate heat, for example, it could shape itself into a
radiator so as to dump heat away from the fire fighters and victims.

A space suit is nothing more nor less than an incredible space ship itself so this same smart
cloth could be the super structure of a deep space probe replete with an on board A.I computer
capable of creating the science experiments needed enroute to its destination and capable of not
only making changes in mission plans but creating even new experiments as they are needed or
wanted. The same super explorer could even create its own solar energy gathering panels if
appropriate or utilizing R.T.G technology with plutonium also it will be able to repair itself. And
while all of the above is going on the craft could even expand it's own computing capabilities if
need be.

Another application of nano robots would be in carrying out construction projects in hostile
environments, for example with just a handfull, of self replicating robots, utilizing local materials,
and local energy it's conceivable that space habitats can be completely constructed by remote
control so that the inhabitants need only show up with their suitcases. Colonization of space
begins to make economic sense then, since it would only take one saturn type rocket to create a
complete space colony on mars, for example. An engineer or a team of engineers could check
up on the construction of the habitat via telepresents utilizing cameras and sensors created on
the surface of Mars by the nano bots all from the comfortable confines of Earth. Then once the
habitat is complete humans can show up to orchestrate the expansion of the exploration. Venus
could be explored with nano robots too. Super Hulls could be fashioned by nano robots to
withstand the intense pressures and corrosive gases of the venusian atmosphere, to safely house
nano robot built sensors and equiptment. The potential in all of this is getting a lot more space
exploraton accomplished with less investment of resources and a lot less danger to human
explorers.

C.U.Shah College of Engg. & Tech. - 22 -


Nanocomputers

5.2 CANCEL CANCER

1)
Provide a brief history of the science that led to the development of the
technical application the team has selected. Explain the problem your
team has selected.

Nanotechnology was first proposed on December 29th 1959 by Richard Feynman in his
speech, "There's Plenty of Room at the Bottom." He gave this speech at the annual meeting of
the American Physical Society at the California Institute of Technology. In his speech he stated
" I am not afraid to consider the final question as to whether, ultimately-in the great
future-we can arrange the atoms the way we want; the very atoms, all the waydown!"
Feynman stated that the laws of physics do not prevent us from manipulating individual atoms or
molecules. The only limitation they had was no appropriate method to do so. He also predicted
that at some point in time, the methods for atomic manipulation would be available. Feynman
declared that we should use the bottom-up approach instead of the top-down approach. The
top-down approach would modify the shape and size of materials to meet specific needs for
assembly of these materials. The bottom-up approach would produce components made of
single molecules held together by covalent bonds, which are much stronger than the forces that
hold together macro-components. By using the bottom-up approach, the amount of information
that can be stored in devices would be incredible.

However, it wasn‟t until 1974 that Norio Taniguchi created the term “Nanotechnology”
at the University of Tokyo. Taniguchi distinguished engineering at the micrometer scale from a
new, sub-micrometer level, which he called "nano-technology."

In 1981, Gerd Binnig and Heinrich Rohrer invented the Scanning Tunneling
Microscope (STM) at IBM‟s Zurich Research Laboratory and later created the Atomic Force
Microscope in 1986. With the discovery of these two inventions they could not only
photograph individual atoms, but could actually move the atom around. In addition, John Foster
of IBM Almaden labs was able to spell “IBM” with 35 xenon atoms on a nickel using Scanning
Tunneling Microscope to control the atoms.

The investigation of nanotechnology expanded in 1985 when Professor Richard


Smalley, Robert Curl, Jr., and Harold Kroto discovered a new form of carbon, which was a
molecule of sixty carbon atoms. This supercarbon has become one of a growing number of
building blocks for a new class of nano-sized products. Then, eleven years later, they won the
Nobel Prize.

C.U.Shah College of Engg. & Tech. - 23 -


Nanocomputers

In 1986, Eric Drexler wrote Engines of Creation. It dealt with the concept of
molecular manufacturing and included a full-length examination of the possibilities of
nanotechnology. He proposed the idea of universal assemblers, robotic type machines, which
form the basis of molecular manufacturing. Therefore, it will allow us to build anything atom-by-
atom, molecule by molecule, using an “assembler” that is controlled by a computer to move
the atoms where you desire. As he began to explain this theory, he stated “In biology,
enzymes stick molecules together in new patterns all the time. In every cell, there are
programmable, numerically controlled machine tools that manufacture protein
molecules according to genetic instructions. If you can build one molecular machine,
then you can use it to build an even better molecular machine.”

In the spring of 1988, Eric Drexler taught the first formal course in nanotechnology while
visiting Stanford University. He suggested the possibility of nano-sized objects that are self-
duplicating robots or nanomachines that would roam around in the body killing cancer cells.

Drexler received a doctorate degree in the field of nanotechnology from MIT in 1991.
In 1992, he published another book called Nanosystems to provide a graduate level
introduction to the physical and engineering standards of the field.

The publication of an issue of Scientific American that focused on nanotechnology in


September 2001 is considered a milestone. The June 15 and July 1 issues of Red Herring were
also considered milestones.

There were many scientific advances that led to the concept of nanotechnology. Many
contributors helped with the theory, such as Feyman giving his speech or Drexler teaching a
course. Nanotechnology will have many more improvements in the future and will benefit our
society

2)
Identify two scientists or engineers that have made major
contributions to this technical application.
Shuming Nie has made a very major contribution to the field where medicine and
nanotechnology meet. Nie‟s work has shown that Nano particles can be used to detect cancer
at its early stages. The use of his latest technology on molecules that are linked with cancer may
be the key to improved cancer detection, which in turn will save thousands of lives.

Nie‟s latest technology involves the use of quantum dots. He work, in a way, color-
codes biological molecules, such as genes and proteins, allowing doctors or physicians to see
and identify the exact location of selected molecules in the cells and tissues of a living person.
This technology works through several steps and series. The fluorescent quantum dots that
conduct little electricity are implanted inside micron-sized polystyrene beads. The different
colors are produced because of the varying sizes and quantities of the dots that are embedded
into the beads. Shuming Nie then attaches these beads to biological molecules such as

C.U.Shah College of Engg. & Tech. - 24 -


Nanocomputers

antibodies or proteins and then applies them to cells and tissue samples in the laboratory. These
antibodies attached to the beads will then hold on or stick to specific molecules. This makes
identifying the location and figuring out the number of these molecules that are present easier.
This technique, when targeted at cancer cells, allows us to see whether or not cancer cells are
present.

However, Drexler cannot be given all the credit, for he learned from the best, Richard
Feyman. Drexler was a one-time student of Feyman. Drexler is considered to be the founding
father of nanotechnology, as we know it today because he had many ideas concerning
molecular engineering and manufacturing. The ideas of molecular engineering brought up by
Drexler are mentioned in a paper of his named “Engines of Creation.” It is in this paper that
Drexler laid down the possible foundations of nanotechnology.

Linking every-day objects to atoms and molecules on a molecular level may help us to
understand Drexler‟s concepts better. One can visualize an atom to be a physical object, such
as a marble. A molecule which is quite complex is a clump of atoms that are joined together or
linked. This fairly complex molecule can be considered as a group of marble joining to be the
size of a fist. According to the different chemical properties of the various atoms, their bonds
will “snap” and “unsnap.” The molecule‟s shape is formed similarly to how children build toys
with things like Erector Sets. The molecule‟s functions and shape will be familiar things such as
levers, gears, motors, and pulleys.

Drexler proposed the supposition that a submicroscopic device could be created from
some type of atoms. This device would have a robotic limb controlled by a computer, that
would be able to move around atoms and position them exactly and precisely where the robot
wanted them to be. This type of robotic limb is called an “assembler”. These “assemblers” are
similar to enzymes in biology that adhere molecules together in new patterns all the time. If one
of these molecular devices could be built, then that device could be used to build even better
ones.

These molecular machines are made so precisely, down to the most minuscule details.
Because their parts are so much smaller than the everyday things that we are used to, they are a
million times faster than the moving parts that we are familiar with. Even though a single
nanomachine would be incredibly fast and precise, they could not be able to change anything for
something large like the human, due to the fact that nanomachines are so small. This is why
many of these little nanomachines would be needed at the same time in order to do something
for a human being. The way we would get so many of these machines is by programming the
machines how to reproduce or replicate themselves.

One molecular machine alone can replicate itself in an expedient manner by with simply
fuels and some other raw materials which can be found in the first machine itself. The new
molecular machi robot wanted them to be. This type of robotic limb is called an “assembler”.
These “assemblers” are similar to enzymes in biology that adhere molecules together in new

C.U.Shah College of Engg. & Tech. - 25 -


Nanocomputers

patterns all the time. If one of these molecular devices could be built, then that device could be
used to build even better ones.

These molecular machines are made so precisely, down to the most minuscule details.
Because their parts are so much smaller than the everyday things that we are used to, they are a
million times faster than the moving parts that we are familiar with. Even though a single
nanomachine would be incredibly fast and precise, they could not be able to change anything for
something large like the human, due to the fact that nanomachines are so small. This is why
many of these little nanomachines would be needed at the same time in order to do something
for a human being. The way we would get so many of these machines is by programming the
machines how to reproduce or replicate themselves.

One molecular machine alone can replicate itself in an expedient manner by with simply
fuels and some other raw materials which can be found in the first machine itself. The new
molecular machine would then be able to replicate itself too. With each new nanomachine
replicating itself, there would be a whole group of nanomachines that can then carry out
complex procedures. These procedures include making anything imaginable. These
nanomachines can make almost anything as long as they have enough fuels and raw materials
with energy costs that are minimized.

Drexler also came up with an idea on building nanocomputers. His nanocomputers


would not work electrically, but it would have several mechanical parts in motion. These
computers would be reasonably quicker working than the computers of today. It will be faster
due to the fact that the information inside the computer only has to move such tiny spaces.
Therefore, these nanocomputers will be able to understand one billion instructions a second.
Drexler says, “Eventually the whole integrated circuit technology base is going to be replaced.”

In addition, Drexler foresaw the great things that nanotechnology can help accomplish in
the field of medicine. The highly intelligent nanocomputers that were before mentioned can enact
a most important role in many areas of medicine. Software composed of artificial intelligence
would allow surgical procedures to be incredibly precise on a molecular level. However, the
most interesting thing is that Drexler himself said that the simplest use of nanotechnology in
medicine would be the destruction of selective cells or other things in the body, which is akin to
the project our team has selected. We are using nanotechnology to destroy only cancerous cells
and not healthy ones.

3)
Describe in detail how the technical application the team selected
impacts the problem outlined in Component One.
The use of Cancel Cancer will dramatically decrease the death rate of people all over
the world. It will save many lives and raise the hopes of infected patient‟s families and friends.
Cancel Cancer will replace chemotherapy and other means of treating cancer. Some benefits of
using Cancel Cancer compared to chemotherapy are that it won‟t leave the patient with a

C.U.Shah College of Engg. & Tech. - 26 -


Nanocomputers

weakened immune system or extremely exhausted. Also, it will only kill cancerous cells and not
harm the healthy cells. However, by simply drinking a solution containing nano devices, the side-
effects of chemotherapy will be diminished.

4)
Identify the limitations and benefits of using the technical application
the team has selected to solve the problem.
The nano device that we plan to build will bring many benefits to not just medicine, but
to millions of people living on this Earth. First of all, just the hope of early cancer detection and
better ways of treatment can brighten the spirits of the many cancer patients and their close
ones. However, this device is not only going to lift people‟s hopes, but it will enhance cancer
treatment greatly.

Since our device will allow doctors to single out only cancerous cells, it will not destroy
the healthy ones. This will be a big progression in the treatment of cancer because one of the
most popular treatments for cancer was chemotherapy. In chemotherapy, radiation was would
kill the cancerous cells in the body, but the healthy ones would also be destroyed. This would
leave patients tired and extremely weak. It would also leave the patient with a weakened
immune system, when their immune system was very weak due to cancer in the first place.
However, with the precise destruction of only cancerous cells, this could be prevented or at
least greatly decreased. Also, in chemotherapy, some people would have the side-affects with
no benefits. They would be incredibly exhausted and there would be no improvements in their
cancer. Our project will dispose of these troubles and cancer will no longer be an incurable
disease, but will become more like an extinct disease.

For chemotherapy, a patient is required to visit a clinic everyday, which is very


expensive. Our project will remove these hassles. Once you‟ve been treated, you‟re completely
cured and there is no need to go back and forth.

Although there are many benefits to curing cancer with nanotechnology, there are also
many limitations. More than 1700 people die from cancer every day in the United States and
Canada. With our new invention, many more people will be cured from cancer, leaving the
population to dramatically increase. When the population rises, there will be overcrowding in a
world that is already overcrowded. There won‟t be enough food, water, or resources for
humans to survive.

In addition, there are thousands of chemotherapists who help with the curing of cancer.
These chemotherapists will most likely lose their job with the making of our invention. Many
companies will go out of business due to the fact that our invention will be an improved and
better way to treat cancer.

C.U.Shah College of Engg. & Tech. - 27 -


Nanocomputers

There is also the possibility that our machine will malfunction and even cause harm to the
patient. This will cause enormous dilemmas because it is placing the patient‟s life in grave
danger. If the patient is injured or dies, the tragedy will most definitely affect the patient‟s family
and friends.

5.3 BIO-NANOTECHNOLOGY

Implications for Designing More Effective Tissue Engineering Materials

Nanotechnology can be defined as using materials and systems whose structures and
components exhibit novel and significantly changed properties by gaining control of structures at
the atomic, molecular, and supramolecular levels. Although many advanced properties for
materials with constituent fiber, grain, or particle sizes less than 100 nm have been observed for
traditional science and engineering applications (such as in catalytic, optical, mechanical,
magnetic, and electrical applications), few advantages for the use of these materials in tissue-
engineering applications have been explored. However, nanophase materials may give
researchers control over interactions with biological entities (such as proteins and cells) in ways
previously unimaginable with conventional materials. This is because organs of the body are
nanostructures and, thus, cells in the body are accustomed to interacting with materials that have
nanostructured features. Despite this fact, implants currently being investigated as the next-
generation of tissue-engineering scaffolds have micron-structured features. In this light, it is not
surprising why the optimal tissue-engineering material has not been found to date.

Over the past two years, Purdue has provided significant evidence to the research
community that nanophase materials can be designed to control interactions with proteins and
subsequently mammalian cells for more efficient tissue regeneration. This has been demonstrated
for a wide range of nanophase material chemistries including ceramics, polymers, and more
recently metals. Such investigations are leading to the design of a number of more successful
tissue-engineering materials for orthopedic/dental, vascular, neural, bladder, and cartilage
applications. In all applications, compared to conventional materials, the fundamental design
parameter necessary to increase tissue regeneration is a surface with a large degree of
biologically-inspired nanostructured roughness. In this manner, results from the present
collection of studies have added increased tissue-regeneration as another novel property of
nanophase materials.

5.4 NANOMETROLOGY

C.U.Shah College of Engg. & Tech. - 28 -


Nanocomputers

Nanometrology involves high precision measurement techniques combined with nano-


positioning systems to measure:

Capabilities and applications in nanometrology are based on Differential Capacitance


Micrometry.Current applications in Nanometrology are :

Precision Deformation Measurement

 Tectonics

 Mining

Precision Displacement Measurement

 Gravity Gradiometer

The Capacitance Micrometry technology was originally developed in the mid seventies
and uses relative position measurement within a locally defined reference standard. It can be
configured to allow picometre resolution over a one hundred micron range or used at lower
resolution over larger dynamic range.

The current measurement systems feature are :

Ratiometric resolution to approximately 1 in 108 of the selected range reference length

 Worst case non linearity against laser reference is < 10 -5 over 0.4 of reference range

 Large dynamic range e.g. 10mm-100µm

 Long term stability - high repeatability

 Allows active feedback for positioning

 No high vacuum requirement

5.5 EARTH STRAIN MEASUREMENT


A Nanometrics Instrumentation Application

 To apply short and long term monitoring of mining induced strain variations at selected
points at underground or opencut mining operations.

 To use the strain measurements to predict rockmass response to mining, eg


( pit slope stability, subsidence).

C.U.Shah College of Engg. & Tech. - 29 -


Nanocomputers

The Earth Strain Measurement Group provides precision strain monitoring systems for long
term monitoring of mining induced strain variations at selected points on the mineplan.
Instruments allow continuous monitoring of tensor plane strain within the range of 10-3 to 10-9.
The technology was originally developed for earthquake strain monitoring applications requiring
extremely high sensitivity, stability and dynamic range, but is now used in minescale monitoring
environments.

This technique will be used to measure loads induced in highwall mining or in the walls of
deep open pit mining operations. Key advantages include the ability to measure the mine scale
engineering induced strain response of large structures from significant distances, that loads
induced by slow creep processes over large areas can be monitored, that long term slow
deformations can be monitored with high reliability, and that elastic failure processes can be
monitored. The operations can be performed remotely without any disturbance to mining
processes. Direct estimates of the effects of blasts on wall loading can be measured, as can the
subsequent creep and slump processes. For mining applications strain monitoring complements
microseismic monitoring which more than adequately documents (location and amplitude) the
elastic failure and stress concentration processes.

5.6 AIRBORNE GRAVITY MEASUREMENT

Geophysical methods capable of measuring the acceleration due to the Earth's


gravitational field are amongst the earliest applications of the geophysical sciences. Gravity
surveying is one of the important techniques in modern exploration for nearly all mineral and
petroleum commodities. The significance of this method has increased in recent times and will
continue to do so in the future as major advances in satellite positioning technology provide cost
effective access to surface surveys over much larger regions than previously possible.

The exploration industry has recently renewed interest in large scale gravity surveys and
has developed a greater appreciation of the contribution of these data sets. The need for
acquisition of large gravity data sets at high speed over highly prospective areas is renewing
demand for airborne gravity facilities capable of achieving measurements at an accuracy suitable
for detection of small and lenticular mineral orebodies. Gravity gradiometry provides the best
opportunity to achieve this accuracy and can be performed at sampling rates necessary for
targets of industry interest, on the order of a few hundreds of metres.
This project will develop an airborne gravity gradiometer which will be capable of
providing measurements from low flying aircraft at a rate and sensitivity suitable for the detection
of buried orebodies down to a scale of approximately 300 m at burial depths of 200 m. The
measurements will be integrated into other geophysical measurements from the same or other
airborne platforms to enhance exploration capability.

The project has a planned duration of five years with a budget of $15 million dollars.
The project involves high risk research with great potential impact in the industry. The project

C.U.Shah College of Engg. & Tech. - 30 -


Nanocomputers

aims at detection of geophysically significant subsurface anomalies potentially associated with


ore bodies or hydrocarbon deposits by rapid vehicle mounted surface or airborne regional
gravitational studies. The existence of gravitational anomalies depends directly on the presence
of a mass excess or deficit associated with the deposit.

The magnitude of a typical anomaly relative to the unperturbed gravity field is


proportional to the total mass excess (or deficit), and is inversely proportional to the square of
the distance from its effective centre and the point of observation.
It is not possible in principle to distinguish between the accelerations acting on a body due to
gravitational effects from those due to kinematic effects associated with changes of the body's
velocity. Thus most gravity measurement is performed from stationary platforms fixed to the
earth surface, and its precision is limited by vibration noise sources common in the earth. The
gravitational anomaly of an ore body of density contrast 300 kg m-3 and of dimension say 200 m
buried below a depth of say 100 m of overburden is typically 2x10 -6ms-2, which is 0.00002% of
the normal Earth gravity field. This relatively small effect is normally measured in units of micro
gals ( mGal ), and would represent approximately 200 mGal.

To this time most resource significant measurements have been made using instruments
of the LaCoste/Romberg type which are essentially ultrasensitive spring balances detecting the
small difference in weight caused by the gravity anomaly. The measurements are subject to a
wide variety of environmental influences, and measurements must be performed relative to a
standard point that is used regularly during the survey as a fixed reference for the removal of
drifts in the instrument. With great care, measurements over reasonable areas can be achieved
to about 5 mGals, making this technology appropriate for mapping regions of known potential.
The procedure is slow, and requires extensive information on local topography and geology by
reason of the fact that the normal variation of gravity with height is approximately 300 mGal per
metre. This type of relative gravity instrument has in fact been used with great difficulty from
moving platforms and in particular from aircraft where altitude control using for example
precision radar altimeters and pressure sensors to achieve vertical position to as little as one
metre still imposes limitations of the order of a few hundred mGals on the gravity data.

For this reason emphasis for large scale geophysical prospecting has moved towards
gradiometry. In principle, measurement of the gradient of the gravity field over a known baseline
allows one to cancel out the accelerations due to the motion of the platform itself. Gradient
measurements also have some advantages in detection of boundaries of anomalies.

The vertical component of the gradient above the orebody discussed above and
measured from an aircraft at approximately 300m is approximately 1x10-9 ms-2 per metre, which
is 1 Eotvos. Thus the Eotvos is a unit of gravity gradient, and 1 Eotvos corresponds to 10 -9 s-2.
The gradient would be eight times larger at the earth's surface. For a gradiometer, the vertical
dependence of the gradient is smaller than for a gravimeter, so that precise control of aircraft
altitude is not a critical issue.

C.U.Shah College of Engg. & Tech. - 31 -


Nanocomputers

Useful gravity gradient data for exploration will require measurements below the 1
Eotvos level. This will certainly require active stabilisation of the instrument platform for
displacements at a level of about 0.01ms-2Hz-1/2 vertical, and rotations better than 10-5 rad s-1Hz-
1/2
. This is certainly possible on a quality stabilised platform.

Many major laboratories have been involved in gradiometer research over the last
fifteen years. One major direction of this work has been towards superconducting gravimeters
(relative and gradiometric) utilising many somewhat exotic but benign characteristics of materials
obtainable at liquid helium temperatures. The instruments are essentially superconducting
versions of the spring or differential spring gravimeters where the mechanical springs have been
replaced by magnetic field levitation. Stability is obtained by the inherent stability of persistent
currents which support the superconducting proof mass. Commercial versions of the gravimeter
with excellent long term stability (5 mGal per year) and sensitivity better than a mGal are
available but cannot be used from moving platforms.

Another direction of research has produced instruments similar to the Bell Aerospace
Rotating Gravity Gradiometer. Generically these instruments consist of precisely matched
accelerometer pairs which are rotated about an axis to produce outputs which are modulated at
harmonics of the rotation frequency by the gradient being measured. These outputs are
effectively differenced to produce a gradient. The US Air Force GGSS (Gravity Gradiometer
Survey System) used two orthogonal pairs of accelerometers to produce two gradients. Three
such systems mounted in mutually orthogonal configuration provided six gradients, which is
sufficient to fully determine the gradient tensor. The system of three gradiometers was inertially
stabilised by three gymbals controlled by two 2-degrees-of-freedom gyroscopes and three
orthogonal accelerometers. When in use air-borne in a specially equipped C-130 transport,
navigation was performed by the autopilot using the inertial outputs of the measurement
platform. Gradients have been measured to a few tens of Eotvos units under ideal conditions.
Initially the CSIRO system proposed will measure only the vertical gravity gradient. The
work is proceeding with eight full time staff, and the project will be performed in four stages
over four years, for completion in 2005.

C.U.Shah College of Engg. & Tech. - 32 -


Nanocomputers

6. CONCLUSION
The correct scientific answer is I don't know.

Having said that, it is worth pointing out that the trends in the development of computer
hardware have been remarkably steady for the last 50 years. Plotted on semilog paper as a
function of year, such parameters as

 the number of atoms required to store one bit

 the number of dopant atoms in a transistor

 the energy dissipated by a single logic operation

 the resolution of the finest machining technology

 many others
have all declined with remarkable regularity, even as the underlying technology base has
changed dramatically. From relays to vacuum tubes to transistors to integrated circuits to Very
Large Scale Integrated circuits (VLSI) we have seen steady declines in the size and cost of logic
elements and steady increases in their performance.
If we extrapolate these trends we find they reach interesting values in the 2010 to 2020
time frame. The number of atoms required to store one bit in a mass memory device reaches 1.
The number of dopant atoms in a transistor reaches 1, (while fundamental device physics might
force us to use more than one dopant atom, it's clear that some not-too-large integer number
should suffice). The energy dissipated by a single logic operation reaches kT for T=300 kelvins;
this is roughly the energy of a single air molecule bouncing around at room temperature. The
finest machining technologies reach a resolution of roughly an atomic diameter.

C.U.Shah College of Engg. & Tech. - 33 -


Nanocomputers

Such performance seems to require a manufacturing technology that can arrange


individual atoms in the precise structures required for molecular logic elements, connect those
logic elements in the complex patterns required for a computer, and do so inexpensively for
billions of billions of gates. In short, if we're to keep the computer hardware revolution on
schedule then it seems we'll have to develop nanotechnology in the 2010 to 2020 time frame.

Of course, extrapolating straight lines on semilog paper is a philosophically debatable


method of technology forecasting. While we can confidently state that no fundamental law of
nature prevents us from developing nanotechnology on this schedule (or even faster), there is
equally no law of nature that says the trends of the past must continue unchanged into the future,
or that this schedule will not slip. For example, while Babbage proposed the stored program
computer in the 1830's, it was about a century before anyone actually built one.

In 1993 the author, as co-chair of the Third Foresight Conference on Molecular


Nanotechnology, asked the assembled attendees how long they thought it would take to
develop nanotechnology, as defined here. By show of hands, answers in the range from 2010 to
2040 predominated (about two thirds of the audience).

Regardless of what extrapolation of trends or polls might suggest, we should keep firmly
in mind that how long it takes depends on what we do (or don't do). A focused effort with
resources appropriate to the magnitude of the task would speed development. If we do little, or
focus resources on short term goals, fundamental developments might be much delayed (just as
Babbage's computer was delayed by a century). To quote Alan Kay:

"The best way to predict the future is to invent it."

C.U.Shah College of Engg. & Tech. - 34 -


Nanocomputers

7. BIBLIOGRAPHIC NOTES

7.1 REFERENCES

 A true myth – NanoComputers Std. IEEE Author. James Peterson


 Simple guide to Nanocomputing pub. Delhi Author Robert Wullmon
 Basics of Quantum computing Std. IEEE Author George Killjt
 Applied Quantum + Nano Author Rafter James

7.2 RELATED WEBSITES

 How Stuff Works.com


 IEEE standards.com
 Nanospace.com
 Google.co.in

C.U.Shah College of Engg. & Tech. - 35 -


Nanocomputers

 Nanoimages.com

------------------------------------------------

C.U.Shah College of Engg. & Tech. - 36 -

You might also like