Tuesday, May 1, 2007

Nanotechnology



Nanotechnology
Nanotechnology is a field of applied science and technology covering a broad range of topics. The main unifying theme is the control of matter on a scale smaller than 1 micrometer, normally between 1-100 nanometers, as well as the fabrication of devices on this same length scale. It is a highly multidisciplinary field, drawing from fields such as colloidal science, device physics, and supramolecular chemistry. Much speculation exists as to what new science and technology might result from these lines of research. Some view nanotechnology as a marketing term that describes pre-existing lines of research applied to the sub-micron size scale.

Despite the apparent simplicity of this definition, nanotechnology actually encompasses diverse lines of inquiry. Nanotechnology cuts across many disciplines, including colloidal science, chemistry, applied physics, materials science, and even mechanical and electrical engineering. It could variously be seen as an extension of existing sciences into the nanoscale, or as a recasting of existing sciences using a newer, more modern term. Two main approaches are used in nanotechnology: one is a "bottom-up" approach where materials and devices are built from molecular components which assemble themselves chemically using principles of molecular recognition; the other being a "top-down" approach where nano-objects are constructed from larger entities without atomic-level control.

The impetus for nanotechnology has stemmed from a renewed interest in colloidal science, coupled with a new generation of analytical tools such as the atomic force microscope (AFM) and the scanning tunneling microscope (STM). Combined with refined processes such as electron beam lithography and molecular beam epitaxy, these instruments allow the deliberate manipulation of nanostructures, and in turn led to the observation of novel phenomena. The manufacture of polymers based on molecular structure, or the design of computer chip layouts based on surface science are examples of nanotechnology in modern use. Despite the great promise of numerous nanotechnologies such as quantum dots and nanotubes, real applications that have moved out of the lab and into the marketplace have mainly utilized the advantages of colloidal nanoparticles in bulk form, such as suntan lotion, cosmetics, protective coatings, and stain resistant clothing.

History of nanotechnology
Although nanotechnology is a realtively recent development in scientific research, the development of its central concepts happened over a longer period of time.

Pre-Nanotechnology
Humans have unwittingly employed nanotechnology for thousands of years, for example in making steel and in vulcanizing rubber. Both of these processes rely on the properties of stochastically-formed atomic ensembles mere nanometers in size, and are distinguished from chemistry in that they don't rely on the properties of individual molecules. But the development of the body of concepts now subsumed under the term nanotechnology has been slower.

The first mention of some of the distinguishing concepts in nanotechnology (but predating use of that name) was in 1867 by James Clerk Maxwell when he proposed as a thought experiment a tiny entity known as Maxwell's Demon able to handle individual molecules.

In the 1920's, Irving Langmuir and Katharine B. Blodgett introduced the concept of a monolayer, a layer of material one molecule thick. Langmuir won a Nobel Prize in chemistry for his work.

Conceptual origins
The topic of nanotechnology was again touched upon by "There's Plenty of Room at the Bottom," a talk given by physicist Richard Feynman at an American Physical Society meeting at Caltech on December 29, 1959. Feynman described a process by which the ability to manipulate individual atoms and molecules might be developed, using one set of precise tools to build and operate another proportionally smaller set, so on down to the needed scale. In the course of this, he noted, scaling issues would arise from the changing magnitude of various physical phenomena: gravity would become less important, surface tension and Van der Waals attraction would become more important, etc. This basic idea appears feasible, and exponential assembly enhances it with parallelism to produce a useful quantity of end products. At the meeting, Feynman announced two challenges, and he offered a prize a $1000 for the first individuals to solve the each one. The first challenge involved the construction of a nanomotor, which, to Feynman's surprise, was achieved by November of 1960 by William McLellan. The second challenge involved the possibility of scaling down letters small enough so as to be able to fit the entire Encyclopedia Britannica on the head of a pin; this prize was claimed in 1985 by Tom Newman.[1]

In 1965 Gordon Moore observed that silicon transistors were undergoing a continual process of scaling downward, an observation which was later codified as Moore's law. Since his observation transistor minimum feature sizes have decreased from 10 micrometers to the 45-65 nm range in 2007; one minimum feature is thus roughly 180 silicon atoms long

The term "nanotechnology" was first defined by Tokyo Science University, Norio Taniguchi in a 1974 paper (N. Taniguchi, "On the Basic Concept of 'Nano-Technology'," Proc. Intl. Conf. Prod. Eng. Tokyo, Part II, Japan Society of Precision Engineering, 1974.) as follows: "'Nano-technology' mainly consists of the processing of, separation, consolidation, and deformation of materials by one atom or one molecule." Since that time the definition of nanotechnology has generally been extended upward in size to include features as large as 100 nm. Additionally, the idea that nanotechnology embraces structures exhibiting quantum mechanical aspects, such as quantum dots, has been thrown into the definition.

Also in 1974 the process of atomic layer deposition, for depositing uniform thin films one atomic layer at a time, was developed and patented by Dr. Tuomo Suntola and co-workers in Finland.

In the 1980s the idea of nanotechology as deterministic, rather than stochastic, handling of individual atoms and molecules was conceptually explored in depth by Dr. K. Eric Drexler, who promoted the technological significance of nano-scale phenomena and devices through speeches and the books Engines of Creation: The Coming Era of Nanotechnology and Nanosystems: Molecular Machinery, Manufacturing, and Computation, (ISBN 0-471-57518-6). Drexler's vision of nanotechnology is often called "molecular nanotechnology" (MNT) or "molecular manufacturing," and Drexler at one point proposed the term "zettatech" which never became popular.

Experimental advances
Nanotechnology and nanoscience got a boost in the early 1980s with two major developments: the birth of cluster science and the invention of the scanning tunneling microscope (STM). This development led to the discovery of fullerenes in 1986 and carbon nanotubes a few years later. In another development, the synthesis and properties of semiconductor nanocrystals were studied. This led to a fast increasing number of semiconductor nanoparticles of quantum dots.

At present in 2007 the practice of nanotechnology embraces both stochastic approaches (in which, for example, supramolecular chemistry creates waterproof pants) and deterministic approaches wherein single molecules (created by stochastic chemistry) are manipulated on substrate surfaces (created by stochastic deposition methods) by deterministic methods comprising nudging them with STM or AFM probes and causing simple binding or cleavage reactions to occur. The dream of a complex, deterministic molecular nanotechnology remains elusive.

For the future, some means has to be found for MNT design evolution at the nanoscale which mimics the process of biological evolution at the molecular scale. Biological evolution proceeds by random variation in ensemble averages of organisms combined with culling of the less-successful variants and reproduction of the more-successful variants, and macroscale engineering design also proceeds by a process of design evolution from simplicity to complexity as set forth somewhat satirically by John Gall: "A complex system that works is invariably found to have evolved from a simple system that worked. . . . A complex system designed from scratch never works and can not be patched up to make it work. You have to start over, beginning with a system that works." [2] A breakthrough in MNT is needed which proceeds from the simple atomic ensembles which can be built with, e.g., an STM to complex MNT systems via a process of design evolution. A handicap in this process is the difficulty of seeing and manipulation at the nanoscale compared to the macroscale which makes deterministic selection of successful trials difficult; in contrast biological evolution proceeds via action of what Richard Dawkins has called the "blind watchmaker" [3] comprising random molecular variation and deterministic reproduction/extinction.

References
1. Gribbin, John. "Richard Feynman: A Life in Science" Dutton 1997, pg 170.
2.Gall, John, (1986) Systemantics: How Systems Really Work and How They Fail, 2nd ed. Ann Arbor, MI : The General Systemantics Press.
3. Richard Dawkins, The Blind Watchmaker: Why the Evidence of Evolution Reveals a Universe Without Design, W. W. Norton; Reissue edition (September 19, 1996)
Retrieved from "http://en.wikipedia.org/wiki/History_of_nanotechnology"

Origins

Main article: History of nanotechnology
The first mention of some of the distinguishing concepts in nanotechnology (but predating use of that name) was in "There's Plenty of Room at the Bottom," a talk given by physicist Richard Feynman at an American Physical Society meeting at Caltech on December 29, 1959. Feynman described a process by which the ability to manipulate individual atoms and molecules might be developed, using one set of precise tools to build and operate another proportionally smaller set, so on down to the needed scale. In the course of this, he noted, scaling issues would arise from the changing magnitude of various physical phenomena: gravity would become less important, surface tension and Van der Waals attraction would become more important, etc. This basic idea appears feasible, and exponential assembly enhances it with parallelism to produce a useful quantity of end products.

The term "nanotechnology" was defined by Tokyo Science University Professor Norio Taniguchi in a 1974 paper (N. Taniguchi, "On the Basic Concept of 'Nano-Technology'," Proc. Intl. Conf. Prod. Eng. Tokyo, Part II, Japan Society of Precision Engineering, 1974.) as follows: "'Nano-technology' mainly consists of the processing of, separation, consolidation, and deformation of materials by one atom or one molecule." In the 1980s the basic idea of this definition was explored in much more depth by Dr. K. Eric Drexler, who promoted the technological significance of nano-scale phenomena and devices through speeches and the books Engines of Creation: The Coming Era of Nanotechnology and Nanosystems: Molecular Machinery, Manufacturing, and Computation, (ISBN 0-471-57518-6), and so the term acquired its current sense.

Nanotechnology and nanoscience got started in the early 1980s with two major developments; the birth of cluster science and the invention of the scanning tunneling microscope (STM). This development led to the discovery of fullerenes in 1986 and carbon nanotubes a few years later. In another development, the synthesis and properties of semiconductor nanocrystals was studied. This led to a fast increasing number of metal oxide nanoparticles of quantum dots.


From Wikipedia, the free encyclopedia