InfoUSA Logo - U.S. Department of Statespacing image SEARCH >spacing imageSITE MAP >
 U.S. LIFE  navigation seperator image  U.S. EDUCATION  navigation seperator image  U.S. GOVERNMENT  navigation seperator image  U.S. MEDIA  navigation seperator image  U.S. ECONOMY  navigation seperator image  QUIZZES   navigation seperator image  GUIDED TOURS

U.S. ECONOMY > American Industries > Healthcare and Medicine > Science and Medicine: A Republic of Science

A Republic of Science

Inquiry and innovation in science and medicine

The United States came into being during the Age of Enlightenment (circa 1680 to 1800), a period in which writers and thinkers rejected the superstitions of the past. Instead, they emphasized the powers of reason and unbiased inquiry, especially inquiry into the workings of the natural world. Enlightenment philosophers envisioned a "republic of science," where ideas would be exchanged freely and useful knowledge would improve the lot of all citizens.

From its emergence as an independent nation, the United States has encouraged science and invention. It has done this by promoting a free flow of ideas, by encouraging the growth of "useful knowledge," and by welcoming creative people from all over the world.

The United States Constitution itself reflects the desire to encourage scientific creativity. It gives Congress the power "to promote the progress of science and useful arts, by securing for limited times to authors and inventors the exclusive right to their respective writings and discoveries." This clause formed the basis for the U.S. patent and copyright systems, which ensured that inventions and other creative works could not be copied or used without the creator's receiving some kind of compensation.


In the early decades of its history, the United States was relatively isolated from Europe and also rather poor. Nonetheless, it was a good place for science. American science was closely linked with the needs of the people, and it was free from European preconceptions.

Two of America's founding fathers were scientists of some repute. Benjamin Franklin conducted a series of experiments that deepened human understanding of electricity. Among other things, he proved what had been suspected but never before shown: that lightning is a form of electricity. Franklin also invented such conveniences as bifocal eyeglasses and a stove that bears his name. (The Franklin stove fits into a fireplace and circulates heat into the adjacent room.)

Thomas Jefferson was a student of agriculture who introduced various types of rice, olive trees, and grasses into the New World. He stressed the scientific aspect of the Lewis and Clark expedition (1804-06), which explored the Pacific Northwest, and detailed, systematic information on the region's plants and animals was one of that expedition's legacies.

Like Franklin and Jefferson, most American scientists of the late 18th century were involved in the struggle to win American independence and forge a new nation. These scientists included the astronomer David Rittenhouse, the medical scientist Benjamin Rush, and the natural historian Charles Willson Peale.

During the American Revolution, Rittenhouse helped design the defenses of Philadelphia and built telescopes and navigation instruments for the United States' military services. After the war, Rittenhouse designed road and canal systems for the state of Pennsylvania. He later returned to studying the stars and planets and gained a worldwide reputation in that field.

As surgeon general, Benjamin Rush saved countless lives of soldiers during the Revolutionary War by promoting hygiene and public health practices. By introducing new medical treatments, he made the Pennsylvania Hospital in Philadelphia an example of medical enlightenment, and after his military service, Rush established the first free clinic in the United States.

Charles Willson Peale is best remembered as an artist, but he also was a natural historian, inventor, educator, and politician. He created the first major museum in the United States, the Peale Museum in Philadelphia, which housed the young nation's only collection of North American natural history specimens. Peale excavated the bones of an ancient mastodon near West Point, New York; he spent three months assembling the skeleton, and then displayed it in his museum. The Peale Museum started an American tradition of making the knowledge of science interesting and available to the general public.

American political leaders' enthusiasm for knowledge also helped ensure a warm welcome for scientists from other countries. A notable early immigrant was the British chemist Joseph Priestley, who was driven from his homeland because of his dissenting politics. Priestley, who came to the United States in 1794, was the first of thousands of talented scientists who emigrated in search of a free, creative environment. Others who came more recently have included the German theoretical physicist Albert Einstein, who arrived in 1933; Enrico Fermi, who came from Italy in 1938 and who produced the world's first self-sustaining nuclear chain reaction; and Vladimir K. Zworykin, who left Russia in 1919 and later invented the television camera.

Other scientists had come to the United States to take part in the nation's rapid growth. Alexander Graham Bell, who arrived from Scotland by way of Canada in 1872, developed and patented the telephone and related inventions. Charles P. Steinmetz, who came from Germany in 1889, developed new alternating-current electrical systems at General Electric Company. Later, other scientists were drawn by America's state-of-the-art research facilities. By the early decades of the 20th century, scientists working in the United States could hope for considerable material, as well as intellectual, rewards.


During the 19th century, Britain, France, and Germany were at the forefront of new ideas in science and mathematics. But if the United States lagged behind in the formulation of theory, it excelled in using theory to solve problems: applied science. This tradition had been born of necessity. Because Americans lived so far from the well-springs of Western science and manufacturing, they often had to figure out their own ways of doing things. When Americans combined theoretical knowledge with "Yankee ingenuity," the result was a flow of important inventions. The great American inventors include Robert Fulton (the steamboat); Samuel F.B. Morse (the telegraph); Eli Whitney (the cotton gin); Cyrus McCormick (the reaper); and Thomas Alva Edison, the most fertile of them all, with more than a thousand inventions credited to his name.

Edison was not always the first to devise a scientific application, but he was frequently the one to bring an idea to a practical finish. For example, the British engineer Joseph Swan built an incandescent electric lamp in 1860, almost 20 years before Edison. But Edison's was better. Edison's light bulbs lasted much longer than Swan's, and they could be turned on and off individually, while Swan's bulbs could be used only in a system where several lights were turned on or off at the same time. Edison followed up his improvement of the light bulb with the development of electrical generating systems. Within 30 years, his inventions had introduced electric lighting into millions of homes.

Another landmark application of scientific ideas to practical uses was the innovation of the brothers Wilbur and Orville Wright. In the 1890s they became fascinated with accounts of German glider experiments and began their own investigation into the principles of flight. Combining scientific knowledge and mechanical skills, the Wright brothers built and flew several gliders. Then, on December 17, 1903, they successfully flew the first heavier-than-air, mechanically propelled airplane.

An American invention that was barely noticed in 1947 went on to usher in a new age of information sharing. In that year John Bardeen, William Shockley, and Walter Brattain of Bell Laboratories drew upon highly sophisticated principles of theoretical physics to invent the transistor, a small substitute for the bulky vacuum tube. This and a device invented 10 years later, the integrated circuit, made it possible to package enormous amounts of electronic circuitry in tiny containers. As a result, book-sized computers of today can outperform room-sized computers of the 1960s, and there has been a revolution in the way people live -- in how they work, study, conduct business, and engage in research.

In the second half of the 20th century American scientists became known for more than their practical inventions and applications. Suddenly, they were being recognized for their contributions to "pure" science, the formulation of concepts and theories. The changing pattern can be seen in the winners of the Nobel Prizes in physics and chemistry. During the first half-century of Nobel Prizes -- from 1901 to 1950 -- American winners were in a distinct minority in the science categories. Since 1950, Americans have won approximately half of the Nobel Prizes awarded in the sciences.


One of the most spectacular -- and controversial -- accomplishments of U.S. technology has been the harnessing of nuclear energy. The concepts that led to the splitting of the atom were developed by the scientists of many countries, but the conversion of these ideas into the reality of nuclear fission was the achievement of U.S. scientists in the early 1940s.

After German physicists split a uranium nucleus in 1938, Albert Einstein, Enrico Fermi, and Leo Szilard concluded that a nuclear chain reaction was feasible. In a letter to President Franklin Roosevelt, Einstein warned that this breakthrough would permit the construction of "extremely powerful bombs." His warning inspired the Manhattan Project, the U.S. effort to be the first to build an atomic bomb. The project bore fruit when the first such bomb was exploded in New Mexico on July 16, 1945.

The development of the bomb and its use against Japan in August of 1945 initiated the Atomic Age, a time of anxiety over weapons of mass destruction that has lasted through the Cold War and down to the antiproliferation efforts of today. But the Atomic Age has also been characterized by peaceful uses of atomic energy, as in nuclear power and nuclear medicine.

The first U.S. commercial nuclear power plant started operation in Illinois in 1956. At the time, the future for nuclear energy in the United States looked bright. But opponents criticized the safety of power plants and questioned whether safe disposal of nuclear waste could be assured. A 1979 accident at Three Mile Island in Pennsylvania turned many Americans against nuclear power. The cost of building a nuclear power plant escalated, and other, more economical sources of power began to look more appealing. During the 1970s and 1980s, plans for several nuclear plants were cancelled, and the future of nuclear power remains in a state of uncertainty in the United States.

Meanwhile, American scientists have been experimenting with other renewable sources of energy, including solar power. Although solar power generation is still not economical in much of the United States, recent developments might make it more affordable.

In 1994 Subhendu Guha, executive vice president of United Solar Systems in Troy, Michigan, was lecturing on the benefits of solar energy and showing a picture of solar cells arrayed on the roof of a house. An architect in the audience said, "But it's so ugly. Who would want that on their house?" That remark got Guha thinking about how to make the photovoltaics look more like the roof, instead of mounting the solar cells on frames that jut skyward.

Two years later, Guha's innovation came off the assembly line -- solar shingles that can be nailed directly onto the roof. The shingles are made from stainless steel sheeting, coated with nine layers of silicon, a semiconducting film, and protective plastic. Roofers install the shingles just as they do normal ones, but they must drill a hole in the roof for electrical leads from each shingle. Guha believes that his shingles will be economical in some parts of the United States as their energy efficiency improves and their production cost decreases. Meanwhile, they are already being used in Egypt, Mexico, and other developing countries. In 2002, United Solar Systems increased its production capacity by installing the largest known photovoltaic solar-cell manufacturing machine at its Michigan plant.

Another application of solar power is being tested at the U.S. Department of Energy's National Solar Thermal Test Facility in Albuquerque, New Mexico. Scientists have paired parabolic dishes to collect sunlight with engines that automatically operate the system in remote locations. The primary applications of the Advanced Dish Development System (ADDS) are water pumping and village electrification. The system shows promise for parts of the southwest United States and for developing countries.


Running almost in tandem with the Atomic Age has been the Space Age. American Robert H. Goddard was one of the first scientists to experiment with rocket propulsion systems. In his small laboratory in Worcester, Massachusetts, Goddard worked with liquid oxygen and gasoline to propel rockets into the atmosphere. In 1926 he successfully fired the world's first liquid-fuel rocket, which reached a height of 12.5 meters. Over the next 10 years, Goddard's rockets achieved modest altitudes of nearly two kilometers, and interest in rocketry increased in the United States, Great Britain, Germany, and the Soviet Union.

Expendable rockets provided the means for launching artificial satellites, as well as manned spacecraft. In 1957 the Soviet Union launched the first satellite, Sputnik I, and the United States followed with Explorer I in 1958. The first manned space flights were made in the spring of 1961, first by Soviet cosmonaut Yuri Gagarin and then by American astronaut Alan B. Shepard, Jr.

From those first tentative steps to the 1969 moon landing to today's reusable space shuttle, the American space program has brought forth a breathtaking display of applied science. Communications satellites transmit computer data, telephone calls, and radio and television broadcasts. Weather satellites furnish the data necessary to provide early warnings of severe storms. Space technology has generated thousands of products for everyday use -- everything from lightweight materials used in running shoes to respiratory monitors used in hospitals.


As in physics and chemistry, Americans have dominated the Nobel Prize for physiology or medicine since World War II. The National Institutes of Health, the focal point for biomedical research in the United States, has played a key role in this achievement. Consisting of 24 separate institutes, the NIH occupies 75 buildings on more than 120 hectares in Bethesda, Maryland. Its budget in 2000 was almost $23 thousand million.

The goal of NIH research is knowledge that helps prevent, detect, diagnose, and treat disease and disability -- everything from the rarest genetic disorder to the common cold. At any given time, grants from the NIH support the research of about 35,000 principal investigators, working in every U.S. state and several foreign countries. Among these grantees have been 91 Nobel Prize-winners. Five Nobelists have made their prize-winning discoveries in NIH laboratories.

NIH research has helped make possible numerous medical achievements. For example, mortality from heart disease, the number-one killer in the United States, dropped 41 percent between 1971 and 1991. The death rate for strokes decreased by 59 percent during the same period. Between 1991 and 1995, the cancer death rate fell by nearly 3 percent, the first sustained decline since national record-keeping began in the 1930s. And today more than 70 percent of children who get cancer are cured.

With the help of the NIH, molecular genetics and genomics research have revolutionized biomedical science. Building on initial work dating from the 1980s and 1990s, international research teams have constructed a map of the human genome. The genome is defined as all the DNA in the human body, and it was mapped to 99.99 percent accuracy in a project completed in 2003 involving hundreds of scientists at 20 sequencing centers in China, France, Germany, Great Britain, Japan, and the United States.

Scientists will use this new knowledge to develop genetic tests for susceptibility to diseases such as colon, breast, and other cancers and to the eventual development of preventive drug treatments.

Research conducted by universities, hospitals, and corporations also contributes to improvement in diagnosis and treatment of disease. NIH funded the basic research on Acquired Immune Deficiency Syndrome (AIDS), for example, but many of the drugs used to treat the disease have emerged from the laboratories of the American pharmaceutical industry; those drugs are being tested in research centers across the country.

One type of drug that has shown promise in treating the AIDS virus is the protease inhibitor. After several years of laboratory testing, protease inhibitors were first given to patients in the United States in 1994. One of the first tests (on a group of 20 volunteers) showed that not only did the drug make the amount of virus in the patients' blood almost disappear, but that their immune systems rebounded faster than anyone had thought possible.

Doctors have combined protease inhibitors with other drugs in "combination therapy." While the results are encouraging, combination therapy is not a cure, and, so far, it works only in the blood; it does not reach into the other parts of the body -- the brain, lymph nodes, spinal fluid, and male testes -- where the virus hides. Scientists continue to experiment with combination therapy and other ways to treat the disease, while they search for the ultimate solution -- a vaccine against it.


While the American medical community has been making strides in the diagnosis and treatment of disease, the American public also has become more aware of the relationship between disease and personal behavior. Since the U.S. surgeon general first warned Americans about the dangers of smoking in 1964, the percentage of Americans who smoke has declined from almost 50 percent to approximately 25 percent. Smoking is no longer permitted in most public buildings or on trains, buses, and airplanes traveling within the United States, and most American restaurants are divided into areas where smoking is permitted and those where it is not. Studies have linked a significant drop in the rate of lung cancer to a nationwide decline in cigarette smoking.

The federal government also encourages Americans to exercise regularly and to eat healthful diets, including large quantities of fruits and vegetables. More than 40 percent of Americans today exercise or play a sport as part of their regular routine. The per capita consumption of fruits and vegetables has increased by about 20 percent since 1970.

President George W. Bush believes Americans can do even better, and to that end he launched a national health and fitness initiative in 2002. The president encouraged Americans to adopt four "guideposts" for a healthy lifestyle: to exercise for 30 minutes every day; to eat a nutritious diet; to get preventative medical screenings; and to avoid smoking, drug use, and excessive alcohol consumption. At a White House event, at which he also introduced the new members of the President's Council on Physical Fitness and Sports, Bush said, "We're making great progress in preventing and detecting and treating many chronic diseases, and that's good for America....Yet we can still improve....When America and Americans are healthier, our whole society benefits."


InfoUSA is maintained by the Bureau of International Information Programs (IIP), U.S. Department of State

The numerical data in this section is solely for informational purposes. Please consult the original sources for updated information.