Document Type

Book Chapter

Publication Date



Toxicity testing is a key part of the process of assessing the hazards, safety, or risk that chemicals and other substances pose to humans, animals, or the environment. Standardized methods for such testing, typically involving animals, began to emerge during the first half of the 20th century. In 1959, British scientists William Russell and Rex Burch proposed a framework for reducing, refining, or replacing animal use in toxicology and other forms of biomedical experimentation. This “3Rs” or “alternatives” approach emerged at a time of growing sensitivity to the use of animals in experimentation, and progress in its implementation has been spurred by a growing appreciation of the power of emerging science and technology and the limitations of animal-based approaches. The 3Rs approach, although slow to be embraced, increasingly become a framework for change in toxicity testing during the last quarter of the 20th century. These years saw measurable growth in research activity related to the 3Rs, along with the establishment of 3Rs-based organizations and centers, journals, websites, funding sources, and conferences. As the field matured, the principles for validating new and revised alternative tests were formulated and pioneered. The 3Rs framework reached a tipping point in 2007 with the emergence of a U.S. National Research Council report proposing a radically different, largely animal-free approach to toxicity testing, encapsulated in the phrase “21st Century Toxicology.” This chapter reviews these developments, examines 3Rs trends in the toxicological literature, presents measures of the impact of 3Rs activity, and concludes with a summary of some of the remaining challenges to the development, validation, regulatory acceptance, and implementation of 3Rs methods.