Since prehistoric times humans have altered the state of food to extend its longevity or improve its taste. At least as early as 300,000 years ago, early humans had harnessed fire to cook and conserve meat and, later, determined that salt could be added to preserve meat without cooking. In ancient Rome and Greece, wine was often mixed with honey, herbs, spices and even saltwater, chalk or lead—which served as both a sweetener and a preservative.

Over time, the act of adulterating food for economic gain began to emerge. During the Middle Ages, imported spices were quite valuable. Due to their high prices and limited supply, merchants sometimes combined spices with numerous cheap substitutes such as ground nutshells, pits, seeds, juniper berries, stones or dust. In response, trade guilds were formed to supervise the quality of products and prevent the adulteration of food, and laws were drafted throughout Europe to regulate the quality of bread, wine, milk, butter and meat. Following the Reformation, however, the influence of guilds waned and, along with them, their laws.

During the 18th and 19th centuries, as the United States shifted from an agricultural to an industrial economy and urbanization disconnected people from food production, the debasement of food for profit became rampant. Milk was often watered down and colored with chalk or plaster—substances which were also added to bulk up flour. Lead was added to wine and beer, and coffee, tea and spices were routinely mixed with dirt, sand or other leaves. Although a number of laws forbade harmful substances from being added to food, they were tough to enforce since there were no dependable tests to prove the existence of pollutants.

In 1820, a chemist by the name of Fredrick Accum drew attention to numerous cases of food tampering when his book “A Treatise on Adulterations of Food, and Culinary Poisons” was published. By this time, microscopes could detect differences between ingredients and chemical reactions were able to identify the existence of toxic substances. Nevertheless, the increasing number of urban dwellers were reliant on others for food—which was often transported great distances with primitive refrigeration and poor sanitation—and the practice of adding dangerous additives to extend a food’s shelf life continued.

By the end of the 19th century, the rise of analytic chemistry enabled manufacturers to mask food deterioration in ways that were tough to detect. At the same time, homegrown elixirs, tinctures and “medicines” containing opium, cocaine, heroin and other drugs were sold without restriction, warnings or ingredient labels. In 1902, Dr. Harvey W. Wiley, chief chemist in what is now the Department of Agriculture, organized a group of volunteers to test the effects of ingesting some of the most common food preservatives in use at the time, such as borax, copper sulfate, sulfuric acid and formaldehyde. Known as the Poison Squad, this group of men agreed to eat increasing amounts of each chemical while carefully tracking its impact on their bodies. The shocking reports garnered widespread attention and, in 1906, Congress passed both the Meat Inspection Act and the original Food and Drugs Act, prohibiting the manufacture and interstate shipment of adulterated and misbranded foods and drugs. It wasn’t until 1958, however, that manufacturers were required to conduct the testing necessary to prove a substance’s safety prior to being introduced to the public.

Although the safety of our food supply has greatly improved over the last century, food fraud involving cases of dilution, substitution and mislabeling continue to persist within the global food industry. As processed foods with multiple ingredients are increasingly sourced from numerous countries and the supply chain has become more complex, tracing the sources of contamination—intentional or not—has become a significant challenge.