Why Add Chlorine to Our Drinking Water?

Water sourced from various origins, such as lakes and wells, can carry harmful germs that pose health risks. To combat these, water companies employ disinfectants, mainly chlorine or chloramine, to eliminate disease-causing pathogens like Salmonella, Campylobacter, and Norovirus.
The 19th century in the U.S. was marred by recurrent outbreaks of deadly diseases like cholera, typhoid fever, and dysentery, leading to high mortality rates, especially among children. Life expectancy was significantly lower, with many individuals not reaching age 40. Adding chlorine to drinking water commenced in 1908 in Jersey City, NJ, marking a pivotal moment in public health. Chlorine’s potent properties neutralized waterborne illnesses, prompting widespread adoption and a transformation in the nation’s health landscape.
Chlorine’s high toxicity enables it to eradicate harmful bacteria, microbes, and pathogens carried by water sources, thereby enhancing short-term public health. Its adoption significantly increased American life expectancy by 50% in the 20th century, earning acclaim from the U.S. Centers for Disease Control and Prevention (CDC) as a landmark achievement in public health.
Beyond disinfection, chlorine aids in mitigating offensive tastes and odors in water, and it prevents the growth of molds, algae, and bacteria within water supply infrastructure like reservoirs and distribution systems.
Consequently, the U.S. Environmental Protection Agency mandates water providers to maintain detectable levels of chlorine in their water systems. This measure ensures continued protection against pathogens as water travels from treatment plants to consumers’ households, upholding water safety and safeguarding public health.