Tuesday, December 9, 2008

The Birth of Modern Dairy

The roots of modern dairy regulations can be traced all the way back to the War of 1812. The war against the British disrupted trade relations, and America’s access to whiskey from the British West Indies. The domestic market exploded, and distilleries popped up outside of every major city. By 1829 there were over 1,000 distilleries in New York alone.

One of the natural by-products of the distilling process is “whiskey slop,” the acidic left-over parts of grain that remain after all of the starch and alcohol are removed. Distillery owners realized that cows fed whiskey slop produced more milk at a lower cost than any other method, and it became common to house dairy cattle next to a distillery and channel the hot slop directly into the feeding troughs. It was a perfect system that allowed distillery owners to capitalize on the growing demand in cities for both whiskey and milk.

The cows were kept packed tightly in pens, deep in their own filth, and quickly became sick on their deficient diet. The milk they produced so abundantly was thin and bluish, and too low in butterfat to be used to make butter or cheese. Dairy owners frequently added starch, flour, plaster of paris, and chalk to thicken up the milk and correct the color before they sent it into the cities in non-air conditioned train cars, pooled with the milk from many other dairies.

As the distilleries and slop operations expanded, the city infant mortality rate grew at an alarming rate. Throughout the 1800s, the infant mortality rates in many cities grew until they approached 50%. The reformer Robert Hartley began to research this troubling trend in the 1830s. Hartley observed that even though poverty was a major issue in European cities, infant mortality had been steadily declining in Europe during the same time it had been increasing in the U.S. For example, in London the mortality rate of children younger than five fell from 74.5 percent in 1729 to 31.8 percent in 1829. Yet the causes normally associated with mortality—overcrowding, extreme poverty, and malnutrition—were much less prevalent in American cities than in European ones.

Hartley determined that the important difference was milk. Europeans drank bad milk, but much less of it. The average European family could only afford a quart of milk a week. In American cities, an average family went through a quart of milk a day. Additionally, Hartley believed that the brewery slop fed to European cows was less damaging than the distillery slop fed to American cows.

The idea that milk was the cause of many disease outbreaks began to take hold. When an outbreak of typhoid occurred in Oakland in 1893, the source was traced to a local dairy farm. Although the cows were found to be in good condition, and none tested positive for disease, the New York Times concluded, “nothing can be proved, it is true, but the probability is that the ravages of typhoid in Oakland were due to this milk.”

Two movements began to develop as milk was increasingly linked to outbreaks of diseases and infant mortality. One was the move towards supplying pasteurized milk, led primarily by the philanthropist Nathan Strauss, who donated pasteurization stations to many cities. The other was the campaign for “certified” milk, which would come from dairies that had been inspected and certified to be sanitary and disease free. The certified milk movement approached the milk problem by trying to eliminate bad dairy practices, requiring farmers to test their cattle for tuberculosis, keep the cows in a healthy condition, milk the cows in a sanitary manner, and transport the milk in clean, refrigerated vehicles. Investigators found that frequently the employees at dairy farms were ill, providing a source of contamination and potentially spreading their diseases to consumers.

At first many saw pasteurization as a stopgap measure until the quality of dairies, transportation methods, and distribution centers could be appropriately regulated and monitored. Regulations passed in Chicago in 1912 created two grades of milk: “inspected” and “pasteurized,” with the understanding that the pasteurized milk was of lower quality than the milk inspected and certified disease-free. Legislators were wary of pasteurized milk, and one said it could mean “cooked dirt, cooked dung [… and that] a false sense of security is conveyed by the term ‘pasteurized’” The majority of medical professionals were in favor of certified milk, believing that certified milk would contain very few dangerous pathogens.

The certified milk movement could not tackle the problem quickly enough, however. Even as cities experimented with a mixture of certification standards and pasteurization, they were beset by a wave of epidemics traced to milk. Epidemics of tuberculosis, hoof and mouth disease, and infant paralysis inspired health commissioners to take harsher stances on milk regulations, and mandatory pasteurization was implemented first as a temporary measure, and then permanently as it was credited for limiting outbreaks of diseases.

Although many doctors remained convinced of the superior quality of certified milk, and many consumers protested the flavor of pasteurized milk, pasteurization became a social cause that was taken up by social health advocates and infant welfare societies. The idea that unpasteurized milk was inherently dangerous began to take root and hold as the number of epidemics slowed, and children were taught ditties that ingrained in them the dangers of raw milk. Pasteurized milk was finally accepted, and then expected by consumers.

No comments: