Elizabeth Street, Brownsville, Texas, September 1967 [National Oceanic and Atmospheric Administration] at https://commons.wikimedia.org/wiki/File:Wea00713.jpg]
This Week in Environment and Health
September has been brutal. Barely a heartbeat passed before Hurricane Harvey’s destruction of Houston was followed by Irma’s devastation of entire Caribbean islands and the Florida coast. Irma was declared a “500-year” rain event, while Harvey ranks as a “1000-year” flood event. And they happened within two weeks of each other. (The storms mark the first time the U.S. has been hit by two Category 4 storms in the same year, let alone the same month.) Scientific consultants and environmentalists were called “anti-development” when they warned Houston civic leaders about development that erased important wetlands, increase housing development in the floodplain, and resisted building code reforms. The storms have also ignited a national conversation about climate change. One Florida resident tweeted in response to EPA chief Scott Pruitt’s charge that discussing climate change while hurricanes struck communities showed “insensitivity” to Floridians, “This is bullshit. . . . I just evacuated. I f'ing want us all talking about climate change.” We can no longer avoid this conversation.
This September, however, also marks a different storm in the fiftieth anniversary of Hurricane Beulah. Beulah made landfall in Brownsville, Texas, on September 20, 1967. The storm caused severe flooding and cost roughly $1 billion, second only to Hurricane Betsy at that time. It also spawned more than 100 tornadoes that accounted for most of the deaths associated with the storm, killing over 650 total. The National Weather Service considered the storm one of the worst Gulf storms of the century, and the name Beulah was retired. For people living in the Rio Grande Valley the hurricane marked a watershed moment as it highlighted the healthcare needs of the region.
Dr. Mario Ramirez emerged as one of the heroes during in the aftermath of Beulah, leading medical care efforts in Rio Grande City and administering to both American and Mexican refugees. Mexican organizations sent aid to Valley residents, such as mobile aid trucks with radios, spare parts, and driven by mechanics helped areas hit by Beulah. (Contrast this with the current-day border patrol’s decision to maintain border checkpoints.) Ramirez also took President Lyndon B. Johnson and Texas governor John Connally on their tour of the area as they surveyed the damage done. Residents considered the hurricane a watershed moment, noticing changes in their local environments – a disappearance of horned frogs for instance – although they remain less clear about more significant transformations.
So what do we know fifty years later? We know that such storms affect the natural and built environments in ways that simple economic calculations obscure. Conjunto legend Gilberto Perez wrote a corrido, a traditional Mexican ballad – “Las Crecientes de Beulah” – immortalizing the hurricane. As a consequences of climate change, we know that more of these storms will happen. So studying how people were affected and responded and what changes local, state, and national agencies made become all the more important as we face more such severe weather. And good historians will recover the voices and experiences of ordinary people dealing with extraordinary circumstances. The “behemoth of a storm” and others like it tell us much about how we used to deal with disaster and suggest ways we should respond in the future. In places like the Valley which have undergone tremendous population growth, disaster planning depends on studying past examples (and failures). Reminding us of past crises is the first step in changing consciousness and making different choices, choices that better address who pays the price for the decisions made. Remembering reminds us that things can get worse, and that an ounce of prevention may be worth a pound of cure.
By Christopher Sellers [posted by Amy Hay]
With our current president targeting the EPA for biggest budget cut of any major federal agency (31%), with climate-skeptics now in charge not just of the Executive Branch but of Congress, the anti-environmentalism wave of the mid-2010s has reached a historic high-watermark. This latest surge has taken much of the American environmental community—including environmental historians—by surprise. It shouldn’t have; for what Sam Hays dubbed the “environmental opposition” has a long, inventive, and obdurate history in the United States.
Before modern environmentalism itself first took shape, predecessor movements both inside the U.S. and beyond certainly aroused their share of opposition. Timber barons as well as local trappers and hunters, including Native Americans, resisted early park-making. Public health advocates clashed with corporations insistent upon their right to pollute streams or the air. These precedents echo in today’s proposals by the Trump administration to rescind national monument designations and to reconsider recent rules for mercury and other air toxics. But only when a single movement began bundling all these issues together as “environmental,” especially from the 1960s onward in the United States, did a modern anti-“environmental” politics first arise.
As I have argued in a recent piece in Vox, this opposition first found electoral traction during the 1970’s through Western as well as Southern coalitions forged between suburban and rural voters. A Sagebrush rebellion erupted against federal environmental rules for rural lands, among western miners, ranchers, and others rural property-owners. They were joined by politicians like Anne Gorsuch, representing suburban Denver in the Colorado legislature, who extended her own animus against government interference to regional planning.
Gorsuch would go on to become head of the EPA from 1981-83 during the presidency of Ronald Reagan. Along with James Watt at the Department of the Interior, she launched an assault on federal environmental agencies that is the closest in American history to what we are now seeing under President Trump. Of note is that Trump also successfully nominated her son to become the latest and youngest member of the Supreme Court.
In the South, meanwhile, Newt Gingrich began running for Congress in a rural-suburban district in Georgia in 1972. He first ran as both a Republican and an environmentalist, but lost. But in 1978, when he set aside his environmentalist allegiances, he also began winning. Over the next fifteen years, Gingrich’s stand-offishness toward environmental issues would help energize a Republicanization of white suburban and rural voters in Georgia. Gingrich’s 1994 Contract with America, which helped the Republicans to take over Congress, led to other legislative pushes to undermine many federal environmental laws. Among their accomplishments was the 1996 Congressional Review Act, being used by today’s Congress to try and revoke many Obama-era rules.
Other important trends have also helped nourish anti-environmentalism’s rise. As the environmental movement has become more racially diverse via an environmental justice movement, white environmentalists have also identified increasingly with gentrifying downtowns and “walkability.” Especially in places like Georgia, as environmental issues have become increasingly coded as black and urban, redistricting and gerrymandering have further amplified the use of anti- environmentalism as a political wedge for mobilizing white rural and suburban votes. The growing prominence of the climate and other global environmental issues has also made environmental causes in the U.S. more vulnerable, in the view of some social scientists. Suggesting that it may have come at the expense of more local environmental concerns, they also point to the response, a “heightened level of anti-environmental activity” generated “by the conservative movement [including think tanks] and Congressional Republicans.”
The successes of today’s anti-environmentalism will likely attract much further study by future historians, and our understanding will deepen. But first, environmental historians need to take today’s crisis as a wake-up call. We need to reconsider our decades-long declaration of independence from political history, which has effectively pushed politics to the side of what we consider “real” environmental history should be.
By Kate Brown
“Chernobyl,” a Ukrainian journalist once quipped, “is something we remember once a year.”
On April 26, 2017, the global news media will dutifully remember the 1986 explosion of the Chernobyl Nuclear Power plant in northern Ukraine, then a republic of the Soviet Union. The 31st anniversary will likely differ little from earlier anniversaries. The public will likely see haunting photos of the abandoned city of Pripiat, and beautiful photos of wolves and moose in the Chernobyl ecological reserve, created inside the evacuated 30-kilometer ring around the plant in the weeks after the disaster. Photos of animals in the Zone are often used to make an argument that nature rights itself even after the world’s worst nuclear disaster. Nature, in this formula, includes humans. “Life has returned to normal” the website of the International Atomic Energy Agency states, “in much of the affected area, and people are carrying on with their daily activities.”
Traveling along the southern edge of the Chernobyl Zone of Alienation this summer, I noticed that residents of contaminated regions had adapted to their precarious existence not by returning to dairy farming and raising pigs and cattle, as they had before the accident, but by harvesting the most radioactive products in their environment—wild berries and mushrooms—which are then sold into European markets as wild and organic. In 1986, the European Community established a high, emergency permissible norm for food products of 600 becquerel per kilogram. Emergency norms are supposed to be temporary, good for safe guidelines until the emergency is deemed over. Because of differences of opinion and inaction, the European parliament never lowered the emergency norm.
Meanwhile in contaminated regions of Ukraine and Belarus, no official ever recommended re-starting the regional economy by means of the sale of radioactive forest products. Selling radioactive berries to EU markets occurred outside any rational plan. Locals cannot afford the big investments needed for dairy or wheat farming. All they require to pick berries is a plastic bucket and a ride to the forest. And the returns are big. Pickers can make $25 a day selling berries, half the monthly salary of a public school teacher. P
It is hard to know the biological cost of this exchange of Chernobyl-contaminated berries for euros because there has been little public discussion and almost no medical research on the long-term, low-dose ingestion of radioactive isotopes. When commenting on health effects from the disaster, most reporters use information provided by the World Health Organization website. The WHO information sheet on Chernobyl health effects states that 28 emergency workers died of radiation sickness soon after the accident, that there was a rise in leukemia among emergency workers, and that a large fraction of 6,000 cases of thyroid cancer among children were likely attributable to radioactive iodine intake. That is the sum total of reported damage. The WHO website cites studies no later than 2006, when it published with the International Atomic Energy Agency the Chernobyl Forum Report. When the Forum Report was published, researchers and activists were outraged by what they saw was a minimization of Chernobyl’s health effects. Greenpeace estimated that a million people would be harmed by Chernobyl radiation. The Ukrainian government last year put the death toll among Ukrainians from Chernobyl at 150,000. Unfortunately, however, the Chernobyl Forum Report, despite the controversy, has managed to become the consensus on assessments of Chernobyl’s health effects.
A person doesn’t have to look far, however, to see evidence that more was going on than the 28 original deaths and 6,000 cases of childhood thyroid cancer. I found in archival records that UN agencies, including the WHO and the IAEA, worked in the early nineties to gain control over Chernobyl health assessments and minimize them. UN staffers and UN-appointed scientific experts denied and disappeared evidence they had verified as they fought off recognition from 1990 to 1996 of a childhood thyroid cancer epidemic. I found in Ministry of Health records of the former USSR a great wash of documentation indicating that regions most contaminated from Chernobyl fallout experienced in the first five years after the accident a slowly advancing public health disaster. Regional public health officials tracked an alarming rise in several disease categories. Adults and children suffered from a wide range of puzzling problems—from dizziness, dry mouth, headaches, and nose bleeds to chronic disease of the digestive track, respiratory, circulation and immune systems. More women than before had trouble getting pregnant, staying pregnant and having babies that thrived. That evidence is now thirty years old. A survey of more recent research finds tracks with some of these findings.
Wladimir Wertelecki, a medical geneticist at the University of California, San Diego and his colleagues in the Rivni Province of Ukraine have been tracking birth defects since 2000. The Rivni Province, 200 miles West of Chernobyl, was less contaminated than those immediately surrounding the plant. Wertelecki and his colleagues followed all 96,438 births in the Province from 2000 to 2010. They found the province as a whole had a record of neural tube birth defects, microcephaly and microphthalmia that was one of the highest in Europe. In the most contaminated northern margins of the province the rates are yet higher, instead of 18.3 birth defects for every 10,000 live births, they recorded 27.0. In a later study, Wertelecki’s group found that the whole body counts of cesium 137 in people in the northern Polesian of Rivni are higher than officially permissible upper limits.
Other researchers have found that children in Chernobyl regions have unusually high rates of irritable bowel syndrome, which might be linked also to immune system disorders. A Harvard researcher found a significant elevation of dental cavities in children in contaminated regions of Ukraine over those in non-contaminated regions caused, the authors found, by a permanent decrease in salivary production and flow. Another study showed a significant association between childhood exposure to cesium 137 in local soils and decreased lung capacity and increased airway reactivity, caused by damage to developing lungs from frequent childhood bronchial infections. All authors of the small and sparsely-funded Chernobyl studies comment that few or no studies have been done on the subjects they examine.
Historians Susanne Bauer, Christopher Sellers, and Hiroshi Ichikawa have written about how Soviet scientists during the Cold War took a different trajectory than their western counterparts. Bauer records how Soviet researchers used the human body as a tool to record environmental damage. Sellers argues Soviet toxicologists were less attached to industry science and more interested in protecting workers in factories. Hiroshi Ichikawa finds that Soviet scientists, mobilized by the damage they saw among people near Soviet testing grounds and nuclear weapons sites, emerged as important activists internationally to end nuclear testing. Soviet doctors, these historians find, were performing more sensitive, more body-centered research on industrial toxins than their counterparts in the West. Perhaps it is time to take another more serious look at the Soviet Chernobyl medical records to reach a new consensus on the disaster’s effects.
 M.R. Sheikh Sajjadieh, L. V. Kuznetsova, V. B. Bojenko, “Low internal radiation alters innate immune status in children with clinical symptom of irritable bowel syndrome,” Toxicol Ind Health. 2010; 26: 525-31.
 K. Spivak, C. Hayes, J. H. Maguire, “Caries prevalence, oral health behavior, and attitudes in children residing in radiation-contaminated and non-contaminated towns in Ukraine,” Community Dent Oral Epidemiology, 2004; 32: 1-9.
 Erik R. Svendsen, “Cesium 137 exposure and Spirometry Measures in Ukrainian Children Affected by the Chernobyl Nuclear Incident,” Environmental Health Perspectives, volume 1181 number 51 May 2010.
 Susanne Bauer, “Mutations in Soviet Public Health Science: Post-Lysenko Medical Genetics, 1969–1991,” Studies in History & Philosophy of Biological & Biomedical Sciences 47 (September 2014): 163–72.
 Christopher Sellers, “The Cold War over the Worker’s Body: Cross-National Clashes over Maximum Allowable Concentrations in the Post-World War II Era,” Toxicants, Health and Regulation since 1945, ed. Soraya Boudia and Nathalie Jas (Londres: Pickering and Chatto, 2013): 24-45.
 Hiroshi Ichikawa, “Radiation Studies and Soviet Scientists in the Second Half of the 1950s,” Historia Scientiarum Vol. 25‒1, 2015: 78-93.
By Dawn Biehler 19 December, 2016
On December 16, 2016, the U.S. Centers for Disease Control and Prevention announced that an advanced form of coal workers’ pneumoconiosis, or black lung disease, seems to be resurging among miners in Appalachia. An investigation by National Public Radio released last week also found nearly one thousand people with current or past occupational exposure suffering with Pulmonary Massive Fibrosis (PMF) across eleven clinics in four states since 2011. This cluster represents a dramatic spike in reported morbidity after the Coal Workers’ Health Surveillance Program identified only 31 cases of this potentially fatal form of black lung in the entire US during the 1990s.
Measures for preventing and monitoring black lung have long been inadequate. After industry denied its culpability for workers’ sickness and death for decades, the 1969 Federal Coal Mine Health and Safety Act offered free periodic health screenings to coal miners, but few participate in this voluntary program. The CDC found that only 17% of Kentucky miners took advantage of the free screenings over the past five years. In regions with limited economic opportunities, disclosure of diagnoses to employers through this program may result in loss of livelihood, so younger miners may avoid reporting until black lung progresses into PMF. Indeed, since 2011, 99 cases of PMF from across the US have been reported to the surveillance program at National Institute for Occupational Safety and Health, as compared with NPR’s finding of 962 patients across just eleven clinics, where black lung cases went largely unreported to the national program.
The CDC’s report focuses on the breakdown of surveillance of workers’ bodies, but we must also understand why mine environments remain unsafe, in spite of regulations that have tightened ambient dust standards. Historians of environmental health have much to offer here. According to Alan Derickson’s Black Lung: Anatomy of a Public Health Disaster, the state and industry denied that exposure caused disease for much of the early to middle twentieth century, often with the support of company physicians. Along with Derickson’s book, Barbara Ellen Smith’s Digging Our Own Graves chronicled activist miners’ struggles to obtain information from unions, secure federal intervention, and legitimize their own embodied experiences of disease and environment. Their actions resulted in the adoption of the Coal Act of 1969, and of new safety procedures and equipment to limit exposure to dust from coal and the rock in which coal is embedded. Derickson and Smith’s stories about the complicity of the state in corporate exploitation, and the importance of activists in resisting exploitation and defining disease, should give us pause in this moment of weakened worker power and as a new presidential administration promises policies friendly to the coal industry.
Furthermore, Arthur McIvor’s and Ronald Johnston’s oral histories of mine workers in Britain show that mine environments remained unhealthy in part because new equipment functioned poorly. Foremen continued to demand high productivity even as new safety procedures slowed workers down. Male breadwinners in this highly gendered occupation, in communities where mine work defined men’s identities, feared loss of take-home pay and machismo if they could not keep up. Although from a different national context, these stories presage some of the dynamics of the latest chapter of black lung’s story in Appalachia.
Current miners in the US complain that respirators and water sprays are clumsy and cut into their productivity. We also know that industry inspectors for many years forged results of federally-mandated air quality tests. This is also a story about the depletion of a natural resource: coal companies have already mined the thickest coal seams, and are now sending miners after thinner seams, where they must drill through quartz and other minerals, increasing exposure to not only coal but also rock dust. Finally, the loss of coal mining jobs in the US may be bringing about a surge among former miners seeking care as laid-off workers realize they have only three years to claim federal benefits. As miners seek care, often long after the damage is done, their advanced cases are straining the program that provides compensation for those who develop black lung. History can thus help place the story of miners’ vulnerable bodies in the context of our current moment of economic, political, and environmental change.
Last week, the World Health Organization declared that Zika virus is no longer a global public health emergency. While last year’s outbreaks and Zika’s rapid spread to new areas prompted “urgent and coordinated” research and action under International Health Regulations, authorities have agreed to transition to a more sustained response to address the continued threat of Zika. WHO still stresses that Zika remains a serious public health problem.
It is too soon to know yet whether health authorities at multiple scales will continue to support a strong response to Zika now that WHO has declared the emergency over, though observers have expressed fear of reduced funding for research. The history of public health and infrastructure suggests some patterns to watch out for. Outbreaks such as this have often been symptoms, at least in part, of long-term official neglect of sanitation, housing, or water infrastructure. Consider, for example, the 1993 epidemic of cryptosporidiosis in Milwaukee, WI, most frequently blamed on poorly-maintained water filtration systems.
While Milwaukee’s outbreak was quite generalized throughout the population, many others strike poor neighborhoods and regions the hardest as infrastructure degrades because of injustices beyond the control of resident communities. In such cases, diseases become a constant part of the health landscape – much as Zika appears to be doing. Tuberculosis plagued black neighborhoods in Baltimore in the early twentieth century as racial segregation practices crowded African-Americans into the city’s most depreciated housing. Similarly, Zika has afflicted Brazilian favelas and their counterparts in other countries the most severely – poor neighborhoods where inadequate municipal sanitation leaves trash that collects standing water where Aedes aegypti mosquitoes breed. Streets and informal housing may be riddled with pockets that also collect water, and holes that allow Aedes inside. Communities in Latin America have struggled for decades with other viruses borne by the same mosquito, notably dengue fever, chikungunya, and yellow fever.
I have since 2012 been the environmental historian/urban geographer on a research team examining the distribution of a related mosquito, Aedes albopictus, in Baltimore. A. albopictus has yet to transmit a local case of Zika in Baltimore (there have been travel-related cases), but decades of racial segregation and infrastructure neglect has cast a long shadow on mosquito ecology here. Illegal trash dumping, inadequate waste collection, and most of all abandoned buildings are prevalent in neighborhoods deprived of loans for home maintenance by redlining in the 1930s. These areas have three times as many mosquitoes as mostly-white neighborhoods privileged with fairly constant levels of investment in infrastructure.
Reflecting on the history of outbreaks from typhoid to smallpox to SARS, Judith Walzer Leavitt and Lewis Leavitt have challenged governments and the public to seize upon infectious disease crises as opportunities to reinvest in robust public health infrastructure, broadly conceived, for the benefit of all. Global and local societies have left their poorest members to live with persistent threats to health and well-being. Zika’s continued threat presents yet another chance to funnel fear of disease into support for communities from Brazil to Baltimore.
ProPublica investigates Dr. Alvin Young, the Air Force specialist charged with overseeing Agent Orange research - testing, application, side effects - and his role in denying veterans' benefits for their exposure to Agent Orange herbicides, which included a dioxin contaminant.
One year later . . .
CNN goes to Flint, Michigan a year after the water crisis broke
Florida regulators have voted to increase toxin levels in safe drinking water