In the late 1980s, a Japanese scientist named Koji Minoura stumbled on a medieval poem that described a tsunami so large it had swept away a castle and killed a thousand people. Intrigued, Minoura and his team began looking for paleontological evidence of the tsunami beneath rice paddies, and discovered not one but three massive, earthquake-triggered waves that had wracked the Sendai coast over the past three thousand years.
In a 2001 paper, Minoura concluded that the possibility of another tsunami was significant. But Tokyo Electric Power was slow to respond to the science, leaving the Fukushima Daiichi nuclear power plant unprepared for the 15-meter wave that inundated it in 2011. The result was a $188 billion natural disaster. More than 20,000 people died.
For the past several decades, paleo-hydrologist Victor Baker of the University of Arizona has been using techniques similar to Minoura’s to study the flood history of the Colorado Plateau. Like Minoura, he’s found that floods much larger than any in recorded history are routine occurrences. And like Minoura, he feels his research is being largely ignored by agencies and public utilities with infrastructure in the path of such floods.
Earlier this month, when a spillway at the nation’s tallest dam in Oroville, California, nearly buckled under the pressure of record rainfall, the consequences of under-estimating flood risks were brought into sharp relief. Dams aren’t built to withstand every curveball nature can throw — only the weather events that engineers deem most likely to occur within the dam’s lifespan. When many Western dams were built in the mid-20th century, the best science to determine such probabilities came from historical records and stream gauges.
But that record only stretches back to the late 1800s, a timespan Baker calls “completely inadequate.” Today, technology allows scientists to reconstruct thousands of years of natural history, giving us a much clearer picture of how often super-floods occur. “The probability of rare things is best evaluated if your record is very long,” Baker explains.
By combing the Colorado River, the Green River and others in the Southwest for sediment deposits and other flood evidence and then carbon-dating the results, Baker has concluded the short-term record severely underestimates the size and frequency of large floods. On the Upper Colorado near Moab, Utah, Baker and his team estimated the average 500-year flood at roughly 246,000 cubic feet per second, more than double the 112,000 cfs that scientists had estimated drawing on the stream gage record alone. Baker’s calculations put the 100-year flood at 171,000 cfs, also much greater than the previous estimate of 96,000 cfs. In comparison, legendary flooding in 1983 and 1984 that nearly overwhelmed Arizona’s Glen Canyon Dam, just downstream, peaked at just 125,000 cfs. (The dam has been bolstered since then, and today engineers say it can handle flows up to 220,000 cfs.)
In California, too, super-floods may be more common than previously thought. United States Geological Survey hydrologist Michael Dettinger and UC Berkeley paleoclimatologist B. Lynn Ingram have studied the paleo-flood record across a broad swath of California and discovered that such floods happen at least every 200 years, and maybe more frequently. The last one was in 1862. Thousands of people died, towns were submerged and the state’s economy was devastated, yet it was nowhere near the worst: One flood in the 1600s was at least twice as big.
In 2013, Dettinger and Ingram wrote in Scientific American that California was due for another huge water year. Their prediction has proven prescient. So much rain and snow has pounded California this winter that as of Feb. 21, half the state was under flood, rain or snow warnings. Creeks are overflowing their banks and flooding homes, and water managers were forced to spill excess water over the Oroville Dam’s emergency spillway for the first time in the dam’s 49-year history. On the night of Feb. 12, the sediment-choked water began eroding a hole in the spillway, threatening to release a wall of water. More than 180,000 residents fled to higher ground.
Luckily, emergency crews were able to patch the spillway, and people trickled back home. But Oroville isn’t alone — across the country, some 2,000 dams whose failure could cause loss of life are in need of repair, according to the Association of State Dam Safety Officials. And in many ways, Californians dodged a bullet: this winter’s precipitation was nowhere near as heavy as the storms Dettinger and Ingram have studied, and yet if Oroville’s reservoir hadn’t been depleted by years of drought, floodwaters could have easily overwhelmed the dam.
Does this mean dams like Oroville and Glen Canyon need to be fortified to withstand bigger storms? Officials from the Bureau of Reclamation are confident that Glen Canyon, at least, is equipped to handle even “extremely large hydrologic events.” And The U.S. Army Corps of Engineers is reluctant to apply paleo-hydrology research to existing infrastructure, in part because we’ve altered rivers so much that some Corps’ scientists believe ancient flood records are no longer realistic indicators of current risks.
But Baker believes it would be foolhardy to not at least create contingency plans for the possible failure of some of the West’s biggest dams. That Japanese officials were warned about Fukushima and didn’t act is “an embarrassment,” Baker adds. “We may have some similar things occurring in the United States, if we don’t seriously pay attention to this science.”
Krista Langlois is a correspondent with High Country News