logo2

ugm-logo

The True Scope of the Disaster in Puerto Rico

The U.S. flag, next to a damaged Puerto Rican flag, flies in the municipality of Yabucoa.

Just about nobody believes Puerto Rico’s official death toll for Hurricane Maria. Researchers and journalists alike generally accept that the island’s tally of 64 people killed by the storm last September is a massive undercount, so obviously inaccurate that the Puerto Rican government has agreed to review and revise its figures. But with Puerto Rico still in disarray—from the storm’s casualties, population changes from migration, and the absence of basic services—information on the complete human cost of the catastrophe is still woefully incomplete.

What little is known, however, portends a grim conclusion: that Hurricane Maria is one of the most significant and destructive natural disasters in recent American history.

A new study in The New England Journal of Medicine, conducted in part by researchers at Harvard University, sheds new light on what’s really happened on the island. The team found that there were over 4,600 deaths potentially attributable to the hurricane, a 70-fold increase over official estimates. The survey also measured high rates of migration among people displaced by the storm and, after it passed, long periods where residents faced a loss of basic services.

As I spent time reporting from Puerto Rico three weeks after Maria, two things became clear: The storm had a staggering impact on the island, and it was almost impossible to translate that impact to observers on the mainland. People are used to gauging the scale of far-off events by relying on official estimates of death tolls, dollar amounts of damages, and the like. But in the immediate chaos following the storm, the “official” story was clearly inadequate. Some residents just went missing. Some got swept away in floods. Entire branches of extended families went silent. Mudslides and floods essentially turned remote places in the island’s mountainous interior into islands in their own right. In my attempts to assess the human burden of the hurricane, I asked everyone I interviewed—over two dozen people—if they knew someone who’d disappeared, died, or had fled to the mainland. Each person told me “yes.”

Official counts are obviously more difficult to perform than my anecdotal one, and not just because of scale: Further complicating the picture are mismatched systems in hospitals and morgues that might double-count some victims or misidentify others, as well as tough decision-making over just what counts as a hurricane-related death. In its survey of over 3,200 Puerto Rican households, the team behind the new study tried to get around those difficulties by asking families directly about the deaths of loved ones.

Using Machine learning tools to gain new insights from Earthquake data

Scientists at the Columbia University have discovered a totally new way to study earthquakes. They picked out different types of earthquakes from three years using machine learning algorithms. According to them, these machine learning methods pick out very subtle differences in the raw data that we’re just learning to interpret.

Scientists particularly identified earthquake recordings at The Geysers in California, one of the world’s oldest and largest geothermal fields. They assembled a catalog of 46,000 earthquake recordings, each represented as energy waves in a seismogram. They then mapped changes in the waves’ frequency through time, which they plotted as a spectrogram—a kind of musical roadmap of the waves’ changing pitches, were they to be converted to sound.

Seismologists ordinarily dissect seismograms to evaluate a quake’s size and where it started. However, taking a gander at a seismic tremor’s recurrence data rather enabled the scientists to apply machine-learning tools that can pick out patterns in music and human speech with minimal human information. With these instruments, the scientists diminished every seismic tremor to a spectral “fingerprint” reflecting its subtle contrasts from alternate quakes, and after that utilized a clustering algorithm to sort the fingerprints into groups.

Using this machine learning algorithms, they found repeating patterns of earthquakes appear to match the seasonal rise and fall of water-injection flows into the hot rocks below, suggesting a link to the mechanical processes that cause rocks to slip or crack, triggering an earthquake. It also helped them in making a link to the fluctuating amounts of water injected below ground at The Geysers during the energy-extraction process, giving the researchers a possible explanation for why the computer clustered the signals as it did.

Felix Waldhauser, a seismologist at Lamont-Doherty said, “The work now is to examine these clusters with traditional methods and see if we can understand the physics behind them. Usually, you have a hypothesis and test it. Here you’re building a hypothesis from a pattern the machine has found.”

Scientists noted, “These methods could also help reduce the likelihood of triggering larger earthquakes — at The Geysers, and anywhere else fluid is pumped underground, including at fracking-fluid disposal sites. Finally, the tools could help identify the warning signs of a big one on its way — one of the holy grails of seismology.”

The exploration became out of a bizarre aesthetic coordinated effort. As a musician, Holtzman had for quite some time been receptive to the odd hints of quakes. With sound designer Jason Candler, Holtzman had changed over the seismic floods of chronicles of outstanding quakes into sounds, and after that speeding them up to make them understandable to the human ear. Their joint effort, with examine coauthor Douglas Repetto, turned into the basis for Seismodome, a recurring show at the American Museum of Natural History’s Hayden Planetarium that puts people inside the earth to experience the living planet.

As the exhibit evolved, Holtzman began to wonder if the human ear might have an intuitive grasp of earthquake physics. In a series of experiments, he and study coauthor Arthur Paté, then a postdoctoral researcher at Lamont-Doherty, confirmed that humans could distinguish between temblors propagating through the seafloor or more rigid continental crust and originating from a thrust or strike-slip fault.

Encouraged, and looking to expand the research, Holtzman reached out to study co-author John Paisley, an electrical engineering professor at Columbia Engineering and member of Columbia’s Data Science Institute. Holtzman wanted to know if machine-learning tools might detect something new in a gigantic dataset of earthquakes. He decided to start with data from The Geysers because of a longstanding interest in geothermal energy.

Paisley said, “It was a typical clustering problem. But with 46,000 earthquakes it was not a straightforward task.”

Thus, Paisley found a mind-blowing solution of a topic modeling algorithm that picks usual frequencies in the dataset. When applying another algorithm, they identified the most common frequency combinations in each 10-second spectrogram to calculate its unique acoustic fingerprint. Finally, a clustering algorithm, without being told how to organize the data, grouped the 46,000 fingerprints by similarity.

At the point when the specialists coordinated the groups against normal month to month water-infusion volumes crosswise over Geysers, an example hopped out: A high infusion rate in winter, as urban areas send more run-off water to the territory, was related with more quakes and one kind of flag. A low mid-year infusion rate compared to fewer tremors, and a different flag, with transitional flags in spring and fall.

Now, scientists are planning to apply these methods to recordings of other naturally occurring earthquakes.

More Articles ...