March 24, 2010 By Elaine Pittman
Photo: The 6.7 magnitude Northridge Earthquake caused $2.5 billion in damages to roads and buildings, according to FEMA. Photo courtesy of FEMA
Earthquakes are unlike hurricanes and floods because they can't be predicted. But what if there was warning system to alert residents that the ground would start shaking in 10 to 15 seconds? Although this doesn't sound like very much time, it would provide the opportunity for:
In only 15 seconds, people would have the opportunity to improve their safety and prevent data loss. This is what seismologists and the U.S. Geological Survey (USGS) are trying to provide California residents -- a publicly available early warning earthquake system.
"The idea is that you detect the beginnings of the earthquake, and then you rapidly assess the magnitude that earthquake poses, and provide a warning to people before the shaking starts," said Richard Allen, seismology professor at the University of California (UC), Berkeley. "We're talking about very short periods of time -- a few seconds to a few tens of seconds."
The combination of new technologies and expanded understanding about earthquakes is letting seismologists move closer to issuing public alerts before people feel the first tremor. "There's a recognized need for more rapid earthquake information, particularly in our digital age," said David Oppenheimer, seismologist for the USGS. "There are certain applications that could be used with earthquake early warning to mitigate the impacts of an earthquake. That's our mission."
Earthquake early warning isn't a new idea. On Oct. 1, 2007, the Japan Meteorological Agency launched the most advanced early warning system to date, which provides alerts through media outlets and Internet applications when an earthquake is detected. Systems also have been established in Mexico City, Turkey, Taiwan and Romania, Allen said.
Early warning systems detect primary waves, the first tremors of an earthquake that travel at about 1 to 5 mph in the Earth's crust, according to the Nevada Seismological Lab at the University of Nevada. The damage comes from the next round of waves, called shear or secondary waves, which can topple buildings and cause the damage associated with large temblors. These "s-waves" move slower than their predecessors, which provides a window of time in which the public could be alerted. The time between the two sets of waves is fairly constant, which lets seismologists estimate when the ground will begin shaking in a given area.
In August 2006, the USGS funded a $900,000 project to take the algorithms that various groups had developed for earthquake prediction and get them running on real-time seismic systems. The effort is a collaboration among the USGS, Swiss Seismological Institute, Southern California Earthquake Center and UC Berkeley. Seismologists monitored the performance of three algorithms that are running statewide in California, according to Allen. He said the algorithms constantly detect earthquakes and accurately predicted the two largest quakes during the three-year test period. They were magnitude 5.4 earthquakes, one in the San Francisco Bay Area and one in the Los Angeles region.
The initial test period proved the concept was technically feasible. In August 2009, phase two began as another three-year project funded by the USGS for $1.2 million. "The goal of this three-year project is to develop a prototype warning system that actually provides warning to a small group of users, maybe 10 institutional users," Allen said. "So we're currently in the process of taking the best from each of these three algorithms and combining them into a single algorithm that will form
This Digital Communities white paper highlights discussions with IT officials in four counties that have adopted shared services models. Our aim was to learn about the obstacles these governments have faced when it comes to shared services and what it takes to overcome those roadblocks. We also spoke with several members of the IT industry who have thought long and hard about these issues. The paper offers some best practices for shared government-to-government services, but also points out challenges that government and industry still must overcome before this model gains widespread adoption.