Physicists who have developed a new method of forecasting large earthquakes didn't have to wait long for validation of their technique.
One Southern California earthquake occurred, as they'd forecast, while their research paper was awaiting publication, and another came just three days after the paper was published in the Feb. 19 issue of the Proceedings of the National Academy of Sciences.
But John Rundle of the University of Colorado, lead author of the study, isn't exactly jumping up and down with joy. He says the earthquakes gave him and his colleagues some measure of "professional satisfaction," especially since nobody was hurt, but it will be a long time before he or anyone else knows whether he is really on to something.
Rundle and Kristy Tiampo of the university's Cooperative Institute for Research in Environmental Sciences, along with colleagues at NASA's Jet Propulsion Laboratory in Southern California and the Los Alamos National Laboratory in New Mexico say they have come up with a new way of forecasting large earthquakes, and they have laid it all out there for the rest of us to see if they are right.
Looking at the Big Picture
If they are, California is in for a lot of shaking before this decade ends, and Rundle thinks — although he quickly admits he doesn't know for sure — that he knows where those quakes are most likely to occur.
He doesn't know exactly when they will hit, and a forecast that calls for them to strike sometime in this decade is of limited use to urban planners, but Rundle says at this point it's just "a real time experiment to see if this works."
A forecast is different from a prediction in that it offers a probability of a quake occurring within a designated time frame as opposed to a far more precise prediction calling for an earthquake of a certain magnitude on a certain fault on a certain date. Seismologists would like to be more precise, but so far, no one has figured out how to predict just when and where the next quake is going to hit.
So most forecasts are based on evidence that any particular fault is likely to rupture sometime in the next few years.
Rundle and his colleagues have taken a different approach. They have compiled a 70-year history of smaller earthquakes (magnitude 3 and above) throughout Southern and Central California in an effort to determine changes in seismic patterns since 1932. It is those changes, rather than the potential of any particular fault, that rule the seismic roost in any earthquake zone, according to the researchers.
The scientists divided the region into about 3,000 "boxes" measuring about seven miles on each side, resulting in a grid-like map of Central and Southern California. (Seven miles is roughly the length of the rupture along a fault required to produce an earthquake of magnitude 6.)
All earthquakes of magnitude 3 or greater were plotted on the map, along with the time they occurred. That gave the scientists a way to determine how seismic patterns had changed over time.
The result: "complicated mixtures of seismic quiescence and seismic activation," Rundle says.
Most Quakes Mapped in Empty Regions
"What happens is some of these faults start to slide a little bit, leading up to an earthquake, and as they slide there are other faults in the area that have been active but they have different directions, or orientations, with respect to the one that's starting to slide," he adds. "The ones that have an unfavorable orientation will tend to get squeezed shut, and the activity on them will tend to get shut down."
So activity begins to increase on some faults, "building up toward a larger earthquake, whereas on other faults in that same region, the activity begins to decrease."
All of that data was fed into a supercomputer that generated a map showing the seismic potential for all of Central and Southern California. The color-coded map shows areas where the scientists believe quakes of magnitude 5 or greater are most likely to occur during this decade.
Interestingly, if the researchers are right, most of the quakes will be in fairly sparsely populated areas, particularly the desert regions east of Los Angeles. None should occur directly below the city itself, although suburbs to the north and east may get some shaking.
One quake measuring 5.1 hit the Big Bear area in the mountains east of Los Angeles on Feb. 10, 2001, just as the scientists completed their research. A 5.1 earthquake hit the Anza area east of San Diego on Oct. 31, 2001, while the paper was awaiting publication. And a third hit just below the border with Mexico on Feb. 22 of this year. All were in areas that the researchers had forecast.
There have been no other quakes greater than 5 in the region during that period, so the researchers have a perfect score, so far.
Stating the Obvious?
The study has generated considerable interest in the seismological community, but some believe Rundle and his colleagues are just stating the obvious. They aren't likely to miss often, because the areas they have highlighted are well known for their seismic activity.
"In the final analysis, his method says there will be future earthquakes in areas that have had earthquakes in the past," says Mary Lou Zoback, who has worked in earthquake forecasting for the U.S. Geological Survey.
But Rundle points out that some of the areas that should be in for a large quake, according to his forecast, "have had smaller earthquakes, but haven't yet had a really big earthquake. What we are actually doing is using the smaller earthquakes to try and forecast the bigger earthquakes."
That was precisely the case for the Baja quake of Feb. 22, which hit in an area that had only recorded smaller quakes.
Rundle says he expects his team's forecast to be right about 80 to 90 percent of the time, but interestingly, that's not really what this is all about. The forecast grew out of a broader effort to understand something called "non linear threshold systems."
Brain, Web and Earthquakes
Earthquake forecasting lends itself well to the study of a perplexing condition that is shared by fields that would seem to have little in common, including superconductors, the World Wide Web, and even the human brain.
All are considered "leaky threshold systems," in which an "avalanche of events" leads to another event, often of far greater consequence.
The human brain, for example, receives data from a wide range of sources. I see a chocolate bar on my desk, my memory tells me it tastes great, my nose tells me it smells good, my eyes tell me it's really chocolate, and eventually that "avalanche" of events causes me to take a bite.
Rundle, a specialist in complex systems, says he turned to earthquakes because they are an interesting "threshold system," and if he can figure out what makes them tick, the same method might be applicable to other fields.
But none of those fields is more complex, and more vexing, than trying to pin down just when and where the next earthquake will hit. The field of earthquake prediction, or forecasting, is littered with failures by people who thought they had figured it out.
There is a little yellow dot on Rundle's map in the Cholame Hills of Central California. The dot signifies that there is a fair potential for a large earthquake in that area during this decade, and it sits squarely on a rural community called Parkfield.
That's a famous name among seismologists. Earthquakes of magnitude 6 or greater hit that segment of the notorious San Andreas Fault in 1857, 1881, 1901, 1922, 1934, and 1966. That suggested a certain earthquake periodicity in that area, and in the 1980s scientists with the U.S. Geological Survey confidently predicted that a major quake would hit there before 1993.
Tens of millions of dollars worth of equipment was rushed to the area by scientists who hoped to catch an earthquake in the act, thus measuring every event that precedes a major quake.
Now nine years past due and Parkfield still waits for the Big One.
Lee Dye’s column appears weekly on ABCNEWS.com. A former science writer for the Los Angeles Times, he now lives in Juneau, Alaska.