"What happens is some of these faults start to slide a little bit, leading up to an earthquake, and as they slide there are other faults in the area that have been active but they have different directions, or orientations, with respect to the one that's starting to slide," he adds. "The ones that have an unfavorable orientation will tend to get squeezed shut, and the activity on them will tend to get shut down."
So activity begins to increase on some faults, "building up toward a larger earthquake, whereas on other faults in that same region, the activity begins to decrease."
All of that data was fed into a supercomputer that generated a map showing the seismic potential for all of Central and Southern California. The color-coded map shows areas where the scientists believe quakes of magnitude 5 or greater are most likely to occur during this decade.
Interestingly, if the researchers are right, most of the quakes will be in fairly sparsely populated areas, particularly the desert regions east of Los Angeles. None should occur directly below the city itself, although suburbs to the north and east may get some shaking.
One quake measuring 5.1 hit the Big Bear area in the mountains east of Los Angeles on Feb. 10, 2001, just as the scientists completed their research. A 5.1 earthquake hit the Anza area east of San Diego on Oct. 31, 2001, while the paper was awaiting publication. And a third hit just below the border with Mexico on Feb. 22 of this year. All were in areas that the researchers had forecast.
There have been no other quakes greater than 5 in the region during that period, so the researchers have a perfect score, so far.
Stating the Obvious?
The study has generated considerable interest in the seismological community, but some believe Rundle and his colleagues are just stating the obvious. They aren't likely to miss often, because the areas they have highlighted are well known for their seismic activity.
"In the final analysis, his method says there will be future earthquakes in areas that have had earthquakes in the past," says Mary Lou Zoback, who has worked in earthquake forecasting for the U.S. Geological Survey.
But Rundle points out that some of the areas that should be in for a large quake, according to his forecast, "have had smaller earthquakes, but haven't yet had a really big earthquake. What we are actually doing is using the smaller earthquakes to try and forecast the bigger earthquakes."
That was precisely the case for the Baja quake of Feb. 22, which hit in an area that had only recorded smaller quakes.
Rundle says he expects his team's forecast to be right about 80 to 90 percent of the time, but interestingly, that's not really what this is all about. The forecast grew out of a broader effort to understand something called "non linear threshold systems."
Brain, Web and Earthquakes
Earthquake forecasting lends itself well to the study of a perplexing condition that is shared by fields that would seem to have little in common, including superconductors, the World Wide Web, and even the human brain.
All are considered "leaky threshold systems," in which an "avalanche of events" leads to another event, often of far greater consequence.