The recent devastation in Haiti has reminded us once again of the powerful natural forces at work all around us. Beneath our feet is a cornucopia of activity that we are powerless to control, and largely inefficient at predicting. But it’s not for lack of trying; given the impact that earthquakes can have — not only on the geographic area they are localized in, but on the country and environment as a whole — it’s no surprise that scientists put a great deal of effort into trying to anticipate where the next one will strike. Earthquakes can lead to socio-economic volatility and, depending on the technologies in the area affected, could produce adverse environmental effects, such as large dams or nuclear power plants becoming damaged or destroyed in the quake or in the ensuing aftershocks.
Manitoba is the least likely province in Canada to experience an earthquake. We are situated near the center of the North American Plate, one of nine larger tectonic plates shuffling around the Earth at (literally) the speed your finger nails grow. Earthquakes occur when a tectonic plate, like a slow-moving jigsaw puzzle piece with a jagged edge, gets stuck to another puzzle piece. The pressure builds up as potential energy until the stress of the friction is too great, causing a break and release of the stored energy.
The most common quakes occur in the lithosphere, the area composed of the crust and upper mantle with a thickness ranging from a few hundred kilometres in some continental regions, to just a few kilometres near the mid-ocean ridge. Quakes are centered on fault lines, which are usually located at the boundaries between tectonic plates. However, a small portion of quakes, roughly five per cent, can also be found away from the boundaries where a plate is sectioned into two or more larger blocks. Since the majority of earthquakes are located at faults found at tectonic plate boundaries, we are able to make some forecasts as to how likely and with what magnitude the next earthquake will strike at a given location.
Forecasts are not to be confused with predictions, as forecasts assign probabilities to when and with what force an earthquake will strike, and predictions assign a specific date and location to where an earthquake will occur. Earthquake forecasts, according to U.S. Geological Survey (USGS) are similar to weather forecasts in that predictions are made based on comparing the seismic activity today with the results of similar seismic activity in the past. This type of “farmer’s almanac” forecasting also assists emergency response and government organizations in deciding the probability of aftershocks, and with that, determining when it’s safe for citizens to return to their homes and start the repair process.
Although forecasting is definitely important, earthquake prediction is the science we need to perfect if we want to prevent the loss of lives and mass destruction.
One exciting new area of earthquake prediction comes from observations made by the Detection of Electro-Magnetic Emissions Transmitted from Earthquake Regions (DEMETER) microsatellite — dubbed a microsatellite because of its minimal weight of 120 kg — which was launched in June 2004 from Baikonour, Kazahkstan.
DEMETER carries onboard instruments that monitor ultra low frequency (ULF), extremely low frequency (ELF) and high frequency emissions, which can be associated with electromagnetic and ionospheric disturbances. It is these disturbances that appear inextricably linked to seismic events. In a recent investigation that used DEMETER to monitor electromagnetic anomalies, activity in the Indonesian region during seismic activity was found in both the ULF and ELF ranges. Further investigation of low frequency wavelengths with DEMETER and other orbiting satellites such as the Stanford-built Quakesat could play an important role in monitoring electric and magnetic anomalies and subsequently predicting the earthquakes these anomies are associated with.
Despite a number of prediction methods currently being explored by scientists, our best coping mechanism for earthquakes is the existence of detection mechanisms. It may surprise you to learn that earthquake detection dates as far back to 132 A.D.. Chinese philosopher Zhang Heng designed an elaborate machine where earthquake tremors would dislodge brass balls, dropping them into the mouth of bronze toads. During an earthquake, the pendulum would move, causing a dragon to drop a ball, which would subsequently land in a corresponding toad’s mouth. Depending on which dragon dropped the ball, the direction of the earthquake could be detected. Although the instruments currently in use are much more technologically advanced, the principles are essentially the same.
When an earthquake occurs, different seismic waves travel out from the centre of the earthquake, known as the hypocentre — the epicentre being the surface directly above the hypocentre. The first and fastest moving waves, known as primary waves (p-waves), are the driving force behind early detection systems. The speed at which p-waves and secondary waves (s-waves) travel mean that there is a one-second separation between the waves for every 8 km from the hypocentre.
These few seconds can be instrumental in having people move to a safer location, either by evacuating buildings that are not earthquake-proof, or moving away from objects that could topple onto them. Although the s-waves produce some damage, it is the surface waves that arrive after the p- and s-waves that are responsible for the brunt of the devastation associated with earthquakes. Most of the characteristic tremors that accompany earthquakes are due to Rayleigh waves, which are a type of surface wave that rolls along the continent like how a wave rolls across an ocean. The energy contained in the surface waves largely determines the magnitude of the quakes.
The most well-known measure of earthquake magnitude is the Richter Magnitude Scale (RMS), developed by Charles Richter and Beno Gutenberg in 1935, which determines the severity of the earthquake based on the amplitude of the seismic waves recorded on a seismograph. Through a zig-zag trace — similar to one used in polygraph (lie detector) machines — ground oscillations beneath a seismograph are recorded as a seismogram using a seismometer, an amplifier, and traditionally a hardcopy display. Modern seismographs digitize the output, and store it on removable disks or send it to a central site to be recorded and analyzed.
The output is then interpreted as a magnitude based on a logarithmic scale with every whole number increase indicating a 10-fold increase in magnitude. Thus, a magnitude five is 10 times stronger than a magnitude four; a magnitude six is 100 times stronger than a magnitude four; and so on. The RMS has no discernable upper limit, but since every whole number increase on the scale is associated with the release of approximately 31 times more energy than the preceding whole number, as we approach magnitudes seven and higher, the damage could be devastating depending on where the quake originates. But devastation does not factor into the magnitude rating of the RMS, only the strength of the seismic waves. To measure devastation, American seismologists Harry Wood and Frank Neumann developed the Modified Mercalli Scale (MMS.)
The Modified Mercalli Scale (MMS) is an arbitrary ranking based on the “effects of an earthquake at a given place, on natural features, on industrial installations and on human beings.” In Port-Au-Prince, Haiti, the MMS rating was 10 out of a possible 12, indicating that the damage was very destructive, but not total. A 12 on the MMS represents total destruction with virtually all buildings destroyed. Of course, knowing that the Haiti earthquake measured 7.0 on the RRS or 10 on the MMS offers little comfort to the people of Haiti, and the millions affected by this tragedy the world over. But monitoring systems and accurate records can serve as lessons while pursuing future research in earthquake prediction and forecasting.