India's largest platform for AI & Analytics leaders & professionals

Sign in

India's largest platform for AI & Analytics leaders & professionals

3AI Digital Library

Forecasting Earthquakes with Analytics

3AI December 14, 2020

We know the quakes are coming. We just don’t know how to tell enough people early enough to avoid the catastrophe ahead. Around the world more than 13,000 people are killed each year by earthquakes, and almost 5 million have their lives affected by injury or loss of property. Add to that $12 billion a year in economic losses to the global economy (the average annual toll between 1980 and 2008). Understandably for some time scientists have been asking if earthquakes can be predicted more accurately.

Unfortunately, the conventional answer has often been “no”. For many years earthquake prediction relied almost entirely on monitoring the frequency of quakes or natural events in the surroundings and using this to establish when they were likely to reoccur. A case in point is the Haicheng earthquake that occurred in eastern China on February 4, 1975. Just prior to this earthquake, the temperatures were high and the pressure was abnormal. Many snakes and rodents also emerged from the ground as a warning sign. With this information, the State Seismological Bureau (SSB) was able to predict an earthquake that helped to save many lives. However, this prediction was issued on the day when the earthquake occurred, so it did cause heavy loss of property. Had this earthquake been predicted a few days earlier, it could have been possible to completely evacuate the affected cities, and this is exactly where big data fits in.

Nature is always giving cues about the occurrence of events, and it is simply up to us to tune in to these cues so that we can act accordingly. Since these cues are widespread, it is best to use big data to collectively bring in this data to a central location so that analysis and the resulting predictions are more accurate. Some common information that can be tracked by big data is the movement of animals and the atmospheric conditions preceding earthquakes.

Scientists today predict where major earthquakes are likely to occur, based on the movement of the plates in the Earth and the location of fault zones. They calculate quake probabilities by looking at the history of earthquakes in the region and detecting where pressure is building along fault lines. These can go wrong as a strain released along a section of the fault line can transfer strain to another section. This is also what happened in the recent quake, say French scientists, noting that the 1934 quake on the eastern segment had moved a part of the strain to the eastern section where the latest quake was triggered.

Academics often put forward arguments that accurate earthquake prediction is inherently impossible, as conditions for potential seismic disturbance exist along all tectonic fault lines, and a build-up of small-scale seismic activity can effectively trigger larger, more devastating quakes at any point. However all this is changing. Big Data analysis has opened up the game to a new breed of earthquake forecasters using satellite and atmospheric data combined with statistical analysis. And their striking results seem to be proving the naysayers wrong.

One of these innovators is Jersey-based Terra Seismic, which uses satellite data to predict major earthquakes anywhere in the world with 90% accuracy. It uses unparalleled satellite Big Data technology, in many cases they could forecast major (magnitude 6+) quakes from one to 30 days before they occur in all key seismic prone countries. It uses open source software written in Python and running on Apache web servers to process large volumes of satellite data, taken each day from regions where seismic activity is ongoing or seems imminent. Custom algorithms analyze the satellite images and sensor data to extrapolate risk, based on historical facts of which combinations of circumstances have previously led to dangerous quakes.

Of course plenty of other organizations have monitored these signs – but it is big data analytics which is now providing the leap in levels of accuracy. Monitored in isolation these particular metrics might be meaningless – due to the huge number of factors involved in determining where a quake will hit, and how severe it will be. But with the ability to monitor all potential quake areas, and correlate any data point on one quake, with any other – predictions can become far more precise, and far more accurate models of likely quake activity can be constructed, based on statistical likelihood.

So once again we see Big Data being put to use to make the impossible possible – and hopefully cut down on the human misery and waste of life caused by natural disasters across the globe.

    3AI Trending Articles

  • AI is changing the way doctors think about providing care

    While robots and computers will probably never completely replace doctors and nurses, machine learning/deep learning and AI are transforming the healthcare industry, improving outcomes, and changing the way doctors think about providing care. Machine learning is improving diagnostics, predicting outcomes, and just beginning to scratch the surface of personalized care. Imagine walking in to see […]

  • Embedding Data Quality in Data Strategy & Design for AI

    Featured Article: Author: Prabhu Chandrasekaran AI has been there over a decade, and with Gen AI touching newer frontiers and pushing the envelope across boundaries irrespective of industries and part of the society, One thing that is clearly emerging  world is not the same and – “Data” is not mere oil but a “Strategic Asset”. […]

  • Importance Of Data-Centric AI In Business And The Role Of Observability In It

    Featured Article: Author:  Anirban Nandi, Vice President, AI Products & Business Analytics, Rakuten India We all have heard the old saying that “Data is the new oil”. But wait!! Is that completely true? Just like oil, unrefined data is of little to no use. So, allow me to correct the maxim a little – “Quality […]

  • Microsoft Announces Limited Access to its Custom Neural Voice

    Microsoft announced limited access to its neural text-to-speech AI called Custom Neural Voice. The service allows developers to create custom synthetic voices. . The Custom Neural Voice is a Text-to-Speech (TTS) feature of Speech in Azure Cognitive Services that allows users to create a one-of-a-kind customized synthetic voice for their brand.  Since the preview last year in September, the […]