India's largest platform for AI & Analytics leaders & professionals

Sign in

India's largest platform for AI & Analytics leaders & professionals

3AI Digital Library

Forecasting Earthquakes with Analytics

3AI December 14, 2020

We know the quakes are coming. We just don’t know how to tell enough people early enough to avoid the catastrophe ahead. Around the world more than 13,000 people are killed each year by earthquakes, and almost 5 million have their lives affected by injury or loss of property. Add to that $12 billion a year in economic losses to the global economy (the average annual toll between 1980 and 2008). Understandably for some time scientists have been asking if earthquakes can be predicted more accurately.

Unfortunately, the conventional answer has often been “no”. For many years earthquake prediction relied almost entirely on monitoring the frequency of quakes or natural events in the surroundings and using this to establish when they were likely to reoccur. A case in point is the Haicheng earthquake that occurred in eastern China on February 4, 1975. Just prior to this earthquake, the temperatures were high and the pressure was abnormal. Many snakes and rodents also emerged from the ground as a warning sign. With this information, the State Seismological Bureau (SSB) was able to predict an earthquake that helped to save many lives. However, this prediction was issued on the day when the earthquake occurred, so it did cause heavy loss of property. Had this earthquake been predicted a few days earlier, it could have been possible to completely evacuate the affected cities, and this is exactly where big data fits in.

Nature is always giving cues about the occurrence of events, and it is simply up to us to tune in to these cues so that we can act accordingly. Since these cues are widespread, it is best to use big data to collectively bring in this data to a central location so that analysis and the resulting predictions are more accurate. Some common information that can be tracked by big data is the movement of animals and the atmospheric conditions preceding earthquakes.

Scientists today predict where major earthquakes are likely to occur, based on the movement of the plates in the Earth and the location of fault zones. They calculate quake probabilities by looking at the history of earthquakes in the region and detecting where pressure is building along fault lines. These can go wrong as a strain released along a section of the fault line can transfer strain to another section. This is also what happened in the recent quake, say French scientists, noting that the 1934 quake on the eastern segment had moved a part of the strain to the eastern section where the latest quake was triggered.

Academics often put forward arguments that accurate earthquake prediction is inherently impossible, as conditions for potential seismic disturbance exist along all tectonic fault lines, and a build-up of small-scale seismic activity can effectively trigger larger, more devastating quakes at any point. However all this is changing. Big Data analysis has opened up the game to a new breed of earthquake forecasters using satellite and atmospheric data combined with statistical analysis. And their striking results seem to be proving the naysayers wrong.

One of these innovators is Jersey-based Terra Seismic, which uses satellite data to predict major earthquakes anywhere in the world with 90% accuracy. It uses unparalleled satellite Big Data technology, in many cases they could forecast major (magnitude 6+) quakes from one to 30 days before they occur in all key seismic prone countries. It uses open source software written in Python and running on Apache web servers to process large volumes of satellite data, taken each day from regions where seismic activity is ongoing or seems imminent. Custom algorithms analyze the satellite images and sensor data to extrapolate risk, based on historical facts of which combinations of circumstances have previously led to dangerous quakes.

Of course plenty of other organizations have monitored these signs – but it is big data analytics which is now providing the leap in levels of accuracy. Monitored in isolation these particular metrics might be meaningless – due to the huge number of factors involved in determining where a quake will hit, and how severe it will be. But with the ability to monitor all potential quake areas, and correlate any data point on one quake, with any other – predictions can become far more precise, and far more accurate models of likely quake activity can be constructed, based on statistical likelihood.

So once again we see Big Data being put to use to make the impossible possible – and hopefully cut down on the human misery and waste of life caused by natural disasters across the globe.

    3AI Trending Articles

  • Unlocking Synergy: Combining Computer Vision, NLP, and Deep Learning for Automated Process Discovery & Process Intelligence

    Featured Article: Author: Anurag Upadhyay, Accenture Introduction: In today’s dynamic business landscape, organizations face an ever-increasing demand for efficiency, innovation, and competitiveness. To meet these challenges head-on, businesses are turning to cutting-edge technologies that can revolutionize how they understand, optimize, and manage their operations. Among these transformative technologies, the fusion of Computer Vision, Natural Language […]

  • Top four focus areas to help you shape a data-driven enterprise

    Author: Praveen Reddy, Vice President (Digital – data, analytics, and cloud) | Genpact As we navigate the world of data, some significant trends are manifesting with respect to data ownership and utilization. There are multiple triggers for these trends – ever-rising complexity of data, lack of proper intelligence concerning data assets, need for accelerated digital transformation, […]

  • Impact of AI on Data Anonymization

    Featured Article: Author: Mohan Khilariwal, DGM, Data Engineering, Analytics and Data Science, HCL Technologies Data anonymization has become much more critical as we are into the era where data sharing is becoming a necessity in order to serve the customer, new business opportunities, Governments want to take help from the industry to serve its people […]

  • A Framework for Analytics & AI Value Realization

    We’re living in times of a great data paradox. One side of the paradox is the explosive growth of data in and around us. Here are some facts reported in IDC’s Global Data Sphere to corroborate this: – 59 Zettabytes – total volume of data processed in the year 2020– 26% CAGR – the forecasted […]