5- Degree of success in natural disaster
prediction by AI 7
It is a natural event such as flood, earthquake or hurricane
that cause damage or loss of life. It effects both living and non-living.
Natural disasters are inevitable in our world. Natural
disasters are of different types so it is difficult to predict each and every
one. Meteorologists can track a hurricane with precision, but
seismologists cannot predict
exactly when and where an earthquake will occur.
Prediction of disasters require extensive research and
funding. To predict a natural disaster we have to collect extensive past data,
record live data and generate patterns on previous data. By comparing past and
live data scientist predicts the future events to some extent. Trends are
calculated and used to predict earthquakes, tsunamis and volcanic eruptions.
We can also predict natural disasters by constant
surveillance. Using offshore cameras in hurricane prone areas ensures that
strong winds and waves can be recognized, that will help in tsunami predictions.
Monitoring ocean currents, weather predictions can be predicted in advance,
warning nearby areas in advance under the risk of hurricanes and tornados. But
these short term warnings are only effective when relief programs are planned
and effectively carried out. But this method is very costly and inefficient.
For cost effectiveness and timely information of natural
disaster, predicting it in advance is the only solution. However it is not
always reliable, because disasters unexpected and do not always follows trends.
But it will save much time and resource than constant surveillance.
techniques for earthquake predictions
Natural disasters like earthquakes are caused due to the propagating
seismic waves underneath the surface of earth. Seismometers are installed on
different geographical positions to record vertical motion of surface waves. Ground
motion types are divergence, convergence which results in transforming plate
boundaries. Major earthquakes are caused by divergence, convergence and
transformation of plate boundaries commonly known as faults. The origin where earthquake takes place is
origin point. Total sum of waves are calculated and time series data is
collected for further processing.
There are four different aspects of this time series data
with respect to geophysical analysis can be considered for experimentation.
Analyze the earthquake data recording in
different time points independent of common source gather or common receiver
Analyze the earthquake data set in fixed or
variable length time intervals to predict different hidden patterns
Gathering layers data, like layer between Euro-Asian
and Indian plate etc, in time points to better analyze and study the seismic
patterns of layer with respect to time
Gather and analyze the earth lithosphere layer
data with respect to time intervals
Such identified characteristics of earthquake can be easily
scaled down using some activation function.
Figure 1: Illustration of criteria for fitness function
– Feed Forward Neural Network
It is used with sigmoid function. FFN is used on Seismic Electric signals, predicted magnitude and
pre-determined future seismic events. Prediction of structural responses for a
structure has 80.55% accuracy. Prediction has 71%
It is able
to predict both long and short term shocks. Outputs of different layers are not
– Particle Swarm Optimization
PSO is used for building prior knowledge system. It is used for selection of input values
for the BPN (Back Propagation Neural
Network) based network. It can determine earthquake local earthquake
Works on the principles of Swarms of particles
searching for optimal solution in the defined search space. Converge to the
solution more efficiently then general BPN.
– Genetic Algorithm
Rock mass stability is estimated for planning
purpose. Structural formation has been studied using GA.
Lower the data uncertainty. Used for building
settlement forecast after main shocks. Used in combination with support vector
machines for earthquake data set.
GA can work with improper or incomplete seismic
data. It is found highly efficient in prediction for future earthquakes.
Commonly used in research with different alterations.
Spatial clustering is used versus temporal
clustering for earthquake data sets. Spatial clustering has been identified in
data set while building earthquake forecast model using differential
Set of clusters is developed from huge set of
unsupervised data. This makes the overall scenario to be divided into many
sub-scenarios. Used in MSc algorithm with different aspect.
Artificial Intelligence based techniques were widely used for earthquake time
series prediction. , the results of traditional approaches of probability
estimation should be enhanced by using the particle swarm optimization and
genetic algorithms based approaches. PSO and GA are capable to find actual
fault intensity in any particular region. This work is an attempt to cover
different strategies related to AI for earthquake prediction and crosscheck
techniques for Water Storms Prediction
Water storms occur due to intense
unsustainable winds in oceans. Hurricanes, cyclones and typhoons are all water storms
but their name is different due to the geographical location of storm.
There are different artificial
intelligence techniques to predict storms. Some are given below.
– Nonlinear AI ensemble
A new nonlinear artificial
intelligence ensemble prediction (NAIEP) model has been developed for
predicting typhoon intensity based on multiple neural networks with the same
expected output and using an evolutionary genetic algorithm (GA).
Ensemble numerical prediction
(ENP) model, whether created with different physical process parameterization
schemes or with different initial conditions from a Monte Carlo approach,
formally consists of many different ensemble members. By optimizing the network
structure and the connection weight of ANNs, genetic evolution is able to
create a number of different neural network individuals.
Ensemble prediction of NWP is
motivated by the fact that NWP forecasts are sensitive both to small
uncertainties in the initial conditions and model errors, so it is hard to
further improve the accuracy of single model deterministic predictions.
To construct an NAIEP model, a
number of individual neural networks are first created and then integrated to
build an ensemble prediction model.
A GA is used to construct the
members of the ensemble, and a three-layer back-propagation (BP) network is
used as the basic model for the neural networks, the major computational steps
are summarized below:
generate the connection weights and thresholds from input layer to hidden layer
and from hidden layer to output layer, and set the global convergence error, ?,
of the model.
supervised learning training of the network with learning matrix samples,
calculate the error between the real input and expected output of the network,
and tune the connection weight coefficients from input layer to hidden layer
and from hidden layer to output layer using the learning algorithm of the
error-inverse propagation of the BP network.
calculated output error of the model is greater than ?, return to step 2;
otherwise, end the training and compute the prediction value using the
connection weights, thresholds of the network, and predictors of the prediction
The meteorological ensemble
modeling approach of GNN opens up a vast range of possibilities for operational
– Back propagation Neural Network
Like human neural network in
artificial neural network has 3 layers; perceptron, dendrites and axon. In NN,
each input is multiplied by its weight of its connection of neuron. Connection
determine which input has to be forward and then it sums up all the inputs.
Then it is passed through the hidden layer to calculate its results. After it
passes the result to output layer. In back propagation NN, there is only one
input layer, one output layer and a hidden layer. It is easy to calcite the
To compute the prediction of
storm or any other disaster information is collected and then it is feed to the
neural network. First of all data is normalized then it is feed to input layer.
From input layer the data is transferred to hidden layer. There we do our
calculations by applying sigmoid function. From hidden layer calculations are
collected and summed up, this sum is input to output layer.
NN with back propagation is a
self-driving system which collects data then train itself for different
conditions and scenarios and produce results.
NN with back propagation and
other NN’s are not more than pattern recognition techniques. They are just some
short term predictive skills not to replace metrologies. But it can help in
understanding metrological problems and can solve many complicated patterns
that are difficult to solve by humans and simple programs.
of success in natural disaster prediction by AI
For some people weather forecasts
are just for surety of good day ahead. But for some people it is everything.
Their bread and butter depends upon it. By applying artificial intelligence
knowledge we have been able to transforms life of many people and giving them a
Different companies and
governments are collecting data of winds, water and soil from satellites and
different devices installed on earth. By applying artificial intelligence with
a physical understanding of environment can significantly improve the
prediction skill for multiple types of high-impact weather. High-impact weather
includes events like severe thunderstorms, tornadoes, and hurricanes.
One example the paper highlighted
is that machine learning can provide more accurate hail forecasting. Hail cause billions of dollars damages every year. Even a modest
improvement in hail warning could produce significant savings by getting
individuals move their cars and themselves to safety. Provide these types of
warnings for car insurance companies is one way IBM is commercializing their
IBM, Panasonic are working
heavily on their weather forecasting systems and improving them day by day by
applying new developed artificial intelligence and computational intelligence
techniques. Better weather forecasting allows airlines to adjust their routes
to reduce fuel use, improve safety and increase on time arrives.
Better weather predictions has
direct effect on different fields of life. It directly effects the agriculture,
90% of crops are destroyed by weather conditions. It can be controlled by
proper weather conditions. If damage is inevitable then we can save our money
and time by not planting that kind of crops that are not suitable for that kind
of weather condition.
Transportation is improved by weather
forecasting. By directing routes and stopping the flights in near storm save
many lives and money than ever. The company Safety Line is using Panasonic’s weather forecasting
to optimize the climb profile for commercial aircraft. They claim their system
can reduce fuel consumption by up to 10 percent during ascent. To put that in
perspective, US airline carriers spent $24.6 billion on fuel last year.
Better weather forecasting saves
lives and helps speed up rebuilding efforts. IBM has started combining their
weather forecasting tools with information about utilities’ distribution
networks and data about local ground cover for severe storms. Using machine
learning, they are predicting likely outages. IBM claims they deliver damage predictions that
are 70 to 80 percent accurate 72 hours before the storm is expected.
The potential source of
weather-related data will continue to grow dramatically and the new advances in
machine learning are making it possible for government agencies and companies
to make better use of all this data. Weather forecasting can never be truly
perfect, but AI will allow the practice to continue to improve in its accuracy
and in its resolution.
The more refined and localized
our weather information gets, the easier it will be to find distinct patterns
and connections. Even small improvements in weather forecasting will give
companies new useful pieces of data by finding new correlations and giving
companies more lead time to take advantage of them.
Natural disasters are inevitable
and unpredicted in nature. Nobody can exactly predict that what will happen
next. But by the passage of time human being is able to extract information
from past events and made patterns from that information. In past those patterns
were not so clear and difficult to compute. Modern day technology helps to
collect data and draws results from that data effectively.
Artificial intelligence plays a
key role in pattern recognition and analysis of past events to predict future
events. Different techniques like Neural network (NN), genetic algorithm (GN),
particle swarm optimization (PSO), clustering and many more helps us to find patterns
and prediction. These algorithms alone do not generate good results. But by merging
two algorithms gives us better results that give us better understanding of occurring
and future events.
By using artificial intelligence
techniques success rate of prediction is about 60-70%. Although it is not so accurate
one but it helps to save resources and lives. By using weather prediction crops
are not destroying any more than before. Air traffic is controlled in a good
manner and they are informed before any bad can happen to them. In short, artificial
intelligence made a good impact on the life of people by giving them useful
piece of information in advance.
Aberson, S. D., and C. R. Sampson, 2003:
On the predictability of tropical cyclone tracks
in the northwest Pacific basin. Mon. Wea. Rev., 131, 1491–1497.
Chen, G-L., X-F. Wang, and Z-Q. Zhuang, 1996: Genetic
Algorithm and Application. (in Chinese). Beijing Communication Press, 178
F. Time and time again: the many ways to represent time. International
Journal of Intelligent Systems. 1991;6(4):341–355.
L. Random forests. Machine Learning. 2001;45(1):5–32.
R. P. Tornado dynamics. In: Kessler E., editor. Thunderstorm
morphology and dynamics. Norman: University of Oklahoma
Press; 1986. pp. 197–236.
U.S. Dept. of Commerce/NO A A: National Hurricane Operation
Plan: Federal Coordination For Metrology Services and Supporting Research,
Washington D.C. 1992.
D. E. Rumelhart, J. L. McClelland, “Exploration in the
Microstructure of Cognition” in Parallel Distributed Processing, The MIT
Press, vol. 1, 1986
S. Y. Pakkala, F. C. Lin, Proceedings of the SPIE
R., Haase J. S., Ellsworth W. L., Bouin M. P., Calais E., Symithe S. J.,
Armbruster J. G., De Lépinay B. M., Deschamps A. and Mildor S. L., Crustal
Structure and Fault Geometry of the 2010 Haiti Earthquake from
Deployments, Bulletin of the Seismological Society of America,
Klein E. M., Geochemistry of the
Igneous Oceanic Crust, Treatise on Geochemistry, 3, 433-463 (2003)
Holtzman B. K. and Kendall J.,
Organized melt, seismic anisotropy, and plate boundary lubrication, Geochemistry,
Geophysics, Geosystems, 11(12), 1-29 (2010)