Global Anti-Forecasting of Catastrophic (M6.2+) Earthquakes
Global Anti-Forecasting of Catastrophic (M6.2+) Earthquakes
Building Free Cross-Platform Online Service for Global Anti-Forecasting (Positive Prediction of Lack) of M6.2+ Catastrophic Earthquakes
Building Free Cross-Platform Online Service for Global Anti-Forecasting (Positive Prediction of Lack) of M6.2+ Catastrophic Earthquakes Read more
As recent discoveries in geophysics have shown, catastrophic (~M6+) earthquakes are caused by external (astrophysical) phenomena. The project will establish a free cross-platform online service for dissemination of scientific anti-forecasts for identifying the upcoming periods of "calm Earth" (absence of catastrophic seismicity) months ahead. The service will improve earthquake preparedness globally.
As the PI's discovery of astrophysical cause of tectonics has shown, strong (M6.2+) earthquakes are not caused by internal (geophysical) forces as previously believed, but external (astrophysical) phenomena instead. The discovery was accepted for publishing in peer-reviewed journal Earthquake Science (Editor-in-Chief: Professor Chen Yuntai, member of the Chinese Academy of Sciences).
Brief: as 3 heavenly bodies revolve in a virtually co-planar solar system like our Sun's, they align dynamically on occasion (their relative angular velocities get in step). When Earth aligns to 2 bodies for 3+ days, they magnify each other's oscillations to the point of shaking - felt on Earth as M6.2+ quakes. This physical phenomenon is observed as the increase-peak-decrease pattern in M5.6+ earthquakes' magnitudes.
In seismology science of geophysics is not used for earthquake forecasting, and future seismic activity is guessed based on statistical analyses of past quakes alone - as the last resort due to the lack of knowledge of overall underlying physics behind seismotectonics as such. Here, geophysics alone is used as based on discovery of astrophysical phenomena that are undoubtedly guiding planetary physics, without need to resort to statistics.
Every science tries to reliably predict natural processes it studies. When that is not possible due to unknown nature of the process, the next best thing scientists do is forecast. Classically, forecasts are less reliable than prediction as they use statistics instead of missing physics. Thus seismology is an application of science of geophysics to earthquake forecasting based on past quakes.
If the process becomes known, its physics becomes all that scientists need to make either a prediction or an anti-forecast (a safely guessed absence of prediction) - which is a prediction in its own right. Or in case of earthquakes: to know when we will be seismically safe is as important as to know when we will not.
The project will establish first free online service for global scientific anti-forecasting of M6+ earthquakes - telling reliably of the upcoming periods of absence of hazardous quakes anywhere on Earth, months ahead.
The funds will be used to purchase equipment, pay students to collect and analyze data, and a developer/test-engineer to create cross-platform services. Specifically, we will install high-end computing platforms for involved personnel.
Renting of shared computer power might be necessary during phase One testing in order to develop and fine-tune the service for reliability and scalability.
Towards the project's end, a reliable and long-existing (preferably with a long tradition) non-profit foundation will be sought that would take over the service and maintain it on a permanent basis.
The project methodology is based on:
- near-real-time data collection from testing phase
- real-time data collection and synthetic data creation for monitoring co-service
- data analysis including numerical/spectral analyses
- verification of online service model against science
- verification of monitoring co-service against science.
Prior Activities Plan
The project will make use of data modeling in phase One. Both synthetic and real data will be verified before being used in the project.
Variations of the projected online service in form of vector spaces will be designed and compared against synthetic and real data, as well as in between each other.
- Salaries (PI, developer, grad students) $75,000
- Equipment (2+ high-end platforms) $35,000
- Production costs (server, webdesign) $20,000
- Publishing (announcing, marketing) $15,000
- Travel (meetings, presentations) $5,000
- TOTAL $150,000
The budget items are essential for completion of the project. A project like this would cost considerably more if funded through usual avenues, not just due to overhead but also technical complexity and required IT expertise - here handled by an external developer/freelancer and if necessary an IT consultant.
High-end computing power is essential, as reflected on Equipment item. The testing stage in phase One is essential, and therefore has to be supported by a strong hardware architecture with scientific and commercial licensed software. Shared power might be necessary to rent also.
The budgeted Production costs are minimal, as they include payments for such things as server rental/purchase for 5+ years.
The budgeted Travel costs are minimal too, given number of personnel (4-5, depending on number of students needed).
The budgeted Publishing expenses include marketing, as the service, though free, must be advertised in order for the public to become aware that the service exists.
This research consists of 2 phases:
- May 01, 2018 - Create testing service along with background monitoring service
- Nov 01, 2018 - Launch full online service
In phase One, feasibility and scalability of the proposed online service's variations will be tested against science for real-time data dissemination, accuracy and reliability.
In phase Two, the full service will be established, released, as well as publicly announced and marketed.
Background monitoring service (to be developed during phase One) will be adjusted and included as part of the full service, for operational redundancy and increased reliability.
The background research by the PI regards the Earth as a mechanical oscillator forced externally by secondary gravitational-tidal effects - primarily mass-resonance magnification under conjunctions of Earth and 2 other heavenly bodies in our Solar system lasting for 3+ days. Magnification was found to cause tectonic earthquakes.
Based on the PI's research - into Earth's magnified oscillations as a cause of seismotectonics and a modeling-independent approach to spectral analysis for geophysical data which he devised - monitoring of global M5.6+ seismicity in relation to Earth's conjunctions has been performed in near-real-time over the past 3 years, and monthly results presented via testing platform at www.seismo.info.
A near-real-time dissemination test-model, utilizing third-party services, has been developed via social media like Twitter and Facebook. More than half a million tweets and posts have been disseminated.
Risks and challenges
Main challenge is to establish a reliable service across platforms and architectures.
Other challenges include securing the post-project life of the projected online service.Learn about accountability on Kickstarter
- (30 days)