Thursday, February 25, 2010
Poseidon, Sisyphus & Existentialism
The Existentialist view is that life can often be absurd and is exemplified by Camus' The Myth of Sisyphus. Camus' conclusion is that although we face trials but must "see Sisyphus happy." The lesson is that we must learn to march to the beat of a different drummer.
Wednesday, February 24, 2010
Earthquake Thermodynamics?
With respect to the unexpected deviations from the distributions for Southern California it may also be that there is some coupling among the faults or the statistics may be affected to some extent by absorption by the medium. The seismologists make assumptions about the nature of the location of the earthquake, that of the location of the seismograph and the path connecting them. In optics the medium can absorb some of the radiation and we think of it as a filter. The same may be true with the Earth acting as a filter. It is less likely that a discrete spectrum is present in the earthquake data.
Tuesday, February 23, 2010
Are the Data and the Fit Holographic?
One of the problems with the fit that was obtained is that the deviations of the histogram points from the fit curve are greater than expected. Near the peak they are way beyond the 3σ bounds. This indicates that the fluctuations are probably not statistical in nature. One has to consider the possibility that there is some correlation among events in the occurrance of earthquakes. We can no longer assume that they are statistically independent events. Among the possibilities are a common cause, aftershocks following a major earthquake or an earthquake at one location may stimulate earthquakes elsewhere. Is there an earthquake equivalent of a laser? If so, we should probably be more concerned about our neighbors.
One item that I forgot to mention is that the peak for the fit is at M 1.086. I also need to add a correction or maybe just a clarification concerning the failure of the "equilibrium distribution." The assumption that the distribution we were looking for was connected with the equilibrium distribution of the mechanism failed. Each mechanism and its transition matrix has an equilibrium solution. We can't exclude mechanisms just because they have equilibrium solutions. It is more likely that the simple feedback mechanism considered failed to give a distribution of the correct shape with a peak near M 1.1*.
*edit: It was possible to find a g function which produced a peak for an earthquake distribution but it was not where g = 1 which was a contradiction with the initial assumptions.
Monday, February 22, 2010
A Simplified Fit for the Earthquake Distribution
Sunday, February 21, 2010
Assumptions for the Markov Process
Saturday, February 20, 2010
Some Hindsight on the Assumption of an Equilibrium Distribution
Curve fits can be difficult since there is often more than one minimum and a number of them close together can interfer with each other and prevent a computer program from converging to a solution even if one has an approximate solution. Even if one can find a local minimum that does not guarantee that the minimum is a global minimum, that is, the lowest of all minimums. So there may be competing mechanisms.
Conclusion: Equilibrium Distribution = FAIL
A New Fit for the Southern California Earthquake Data
The gain function is what one would expect being greater than one to the left of the peak and less than one on the right where the distribution steadily decreases. Where the slope of the plot of logN approximately a constant the ratio of the consecutive number of earthquakes would be the same and one would expect g to be a flat line for these magnitudes. In the plot below the solid line represents the g values computed from the polynomial and the points are the computed ratios for the consecutive number of earthquakes and are a check on the results.
The fraction of states, r, for which there is an earthquake is 1/2 at the peak as one would expect since g = 1 at the peak. r(1+g)=1. I borrowed the term gain for consecutive ratios from the analogy with the ratio of the input and output of a feedback amplifier. An amplifier is stable if the gain is not -1 which is the case for earthquakes being whole numbers. So the earthquake mechanism is probably stable too. One would expect any fluctuations to be damped out.
Friday, February 19, 2010
A Review of the Earthquake Mechanism's Fit to the Observed Earthquake Numbers
For different intervals of time one can estimate the total number of earthquakes in a particular interval as N = R ΔT. This number is then multiplied by the relative probabilities to estimate the number of earthquakes for each histogram interval.
*edit The g function was assumed to be for the number of earthquakes. The appears to be a discrepancy between the fit g function and the ratio of consecutive number of expected earthquakes. They should agree and I don't know what went wrong. The fit may have compensated for an error so one cannot assume that the earthquake mechanism works properly.
Wednesday, February 17, 2010
Improved Fit for Southern California Earthquake Data (2000-2009)
The new fit shows higher values for the earthquake rates above the peak in r.
Tuesday, February 16, 2010
Southern California Earthquake Data Fit for 2000 - 2009
The earthquake rates for the Markov process of the assumed mechanism indicate a central peak and are relatively smaller at both large and small magnitudes.
Friday, February 12, 2010
WISE NEO and filtering images
Tuesday, February 9, 2010
A Mechanism Fitting Earthquake Activity with a Peak
The values of r and f are all that is needed to describe the mechanism and are the components of the transition matrix. The transition matrix can be used to find the equilibrium values for the occupancy of the states. These values are unchanged when multiplied by the transition matrix. To get the number of earthquakes in each magnitude interval one multiplies the equilibrium values by r_k. In the plot below logN is the logarithm of the number of earthquakes observed and logN_fit is the logarithm of the number computed from the equilibrium values. Note that this mechanism does have a peak unlike the linear fit found previously. The Poisson distribution does not appear to be an equilibrium distribution. k is the value of k used in the Poisson fit and is approximately 140 + M/ΔM with ΔM = 0.1. The peak is at M 1.1 (k = 151) and there are 10 intervals per M 1.0 step.
Saturday, February 6, 2010
What Went Wrong with the Poisson Fit?
Thursday, February 4, 2010
A Simple Markov Process for Earthquakes
The length of the chain can be extended to any number of states. If one chooses 15 states to correspond to the histogram intervals in the table computed for the fit to the 2009 earthquakes then the number of earthquakes for the given interval of time will be r times the number occupying each state for all except the last. The numbers agree surprisingly well for such a simple model. The model is not sophisticated enough to explain a peak in the earthquake distribution.
A Problem with the Poisson Distribution.
Conclusion: Poisson = FAIL
Tuesday, February 2, 2010
Multiple Events from a Single Source
It is easily seen that the probability distribution for multiple observations is the Poisson distribution.
The P_k can be used to determine the expected number of observations from a single source. The number will range from 0 to e, 2.71828... .
The expected number of events is an increasing function of P. It is not a linear function of the probability. So it is possible that more than one earthquake will happen on the same fault in a given interval if they occur at random. Like lightning don't count on it not happening in the same place twice.
Continuous Probability for a Single Event
A plot helps visualize the continuous distribution (solid red curve) and its relation to the infinitesimal rate. In the plot it was assumed that r=0.5 (per unit of time). The infinitesimal rate is just the slope of the continuous probability distribution at t=0 (dotted blue line). This is the justification for using Δp = r Δt as an estimate of the probability for a small interval dt. The probability (dP) that an event will occur after a long period of time decreases because it probably already has happened. The distribution is determined by the infinitesimal rate and the average time for an occurance is 1/r. It should be noted that this is the probability that just one event will occur before time t (or in a time interval of width t).