Dear Web Hub Telescope-

Wonderfulness! I’m looking forward to a detailed article comparing the classical methods described by Dutton with the modern approach espoused by Baez et al.

Thanks Daniel.

]]>Sorry here again the formula there was a missing }

.

Just in case the matrix $\latex P$ is formed from dividing each element by a column dependent number, which is the sum of all the elements in the column.

]]>That is a good reason if your are planning to do complicated operations with the matrix. For example finding the inverse But adding all the elements and then dividing each element by the sum is not a complicated operation. Is the data available ?, maybe i can try to do it (which some help) . If the maps do agree I would say that proves the good scientific intuition in defining the threshold if not I think there is something to be explained.

There is another thing, for each column let the sum of the elements of the matrix . Let the matrix with elements . So the matrix is a stochastic matrix, for each column all the elements add 1. By Perron’s Theorem 1 is an eigenvalue and all others eigenvalues have norm less than 1.

The eigenvalue 1 is simple if graph is connected. This should be the case. The correspondent eigenvector is an equilibrium measure for the stochastic process associated with . The point is that this measure is precisely the one defined above.

Could there be a climatic interpretation of this? Flow of information?

I am very intrigued about the very low correlated white strip just south of the most correlated zone.I mean, most of the ocean is light blue and you have a clearly white strip zone just south of the red and green and dark blue zone.

I enjoyed very much.

]]>Take a few steps back, and consider the question of why this material — at a very general level — could potentially be of interest to (1) you, and (2) the audience at NIPS. What would be the abstract for this talk?

Here are some possible ingredients:

– New area of application for network theory

– New area of application for machine learning

– Application area represents a pressing human concern

– Azimuth project is searching for ways that mathematicians, scientists and programmers can contribute to the understanding of significant environmental problems

– Made a decision to investigate a more concrete problem

– In this talk, I will begin by giving background and context on the El Nino phenomenon and its physics; then discuss climate network structures that have been posited as indicators for the occurrence of El Nino events; then proceed to evaluate a specific paper which uses this framework, and makes specific testable hypotheses about the preconditions for the occurrence of an El Nino event.

I would also suggest a section that talks about the role of machine learning in this study.

Good Luck!

]]>*“the non-linear differential equations favored by Professor Dutton?”*

HenryB, the Azimuth Project is also working this angle.

]]>Yes, you can always explain a dataset perfectly by a sufficiently complex model, but that model will be mostly useless on new data. This is known as *overfitting*. The most reliable way to guard against this is to fit the model on only a part of the data and evaluate the model model on the the remaining unseen *held out set*. In this case you can tell by just eyballing the scatter plots that there is no magic non linear relationship (except one that just wildly oscillates to fit individual points). There is just a general up and to the right trend plus a lot of noise. Most of the error comes from the noise, fitting a smooth curve is unlikely to be a massive improvement over a straight line.

]]>I decided to eliminate the ‘global warming pause’ stuff.

]]>