Exploring Climate Data (Part 2)

guest post by Blake Pollard

I have been learning to make animations using R. This is an animation of the profile of the surface air temperature at the equator. So, the x axis here is the longitude, approximately from 120° E to 280° E. I pulled the data from the region that Graham Jones specified in his code on github: it’s equatorial line in the region that Ludescher et al. used:

For this animation I tried to show the 1997-1998 El Niño. Typically the Pacific is much cooler near South America, due to the upwelling of deep cold water:

(Click for more information.) That part of the Pacific gets even cooler during La Niña:

But it warms up during El Niños:


You can see that in the surface air temperature during the 1997-1998 El Niño, although by summer of 1998 things seem to be getting back to normal:

I want to practice making animations like this. I could make a much prettier and better-labelled animation that ran all the way from 1948 to today, but I wanted to think a little about what exactly is best to plot if we want to use it as an aid to understanding some of this El Niño business.

13 Responses to Exploring Climate Data (Part 2)

  1. John Baez says:

    It’s great that you can make animated gifs using R! What tricks did you use, after you got the necessary data?

    • Blake Pollard says:

      I used a packaged called ‘animation’, which you can easily install within R. You also need some software called ‘ImageMagick’, which you can download online. The tricky bit is pointing R to the directory where you installed ImageMagick. I think it has something to do with the use of forward slashes versus backslashes in the filepath in R or windows or whatever system you use. Anyways I read some forums and found something that works, but I don’t understand why it works:

      ani.options(convert = shQuote(‘C://Program Files//ImageMagick-6.8.9-Q16//convert.exe’))

      Once R can find the function ‘convert’ which is in ImageMagick and is the real work-horse of the whole thing, converting a bunch of images into a .gif, is pretty easy.

      The command,
      saveGIF({ loop that makes images })

      creates a GIF. Inside the brackets is a loop that generates the images which will be the frames of your animation. You do all your cosmetics for the animation inside the brackets. You can also specify a directory to save the animation, otherwise it ends up in some obscure temp folder.

      A good package for making plots/images is ggplot2.

  2. Nick Stokes says:

    Blake,
    I’ve found animated gif’s fairly limiting. I found that I could do much better with a bit of Javascript. You can give users full control of play speed, pause etc. Here is one effort, with R code and HTML/JS (referring to this). If you want to get really fancy, you can move on to WebGL

    • davetweed says:

      So I’m a complete rank amateur interested in producing simple animated graphs. In my very limited reading it seems like anything other than GIF is going to have some web browsers that are commonly used where it doesn’t work. Is this impression wrong?

  3. WebHubTelescope says:

    Nick, Have you joined the Azimuth Forum? We could use your insight on all things climate by participating in the El Nino prediction project.

  4. arch1 says:

    Thanks, this is already very helpful! The part of the upwelling that never goes away must be within a few tens of miles of the continental shelf, right?

    My visual system craves more. If all you have is data for this single latitude, one possibility might be a two-part display: The top part would be a ‘3-D’ graph plotting temp=f(longitude, time), equipped with a manual slider controlling which time-slice gets displayed in the *bottom* graph (which would otherwise be just like the ones in this posting).

    • arch1 says:

      ..& if this seems worthwhile you could later add a 2nd slider controlling an additional (space-slice) graph.

    • WebHubTelescope says:

      This is the connection between sea-level height and ENSO. A sensitive measure of tidal gauge readings along a coastline is closely correlated to measurements in the middle of the Pacific Ocean.
      http://azimuth.mathforge.org/discussion/1480/tidal-records-and-enso/

      Timely post Blake, thanks!

    • John Baez says:

      arch1 wrote:

      My visual system craves more. If all you have is data for this single latitude…

      No, Blake was using air surface temperature data available from here:

      NCEP/NCAR Reanalysis 1: Surface.

      NCEP is the National Centers for Environmental Prediction, and NCAR is the National Center for Atmospheric Research. They have a bunch of files here containing worldwide daily average temperatures on a 2.5 degree latitude × 2.5 degree longitude grid (that’s 144 × 73 grid points), from 1948 to 2010. And if you go here, the website will help you get data from within a chosen rectangle in a grid, for a chosen time interval. These are NetCDF files.

      I discussed all this and how to use Graham Jones’ program for converting these NetCDF files into a format suitable for work with the statistical programming language R in El Niño Project (Part 5). Blake used this program to extract temperatures at just a single latitude, but it’s equally easy to get temperature data for any sub-rectangle of the grid for any interval of time between 1948 and 2010.

      I urge people out there to play around with this and report back! Blake will do more, but he also has to write a thesis with me on the network theory of Markov processes, so it would be great if other people joined in the fun. I can post more articles in this series about other data visualizations if people 1) give them to me and 2) clearly explain them. Making the code available is good too.

  5. plarryhotter says:

    maybe consider backdropping an image of the geographical region measured for intuition

    • On the context modeling server, I incorporated semantic lookup of climate properties. Here is a snapshop which includes an image backdrop

      Most of the data is available from DBpedia, which is the triple-store semantic database that Wikipedia uses to hold structured information.

      The semantic web is extremely powerful for this kind of stuff.

You can use Markdown or HTML in your comments. You can also use LaTeX, like this: $latex E = m c^2 $. The word 'latex' comes right after the first dollar sign, with a space after it.

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s