ANALYSIS MENU
How analysis works

To fully understand how the analysis routines work, you have to know a little bit about the theory behind them. All meteorological fields can be described as a series of waves. For example, assume that at a certain time the coldest spot in the country is International Falls, MN at 12F and the hottest spot is Death Valley, CA at 96F.  Draw a long line on your map that passes through both of those points. If you graph the temperatures along this line and gather data from cities along the line, your temperature graph will show unique undulations, rises, and falls. These are mathematical waves in the temperature field.

Assume that you find that the temperature 200 miles on each side of Death Valley (along the line) is 94 deg F. You could then assume most of the southwest United States is hot. The temperature graph would show a gradual rise towards Death Valley and a gradual fall as you move away. But what if the temperature 200 miles on each side is instead 52 deg F? With Death Valley being at 96, this would appear on your graph as a short upward "spike" at the location corresponding to Death Valley. These are two unique mathematical waves the first one (with gradual rises and falls) means that the temperature field is composed of large or long wavelength features. The latter situation (with sharp spikes and dips) means that the temperature field is composed of short or small wavelength features. As you can see, gradual changes mean that the meteorological field has long wavelength characteristics, and sharp localized changes imply small wavelength features.

Digital Atmosphere uses a 30 x 30 mesh (grid) to sample these conditions. The mesh is like lines that are drawn across the map, 30 across and 30 down, to sample the wave structure that is present in the type of analysis you pick. Using equally-spaced grid points is the easiest way to perform computerized analysis calculations. Digital Atmosphere must find where reporting stations are within the grid, and try to map the values to that grid, taking into consideration how far the reporting station is from a gridpoint and figuring out what to do with gridpoints that aren't assigned any data. This is what takes up the majority of processing time.

So why not just map values to the grids and interpolate between all the points? Digital Atmosphere's nearest neighbor technique does something like this. It maps data to gridpoints and fills in unassigned gridpoints using the nearest data value, a process known as expansion. However, if the temperature in San Francisco is 52 and in Los Angeles is 85 (and assuming we have NO other data), the analyzed field anywhere in California will show either 52 or 85 degrees! This is clearly not correct, so after expansion, Digital Atmosphere runs a smoothing algorithm. Smoothers "smooth" out the mesh by changing the value at each gridpoint to an average of itself and its neighbors. Unfortunately, this can dampen out features in the field. Too much smoothing will remove all of the short wavelength features.

A better way to do this is to use the Weighted, Barnes, or Cressman techniques. These techniques look at each gridpoint, find where it corresponds to on the earth, and from that location it looks at the values of the stations around it and figures out a carefully averaged value best representing what the value at that gridpoint probably is. The Barnes algorithm goes a step further by going back and seeing how incorrect its mesh turned out to be, then adjusting that to produce a final analysis. The Cressman algorithm does the same thing, but it also begins its analysis using values at stations over a wide area, then gradually tightens up its analysis, running corrections and using values at stations over a progressively more narrow area. This allows Cressman to accurately resolve large waves as well as small, localized waves.

Let's now take a closer look at each type of analysis.