How the Weather Channel gets its Forecasts

Few viewers of the Weather Channel, or any other weather forecast outlet, will appreciate the highly technical nature of the systems required to make the prediction. The process involves massive computing power, huge amounts of observational data, complex scientific models, and highly skilled forecasters. It’s an expensive activity, but ultimately one that can save lives.

The accuracy of the computer generated forecast is highly dependent on a knowledge of the current state of the atmosphere. This can never be fully known, but the ever increasing array of weather satellites means that the atmosphere is measured with ever finer resolution, at ever increasing accuracy. Satellites make many millions of observations of atmospheric temperatures, winds, and humidity as they scan the earth. Observations also come from aircraft, weather balloons, radar, ships, and surface land observing stations. All these observations are collated, processed, and compared to a previous model forecast to provide the best guess at the current atmospheric state. Very complex data assimilation techniques are employed to ensure that the forecasting model starts from the most realistic state.

The shear volume of data requires enormous computing power to process it quickly enough for it to be useful. We’re not talking about a top of the range PC here – if you ran a weather prediction system on one of these you might just see your 1 day forecast within a couple of months. We’re talking about supercomputers, many hundreds or thousands times more powerful than a typical PC.

Having derived the initial state of the atmosphere, the forecasting model drives this forward in time, applying dynamical and physical equations, equations of motion etc. To achieve this the model atmosphere is divided into boxes of a given latitude/longitude area and depth. The smaller you make these boxes, the more accurate the forecast is likely to be, but more computational power is required. Equations are applied within each box, and data flows in and out of each box and it’s neighbours. This is done over a sequence of timesteps, typically each of a few minutes, until the desired forecast length is reached. Each timestep requires many many billions of calculations.

At the end of all that, you have a forecast several days into the future. The products can be used raw, for automated output such as display on websites. But the full value of the computer generated forecast can only be exploited by having a trained forecaster analyse it. He/she will be able to make decisions on how reliable that forecast is, and be able to apply corrections due to known characteristics of the computer model.

Computer models are currently not reliable enough without the forecaster’s skills. The forecasters would not be able to make predictions beyond more than a couple of days without the models. They’re both essential parts of what goes into producing the end product.

When you watch the next Weather Channel forecast, don’t underestimate the amount of effort that has gone into producing it. And it continues, numerical weather prediction is a highly active research area attracting huge investment in many countries.