Manipulating the vast datasets and performing the complex calculations necessary to modern numerical weather prediction requires some of the most powerful [[supercomputer]]s in the world. Even with the increasing power of supercomputers, the [[forecast skill]] of numerical weather models extends to only about six days. Factors affecting the accuracy of numerical predictions include the density and quality of observations used as input to the forecasts, along with deficiencies in the numerical models themselves. Post-processing techniques such as [[model output statistics]] (MOS) have been developed to improve the handling of errors in numerical predictions.
A more fundamental problem lies in the [[Chaos theory|chaotic]] nature of the [[partial differential equation]]s that governdescribe the atmosphere. It is impossible to solve these equations exactly, and small errors grow with time (doubling about every five days). Present understanding is that this chaotic behavior limits accurate forecasts to about 14 days even with accurate input data and a flawless model. In addition, the partial differential equations used in the model need to be supplemented with [[Parametrization (climate)|parameterizations]] for [[solar radiation]], [[moist processes]] (clouds and [[precipitation (meteorology)|precipitation]]), [[heat transfer|heat exchange]], soil, vegetation, surface water, and the effects of terrain. In an effort to quantify the large amount of inherent uncertainty remaining in numerical predictions, [[ensemble forecasting|ensemble forecasts]] have been used since the 1990s to help gauge the confidence in the forecast, and to obtain useful results farther into the future than otherwise possible. This approach analyzes multiple forecasts created with an individual forecast model or multiple models.