Talk:Load (computing): Difference between revisions

Content deleted Content added
Line 122:
: The article itself says that the load is calculated as ''exponentially damped/weighted moving average''. As I understand that means, a load of 5 in the last minute is counted as more significant than a load of 5 in that minute before. However the example ''A load of 3.73 means that during the last minute, the CPU was overloaded by 273%'' does assume an arithmetic average hence is flawed.<br>PS: I still admit, that the example even if not absolutely correct, might help to give a new reader a good idea of this complex matter :D
:[[User:212.55.216.242|212.55.216.242]] 16:11, 15 August 2007 (UTC)
 
 
:i totaly Agree on the example being wrong. What if you have 1 highest-priority job, which will need 1minute of cpu time, and 3 lowest-priority jobs, which need 5seconds of cpu time. What if the lowest-priority jobs are only scheduled when the highest priority job is either blocked, sleeping oder done?
Then you would have a Load Average of 4 for the first minute. Remember, total CPU Time is 1 minute 15 seconds. So you'd only need a 25% faster CPU to do all the work in one minute. The example in the article is undoubtly wrong and missleading.
 
== Unix or windows or both ==