Program optimization: Difference between revisions

Content deleted Content added
No edit summary
Rewrote biased wording
Line 98:
In some cases, adding more [[main memory|memory]] can help to make a program run faster. For example, a filtering program will commonly read each line and filter and output that line immediately. This only uses enough memory for one line, but performance is typically poor, due to the latency of each disk read. Caching the result is similarly effective, though also requiring larger memory use.
 
==When to optimizeOptimize==
<!-- This section is linked from [[Python (programming language)]] -->
 
Typically, optimization involves choosing the best overall algorithms and data structures. <ref>{{cite web|url=https://ubiquity.acm.org/article.cfm?id=1513451|title=The Fallacy of Premature Optimization}}</ref> Frequently, algorithmic improvements can cause performance improvements of several orders of magnitude instead of micro-optimizations, which rarely improve performance by more than a few percent. <ref>{{cite web|url=https://ubiquity.acm.org/article.cfm?id=1513451|title=The Fallacy of Premature Optimization}}</ref> If one waits to optimize until the end of the development cycle, then changing the algorithm requires a complete rewrite.
Optimization can reduce [[readability]] and add code that is used only to improve the [[Computer performance|performance]]. This may complicate programs or systems, making them harder to maintain and debug. As a result, optimization or performance tuning is often performed at the end of the [[development stage]].
 
Frequently, micro-optimization can reduce [[readability]] and complicate programs or systems. That can make programs more difficult to maintain and debug.
 
[[Donald Knuth]] made the following two statements on optimization:
Line 109 ⟶ 111:
<blockquote> "In established engineering disciplines a 12% improvement, easily obtained, is never considered marginal and I believe the same viewpoint should prevail in software engineering"<ref name="autogenerated268"/></blockquote>
 
"Premature optimization" is often used as a rallying cry against all optimization. <ref>{{cite web|url=https://ubiquity.acm.org/article.cfm?id=1513451|title=The Fallacy of Premature Optimization}}</ref>
"Premature optimization" is a phrase used to describe a situation where a programmer lets performance considerations affect the design of a piece of code. This can result in a design that is not as clean as it could have been or code that is incorrect, because the code is complicated by the optimization and the programmer is distracted by optimizing.
 
When deciding whether to optimize a specific part of the program, [[Amdahl's Law]] should always be considered: the impact on the overall program depends very much on how much time is actually spent in that specific part, which is not always clear from looking at the code without a [[Profiling (computer programming)|performance analysis]].
 
When deciding whetherwhat to optimize a specific part of the program, [[Amdahl's Law]] should always be considered:used theto impactproritize on the overall program depends veryparts muchbased on howthe muchactual time is actually spent in thata specificcertain part, which is not always clear from looking at the code without a [[Profiling (computer programming)|performance analysis]].
A better approach is therefore to design first, code from the design and then [[profiling (computer programming)|profile]]/[[Benchmark (computing)|benchmark]] the resulting code to see which parts should be optimized. A simple and elegant design <!-- how is this produced, if not prematurely? -->is often easier to optimize at this stage, and profiling may reveal unexpected performance problems that would not have been addressed by premature optimization.
 
In practice, it is often necessary to keep performance goals in mind when first designing software, butyet theprogrammers programmermust balancesbalance thevarious goalstradeoffs. ofDevelopment designcost is significant, and optimizationhardware is fast.
 
Modern compilers and operating systems are soefficient efficientenough that the intended performance increases oftensometimes fail to materialize. Since Ascompilers anperform example,many cachingautomatic dataoptimizations, atsome theoptimizations application level that is again cached at the operating system level does notmay yield improvementsan inidentical executionexecutable. Even soAlso, itsometimes ishardware amay rare case whenreduce the programmerimpact willof remove failed optimizations from production codemico-optimization. For It is also true that advances inexample, hardware willmay morecache oftendata thanthat notis obviatecached anyat potentiala improvements,software yet the obscuring code will persist into the future long after its purpose has been negatedlevel.
 
==Macros==