Content deleted Content added
No edit summary |
m Reverted 1 edit by 2001:8003:B05C:FD00:5D26:402:8D51:86B4 (talk) to last revision by Mortense |
||
(9 intermediate revisions by 4 users not shown) | |||
Line 7:
==Overview==
Although the term "optimization" is derived from "optimum",<ref>{{Cite book |last1=Antoniou |first1=Andreas |url=https://link.springer.com/content/pdf/10.1007/978-1-0716-0843-2.pdf |title=Practical Optimization |last2=Lu |first2=Wu-Sheng |series=Texts in Computer Science |publisher=[[Springer Publishing|Springer]] |year=2021 |edition=2nd |pages=1 |doi=10.1007/978-1-0716-0843-2 |isbn=978-1-0716-0841-8 |language=en}}</ref> achieving a truly optimal system is rare in practice, which is referred to as [[superoptimization]]. Optimization typically focuses on improving a system with respect to a specific quality metric rather than making it universally optimal. This often leads to trade-offs, where enhancing one metric may come at the expense of another. One
Furthermore, achieving absolute optimization often demands disproportionate effort relative to the benefits gained. Consequently, optimization processes usually
==Levels of optimization==
Line 35:
===Compile level===
Use of an [[optimizing compiler]] with optimizations enabled tends to ensure that the [[executable program]] is optimized at least as much as the compiler can
===Assembly level===
Line 101:
<!-- This section is linked from [[Python (programming language)]] -->
Typically, optimization involves choosing the best overall algorithms and data structures. <ref>{{cite web|url=https://ubiquity.acm.org/article.cfm?id=1513451|title=The Fallacy of Premature Optimization}}</ref> Frequently, algorithmic improvements can cause performance improvements of several orders of magnitude instead of micro-optimizations, which rarely improve performance by more than a few percent. <ref>{{cite web|url=https://ubiquity.acm.org/article.cfm?id=1513451|title=The Fallacy of Premature Optimization}}</ref> If one waits to optimize until the end of the development cycle, then changing the algorithm requires a complete rewrite.
Frequently, micro-optimization can reduce [[readability]] and complicate programs or systems. That can make programs more difficult to maintain and debug.
[[Donald Knuth]] made the following two statements on optimization:
Line 109 ⟶ 111:
<blockquote> "In established engineering disciplines a 12% improvement, easily obtained, is never considered marginal and I believe the same viewpoint should prevail in software engineering"<ref name="autogenerated268"/></blockquote>
"Premature optimization" is often used as a rallying cry against all optimization in all situations for all purposes. <ref>{{cite web|url=https://ubiquity.acm.org/article.cfm?id=1513451|title=The Fallacy of Premature Optimization}}</ref><ref>{{cite web|url=https://www.javacodegeeks.com/2012/11/not-all-optimization-is-premature.html|title=Not All Optimization is Premature}}</ref><ref>{{cite web|url=https://www.infoworld.com/article/2165382/when-premature-optimization-isn-t.html|title=When Premature Optimization Is'nt}}</ref><ref>{{cite web|url=https://prog21.dadgum.com/106.html|title="Avoid Premature Optimization" Does Not Mean "Write Dump Code"}}</ref> Frequently, [[SOLID|Clean Code]] causes code to be more complicated than simpler more efficient code. <ref>{{cite web|url=https://devshift.substack.com/p/premature-abstractions|title=Premature Abstractions}}</ref>
When deciding
In practice, it is often necessary to keep performance goals in mind when first designing software,
Modern compilers are efficient enough that the intended performance increases sometimes fail to materialize. Since compilers perform many automatic optimizations, some optimizations may yield an identical executable. Also, sometimes hardware may reduce the impact of micro-optimization. For example, hardware may cache data that is cached at a software level.
▲In practice, it is often necessary to keep performance goals in mind when first designing software, but the programmer balances the goals of design and optimization.
==Macros==
Line 168:
In particular, for [[just-in-time compiler]]s the performance of the [[Run time environment|run time]] compile component, executing together with its target code, is the key to improving overall execution speed.
==False optimization==
Sometimes, "optimizations" may hurt performance. Parallelism and concurrency causes a significant overhead performance cost, especially energy usage. Keep in mind that C code rarely uses explicit multiprocessing, yet it typically runs faster than any other programming language. Disk caching, paging, and swapping often cause significant increases to energy usage and hardware wear and tear. Running processes in the background to improve startup time slows down all other processes.
==See also==
|