Computer algebra: Difference between revisions

Content deleted Content added
→Cite book, harvnb, tweak cites Tweak citation bot results
remove junk
Line 54:
A simpler expression than this is generally desired, and simplification is needed when working with general expressions.
 
This simplification is normally done through [[rewriting|rewriting rules]].<ref>{{cite book <!--Deny Citation Bot-->|last1=Buchberger |first1=Bruno |first2=Rüdiger |last2=Loos |chapter=Algebraic simplification |chapter-url=https://www.risc.jku.at/people/buchberg/papers/1982-00-00-B.pdf |title={{harvnb|Buchberger|Collins|Loos|Albrecht|1983}} |pages=11–43 |doi=10.1007/978-3-7091-7551-4_2 }}</ref> There are several classes of rewriting rules to be considered. The simplest are rules that always reduce the size of the expression, like {{math|''E'' &minus; ''E'' → 0}} or {{math|sin(0) → 0}}. They are systematically applied in computer algebra systems.
 
A difficulty occurs with [[associative operation]]s like addition and multiplication. The standard way to deal with associativity is to consider that addition and multiplication have an arbitrary number of operands, that is that {{math|''a'' + ''b'' + ''c''}} is represented as {{math|"+"(''a'', ''b'', ''c'')}}. Thus {{math|''a'' + (''b'' + ''c'')}} and {{math|(''a'' + ''b'') + ''c''}} are both simplified to {{math|"+"(''a'', ''b'', ''c'')}}, which is displayed {{math|''a'' + ''b'' + ''c''}}. In the case of expressions such as {{math|''a'' &minus; ''b'' + ''c''}}, the simplest way is to systematically rewrite {{math|&minus;''E''}}, {{math|''E'' &minus; ''F''}}, {{math|''E''/''F''}} as, respectively, {{math|(&minus;1)⋅''E''}}, {{math|''E'' + (&minus;1)⋅''F''}}, {{math|''E''⋅''F''<sup>&minus;1</sup>}}. In other words, in the internal representation of the expressions, there is no subtraction nor division nor unary minus, outside the representation of the numbers.
Line 81:
 
==History==
At the beginning of computer algebra, circa 1970, when the long-known [[algorithm]]s were first put on computers, they turned out to be highly inefficient.<ref>{{cite book <!--Deny Citation Bot-->|title={{harvnb|Buchberger|Collins|Loos|Albrecht|1983}} |pages=95–113 |doi=10.1007/978-3-7091-7551-4_8 |first1=Erich|last1=Kaltofen|chapter=Factorization of polynomials |s2cid=18352153 }}</ref> Therefore, a large part of the work of the researchers in the field consisted in revisiting classical [[algebra]] in order to make it [[Computable function|effective]] and to discover [[algorithmic efficiency|efficient algorithms]] to implement this effectiveness. A typical example of this kind of work is the computation of [[polynomial greatest common divisor]]s, which is required to simplify fractions. Surprisingly, the classical [[Euclid's algorithm]] turned out to be inefficient for polynomials over infinite fields, and thus new algorithms needed to be developed. The same was also true for the classical algorithms from [[linear algebra]].
 
== See also ==