Content deleted Content added
→top: Fixed grammar error |
|||
(41 intermediate revisions by 3 users not shown) | |||
Line 2:
An [[algorithm]] is fundamentally a set of rules or defined procedures that is typically designed and used to solve a specific problem or a broad set of problems.
Broadly, algorithms define process(es), sets of rules, or methodologies that are to be followed in calculations, data processing, data mining, pattern recognition, automated reasoning or other problem-solving operations. With the increasing automation of services, more and more decisions are being made by algorithms. Some general examples are
The following is a '''list of well-known algorithms'''
==Automated planning==
Line 24:
===Graph algorithms===
{{further
* [[Coloring algorithm]]: Graph coloring algorithm.
* [[Hopcroft–Karp algorithm]]: convert a bipartite graph to a [[maximum cardinality matching]]
Line 78:
====Graph search====
{{further
* [[A* search algorithm|A*]]: special case of best-first search that uses heuristics to improve speed
* [[B*]]: a best-first graph search algorithm that finds the least-cost path from a given initial node to any goal node (out of one or more possible goals)
Line 181:
** [[Bogosort]]: the list is randomly shuffled until it happens to be sorted
** [[Slowsort]]
** [[Stooge sort]]
* Hybrid
Line 347 ⟶ 346:
===Numerical algorithms===
{{further
====Differential equation solving====
Line 367 ⟶ 366:
{{further|Special functions}}
* [[Computing π|Computation of π]]:
** [[Bailey–Borwein–Plouffe formula]]: (BBP formula) a spigot algorithm for the computation of the nth binary digit of π▼
** [[Borwein's algorithm]]: an algorithm to calculate the value of 1/π
** [[Gauss–Legendre algorithm]]: computes the digits of [[pi]]▼
** [[Chudnovsky algorithm]]: a fast method for calculating the digits of π
▲** [[Gauss–Legendre algorithm]]: computes the digits of [[pi]]
▲** [[Bailey–Borwein–Plouffe formula]]: (BBP formula) a spigot algorithm for the computation of the nth binary digit of π
* [[Division algorithm]]s: for computing quotient and/or remainder of two numbers
** [[Goldschmidt division]]▼
** [[Long division]]
** [[Newton–Raphson division]]: uses [[Newton's method]] to find the [[Multiplicative inverse|reciprocal]] of D, and multiply that reciprocal by N to find the final quotient Q.▼
** [[Restoring division]]▼
** [[Non-restoring division]]
▲** [[Restoring division]]
** [[SRT division]]
* Exponentiation:▼
▲** [[Newton–Raphson division]]: uses [[Newton's method]] to find the [[Multiplicative inverse|reciprocal]] of D, and multiply that reciprocal by N to find the final quotient Q.
** [[Addition-chain exponentiation]]: exponentiation by positive integer powers that requires a minimal number of multiplications▼
▲** [[Goldschmidt division]]
** [[Exponentiating by squaring]]: an algorithm used for the fast computation of [[Arbitrary-precision arithmetic|large integer]] powers of a number▼
* Hyperbolic and Trigonometric Functions:
** [[BKM algorithm]]: computes [[Elementary function (differential algebra)|elementary functions]] using a table of logarithms
** [[CORDIC]]: computes hyperbolic and trigonometric functions using a table of arctangents
▲* Exponentiation:
▲** [[Addition-chain exponentiation]]: exponentiation by positive integer powers that requires a minimal number of multiplications
▲** [[Exponentiating by squaring]]: an algorithm used for the fast computation of [[Arbitrary-precision arithmetic|large integer]] powers of a number
* [[Montgomery reduction]]: an algorithm that allows [[modular arithmetic]] to be performed efficiently when the modulus is large
* [[Multiplication algorithm]]s: fast multiplication of two numbers
Line 409 ⟶ 408:
====Interpolation and extrapolation====
{{further
* [[Birkhoff interpolation]]: an extension of polynomial interpolation
* [[Cubic interpolation]]
Line 432 ⟶ 431:
====Linear algebra====
{{further|Numerical linear algebra}}
* Krylov methods (for large sparse matrix problems; third most-important numerical method class of the 20th century as ranked by SISC; after fast-fourier and fast-multipole)▼
* [[Eigenvalue algorithm]]s
** [[Arnoldi iteration]]
Line 442 ⟶ 440:
** [[Rayleigh quotient iteration]]
* [[Gram–Schmidt process]]: orthogonalizes a set of vectors
▲* Krylov methods (for large sparse matrix problems; third most-important numerical method class of the 20th century as ranked by SISC; after fast-fourier and fast-multipole)
* [[Matrix multiplication algorithm]]s
** [[Cannon's algorithm]]: a [[distributed algorithm]] for [[matrix multiplication]] especially suitable for computers laid out in an N × N mesh
Line 451 ⟶ 450:
** [[Biconjugate gradient method]]: solves systems of linear equations
** [[Conjugate gradient]]: an algorithm for the numerical solution of particular systems of linear equations
** [[Gaussian elimination]]▼
** [[Gauss–Jordan elimination]]: solves systems of linear equations
** [[Gauss–Seidel method]]: solves systems of linear equations iteratively
▲** [[Gaussian elimination]]
** [[Levinson recursion]]: solves equation involving a [[Toeplitz matrix]]
** [[Stone's method]]: also known as the strongly implicit procedure or SIP, is an algorithm for solving a sparse linear system of equations
** [[Successive over-relaxation]] (SOR): method used to speed up convergence of the [[Gauss–Seidel method]]
** [[Tridiagonal matrix algorithm]] (Thomas algorithm): solves systems of tridiagonal equations
* [[SMAWK Algorithm]]
* [[Sparse matrix]] algorithms
** [[Cuthill–McKee algorithm]]: reduce the [[bandwidth (matrix theory)|bandwidth]] of a [[symmetric sparse matrix]]
Line 488:
{{main|Mathematical optimization}}[[Hybrid algorithm|Hybrid]] Algorithms
* [[Alpha–beta pruning]]: search to reduce number of nodes in minimax algorithm
* [[A hybrid BFGS-Like method]] (see more https://doi.org/10.1016/j.cam.2024.115857)▼
* [[Branch and bound]]
* [[Bruss algorithm]]: see [[odds algorithm]]
Line 494 ⟶ 495:
** [[Greedy randomized adaptive search procedure]] (GRASP): successive constructions of a greedy randomized solution and subsequent iterative improvements of it through a local search
** [[Hungarian method]]: a combinatorial optimization algorithm which solves the [[assignment problem]] in polynomial time
* [[Conjugate gradient method]]s (see more https://doi.org/10.1016/j.jksus.2022.101923)▼
* [[Constraint satisfaction]]{{anchor|Constraint satisfaction}}
**
** [[Chaff algorithm]]: an algorithm for solving instances of the [[Boolean satisfiability problem]]
** [[Davis–Putnam algorithm]]: check the validity of a first-order logic formula
** [[Difference map algorithm]] general algorithms for the constraint satisfaction
** [[DPLL algorithm|Davis–Putnam–Logemann–Loveland algorithm]] (DPLL): an algorithm for deciding the satisfiability of propositional logic formula in [[conjunctive normal form]], i.e. for solving the [[CNF-SAT]] problem
** [[Exact cover]] problem
** [[Min conflicts algorithm]] general algorithms for the constraint satisfaction
*** [[Algorithm X]]: a [[nondeterministic algorithm]]
*** [[Dancing Links]]: an efficient implementation of Algorithm X
Line 515 ⟶ 516:
*** [[Fitness proportionate selection]] – also known as roulette-wheel selection
*** [[Stochastic universal sampling]]
*** [[Truncation selection]]▼
*** [[Tournament selection]]
▲*** [[Truncation selection]]
** [[Memetic algorithm]]
** [[Swarm intelligence]]
Line 527 ⟶ 528:
* [[Hyperparameter optimization#Grid search|Grid Search]]
* [[Harmony search]] (HS): a [[metaheuristic]] algorithm mimicking the improvisation process of musicians
* [[A hybrid HS-LS conjugate]] [[gradient algorithm]] (see https://doi.org/10.1016/j.cam.2023.115304)▼
* [[Interior point method]]
* [[Line search]]▼
* {{anchor|Linear programming}}[[Linear programming]]
** [[Benson's algorithm]]: an algorithm for solving linear [[vector optimization]] problems
Line 537 ⟶ 540:
** [[Karmarkar's algorithm]]: The first reasonably efficient algorithm that solves the [[linear programming]] problem in [[polynomial time]].
** [[Simplex algorithm]]: an algorithm for solving [[linear programming]] problems
▲* [[Line search]]
* [[Local search (optimization)|Local search]]: a metaheuristic for solving computationally hard optimization problems
** [[Random-restart hill climbing]]
Line 555 ⟶ 557:
* [[Stochastic tunneling]]
* [[Subset sum problem|Subset sum]] algorithm
▲* [[A hybrid HS-LS conjugate]] [[gradient algorithm]] (see https://doi.org/10.1016/j.cam.2023.115304)
▲* [[A hybrid BFGS-Like method]] (see more https://doi.org/10.1016/j.cam.2024.115857)
▲* [[Conjugate gradient method]]s (see more https://doi.org/10.1016/j.jksus.2022.101923)
==Computational science==
Line 564 ⟶ 563:
===Astronomy===
* [[Doomsday algorithm]]: day of the week
* [[Zeller's congruence]] is an algorithm to calculate the day of the week for any Julian or Gregorian calendar date▼
* various [[Computus|Easter algorithms]] are used to calculate the day of Easter
▲* [[Zeller's congruence]] is an algorithm to calculate the day of the week for any Julian or Gregorian calendar date
===Bioinformatics===
Line 571 ⟶ 570:
{{see also|List of algorithms#Sequence alignment|l1=Sequence alignment algorithms}}
* [[Basic Local Alignment Search Tool]] also known as BLAST: an algorithm for comparing primary biological sequence information
* [[Bloom filter|Bloom Filter]]: probabilistic data structure used to test for the existence of an element within a set. Primarily used in bioinformatics to test for the existence of a [[k-mer]] in a sequence or sequences.▼
* [[Kabsch algorithm]]: calculate the optimal alignment of two sets of points in order to compute the [[RMSD|root mean squared deviation]] between two protein structures.
* [[Velvet (algorithm)|Velvet]]: a set of algorithms manipulating [[de Bruijn graph]]s for genomic [[sequence assembly]]▼
* [[Sorting by signed reversals]]: an algorithm for understanding genomic evolution.▼
* [[Maximum parsimony (phylogenetics)]]: an algorithm for finding the simplest phylogenetic tree to explain a given character matrix.
▲* [[Sorting by signed reversals]]: an algorithm for understanding genomic evolution.
* [[UPGMA]]: a distance-based phylogenetic tree construction algorithm.
▲* [[Velvet (algorithm)|Velvet]]: a set of algorithms manipulating [[de Bruijn graph]]s for genomic [[sequence assembly]]
▲* [[Bloom filter|Bloom Filter]]: probabilistic data structure used to test for the existence of an element within a set. Primarily used in bioinformatics to test for the existence of a [[k-mer]] in a sequence or sequences.
===Geoscience===
{{further|Geoscience}}
* [[Vincenty's formulae]]: a fast algorithm to calculate the distance between two latitude/longitude points on an ellipsoid▼
* [[Geohash]]: a public ___domain algorithm that encodes a decimal latitude/longitude pair as a hash string
▲* [[Vincenty's formulae]]: a fast algorithm to calculate the distance between two latitude/longitude points on an ellipsoid
===Linguistics===
Line 601 ⟶ 600:
* [[Demon algorithm]]: a [[Monte Carlo method]] for efficiently sampling members of a [[microcanonical ensemble]] with a given energy
* [[Featherstone's algorithm]]: computes the effects of forces applied to a structure of joints and links
* [[Glauber dynamics]]: a method for simulating the Ising Model on a computer▼
* [[Ground state]] approximation
** [[Variational method]]
Line 610:
* [[Sweep and prune]]: a broad phase algorithm used during [[collision detection]] to limit the number of pairs of solids that need to be checked for collision
* [[VEGAS algorithm]]: a method for reducing error in [[Monte Carlo simulation]]s
▲* [[Glauber dynamics]]: a method for simulating the Ising Model on a computer
===Statistics===
Line 626 ⟶ 625:
** [[Expectation-maximization algorithm]]
** [[Fuzzy clustering]]: a class of clustering algorithms where each point has a degree of belonging to clusters
*** [[Fuzzy clustering#Fuzzy c-means clustering|Fuzzy c-means]]▼
*** [[FLAME clustering]] (Fuzzy clustering by Local Approximation of MEmberships): define clusters in the dense parts of a dataset and perform cluster assignment solely based on the neighborhood relationships among objects
▲*** [[Fuzzy clustering#Fuzzy c-means clustering|Fuzzy c-means]]
** [[KHOPCA clustering algorithm]]: a local clustering algorithm, which produces hierarchical multi-hop clusters in static and mobile environments.▼
** [[k-means clustering]]: cluster objects based on attributes into partitions
** [[k-means++]]: a variation of this, using modified random seeds
** [[k-medoids]]: similar to k-means, but chooses datapoints or [[medoid]]s as centers
▲** [[KHOPCA clustering algorithm]]: a local clustering algorithm, which produces hierarchical multi-hop clusters in static and mobile environments.
** [[Linde–Buzo–Gray algorithm]]: a vector quantization algorithm to derive a good codebook
** [[Lloyd's algorithm]] (Voronoi iteration or relaxation): group data points into a given number of categories, a popular algorithm for [[k-means clustering]]
Line 637 ⟶ 636:
** [[Single-linkage clustering]]: a simple agglomerative clustering algorithm
** [[SUBCLU]]: a subspace clustering algorithm
** [[Ward's method]]: an agglomerative clustering algorithm, extended to more general Lance–Williams algorithms▼
** [[WACA clustering algorithm]]: a local clustering algorithm with potentially multi-hop structures; for dynamic networks
▲** [[Ward's method]]: an agglomerative clustering algorithm, extended to more general Lance–Williams algorithms
* [[Estimation theory|Estimation Theory]]
** [[Expectation-maximization algorithm]] A class of related algorithms for finding maximum likelihood estimates of parameters in probabilistic models
*** [[Ordered subset expectation maximization]] (OSEM): used in [[medical imaging]] for [[positron emission tomography]], [[single-photon emission computed tomography]] and [[X-ray]] computed tomography.
** [[Odds algorithm]] (Bruss algorithm) Optimal online search for distinguished value in sequential random input▼
** [[Kalman filter]]: estimate the state of a linear [[Dynamical system|dynamic system]] from a series of noisy measurements
▲** [[Odds algorithm]] (Bruss algorithm) Optimal online search for distinguished value in sequential random input
* [[False nearest neighbor algorithm]] (FNN) estimates [[fractal dimension]]
* [[Hidden Markov model]]
Line 666 ⟶ 665:
===Computer graphics===
{{further|Computer graphics}}
* [[Binary space partitioning]]▼
* [[Clipping (computer graphics)|Clipping]]
** [[Line clipping]]
Line 709:
* [[Slerp]] (spherical linear interpolation): quaternion interpolation for the purpose of animating 3D rotation
* [[Summed area table]] (also known as an integral image): an algorithm for computing the sum of values in a rectangular subset of a grid in constant time
▲* [[Binary space partitioning]]
===Cryptography===
Line 743 ⟶ 742:
** [[Elliptic-curve Diffie–Hellman]] (ECDH)
* [[Key derivation function]]s, often used for [[password hashing]] and [[key stretching]]
** [[Argon2]]▼
** [[bcrypt]]
** [[PBKDF2]]
** [[scrypt]]
▲** [[Argon2]]
* [[Message authentication code]]s (symmetric authentication algorithms, which take a key as a parameter):
** [[keyed-hash message authentication code|HMAC]]: keyed-hash message authentication
Line 757 ⟶ 756:
** [[Advanced Encryption Standard]] (AES), winner of [[NIST]] competition, also known as [[Rijndael]]
** [[Blowfish (cipher)|Blowfish]]
** [[Salsa20#ChaCha variant|ChaCha20]] updated variant of Salsa20
** [[Twofish]]▼
** [[Threefish]]▼
** [[Data Encryption Standard]] (DES), sometimes DE Algorithm, winner of NBS selection competition, replaced by AES for most purposes
** [[International Data Encryption Algorithm|IDEA]]
** [[RC4 (cipher)]]
** [[Salsa20]]
▲** [[Threefish]]
** [[Tiny Encryption Algorithm]] (TEA)
▲** [[Twofish]]
* [[Post-quantum cryptography]]
* [[Proof-of-work system|Proof-of-work algorithms]]
Line 769:
===Digital logic===
* Boolean minimization
** [[Quine–McCluskey algorithm]]: also called as Q-M algorithm, programmable method for simplifying the Boolean equations▼
** [[Petrick's method]]: another algorithm for Boolean simplification▼
** [[Espresso heuristic logic minimizer]]: a fast algorithm for Boolean function minimization
▲** [[Petrick's method]]: another algorithm for Boolean simplification
▲** [[Quine–McCluskey algorithm]]: also called as Q-M algorithm, programmable method for simplifying the Boolean equations
===Machine learning and statistical classification===
Line 790:
** [[LPBoost]]: [[linear programming]] boosting
* [[Bootstrap aggregating]] (bagging): technique to improve stability and classification accuracy
* [[Cluster analysis|Clustering]]: a class of [[unsupervised learning]] algorithms for grouping and bucketing related input vector▼
* [[Computer Vision]]
** [[Grabcut]] based on [[Graph cuts in computer vision|Graph cuts]]
Line 795 ⟶ 796:
** [[C4.5 algorithm]]: an extension to ID3
** [[ID3 algorithm]] (Iterative Dichotomiser 3): use heuristic to generate small decision trees
▲* [[Cluster analysis|Clustering]]: a class of [[unsupervised learning]] algorithms for grouping and bucketing related input vector
* [[k-nearest neighbors]] (k-NN): a non-parametric method for classifying objects based on closest training examples in the [[feature space]]
* [[Linde–Buzo–Gray algorithm]]: a vector quantization algorithm used to derive a good codebook
Line 831:
* [[GLR parser]]: an algorithm for parsing any [[context-free grammar]] by [[Masaru Tomita]]. It is tuned for deterministic grammars, on which it performs almost [[linear time]] and O(n<sup>3</sup>) in worst case.
* [[Inside-outside algorithm]]: an O(n<sup>3</sup>) algorithm for re-estimating production probabilities in [[probabilistic context-free grammar]]s
* [[Lexical analysis]]▼
* [[LL parser]]: a relatively simple [[linear time]] parsing algorithm for a limited class of [[context-free grammar]]s
* [[LR parser]]: A more complex [[linear time]] parsing algorithm for a larger class of [[context-free grammar]]s. Variants:
Line 836 ⟶ 837:
** [[Look-ahead LR parser|LALR (look-ahead LR) parser]]
** [[Operator-precedence parser]]
** [[Simple LR
** [[Simple precedence parser]]
* [[Packrat parser]]: a [[linear time]] parsing algorithm supporting some [[context-free grammar]]s and [[parsing expression grammar]]s
* [[Pratt parser]]▼
* [[Recursive descent parser]]: a [[top-down parsing|top-down parser]] suitable for LL(''k'') grammars
* [[Shunting-yard algorithm]]: converts an infix-notation math expression to postfix
▲* [[Pratt parser]]
▲* [[Lexical analysis]]
===Quantum algorithms===
Line 896:
** [[Byte pair encoding]] (BPE)
** [[Deflate]]
** [[Abraham Lempel|Lempel]]–[[Jacob Ziv|Ziv]]
** [[Lempel–Ziv]]▼
*** [[LZ77 and LZ78]]
*** [[LZJB|Lempel–Ziv Jeff Bonwick]] (LZJB)
*** [[Lempel–Ziv–Markov chain algorithm]] (LZMA)
*** [[Lempel–Ziv–Oberhumer]] (LZO): speed oriented
▲*** [[LZRW|Lempel–Ziv Ross Williams]] (LZRW)
*** [[Lempel–Ziv–Stac]] (LZS)
*** [[Lempel–Ziv–Storer–Szymanski]] (LZSS)
Line 906 ⟶ 907:
*** [[LZWL]]: syllable-based variant
*** [[LZX]]
* [[Entropy encoding]]: coding scheme that assigns codes to symbols so as to match code lengths with the probabilities of the symbols
** [[Arithmetic coding]]: advanced [[entropy]] coding
Line 948:
** [[Wavelet compression]]: form of data compression well suited for [[image compression]] (sometimes also video compression and audio compression)
* [[Transform coding]]: type of data compression for "natural" data like audio signals or photographic images
* [[Video compression]]▼
* [[Vector quantization]]: technique often used in lossy data compression
▲* [[Video compression]]
===Digital signal processing===
Line 968:
====Image processing====
{{further|Digital image processing}}
* [[Blind deconvolution]]: image de-blurring algorithm when [[point spread function]] is unknown.▼
* [[Connected-component labeling]]: find and label disjoint regions
* [[Dithering]] and [[half-toning]]
Line 985:
** [[Scale-invariant feature transform|SIFT]] (Scale-invariant feature transform): is an algorithm to detect and describe local features in images.
** {{visible anchor|SURF ([[speeded up robust features|Speeded Up Robust Features]])}}: is a robust local feature detector, first presented by [[Herbert Bay]] et al. in 2006, that can be used in computer vision tasks like object recognition or 3D reconstruction. It is partly inspired by the SIFT descriptor. The standard version of SURF is several times faster than SIFT and claimed by its authors to be more robust against different image transformations than SIFT.<ref>{{cite web |url=http://www.vision.ee.ethz.ch/~surf/eccv06.pdf |title=Archived copy |website=www.vision.ee.ethz.ch |access-date=13 January 2022 |archive-url=https://web.archive.org/web/20070221214147/http://www.vision.ee.ethz.ch/~surf/eccv06.pdf |archive-date=21 February 2007 |url-status=dead}}</ref><ref>{{cite web |url=http://glorfindel.mavrinac.com/~aaron/school/pdf/bay06_surf.pdf |title=Archived copy |access-date=2013-10-05 |url-status=dead |archive-url=https://web.archive.org/web/20131006113018/http://glorfindel.mavrinac.com/~aaron/school/pdf/bay06_surf.pdf |archive-date=2013-10-06 }}</ref>
* [[Histogram equalization]]: use histogram to improve image contrast - Contrast Enhancement
* [[Richardson–Lucy deconvolution]]: image de-blurring algorithm
▲* [[Blind deconvolution]]: image de-blurring algorithm when [[point spread function]] is unknown.
* [[Median filtering]]
* [[Seam carving]]: content-aware image resizing algorithm
Line 1,072:
===Process synchronization===
{{further|Process scheduler|Process synchronization}}
* [[Dekker's algorithm]]
* [[Lamport's Bakery algorithm]]
|