Content deleted Content added
Short description was too long, shorten it |
Add banner {{Cleanup bare URLs}}. After at least 7 passes by @Citation bot since 20220903, this article still has 1 untagged bare URL ref |
||
Line 1:
{{Use American English|date = January 2019}}▼
{{Short description|Set-to-real map with diminishing returns}}
{{Cleanup bare URLs|date=September 2022}}
▲{{Use American English|date = January 2019}}
In mathematics, a '''submodular set function''' (also known as a '''submodular function''') is a [[set function]] whose value, informally, has the property that the difference in the incremental value of the function that a single element makes when added to an input set decreases as the size of the input set increases. Submodular functions have a natural [[diminishing returns]] property which makes them suitable for many applications, including [[approximation algorithms]], [[game theory]] (as functions modeling user preferences) and [[electrical network]]s. Recently, submodular functions have also found immense utility in several real world problems in [[machine learning]] and [[artificial intelligence]], including [[automatic summarization]], [[multi-document summarization]], [[feature selection]], [[Active learning (machine learning)|active learning]], sensor placement, image collection summarization and many other domains.<ref name="LB" /><ref name="TIWB" /><ref name="KG1" /><ref name="KG" />
Line 21 ⟶ 22:
; Budget-additive functions : Any function of the form <math>f(S)=\min\left\{B,~\sum_{i\in S}w_i\right\}</math> for each <math>w_i\geq 0</math> and <math>B\geq 0</math> is called budget additive.{{citation needed|date=August 2014}}
; Coverage functions : Let <math>\Omega=\{E_1,E_2,\ldots,E_n\}</math> be a collection of subsets of some [[matroid|ground set]] <math>\Omega'</math>. The function <math>f(S)=\left|\bigcup_{E_i\in S}E_i\right|</math> for <math>S\subseteq \Omega</math> is called a coverage function. This can be generalized by adding non-negative weights to the elements.
; [[Entropy (information theory)|Entropy]] : Let <math>\Omega=\{X_1,X_2,\ldots,X_n\}</math> be a set of [[random variables]]. Then for any <math>S\subseteq \Omega</math> we have that <math>H(S)</math> is a submodular function, where <math>H(S)</math> is the entropy of the set of random variables <math>S</math>, a fact known as [[
; [[Matroid]] [[matroid rank|rank functions]] : Let <math>\Omega=\{e_1,e_2,\dots,e_n\}</math> be the ground set on which a matroid is defined. Then the rank function of the matroid is a submodular function.<ref name=F22>Fujishige (2005) p.22</ref>
Line 40 ⟶ 41:
=== Lovász extension ===
This extension is named after mathematician [[László Lovász]]. Consider any vector <math>\mathbf{x}=\{x_1,x_2,\dots,x_n\}</math> such that each <math>0\leq x_i\leq 1</math>. Then the Lovász extension is defined as <math>f^L(\mathbf{x})=\mathbb{E}(f(\{i|x_i\geq \lambda\}))</math> where the expectation is over <math>\lambda</math> chosen from the [[uniform distribution (continuous)|uniform distribution]] on the interval <math>[0,1]</math>. The Lovász extension is a convex function if and only if <math>f</math> is a submodular function.
=== Multilinear extension ===
Line 46 ⟶ 47:
=== Convex closure ===
Consider any vector <math>\mathbf{x}=\{x_1,x_2,\dots,x_n\}</math> such that each <math>0\leq x_i\leq 1</math>. Then the convex closure is defined as <math>f^-(\mathbf{x})=\min\left(\sum_S \alpha_S f(S):\sum_S \alpha_S 1_S=\mathbf{x},\sum_S \alpha_S=1,\alpha_S\geq 0\right)</math>. The convex closure of any set function is convex over <math>[0,1]^n</math>. It can be shown that <math>f^L(\mathbf{x})=f^-(\mathbf{x})</math> for submodular functions.
=== Concave closure ===
|