Content deleted Content added
Removing image with no source information. Such images that are older than seven days may be deleted at any time. |
m Typo fixing , typos fixed: algoritm → algorithm using AWB |
||
Line 22:
==Segmentation using Normalized Cuts==
===Graph theoretic formulation===
The set of points in an arbitrary feature space can be represented as a weighted undirected complete graph G = (V, E), where the nodes of the graph are the points in the feature space. The weight <math>w_{ij}</math> of an edge <math>(i, j) \in E</math> is a function of the similarity between the nodes <math>i</math> and <math>j</math>. In this context, we can formulate the image segmentation problem as a graph partitioning problem that asks for a partition <math>V_1, \cdots, V_k</math> of the vertex set <math>V</math>, where, according to some measure, the vertices in any set <math>V_i</math> have high similarity, and the vertices in two different sets <math>V_i, V_j</math> have low similarity.
Line 39 ⟶ 37:
In the normalized cuts approach<ref>Jianbo Shi and Jitendra Malik (1997): "Normalized Cuts and Image Segmentation", IEEE Conference on Computer Vision and Pattern Recognition, pp 731-737 </ref>, for any cut <math>(S, \overline{S})</math> in <math>G</math>, <math>ncut(S, \overline{S})</math> measures the similarity between different parts, and <math>nassoc(S, \overline{S})</math> measures the total similarity of vertices in the same part.
Since <math>ncut(S, \overline{S}) = 2 - nassoc(S, \overline{S})</math>, a cut <math>(S^{*}, {\overline{S}}^{*})</math> that minimizes <math>ncut(S, \overline{S})</math> also maximizes <math>nassoc(S, \overline{S})</math>.
Computing a cut <math>(S^{*}, {\overline{S}}^{*})</math> that minimizes <math>ncut(S, \overline{S})</math> is an [[NP-hard]] problem. However, we can find in polynomial time a cut <math>(S, \overline{S})</math> of small normalized weight <math>ncut(S, \overline{S})</math> using [[
===The Ncut Algorithm===
Line 58 ⟶ 55:
Minimizing <math>\frac{y^t (D - W) y}{y^t D y}</math> subject to the constraints above is [[NP-hard]]. To make the problem tractable, we relax the constraints on <math>y</math>, and allow it to take real values. The relaxed problem can be solved by solving the generalized eigenvalue problem <math>(D - W)y = \lambda D y</math>for the second smallest generalized eigenvector.
'''The partitioning
# Given a set of features, set up a weighted graph <math>G = (V, E)</math>, compute the weight of each edge, and summarize the information in <math>D</math> and <math>W</math>.
# Solve <math>(D - W)y = \lambda D y</math> for eigenvectors with the smallest eigenvalues.
Line 69 ⟶ 66:
===Limitations===
Solving a standard eigenvalue problem for all eigenvectors (using the [[
==OBJ CUT==
Line 80 ⟶ 76:
<math>E(m, \Theta) = \sum \phi_x(D|m_x) + \phi_x(m_x|\Theta) + \sum \Psi_{xy}(m_x, m_y) + \phi(D|m_x, m_y)</math> (1)
The term <math>\phi_x(D|m_x) + \phi_x(m_x|\Theta)</math> is called an unary term, and the term <math>\Psi_{xy}(m_x, m_y) + \phi(D|m_x, m_y)</math> is called a pairwise term.
Line 93 ⟶ 88:
# The corresponding LPS model is matched to D to obtain the samples <math>\Theta_1, \cdots, \Theta_s</math>
# The objective function given by equation (2) is determined by computing <math>E(m, \Theta_i)</math> and using <math>w_i = g(\Theta_i|Z)</math>
# The objective function is minimized using a single [[Max-
===Example===
|