Multinomial logistic regression: Difference between revisions

Content deleted Content added
As a set of independent binary regressions: sloppy notation, you’re not summing over one particular outcome, you’re summing over all possible outcomes
Line 74:
Using the fact that all ''K'' of the probabilities must sum to one, we find:
 
:<math>\Pr(Y_i=K) \,=\, 1- \sum_{kj=1}^{K-1} \Pr (Y_i = kj) \,=\, 1 - \sum_{kj=1}^{K-1}{\Pr(Y_i=K)}\;e^{\boldsymbol\beta_kbeta_j \cdot \mathbf{X}_i} \;\;\Rightarrow\;\; \Pr(Y_i=K) \,=\, \frac{1}{1 + \sum_{kj=1}^{K-1} e^{\boldsymbol\beta_kbeta_j \cdot \mathbf{X}_i}}</math>.
 
We can use this to find the other probabilities:
 
:<math>
\Pr(Y_i=k) = \frac{e^{\boldsymbol\beta_k \cdot \mathbf{X}_i}}{1 + \sum_{kj=1}^{K-1} e^{\boldsymbol\beta_kbeta_j \cdot \mathbf{X}_i}} \;\;\;\;,\;\;k < K
</math>.