Gaussian function: Difference between revisions

Content deleted Content added
m Reverted edit by Technopat (talk) to last version by 203.109.109.242
Tags: Rollback Reverted
Undid revision 1307613592 by 86.194.124.12 (talk)Reverted unexplained edit.
 
(5 intermediate revisions by 4 users not shown)
Line 118:
For the general form of the equation the coefficient ''A'' is the height of the peak and {{math|(''x''<sub>0</sub>, ''y''<sub>0</sub>)}} is the center of the blob.
 
If we set
<math display="block">
\begin{align}
a &= \frac{\cos^2\theta}{2\sigma_X^2} + \frac{\sin^2\theta}{2\sigma_Y^2}, \\
Line 125 ⟶ 126:
\end{align}
</math>then we rotate the blob by a positive, counter-clockwise angle <math>\theta</math> (for negative, clockwise rotation, invert the signs in the ''b'' coefficient).<ref>{{cite web |last1=Nawri |first1=Nikolai |title=Berechnung von Kovarianzellipsen |url=http://imkbemu.physik.uni-karlsruhe.de/~eisatlas/covariance_ellipses.pdf |access-date=14 August 2019 |url-status=dead |archive-url=https://web.archive.org/web/20190814081830/http://imkbemu.physik.uni-karlsruhe.de/~eisatlas/covariance_ellipses.pdf |archive-date=2019-08-14}}</ref>
 
 
To get back the coefficients <math>\theta</math>, <math>\sigma_X</math> and <math>\sigma_Y</math> from <math>a</math>, <math>b</math> and <math>c</math> use
 
<math display="block">\begin{align}
\theta &= \frac{1}{2}\arctan\left(\frac{2b}{a-c}\right), \quad \theta \in [-45, 45], \\
\sigma_X^2 &= \frac{1}{2 (a \cdot \cos^2\theta + 2 b \cdot \cos\theta\sin\theta + c \cdot \sin^2\theta)}, \\
\sigma_Y^2 &= \frac{1}{2 (a \cdot \sin^2\theta - 2 b \cdot \cos\theta\sin\theta + c \cdot \cos^2\theta)}.
\end{align}</math>
 
 
Example rotations of Gaussian blobs can be seen in the following examples:
Line 173 ⟶ 172:
Also see [[multivariate normal distribution]].
 
=== Higher-order Gaussian or super-Gaussian function or generalized Gaussian function ===
A more general formulation of a Gaussian function with a flat-top and Gaussian fall-off can be taken by raising the content of the exponent to a power <math>P</math>:
<math display="block">f(x) = A \exp\left(-\left(\frac{(x - x_0)^2}{2\sigma_X^2}\right)^P\right).</math>
Line 310 ⟶ 309:
[[Category:Articles containing proofs]]
[[Category:Articles with example MATLAB/Octave code]]
 
 
]
 
 
 
 
 
 
 
 
 
 
 
 
import numpy as np
import matplotlib.pyplot as plt
 
def gaussian_weight(x, x_i, tau):
return np.exp(-(x - x_i)**2 / (2 * tau**2)) # Use regular minus and correct exponentiation
 
def locally_weighted_regression(X, y, tau):
X = np.array(X)
y = np.array(y)
n = len(X)
y_pred = []
 
for query_point in X:
weights = np.array([gaussian_weight(query_point, x_i, tau) for x_i in X])
W = np.diag(weights)
X_mat = np.c_[np.ones(n), X] # Add intercept (column of ones)
theta = np.linalg.pinv(X_mat.T @ W @ X_mat) @ X_mat.T @ W @ y # Solving for theta
y_pred.append(theta[0] + theta[1] * query_point)
 
return np.array(y_pred)
 
# Generate data
np.random.seed(42)
X = np.linspace(-3, 3, 50)
y = np.sin(X) + np.random.normal(0, 0.2, X.shape)
tau = 0.5
 
# Perform locally weighted regression
y_pred = locally_weighted_regression(X, y, tau)
 
# Plot results
plt.figure(figsize=(10, 6))
plt.scatter(X, y, color='red', label='Data Points')
plt.plot(X, y_pred, color='blue', label='LWR Fitted Curve')
plt.title('Locally Weighted Regression')
plt.xlabel('X')
plt.ylabel('y')
plt.legend()
plt.show()