Logo

Zhu Lin's webpage

Logistic Regression

Adjust parameters and visualize the optimization process in real-time.

Logistic Regression Model

Formula: ŷ = sigmoid(XW + b)

where sigmoid(z) = 1 / (1 + e-z)


Loss Function (Binary Cross-Entropy):

Loss = - (1 / n) Σ [ y * log(ŷ) + (1 - y) * log(1 - ŷ) ]

where n is the number of data points.

note: the negative sign is used to convert the loss to a minimization problem from log likelihood.


Update Rule (Gradient Descent):

Update W: W = W - α * (1 / n) Σ (ŷ - y) * X

Update b: b = b - α * (1 / n) Σ (ŷ - y)

where α is the learning rate.


Trained Parameters:

Weight (W):

Bias (b):

Gradient Descent Visualization

Cross-Entropy Loss


Weight Vector (W, b):