Maximum likelihood estimator and cross entropy in logistic regression

lzhangstat
1 min readDec 16, 2021

--

Cross entropy is a widely used loss function in classification problem and logistic regression is not an exception. However, cross entropy loss actually means more in the logistic regression setting. Minimizing the cross-entropy is the same as maximizing the log likelihood of logistic regression.

Let’s go through the derivation step by step in the following part.

First, define the cross entropy loss function

Then let’s write down the log likelihood of a logistic regression, which can be expressed by two parts, when y_i is 0 and when y_i is 1. A bit of simplification will lead us to the formula exactly the same as the cross entropy loss except the sign. That’s why I say that minimizing the cross-entropy is the same as maximizing the log likelihood of logistic regression.

The detailed proof is shown in the below picture.

--

--