Implementing Maximum Likelihood Estimation (MLE) in Python

Maximum Likelihood Estimator

In intraday trading, predicting the direction of stock prices, market prices, and other risk factors is of utmost importance as a lot of money is leveraged. This is done with the help of bot trading which implements different machine learning models. But how ill these models evaluate the unknown parameters present in the market?

Well, this is done with the help of a concept called maximum likelihood estimator. Maximum Likelihood Estimator ( MLE ) is not just used in intraday trading, but also in other fields like biology & medicine, marketing & advertising, etc.

The Maximum Likelihood Estimator (MLE) is a statistical method to estimate the unknown parameters of a probability distribution based on observed data. It finds the parameter value that maximizes the likelihood function. Implemented in Python, MLE can estimate the proportion of red marbles in a bag by drawing samples and calculating the proportion that are red.

In this article, we will understand in depth what MLE is, and how to implement it in Python Programming Language.

Recommended: Probability Distributions with Python (Implemented Examples)

What is Maximum Likelihood Estimator( MLE )?

As mentioned above, MLE is a statistical method that evaluates the unknown parameters of a probability distribution based on some data. Let’s take a deeper dive into what MLE is.

Any observed data is described by a probability distribution function generally with unknown parameters. These unknown parameters influence the shape of the distribution. To know the influence of these parameters, what MLE does is construct a function that represents the probability of observing the actual data points in the given dataset. It is similar to sampling, where you draw some conclusions about the population based on the collected sample.

MLE then finds the value of the parameter which maximizes the likelihood function. This calculated value is considered the most likely parameter based on the observed data.

MAximum Likelihood Function
Maximum Likelihood Function

MLE uses calculus in its concept. Often log-likelihood functions are used instead of MLE as they are easier to differentiate. MLE is very easy to implement and mostly remains consistent as the sample size increases. Moreover, as the sample size increases, it becomes much more efficient than other estimators.

As with everything, MLE does have some disadvantages, one of them being that it requires an appropriate model to implement its functionality. Otherwise, it may give you a biased value and would not represent the true value of the parameter. Another disadvantage is that it is sensitive to outliers which can significantly affect the likelihood function.

let us understand MLE with an example in Python programming language.

MLE in Python

In the example below, we are going to calculate the red marbles in a bag of marbles. We will try to do this by using the MLE function. We will draw 100 samples and then estimate the red marbles in the bag. Let us now look at the code below.

import random

def generate_data(n_samples, p_red):
  """Generates n_samples data points from a Bernoulli distribution with probability p_red."""
  data = []
  for _ in range(n_samples):
    data.append(1 if random.random() < p_red else 0)
  return data

def mle_red_proportion(data):
  """Estimates the proportion of red marbles using MLE."""
  # Calculate the number of red marbles
  n_red = sum(data)

  # Estimate the proportion (MLE)
  p_red_hat = n_red / len(data)
  return p_red_hat

# Generate random data (replace with your desired values)
n_samples = 100
p_red = 0.6  # True proportion of red marbles (unknown)
data = generate_data(n_samples, p_red)

# Estimate the proportion of red marbles using MLE
p_red_hat = mle_red_proportion(data)

# Print the results
print("Number of samples:", n_samples)
print("Estimated proportion of red marbles:", p_red_hat)

Let us now look at the output to understand the implementation of MLE in Python.

Maximum Likelihood Estimator Output
Maximum Likelihood Estimator Output

From the result, we estimate that there are 62% of red marbles in the bag.

Conclusion

Here you go! Now you can estimate different unknown parameters of a probability distribution using the concept of Maximum Likelihood Estimator(MLE). MLE is a simple yet powerful concept to estimate unknown parameters. With some sample data, you can now easily implement MLE in Python to solve real-world problems like estimating consumer preferences. What other creative ways can you think of applying MLE?

hope you enjoyed reading it!!

Recommended: How To Calculate Power Statistics?

Recommended: Logistic Regression in Predictive Analytics: A Comprehensive Guide