Si usted no está seguro de si o no su respuesta es correcta, una forma útil de verificar es trace una gráfica de la función de verosimilitud logarítmica y ver si su supuesta MLE se ve visualmente para dar la maximización de valor. Voy a hacer esto más adelante, pero tengo que incluir las matemáticas para la obtención de la MLE en el caso general.
MLE en el caso general: Para IID los datos de esta distribución, que tienen la log-verosimilitud:
$$\ell_\mathbf{x}(\theta) = n \ln \theta + (\theta-1) \sum_{i=1}^n \ln x_i - \sum_{i=1}^n x_i^\theta \quad \quad \text{for } \theta>0.$$
The corresponding score function is:
$$s_\mathbf{x}(\theta) = \frac{d\ell_\mathbf{x}}{d\theta}(\theta) = \frac{n}{\theta} + \sum_{i=1}^n (1-x_i^\theta) \ln x_i,$$
and the observed information is:
$$I_\mathbf{x}(\theta) = - \frac{d^2\ell_\mathbf{x}}{d\theta^2}(\theta) = \frac{n}{\theta^2} + \sum_{i=1}^n x_i^\theta (\ln x_i)^2 > 0.$$
We can see from the positive observed information that the log-likelihood is strictly concave, which means that the MLE will be at the unique critical point (unless the score is monotone, in which case the maximum is approached at the boundary of the parameter range, and there is no MLE). The critical point is given implicitly by solving the score equation:
$$0 = s_\mathbf{x}(\hat{\theta}) = \frac{n}{\hat{\theta}} + \sum_{i=1}^n (1 - x_i^\hat{\theta}) \ln x_i.$$
There is no closed-form expression for the MLE in this case, so we need to find it numerically.
Iterative algorithm for MLE: Applying Newton's method, with your chosen starting-point, gives:
$$\hat{\theta}_0 = 1 \quad \quad \quad \hat{\theta}_{k+1} = \hat{\theta}_{k} + \frac{s_\mathbf{x}(\hat{\theta})}{I_\mathbf{x}(\hat{\theta})} = \hat{\theta}_{k} + \frac{n \hat{\theta}_k + \hat{\theta}_k^2 \sum_{i=1}^n (1 - x_i^{\hat{\theta}_k}) \ln x_i}{n + \hat{\theta}_k^2 \sum_{i=1}^n x_i^{\hat{\theta}_k} (\ln x_i)^2}.$$
(Note: The starting point you have chosen is a reasonable one. With some calculus, it is possible to show that $\mathbb{E}(X) = \Gamma(1 + 1/\theta)$, so we could approximate $\bar{x} \approx \Gamma(1 + 1/\theta)$ as a starting point for the iteration. However, the problem is that this already requires numerical solution, so it is not a great starting point. The value you have chosen is reasonable, and the iteration should converge quite rapidly in any case.) We can implement this iteration algorithm in the following R
code:
#Create function to find the MLE via iteration
#The input m is the number of iterations to perform (default is five iterations)
MLE_ITERATION <- function(x, m = 5) {
n <- length(x);
T <- 1;
theta <- rep(T, m+1);
for (k in 1:m) {
NUMERATOR <- n*theta[k] + theta[k]^2 * sum((1-x^theta[k])*log(x));
DENOMINATOR <- n + theta[k]^2 * sum(x^theta[k]*(log(x))^2);
theta[k+1] <- theta[k] + NUMERATOR/DENOMINATOR; }
theta; }
Application to your data set: You have the data vector $\mathbf{x} = (0.60, 5.17, 0.23)$. With $m=10$ iterations (which is more than you need) you get the MLE $\hat{\theta} = 0.6771516$. La log-verosimilitud y el MLE se muestra aquí:
Aquí es el R
código utilizado para generar el MLE y la trama:
#Enter your data
x <- c(0.60, 5.17, 0.23);
#Choose number of iterations
m <- 10;
#Generate the iterations, and display the last value
THETA_ITER <- MLE_ITERATION(x, m);
THETA_ITER[m+1];
[1] 0.6771516
#Generate vectorised log-likelihood function
LOG_LIKE <- function(theta) {
LL <- rep(0, length(theta));
for (i in 1:length(theta)) {
LL[i] <- length(x)*log(theta[i]) + (theta[i]-1)*sum(log(x)) - sum(x^theta[i]); }
LL }
DATA <- data.frame(Theta = 1:200/100,
Log_Like = LOG_LIKE(1:200/100));
#Plot the log-likelihood function with MLE
library(ggplot2);
ggplot(data = DATA, aes(x = Theta, y = Log_Like)) +
geom_line(size = 1.2) +
geom_vline(xintercept = THETA_ITER[m+1],
size = 1.2, linetype = 'dashed', colour = 'red') +
theme(plot.title = element_text(hjust = 0.5, face = 'bold'),
plot.subtitle = element_text(hjust = 0.5)) +
ggtitle('Plot of Log-Likelihood Function') +
labs(subtitle = '(Red line shows MLE - obtained via iteration)') +
xlab(expression(theta)) + ylab('Log-Likelihood');