part(a)
In the method of moments we formulate the moments of the probability law of the distribution in which the random variables belong to and equate these moments to the moments obtained from the sample at hand and solve for the unknown parameters.
But \(e^{-\frac {\left \vert x\right \vert }{\sigma }}\)is symmtric around the \(x=0\) due to absolute \(x\) in the power of the \(\exp .\)(This assumes \(\sigma \) positive, which is ofcourse true) but it is multiplied by negative \(x\) to the left of y-axis and multiplied by positive \(x\) on the right of the y-axis, hence the area of the left of the y-axis will be equal but negative to the area on the right of the y-axis. Hence the above integral is zero. Hence \(\mu _{1}=0\)
This moment provides no information. Find the second moment.
Due to the symmetry of \(e^{-\frac {\left \vert x\right \vert }{\sigma }}\) and also \(x^{2}\) is even and symmetrical around \(x=0\,\), the above integral is then twice the integral from \(x=0\cdots \infty \) and it becomes
Integration by parts gives
Hence
Now find \(\mu _{2}\) from the sample itself and substitute for it in the above. From the sample,
Since the mean of the population was found to be zero, we can take the mean of the sample \(\bar {X}=0\)
Hence \(\mu _{2}\) from the sample becomes \(\ \)
Replace the above in (1) we obtain estimate of the population \(\sigma \) as
___________________________________________________________________________________________
part(b)
The MLE of \(\sigma \) is found as follows. Since i.i.d. random variables we write (Where \(L\left ( \sigma \right ) \) mean \(lik\left ( \sigma \right ) \) and \(l\left ( \sigma \right ) \) means \(\log \left ( lik\left ( \sigma \right ) \right ) \)
Therefore
Now we find the MLE, which is the value of \(\sigma \,\)which maximizes the above function.
Hence
The above is the MLE estimate of the parameter \(\sigma .\)
part(c)
The asymptotic distribution of the MLE \(\hat {\sigma }\) is normal with mean \(\sigma \) and variance \(\frac {1}{nI\left ( \sigma \right ) }\)where
But
and
Hence
Need to find \(E\left \vert X_{i}\right \vert \), since i.i.d. all random variables has the same expected value as \(X\), hence
Therefore from (2)
Hence the Fischer information matrix is
Hence MLE \(\hat {\sigma }_{n}\ \)has an asymptotic distribution \(\sim N\left ( \sigma ,\frac {1}{nI\left ( \sigma \right ) }\right ) \)
i.e.
and
part(a)
The random variable here is the lifetime of a component.
In this problem the contribution to the likelihood function of \(\lambda \) comes from only one random variable. Hence we need to find the pdf of this random observation, which is an order statistics. It is the minimum random variable among \(n\) random variables where \(n=5\) here.
Since this is an exponential distribution, we know that the distribution of \(X_{\left ( 1\right ) }\) is given by (from section 3.7, chapter 3, textbook)
Where in the above, the \(t\) is the time of the first failures in each sample taken. (sample size is 5 in this problem).
Hence
so for \(n=5,\) the likelihood function is
Hence we need to find the maximum of the above function. Since we have only one r.v., no need to take logs, use standard method:
Solve for \(\hat {\lambda }\)
But here \(n=5\) and time of first failure is \(t=100\) hence the above becomes ( write \(T=100\) ) then we have
part(B)
Since \(\hat {\lambda }=\frac {1}{5T}\) where \(T\) is a r.v (the first time to fail) which has the distribution \(5\lambda e^{-5\lambda t}\), Hence we conclude that the distribution of \(\hat {\lambda } \sim \frac {1}{5}\frac {1}{5\lambda e^{-5\lambda t}}\) But an exponential distribution is \(\tau e^{-\tau t}\) , hence now we see that sampling distribution of \(\hat {\lambda } \sim \)multiple one over an exponential distribution with parameter \(\left ( \tau =5\lambda \right ) \).
(When asked to find distribution of some r.v., do we always have to express in terms of "known" distributions?)
___________________________________________________________________________________________
part(C)
We need to find the standard deviation of the sampling distribution of \(\hat {\lambda }\) found above.
Since we found that \(\hat {\lambda } \sim \frac {1}{\text {exponential distribution with parameter}\left ( \tau \right ) }\) and the variance of an exponential with parameter \(\tau \) is \(\frac {1}{\tau ^{2}}\), hence variance of \(\hat {\lambda }=\tau ^{2}\)
Hence standard error is the square root of this variance. Hence standard error of the MLE \(\hat {\lambda }=\) \(\tau =5\lambda \)
part (a)
First, I want to say that I am using the following defintion of the Gamma function (using \(\beta \) instead of \(\lambda \)) in the defintion. Since The data given has units of time and are not rate (i.e. 1/time). So I am using this definition of Gamma PDF
Now to answer part (A).
Yes. The following shows the histogram of the data, and a plot of a Gamma distribution with the shape parameter \(\alpha =1\) and scale parameter \(\beta \) set to the average of the data.
Part(b)
Using method of moments. We need 2 equations since we have to estimate 2 parameters \(\alpha ,\lambda .\) For Gamma
Now from the data itself, calculate the First and Second moments and equate to the above and solve for \(\alpha ,\lambda \,\) and these will be our estimate. This little code does the above
Now using the MLE method. For \(\alpha \)
Hence we obtain the 2 equations
From the second equation, set it to zero we obtain
Substitute the above in the first equation and set to zero we obtain
And solve for \(\hat {\alpha }\). Once we find \(\hat {\alpha }\) we then find also \(\hat {\lambda }=\frac {\hat {\alpha }}{\bar {X}}\)
Part(C)
Now Fit this model again, and compare the MLE fitting to the method of moments fitting
This plot shows more closely the fitting on top of each others. They are very close so hard to see the difference other than near the high frequency part.
The fits above both look reasonable.
Part(d)
Use bootstrap method.
For the method of moments.
Try for \(n=500\) be the same size\(.\) Use the method of moments parameters to generate an \(n\) random variables from Gamma distribution. First time use the parameters estimated from the data as shown above.
Now, use the sample generated above to estimate the parameters from it again using also the method of moments. Use these parameters to generate another \(n\) random variables. repeat this process for say \(N=5000\) and find the variances of the parameters \(\alpha ,\lambda \), and hence we find the standard error which is the square root of these variances.
Here is the code to do the above and the result
(Last minute update), I am getting large result for standard error from the bootstrap method. I think I have something wrong. Here is the result I get and the code
For Method of moments, I get standard error for alpha=918 and for lambda=18
For MLE I get
Standard error for alpha=1.68697*10^8
Standard error for lambda=60.2585
Part (e) and (f)
Run out of time.
Mathemtica notebook for corrected version
Text file of gamma arrivals data
16/20