Date of Award
Doctor of Philosophy (PhD)
R. Carter Hill
Maximum entropy estimation is a relatively new estimation technique in econometrics. We carry out several Monte Carlo experiments using real data as a basis in order to understand the properties of the maximum entropy estimator. We compare the maximum entropy and generalized maximum entropy estimators to traditional estimation techniques in linear regression, binary choice, and multinomial choice models. In addition, we discuss maximum entropy estimation in censored and truncated regression models. We find that the generalized maximum entropy estimator dominates the logit estimator and the multinomial logit estimator in Monte Carlo experiments. The generalized maximum entropy estimator in discrete choice models allows us to jointly estimate the unknown probabilities and the unknown errors resulting in more uniform predicted probabilities and reducing the variance of the parameter estimates. In the linear regression problem, the generalized maximum entropy estimator allows us to impose nonsample information about the unknown parameters and errors. However, we must impose a set of support points for unknown parameters and errors, which is not always an easy thing to do. We find that when we do specify nonsample information that is correct, the generalized maximum entropy estimator has lower risk than either the ordinary least squares or the inequality restricted least squares estimators. From our sampling experiments using real data, we find that maximum entropy estimation is a viable estimation technique in several econometric models.
Campbell, Randall Charles, "An Empirical Examination of Maximum Entropy Estimation." (1999). LSU Historical Dissertations and Theses. 6914.