Persistent and transient productive inefficiency: a maximum simulated likelihood approach
Access Status
Authors
Date
2016Type
Metadata
Show full item recordCitation
Source Title
ISSN
School
Collection
Abstract
The productive efficiency of a firm can be seen as composed of two parts, one persistent and one transient. The received empirical literature on the measurement of productive efficiency has paid relatively little attention to the difference between these two components. Ahn and Sickles (Econ Rev 19(4):461–492, 2000) suggested some approaches that pointed in this direction. The possibility was also raised in Greene (Health Econ 13(10):959–980, 2004. doi:10.1002/hec.938), who expressed some pessimism over the possibility of distinguishing the two empirically. Recently, Colombi (A skew normal stochastic frontier model for panel data, 2010) and Kumbhakar and Tsionas (J Appl Econ 29(1):110–132, 2012), in a milestone extension of the stochastic frontier methodology have proposed a tractable model based on panel data that promises to provide separate estimates of the two components of efficiency. The approach developed in the original presentation proved very cumbersome actually to implement in practice. Colombi (2010) notes that FIML estimation of the model is ‘complex and time consuming.’ In the sequence of papers, Colombi (2010), Colombi et al. (A stochastic frontier model with short-run and long-run inefficiency random effects, 2011, J Prod Anal, 2014), Kumbhakar et al. (J Prod Anal 41(2):321–337, 2012) and Kumbhakar and Tsionas (2012) have suggested other strategies, including a four step least squares method. The main point of this paper is that full maximum likelihood estimation of the model is neither complex nor time consuming. The extreme complexity of the log likelihood noted in Colombi (2010), Colombi et al. (2011, 2014) is reduced by using simulation and exploiting the Butler and Moffitt (Econometrica 50:761–764, 1982) formulation. In this paper, we develop a practical full information maximum simulated likelihood estimator for the model. The approach is very effective and strikingly simple to apply, and uses all of the sample distributional information to obtain the estimates. We also implement the panel data counterpart of the Jondrow et al. (J Econ 19(2–3):233–238, 1982) estimator for technical or cost inefficiency. The technique is applied in a study of the cost efficiency of Swiss railways.
Related items
Showing items related by title, author, creator and subject.
-
Zhao, X.; Scarrott, C.; Oxley, Leslie; Reale, M. (2010)This article introduces a new approach for estimating Value at Risk (VaR), which is then used to show the likelihood of the impacts of the current financial crisis. A commonly used two-stage approach is taken, by combining ...
-
Awange, Joseph; Palancz, B.; Lewis, R.; Lovas, T.; Heck, B.; Fukuda, Y. (2016)Traditionally, the least-squares method has been employed as a standard technique for parameter estimation and regression fitting of models to measured points in data sets in many engineering disciplines, geoscience fields ...
-
Yick, John S. (2000)The influence of observations on the outcome of an analysis is of importance in statistical data analysis. A practical and well-established approach to influence analysis is case deletion. However, it has its draw-backs ...