Estimating most productive scale size with stochastic data in data
envelopment analysis
M. Khodabakhshi ⁎
Department of Mathematics, Faculty of Science, Lorestan University, Khorram Abad, Iran
abstract article info
Article history:
Accepted 4 March 2009
Keywords:
Stochastic data
Most productive scale size (mpss)
Chance constraints
Software Companies
This article estimates most productive scale size in stochastic data envelopment analysis (DEA). Jahanshahloo and
Khodabakhshi [Jahanshahloo, G.R. and Khodabakhshi, M., Using input–output orientation model for determining
most productive scale size in DEA. Applied Mathematics and Computation 2003, 146(2–3), 849–855.] studied most
productive scale size in classic data envelopment analysis. The classic data envelopment analysis requires that the
values for all inputs and outputs be known exactly. However, this assumption may not be true, because data in many
real applications cannot be precisely measured. One of the important methods to deal with imprecise data is
considering stochastic data in DEA. Therefore, this research studies most productive scale size with considering
stochastic data in DEA. To that end, input–output orientation model introduced in Jahanshahloo and Khodabakhshi
[Jahanshahloo, G.R. and Khodabakhshi, M., Using input–output orientation model for determining most productive
scale size in DEA. Applied Mathematics and Computation 2003, 146(2–3), 849–855.] is extended in stochastic data
envelopment analysis. To solve the stochastic model, a deterministic equivalent is obtained. Although the
deterministic equivalent is non-linear, it can be converted to a quadratic program. Furthermore, data of software
companies is used to apply the proposed approach. Performance of software companies are evaluated based on their
scale sizes in classic and stochastic data envelopment analysis.
© 2009 Elsevier B.V. All rights reserved.
1. Introduction
Data envelopment analysis (DEA) initiated by Charnes et al. (1978),
and the first model was called CCR model. This model, the CCR model, is a
linear programming problem, and it is readily computable. The specific
research standard of efficiency measurement for production units in the
field of operational research took off with introducing the CCR model. For
example, Banker et al. (1984) provided some models for estimating
technical and scale inefficiencies in data envelopment analysis. They
extended DEA by adding a convexity constraint to obtain a new model
known as BCC model. BCC model is a variable returns to scale version of
the CCR model. Tone (2001), also, introduced a non-radial model known
as slack-based measure to evaluate efficiency of decision making units.
The original models, CCR and BCC, in DEA only allow changes in the input
combination of decision making units that are limited to the observed
inputs of evaluating decision making units. Cooper et al. (2001), with a
proper initiative on the data of textile industry of China for improving
congestion management, increased labor input and reduced capital
input and showed the new combination could have constructive results.
The idea initiated by Cooper et al. (2001) motivated further work and
new models in DEA, e.g., Jahanshahloo and Khodabakhshi (2004) and
Khodabakhshi (2009). These models use more flexibility in changes of
the used input combination to find the maximum possible output and
can be useful to resources management. Since 1978 there has been a
surge of research on DEA and many further models were introduced in
the literature. See, for example, Andersen and Petersen (1993), Adler
et al. (2002), Li et al. (2007) which are different approaches for ranking
efficient units obtained by the original models in DEA. Note that although
efficient units are benchmark for inefficient ones, they are not
comparable among themselves with original DEA models. A thorough
discussion on new development in DEA up to 1996 can be found in
Cooper et al. (1996). The proposed model in Cooper et al. (1996) for
determining most productive scale size has a fractional objective
function. Jahanshahloo and Khodabakhshi (2003), also, provided an
input–output orientation model to estimate most productive scale size
units with linear objective function. Both models can determine most
productive scale size units, while the later model is easier to solve than
the first one because of their objective functions. One may refer to Cooper
et al. (2000) and Thanassoulis (2001) which include most of the
developments and extensions in DEA. One of the advantages of the DEA
method is that it requires neither a priori weights nor explicit
specification of functional relations among the multiple inputs and
outputs. However, as one of the weaknesses, DEA does not allow
stochastic variations in input–output data, such as measurement errors
and data entry errors. Traditionally, the coefficients of data envelopment
analysis (DEA) models, i.e., the data of inputs and outputs of the different
decision making units (DMUs), are assumed to be measured with
precision. On the other hand, as some authors point out (see, e.g., Liu,
1999), this is not always possible. To remove this weakness in the classic
Economic Modelling 26 (2009) 968–973
⁎ Tel.: +98 9126278846; fax: +98 661 2201333.
E-mail address: mkhbakhshi@yahoo.com.
0264-9993/$ – see front matter © 2009 Elsevier B.V. All rights reserved.
doi:10.1016/j.econmod.2009.03.002
Contents lists available at ScienceDirect
Economic Modelling
journal homepage: www.elsevier.com/locate/econbase