IEEE TRANSACTIONS ON POWER DELIVERY, VOL. 24, NO. 3, JULY 2009 1319
Statistical Analysis of Exceptional Events:
The Italian Regulatory Experience
Elena Fumagalli, Luca Lo Schiavo, Anna Maria Paganoni, and Piercesare Secchi
Abstract—In the analysis of reliability performance of distribu-
tion utilities, as well as in continuity of supply regulation, criteria
are needed for separating normal operation data from exceptional
events. In recent years a number of statistical methodologies has
been proposed for this purpose. We present here the new method-
ology that was adopted by the Italian regulatory authority at the
beginning of 2008. The decision is supported by a statistical anal-
ysis of the number of faults on the medium voltage (MV) and low
voltage (LV) networks, for each six-hour time interval in a three
year time span, for different provinces and distribution companies.
The new methodology is employed in the reward and penalty mech-
anisms that regulate the SAIDI, SAIFI, and MAIFI indicators and,
with some original provisions, also in the Guaranteed Standard on
maximum restoration times.
Index Terms—Continuity of supply, exceptional events, perfor-
mance-based regulation, reliability, statistical methodology.
I. INTRODUCTION
R
ELIABILITY performance of distribution utilities has re-
ceived considerable attention in recent years. The intro-
duction of continuity of supply regulation in several European
countries and an increased awareness by the customers are the
key factors in this process. Analysis of continuity of supply indi-
cators are fundamental for setting regulatory targets, monitoring
utility performance, and disseminating information to the public
[1], [2].
One of the main problems in the analysis is how to identify
events that are exceptional with respect to normal performance
(where, for exceptional events, we intend events outside the con-
trol of the distribution utilities which can affect their reliability
performance, such as severe weather). The exclusion of these
extreme cases from the data set enable utilities, regulators and
the public to observe more meaningful trends in “normal oper-
ation” performance, that would be, otherwise, hard to capture.
In addition, regulatory instruments employed in quality regula-
tion usually penalize and/or reward utilities on the basis of ex-
pected performance. It is therefore crucial to understand clearly
Manuscript received March 06, 2008; revised August 20, 2008. Current ver-
sion published June 24, 2009. Paper no. TPWRD-00148-2008.
E. Fumagalli is with the Department of Management, Economics and In-
dustrial Engineering, Politecnico di Milano, Milan, Italy (e-mail: elena.fuma-
galli@polimi.it).
L. Lo Schiavo is with the Italian Regulatory Authority for Electricity and
Gas, Quality and Consumers Affairs, Milan, Italy (e-mail: lloschiavo@autorita.
energia.it). Views expressed in this paper do not necessarily reflect those of the
institution he currently works for.
A. M. Paganoni and P. Secchi are with MOX, Department of Mathematics,
Politecnico di Milano, Milan, Italy (e-mail: anna.paganoni@polimi.it; pierce-
sare.secchi@polimi.it).
Digital Object Identifier 10.1109/TPWRD.2008.2007013
when failure in meeting regulatory targets is due to the utility
behavior or to events that are outside the utility’s control. More-
over, even if some events, such as extreme weather conditions,
are unavoidable, regulators have become more and more inter-
ested in controlling the efficiency and effectiveness of utility
restoration schemes under such conditions.
Traditional criteria for separating continuity of supply data
into normal operation data and exceptional data are based on
definitions of exceptional events, given in terms of number of
customers interrupted, duration of the interruption, weather
conditions, extent of the mechanical damage to the distribution
system, and combinations of these factors. Criteria of this sort,
however, are not always sufficiently unambiguous or objective
in the implementation phase [3], [4]. For similar reasons, they
have been studied and discussed also in the literature [5]–[7].
A few recent contributions attempted to overcome these
difficulties, with the use of statistical methodologies [4],
[7]–[9]. A statistical approach, in fact, is expected to present
significant advantages because of a reduction in ambiguities
and an increase in fairness. Nevertheless, statistical analysis
of exceptional events can be performed in very different ways,
depending on the choice (often the availability) of the quality
indicator, on the spatial and temporal units of such measure,
and on the statistical methodology employed. In addition, the
choice of the threshold that will separate normal from excep-
tional events allows for a fair amount of discretion.
In this paper, after reviewing the existing statistical ap-
proaches (Sections II and III), we present the new methodology
that was adopted by the Italian regulatory authority at the
beginning of 2008. Section IV describes the statistical basis for
this decision and Section V discusses the regulatory aspects.
Section VI concludes and indicates directions for further work.
II. STATISTICAL METHODOLOGIES:
U.S. AND U.K. APPROACHES
A. Beta Method
After careful consideration of several alternatives, the IEEE
Distribution Reliability Working Group
1
created, in 2004, a sta-
tistical methodology called the Beta Method, that allows seg-
mentation of reliability data into normal and exceptional cat-
egories [7]. The main purpose of the methodology is to en-
able utilities and regulators to study reliability performance that
is observed during normal operating periods (i.e., to identify
trends in this performance). To this end, the large statistical ef-
fects of major events need to be separated from normal operation
data. Other desirable properties of the methodology are: fairness
1
Part of the Distribution Subcommittee that reports to the IEEE Power and
Energy Society (PES) Transmission and Distribution Technical Committee.
0885-8977/$25.00 © 2008 IEEE