Advances in Computational Research, ISSN: 0975–3273, Volume 1, Issue 2, 2009, pp-1-09
Copyright © 2009, Bioinfo Publications, Advances in Computational Research, ISSN: 0975–3273, Volume 1, Issue 2, 2009
A Comprehensive Note on Complexity Issues in Sorting Algorithms
Parag Bhalchandra*, Nilesh Deshmukh, Sakharam Lokhande, Santosh Phulari*
*School of Computational Sciences, Swami Ramanand Teerth Marathwada University, Nanded, MS,
India, 431606, Email: pub1976@ rediffmail.com
Abstract- Since the dawn of computing, the sorting problem has attracted a great deal of research. In
past, many researchers have attempted to optimize it properly using empirical analysis. We have
investigated the complexity values researchers have obtained and observed that there is scope for fine
tuning in present context. Strong evidence to that effect is also presented. We aim to provide a useful
and comprehensive note to researcher about how complexity aspects of sorting algorithms can be best
analyzed. It is also intended current researchers to think about whether their own work might be
improved by a suggestive fine tuning. Our work is based on the knowledge learned after literature
review of experimentation, survey paper analysis being carried out for the performance improvements of
sorting algorithms. Although written from the perspective of a theoretical computer scientist, it is
intended to be of use to researchers from all fields who want to study sorting algorithms rigorously.
Keywords: Algorithm analysis, Sorting algorithm, Empirical Analysis Computational Complexity
notations.
1. Introduction
Searching and Sorting are the tasks that are
frequently encountered in various Computer
Applications. Since they reflect fundamental
tasks that must be tackled quite frequently,
researchers have attempted in past to develop
algorithms efficient in terms of optimum
memory requirement and minimum time
requirement i.e., Time or Space Complexities.
Together with searching, sorting is probably the
most used algorithm in Computing, and one in
which, statistically, computers spend around
half of the time performing
a
. Sorting algorithms
are always attractive because of the amount of
time computers spend on the process of sorting
has always been a matter of research attention.
For this reason, the development of fast,
efficient and inexpensive algorithms for sorting
and ordering lists and arrays is a fundamental
field of computing. By optimizing sorting,
computing as a whole will be faster. When we
look to develop or use a sorting algorithm on
large problems, we came across previous
research literature where it was mentioned
clearly to concentrate on how long the
algorithm might take to run. We discovered
that, the time for most sorting algorithms
depends on the amount of data or size of the
problem and in order to analyze an algorithm,
we required to find a relationship showing how
the time needed for the algorithm depends on
the amount of data. We found that, for an
algorithm, when we double the amount of data,
the time needed is also doubled. The analysis
of another algorithm told us that when we
double the amount of data, the time is
increased by a factor of four. The latter
algorithm would have the time needed increase
much more rapidly than the first. We have
discovered that, some factors other than the
sorting algorithm selected to solve a problem,
affect the time needed for run [1]. It is just
because different people carrying out a solution
to a problem may work at different speeds,
even when they use the same sorting method,
as different computers work at different
speeds. The different speeds of computers
can be due to different "clock speeds”, the
rates at which steps in the program are
executed by the machine and different
"architectures," the way in which the internal
instructions and circuitry of the computer are
organized. Consequently, analysis of sorting
algorithm can not predict exactly how long it will
take on a particular computer. We also found
that, the analysis of efficiency depends
considerably on the nature of the data. For
example, if the original data set is almost
ordered already, a sorting algorithm may
behave rather differently than if the data set
originally contains random data or is ordered in
the reverse direction. The purpose of this
investigation is to magnify analysis of sorting
algorithms considering all possible factors and
make a concise note of it. Our work may be
useful for some applications that seek to
determine which sorting algorithm is the fastest
to sort the lists of different lengths, and, to,
therefore determine which algorithm should be
used depending on the list length. For example
Shell sort should be used for sorting of small
(less than 1000 items) arrays. It has the
advantage of being an in-place and non-
recursive algorithm, and is faster than
Quicksort up to a certain point. For larger
arrays, the best choice is Quicksort, which uses
recursion to sort lists, leading to higher system
usage but significantly faster results. We have
attempted to review the rich body of sorting
literature in accord with their utility and
performance so as to make a critical analysis of
them in order to discover tuning factors. These
factors are intended to help the reader to avoid
wasted efforts in order to produce correct
complexity values. Most of the part of this
paper concentrates on the study of algorithms
for problems in the standard format where both
instances and outputs are finite objects, and
the key metrics are resource usage (typically
time and memory).Several of the suggestions
enunciated here may be somewhat
controversial, but we have, at least elaborated