Available Online at www.ijcsmc.com
International Journal of Computer Science and Mobile Computing
A Monthly Journal of Computer Science and Information Technology
ISSN 2320–088X
IJCSMC, Vol. 2, Issue. 7, July 2013, pg.262 – 272
RESEARCH ARTICLE
© 2013, IJCSMC All Rights Reserved 262
Enhance Luhn Algorithm for Validation
of Credit Cards Numbers
Khalid Waleed Hussein
1
, Dr. Nor Fazlida Mohd. Sani
2
,
Professor Dr. Ramlan Mahmod
3
, Dr. Mohd. Taufik Abdullah
4
1-4
Faculty Computer Science & IT, University Putra Malaysia (UPM), Kuala Lumpur-Malaysia
1
Khaled_it77@yahoo.com,
2
fazlida@fsktm.upm.edu.my,
3
ramlan@fsktm.upm.edu.my,
4
mtaufik@fsktm.upm.edu.my
Abstract-The Luhn algorithm is the first line of defense in many e-commerce sites and is used to validate
a variety of identification numbers such as credit card numbers. Nevertheless, many card numbers exist
and at such volumes, the algorithm cannot distinguish among these numbers. A variety of tests show
that the Luhn algorithm suffers from weaknesses including the failure to determine the length and type
of credit card number being analyzed. We intend to enhance the Luhn algorithm for the validation of
credit card numbers. The enhancement is expected to be useful for many e-commerce sites that use the
algorithm.
Keyword- Security; Luhn algorithm; Credit Card Number Validation; Visa card Validation; JCB number
Validation.
I. INTRODUCTION
Credit cards are the most frequently used payment method, accounting for about 95% of all online
transactions; these are the primary means of payment for goods and services purchased online[1]. With
increasing credit card use on the Internet comes a dramatic increase in credit card fraud[2]. Typing errors
are one of the most common errors that occur when a user attempts to retype his/her credit card number