Friday, April 8, 2011

Credit Card validation: Luhn algorithm

I was paying my phone bill online when I was taken aback by an alert stating an incorrect credit card number. I was sure, but I double checked, and the number I entered was 16 digits, had no leading/trailing spaces, and had no non-numeric stray characters.

How could then someone validate my card number in the browser, without going to the server? Are all card numbers generated guided by some mathematical rules? I knew all Visa cards begin with a 4 and Master cards with a 5, but are there more rules/algorithms that could be programmed in client-side javascript?

The curious part of me opened the Developer Tools. (And I must note that the Chrome Developer Tools are much better than Firebug, only that I am used to Firebug, but lets put that comparison aside for a later post). Half expecting to find an uncompressed javascript, I started digging into the scripts section. I was about to close off the Developer Tools when I found some embedded script in the HTML. And voila, it turned out to be what I was looking for! Totally uncompressed, and with comments intact!

I spend the next 10 minutes going through the stack of JS functions, and found a confident algorithm to shoo away invalid card numbers, besides the usual number of digits and starting digits etc:

  • Double every second digit, starting from the right. If the result is in two digits, add the two digits.
  • Sum all digits obtained as above to all the remaining (undoubled) digits. 
  • If the sum so obtained is not a multiple of 10, the card number is invalid!!

Googling up gave me tons of results on credit card number validation; a reliable wikipedia page told me the algorithm is called Luhn algorithm, patented by some Hans Peter Luhn in 1954. This is used as the validation algorithm for almost all Card Issuing Networks across the world for like 6 decades!. And I was so surprised at the javascript validation error...

No comments:

Post a Comment