hit counter code

Tuesday, August 19, 2008

1's & 0's: ASCII Tables

First you need to know that all letters and characters are converted to a number. One of these easy to understand number conventions is called the ASCII (say “Ask Key or Ask Key Two”). This standard dates back to the early 1960's as a conventional way of storing the English alphabet and stands for American Standard Code for Information Interchange, or ASCII. There are other formats out there, but for our discussion we'll stick with this one.

So, according to the ASCII standard, the letter “A” gets the decimal (number) value of 65. Why you ask? I don't know, that's just the way it has been for many years. The letter “B” is 66, and so forth. You can view a table here or look up ASCII on Wikipedia if you are really inquisitive and want to read more.

So, how does a computer store the number 65 that represents and ASCII letter “A”? To understand that we need to explore how 0's and 1's can represent any number. This gets a little interesting, but if you follow it you'll know the answer to the question of why a computer “sees” Kilobyte (K) of memory, for example, is 1024 bytes and NOT 1000 bytes.


No comments: