• A A

Home > Technology > IT > General > Binary and ASCII Compared

Difference Between Binary and ASCII

Binary vs ASCII
 

Binary code is a method used in computers and digital devices, to represent and transfer text, symbols, or processor instructions. Since computers and digital devices perform their fundamental operations based on two voltage values (High or Low), every bit of data involved with a process has to be converted into that form. The ideal method for accomplishing this task is to represent the data in the binary numeral system, which includes only two digits, 1 and 0. For example, with every keystroke on your keyboard, it produces a string of 1`s and 0`s, which is unique for each character and sends it as the output. The process of converting data into binary code is called as encoding. Many encoding methods are used in computing and telecommunications.

ASCII, which stands for American Standard Code for Information Interchange, is a standard encoding for alphanumeric characters used in computers and related devices. ASCII was introduced by the United States of America Standards Institute (USASI) now known as the American National Standards Institute.       

More about Binary Codes

Simplest way to encode a data is to assign a specific value (mostly in decimal numbers) to the character or the symbol or the instruction, and then convert the value (decimal number) to the binary number, which only consists of 1`s and 0`s. The sequence of 1 `s and 0`s is called as a binary string. The length of the binary string determines the number of different characters or instructions which can be encoded. With only one digit, only two different characters or instructions can be represented. With two digits, four characters or instructions can be represented. Generally, with a binary string of n digits, 2n different characters, instructions, or states can be represented.    

Many encoding methods exist with different lengths of binary strings, of which some have constant length and the others variable length. A few of binary codes with constant bit strings are ASCII, extended ASCII, UTF-2, and UTF-32. UTF-16 and UTF-8 are variable length binary codes. Both Huffman encoding and Morse code can also be considered as variable length binary codes.

More about ASCII

ASCII is an alphanumeric character encoding scheme introduced in the 1960’s. Original ASCII uses 7 digits long binary string, which enables it to represent 128 characters. A later version of ASCII called extended ASCII uses 8 digits long binary string giving it the ability to represent 256 different characters.

ASCII includes, primarily, two types of characters, which are control characters (represented by 0-31 decimal and 127decimal) and printable characters (represented by 32- 126 decimal). For example, control key delete is given the value 127decimal which is represented by 1111111. The character a, which is given the value 97decimal,is represented by 1100001. ASCII can represent letters in both cases, numbers, symbols, and control keys.

What is the difference between Binary Code and ASCII?

• Binary code is a general term used for a method of encoding characters or instructions, but ASCII is only one of the globally accepted conventions of encoding characters, and was the most commonly used binary encoding scheme for more than three decades.

• Binary code can have different lengths for encoding depending on the number of characters, instructions, or the encoding method, but ASCII uses only 7 digits long binary string and 8 digits long for extended ASCII.


email

Related posts:

  1. Difference Between Unicode and ASCII
  2. Difference Between Encoding and Decoding
  3. Difference Between Complete Binary Tree and Full Binary Tree
  4. Difference Between Binary and Decimal
  5. Difference Between Encoding and Modulation

Tags: , , ,

Copyright © 2010-2012 Difference Between. All rights reserved.Protected by Copyscape Web Plagiarism Detection
Terms of Use and Privacy Policy : Legal.
hit counters
eXTReMe Tracker
hit counters