+ 1
every variable you define in your program must be a real signals that computer can understand ..
computer as a machine can't understand English letters ! it can't understand any language except the machine language ..
now imagine the machine language as a language of only 2 characters 0 and 1 .. 0 means no electric signal and 1 means electric signal exists ..
when defining a character in any programming language the complier converts it to machine to let the computer understands it ..
every character is converted to 8 bits ex (01110111)
these 8 bits are constant for any character or symbol
and the standard that defines which 8 bits means which symbol is the ASCII codes .
for summary ASCII codes are a representation for characters and symbol that makes the machine understand the characters and symbols in its own language which is called machine language