How does a computer convert text into binary or 0's and 1's?
Computers convert text and other data into binary with an assigned ASCII (American Standard Code for Information Interchange) value. Once the ASCII value is known, that value can be converted to binary. In the following example, we take the word hope, and show how it is converted to binary that the computer understands.
Let's take the first character h and break down the process. Once the letter h (in lowercase) is typed on the keyboard, it sends a signal to the computer as input. The computer operating system knows the ASCII standard value for h is 104, which can be converted by the computer to the binary value 01101000.
See our binary, decimal, and hexadecimal conversions page for further information about how the conversion happens.
After the h is converted into binary, the computer can store and process the data as ones (on) and zeros (off).
See our hard drive page for information about computer hard drives and how information is stored on magnetic media like hard drives.
When storing this data, each character takes 8 bits (1 byte), which means to store "hope" as plaintext, it would take 4 bytes or 32-bits.
For information about computer storage sizes, see: How much is 1 byte, kilobyte, megabyte, gigabyte, etc.?
How does the computer convert binary back into text?
When the computer needs to convert the binary data back to human-readable text, it's the reverse of the previously shown process. For example, a computer may convert the binary 01101000 to the decimal value 104 which it knows is the letter h using the ASCII standard conversion. Hence, you see the letter 'h' output to your computer's monitor.
See our binary page for full information about how and why binary is done on a computer
How do I determine the ASCII values?
See our ASCII page for a chart for a full list of characters and their related decimal and binary values and further information about the ASCII standard.