Most people know that computers are "just a bunch of ones and zeroes," but few understand what that really means. How do ones and zeros become numbers, words, and pictures? This talk starts at the beginning, explaining binary numbers and shows how they are used to describe basic datatypes. We begin to uncover what the statement "just a bunch of ones and zeroes" really means.
34. Memory.
TuringMachine.
“Turing's greatest contribution to the development of the
digital computer [was the] idea of controlling the function
of a computing machine by storing a program of …
numerically, encoded instructions in the machine's
memory”
http://www.alanturing.net/
63. • 1 million + glyphs
• Allows a huge number of characters to be represented
with as few bits as possible.
• Uses a form of data compression.
Unicode.
85. // Convert Number to binary string
let num = 9;
let binary = num.toString(2); // “1001”
// Javascript
// Convert binary string to Number
let binary = “1001”;
let num = parseInt(binary, 2); // 9