In this section, we're going to have a look at circuits and switches, coding schemes, binary digits, 2^n, the 5 generations of computers, Moore's Law, bits, bytes, KB, MB, GB, TB, and machine language.
Computers run on electricity. This is important, because electricity has two discrete states: On or Off. Because of this, we can take those two states and assign a coding scheme, whether or not the switch is on or off.
For example, a light switch can be on or off. We can associate a coding scheme with this. A store may have an Open sign, which is a light. The light switch (or circuit) can be used to signal that the store is open for business or closed for business. And based upon whether or not that circuit or switch is on or off, it can mean one thing or another thing. This is the fundamental foundation of how computers work, just scaled up about a billion times.
The words circuit, switch, and transistor are all interchangeable. You have circuits, which are either on or off, and the arrangements of those on or off circuits have some meaning, or are associated with some coding schemes.
So, if we have one light, we can have 2 messages (on or off). How many messages can we have with 2 lights? We can have 4 different messages with 2 lights: 1.) on, on, 2.) off, off, 3.) on, off, 4.) off, on. And what if we had 3 lights? 8 messages.
The formula for figuring out how many messages we can have with any given number of lights is 2^n
where n
is the number of lights (or circuits or switches or transistors). So, 2^3
is 2*2*2
, two the power of three.
1 light = 2 messages
2 lights = 4 messages
3 lights = 8 messages
4 lights = 16 messages
5 lights = 32 messages
6 lights = 64 messages
7 lights = 128 messages
8 lights = 256 messages
9 lights = 512 messages
10 lights = 1024 messages
These are numbers you've likely always heard about with computers. This is called base two. It's base two, because it's on or off; there are two states. It is binary.
With a coding scheme, we decide what the meaning is. Here's an example:
Coding Scheme Example
scheme | meaning |
---|---|
on on | let's party |
on off | movie night |
off on | study night |
off off | sleeping |
Something else we can do, instead of saying on, and off, is abbreviate that to 1
and 0
. So, 1
represents on and 0
represents off.
Here's the coding scheme example again, this time using 1
s and 0
s:
Coding Scheme Example
scheme | meaning |
---|---|
1 1 | let's party |
1 0 | movie night |
0 1 | study night |
0 0 | sleeping |
And now you know why so often, when you see something related to computers you see 0
s and 1
s.
So, that is how a computer works. Computers run on electricity. Electricity has two discrete states; on or off. Based upon the state of a circuit, such as a light, we can associate messages with that state, e.g. if we have 1 light, on means "open", off means "closed." And, that is called a coding scheme. Instead of writing on or off over and over, we'll just abbreviate that as 1
meaning on and 0
meaning off.
Here's another example of a made up coding scheme
scheme | meaning |
---|---|
0 0 0 | A |
0 0 1 | B |
0 1 0 | C |
1 0 0 | D |
1 1 0 | E |
1 0 1 | F |
0 1 1 | G |
1 1 1 | H |
With this coding scheme, we can translate a message stored in binary:
scheme | meaning |
---|---|
1 1 1 | |
1 1 0 | |
0 0 0 | |
1 0 0 | |
0 0 0 | |
0 1 0 | |
1 1 1 | |
1 1 0 |
The message stored here in binary is HEADACHE. Of course, we need to have the coding scheme to know this.
scheme | meaning |
---|---|
1 1 1 | H |
1 1 0 | E |
0 0 0 | A |
1 0 0 | D |
0 0 0 | A |
0 1 0 | C |
1 1 1 | H |
1 1 0 | E |
Let's now take a look at Binary Digits. 0
s and 1
s are referred to as binary digits. Abbreviating the term "binary digits" gives us the term bits.
on & off, 1
& 0
, binary digists, bits, and machine language are all words used to refer to this idea that, within a computer, it's all nothing but a bunch of zero's and one's, or switches that are on or off. It's all just a bunch of binary digits or bits. That's the language that computers speak; it's machine language.
circuits, switches, transistors, and even gates are all words used to refer to this thing within a computer that can either be on or off. It's a circuit, it's a switch, it's a transistor, it's a gate that can either be opened or closed; You will learn that people use all of those words to talk about this same thing; this ability of computers to store on/off states.
A coding scheme that has been used in the past, and is still widely used today is ASCII. We can see that in ASCII, letters of the alphabet are associated with 0
s and 1
s.
Binary | Glyph |
---|---|
100 0001 | A |
100 0010 | B |
100 0011 | C |
100 0100 | D |
100 0101 | E |
100 0110 | F |
100 0111 | G |
100 1000 | H |
100 1001 | I |
100 1010 | J |
100 1011 | K |
100 1100 | L |
100 1101 | M |
100 1110 | N |
100 1111 | O |
101 0000 | P |
101 0001 | Q |
101 0010 | R |
101 0011 | S |
101 0100 | T |
101 0101 | U |
101 0110 | V |
101 0111 | W |
101 1000 | X |
101 1001 | Y |
101 1010 | Z |
So, if we have these switches, these circuits, in this arrangement of on/off: 100 0001
, that's capital letter "A," so literaly, when you hit capital letter "A" on your keyboard, somewhere within your computer seven circuits and switches go into that arrangement on on and off.
The most popular coding scheme today is UTF-8. The Wikipedia entry for UTF-8 has this graph:
This graph shows the decline of ASCII and the rise of UTF-8. ASCII worked at one time, but we needed a way to capture all the languages of the world. The same people who created the Go programming language were the people who created the world's most popular coding scheme, which is UTF-8. The first part of UTF-8 encorporates or encapsulates ASCII; that is, the first part of UTF-8 is ASCII.
If you have a single binary digit, that's just called a bit. If you have 8 bits, that's called a byte. 1000 bytes is called a Kilobyte, 1000 Kilobytes is called a Megabyte, 1000 Megabytes is called a Gigabyte, and 1000 Gigabytes is called a Terabyte.
1 bit
8 bits = byte
1000 bytes = kb
1000 kb = mb
1000 mb = gb
1000 gb = tb
The first computers came about in the 1940s. Among the earliest electronic general-purpose computers was the ENIAC. It was built to help calculate tradjectories on shells shot from ships at sea to the land during World War II. They needed to be able to do that more quickly and take into account factors like temperature and humidity. To use our light analogy, it had over 16,000 lights, which could be arranged in different orders of on or off. But instead of using lightbulbs, they used vacuum tubes.
You could also call these circuits. They were either on or off. And the first generation of computers used vacuum tubes to store those on or off states. The problems with vacuum tubes is that they ran hot, they burned out, and they attracted moths. And so Grace Hopper, one of the pioneers of computer programming, pulled a moth from a computer and this case is held as an instance of literal "debugging."
The second generation of computers ran on transistors, which didn't burn out and were cool, and a lot smaller. The transistors in second generation computers stored the on/off states which could be checked.
The third generation of computers used silicon wafers, or integrated circuits to store the on/off states.
In 1971, they were able to fit about 2,300 transistors on an integrated circuit. In the 1980s, the transistor count was up to about 30,000, in the 1990s around 700,000, in the 2000s around 10,000,000. That's 10,000,000 on/off circuits, and they can be checked almost instantaneously, and it's all stored on something no bigger than your thumbnail. In 2011, it was closer to 2.6 billion.
As an aside, Intel stands for Integrated Electronics, which is about integrated circuits and chips. The fact that the number of circuits on a chip doubles every two years, is called Moor's law.
So the five generations of computers are
- Vacuum tubs
- Transistors
- Integrated circuits (chips)
- Microprocessors (cpu's)
- We will leave this up to you to discover...