You are here: Home → Computers → Inner Workings → The Electronics Bit

On this page we have a look at some of the electronics behind computers. Actually, this page isn't so much about *electronics* as it is about *logic gates* and their use…

At bottom, computer electronics is based around the idea that a computer circuit can be in one of only two possible states. Typically these are “on” and “off”. However, it's not uncommon (e.g., in connector cables) to use negative and positive instead. (That way we can tell the difference between one of the two valid states and “the cable is unplugged”!)

To avoid confusion, we refer to the two possible (valid) states as *logic high* and *logic low*. Usually logic high = on and logic low = off, but not always. Hence the terms more clearer as to the concept they embody, not the specific mechanism used to implement them. As also noted in the basic workings section, computers count in binary — a number system that uses only the digits 0 and 1. We take the convention that logic high means 1 and logic low means 0. Last but not least, sometimes we use logic high to be “true” and logic low to mean “false”.

The most basic components of an electronic computer are *transistors*. These are like little miniture switches. A transistor switches its output on or off based on one or more other signals flowing into it. The design of such circuits can get fairly complicated — and it's really somewhat beyond my area of expertise.

A (small) group of transistors can be put together to make various kinds of *logic gates*. Conceptually, logic gates are what a computer is made of. If you could figure out a way to make logic gates out of steam valves, then you could built a steam-powered computer.

(In fact, this would be quite easily possible. There's just not a lot of point, because such a computer would be *huge*, unbelievably slow, constantly breaking down, and generally it would serve no useful purpose. The point is, *electronic* computers make logic gates out of transistors, but you could in fact make a computer out of *anything* that implements logic gates.)

So what exactly *is* a logic gate, then?

A logic gate, aside from probably having connections to the main power supply, has exactly one output wire, and one or more input wires. The status of the output wire depends only on the *current* status of the input wires. (The importance of this statement will become clearer later.)

Almost the simplest possible gate is the *NOT gate*, otherwise known as an *inverter*. This gate has only one input, and its output is simply the opposite of whatever its input is. That is, if the input is at logic high, the output will be at logic low, and vice versa.

(There *is* an even simpler gate — a *buffer*. It also has only one input, and it's output is identical to it's input. What's the point of that? Well, a signal can flow from a logic gate's input(s) to its output, but *never* the other way round. Even so, buffers are not especially useful, at least in conceptual designs.)

The next gate is the *AND gate*. It has a minimum of two inputs, but can actually have an unlimited number of them. They are all “identical”. In the 2-input case, the output is at logic high only if the left input *and* the right input are at logic high. We can write that in a little table:

In 1 | In 2 | Out |
---|---|---|

0 | 0 | 0 |

1 | 0 | 0 |

0 | 1 | 0 |

1 | 1 | 1 |

This is called a *truth table*. One could say that “x AND y is true if x is true *and* y is true”. This is called *Boolean algebra*, and is in fact where the idea of logic gates comes from.

For more than 2 inputs, the principle simply extends. The output is high only if *all* the inputs are.

In a similar mannar to the above an *OR gate* has its output high if the left input *or* the right input is high. The truth table is

In 1 | In 2 | Out |
---|---|---|

0 | 0 | 0 |

1 | 0 | 1 |

0 | 1 | 1 |

1 | 1 | 1 |

Again, the principle extends to more inputs; the output is high if *any* inputs are high.

In addition to the above, there is also the *NAND gate*, which is really a normal AND gate with a NOT gate placed on its output. In other words, for any given inputs, the output of a NAND gate is the opposite of the output that an AND gate would give for the same inputs. The *NOR gate* works similarly, but with an OR gate instead of an AND.

Why the heck would you care? Because NAND and NOR gates require 1 less transistor to make. So building a circuit with them takes fewer transistors, while producing the same result. So it's a practical rather than theoretical thing.

There are in fact *two* types of OR gate. The one mentioned about is the *inclusive-OR gate*, and this is the kind that is almost always meant when someone says “OR gate”. There is also a more complicated *explusive-OR gate*, often written *XOR gate*. The output of such a gate is high if the left input *or* the right input is high, but *excluding* the case where *both* are high. The truth table is this:

In 1 | In 2 | Out |
---|---|---|

0 | 0 | 0 |

1 | 0 | 1 |

0 | 1 | 1 |

1 | 1 | 0 |

As you can see, this is only 1 digit different to the normal inclusive-OR gate.

Look at the truth table for the XOR gate again. When both inputs are low, the output is low. When both inputs are high, this case is excluded, so the output is still low. In fact, the only time the outputs are high is when the inputs are *different*. Whenever both inputs are *the same*, the output is low. So you could view the XOR as a kind of “difference detector”.

There is another way to look at this gate: compare in 1 to out. Notice than when in 2 is 0, in = out, but when in 2 is 1, in and out are opposits. So in a way, an XOR gate is like a NOT gate that you can “switch off” or on using another input.

Following on from the example above, look at the truth table for the AND gate. Notice how if in 2 is 0, out is 0. However, if in 2 is 1, then out = in 1. So by turning on in 2, we let the signal from in 1 flow directly to out, but by turning in 2 off, we block that signal. Do you see why it's called an AND *gate* now?

The OR gate works similarly. By setting one input to 1, you can prevent a 0 signal from being propogated. So you can use OR to block 0s, and AND to block 1s.

The AND gate implements the “all rule”, while the OR gate implements the “any rule”. However, if you take an AND gate and attach inverters to all of its inputs, it's really now a “none rule”. Further, if you invert the output as well, it's a “not none” rule. Saying “not none” is equivilent to saying “some” which is tantemount to “any”.

In other words, if you take an AND gate and attach inverters to all of its inputs and outputs, it behaves *exactly* like an OR gate. Perhaps unsurprisingly, if you do the same thing to an OR gate, it turns into an AND gate. In Boolean algebra, they refer to this as *duallity*. (There is actually much more to it then this; I'm simplifying.)

Note also that it is quite possible to build an XOR gate out of ANDs and ORs. The formula in question is

a XOR b = (a OR b) AND (NOT (a AND b))

(Unfortunately, I don't *yet* have the ability to draw a picture for you…)

As an aside, notice that if you take a NAND or NOR gate and tie both its inputs together to make a device with only 1 in and 1 out, it essentially becomes an interter. What this means is that if you have a box of NANDs or a box of NORs, you can make them into inverters, and thereby make NAND, NOR, AND, OR and NOT gates all out of just the one thing you have.

If you *really* have the urge, your nearest electronics shop should be able to sell you some 7400 chips for a few pence each. A single 7400 chip has 14 pins. 2 pins are for the + and − connections, and the remaining 12 are for 4 NAND gates. (Each gate has 2 input pins and 1 output pin.) With a battery, some wires and a test meter or some LEDs, you can play with actual geniune logic gates, all for a few pence!

Logic gates are all very interesting, but… a bit simple, no? I mean, what *useful* stuff can you actually do with such simple things?

If you understand how binary works, you can look at building an electronic circuit to add two binary numbers together. Recall that all digits in binary are either 1 or 0. That means that if we look at 1-digit numbers, the only possible calculations we can do are 0 + 0 = 0, 0 + 1 = 1, 1 + 0 = 1, and 1 + 1 = 2. Now in binary, 2 is written as 10. What is really happening here is “1 + 1 = 0, *carry* 1”. In the other cases, there's nothing to carry, so you could say carry = 0.

So, to add two 1-digit binary numbers together, we need to build something with the following truth table:

In 1 | In 2 | Total | Carry |
---|---|---|---|

0 | 0 | 0 | 0 |

1 | 0 | 1 | 0 |

0 | 1 | 1 | 0 |

1 | 1 | 0 | 1 |

If you examine this for a moment, you will notice that we only carry 1 if in 1 *and* in 2 are 1. So that output is clearly just an AND gate connected to both inputs. Looking at the total column we see that it matches the truth table for an XOR gate. So, with two logic gates, we can add two 1-digit binary numbers to get a 2-digit total. This is called a *binary half adder*.

So we managed to make a machine that knows that 1 + 1 = 2. Not very impressive! How about something that can add somewhat *bigger* numbers?

Well, to add bigger numbers, we just need to chain some binary half adders together, resulting an a *binary full adder*. (There are actually more sophisticated ways to do this, but this is the simplest.) Each pair of input digits must be added, and each result must be added to the carry digit from the next pair over to give the overall total.

(As soon as I can, I will add a picture here!)

Generated by Indoculate Release #2b (17-Feb-2007) [Compiled by GHC 6.6]. Document generated on 2007-03-23 20:40:32.