We are living in a time of incredible growth. The Information Age has transformed the way we live, with over half the world's population currently connected to the internet. The human race is more intertwined than ever before. But what led to this amazing era? What single invention gave rise to our smartphone-equipped generation?
Inside a computer, you'll find a motherboard, which acts like the chassis of a car, giving all the internal components something to connect to. Attached to the motherboard is the microprocessor (CPU). If you look underneath a CPU, you'll see hundreds of tiny pins. This processor is made up of incredibly complex areas designed to add and store numbers. However, all of these areas are built from one microscopic foundation: the transistor.
To truly understand the massive impact of the computer, we need to understand the history, physics, and math behind the transistor.
The History Before Transistors (Vacuum Tubes)
Before the transistor was invented, computers used vacuum tubes, which were bulky, fragile glass bulbs. A triode vacuum tube consisted of a cathode, a grid, and an anode.
When an electrical current heated the cathode, it released electrons. Because gases were removed from the glass tube to create a vacuum, the electrons faced no resistance and were immediately attracted to the positively charged anode. By applying a positive or negative voltage to the middle "grid," engineers could control this flow—turning the current ON or OFF instantly. This is the physical foundation of binary coding (ones and zeros).
The ENIAC Computer
The world's first general-purpose electronic computer, the ENIAC, used 18,000 of these vacuum tubes. Completed in 1945, it was purpose-built to calculate complex artillery trajectories during World War II. A calculation that would take a human a full day to solve took the ENIAC only 30 minutes.
However, this machine weighed 30 tons and took up an entire room! It was incredibly power-hungry, and the extreme heat meant the glass vacuum tubes constantly burned out and needed replacing.
Related Article: What Is Calculator ? History Of Calculator.
How Does a Transistor Work?
Today, the computing power required to run a simple game on your phone is vastly superior to the ENIAC. A modern smartphone processor contains over 2 billion transistors on a silicon chip no larger than a fingernail. But how does silicon actually work?
Silicon is a semiconductor, meaning its ability to conduct electricity can be artificially tailored by introducing impurities to its crystal structure. Pure silicon atoms share electrons perfectly, making them poor conductors. By introducing impurities (a process called "doping"), we change how it conducts current:
- N-Type (Negative): If we add Phosphorus, an extra electron is left free to roam around. This creates a negatively charged material.
- P-Type (Positive): If we add Boron, which is missing an electron, it creates an empty space called a "hole." This structure wants to steal electrons, creating a mobile positive charge.
The NPN Junction as a Switch
When we sandwich a piece of P-type silicon between two pieces of N-type silicon, we create the world's most common transistor: the NPN Transistor. The interaction between the free electrons and the "holes" creates a barrier called the depletion layer, which acts as a wall that stops electricity from flowing.
This setup creates three main parts:
- The Source: Where the electricity enters.
- The Drain: Where the electricity exits.
- The Gate (Base): The switch in the middle that controls the flow.
Electricity cannot flow from the Source to the Drain naturally. However, when a tiny positive voltage is applied to the middle Gate, it collapses the depletion layer and opens a massive conducting channel. The neat thing about this setup is that there are no moving parts—you use a tiny amount of electricity to turn a massive amount of electricity ON or OFF!
Read Also: Complete Details About Processors: CPU Cores, GHz, & nm Explained
Creating Logic Gates
The breakthrough idea with transistor technology was wiring them together to form Logic Gates. These gates take electrical inputs and make logical mathematical "decisions."
The OR Gate
If you take two transistors, add power to both inputs, and allow both of their outputs to flow to a single light bulb, you have created an OR gate. If you turn Switch A ON, or you turn Switch B ON, the light bulb lights up.
The AND Gate
If you change the wiring so that the output of the first transistor runs directly into the input of the second transistor, you've created an AND gate. The electricity coming from the first transistor is stopped dead at the second one. Both Switch A and Switch B must be turned ON for the electricity to reach the light bulb.
Understanding Binary Numbers
ON and OFF can be represented mathematically as 1 (ON) and 0 (OFF). Zeros and ones are the primary language of computers, making up a numbering system called Binary.
Because binary only uses two digits, we have to count up a little differently than our normal decimal system. Let's count up using only ones and zeros:
0000= 00001= 10010= 2 (We ran out of 1s and 0s, so we must move to the next column)0011= 30100= 4
The Light Bulb Trick
To make this easier, imagine four light bulbs in a row. From right to left, they represent the decimal values 1, 2, 4, and 8. When a light bulb is turned ON (1), you simply add its value!
1001: The "8" bulb and the "1" bulb are ON. (8 + 1 = 9 in decimal).1010: The "8" bulb and the "2" bulb are ON. (8 + 2 = 10 in decimal).1011: The "8", "2", and "1" bulbs are ON. (8 + 2 + 1 = 11 in decimal).
Related Article: Parts Of A Calculator: How Does A Calculator Work?
How Do Computers Add Numbers? (The Full Adder)
Since switches represent ones and zeros, we can wire our logic gates together to create an adding machine called a Full Adder.
To add numbers, computers use an Exclusive OR (XOR) gate. This gate only turns its output ON if exclusively one switch is on, but not both. If both Switch A and Switch B are ON, the XOR gate turns OFF (0), but it triggers a separate AND gate below it to carry a "1" over to the next column. In other words: 1 + 1 = 10 (which is 2 in binary).
Stringing Adders Together
In normal math, when you add 7 + 7 to get 14, you write down the 4 and "carry" the 1 to the next column. Computers do the exact same thing using a "Carry-Out" wire.
A Full Adder accepts Input A, Input B, and a "Carry-In" from the previous column. Let's look at what happens when we add 00000001 (1) plus 00000001 (1):
- In the first adder, the inputs are both 1.
- 1 + 1 = 10 in binary.
- The adder turns its own sum wire OFF (0) and sends a "Carry-Out" (1) down the wire to the next adder.
- The second adder receives the 1, turning its 2-value light bulb ON.
- The final binary result is
00000010(which is 2)!
By chaining 8 Full Adders together in a row, a computer can process 8-bit binary numbers up to 255. This exact process of logic gates opening, closing, adding, and carrying happens millions or even billions of times a second inside your CPU!
The Future: Moore's Law & Quantum Limits
In 1965, the co-founder of Intel, Gordon E. Moore, noticed a trend: the density of transistors on integrated circuits doubles roughly every two years. This is known as Moore's Law, and it is the reason technology has advanced so incredibly fast over the last 50 years.
However, this incredible growth is finally starting to slow down. One reason is "Rock's Law," which states that the financial cost of manufacturing these microscopic devices is skyrocketing. It is becoming increasingly difficult for manufacturers to shrink transistors while remaining profitable.
The second, much larger problem is physics. Quantum Tunneling occurs when transistors become so unfathomably small that the physical barriers inside them are only a few atoms thick. At this atomic scale, electrons can simply teleport (or "tunnel") right through the walls, rendering the ON/OFF switches useless.
With silicon transistors reaching their absolute physical limits, engineers are looking to harness quantum mechanics to create quantum computers, or focusing on decreasing power consumption rather than just increasing speed. One thing is certain: the computer industry will have to completely redefine itself in the coming decades.
No comments:
Post a Comment