top of page

Understanding the classical world before quantum

  • Vitalii Manoilo
  • Apr 14
  • 6 min read

Updated: May 8

In my previous post, I shared my story of how I became interested in quantum computing. Before diving into quantum computing, it makes sense to first explore how classical computers work. This will make quantum concepts easier to understand later.


I won't go too deep into physics or electrical engineering, just enough to build a solid foundation.


What is a bit?


At its core, a bit stores and represents only one of two states: 0 or 1, a binary or Boolean state.

  • 0 can also mean: False, Off, No

  • 1 can also mean: True, On, Yes


Imagine a light switch: you flip it up — the lights turn on; you flip it down — the lights turn off.


This is the fundamental language of computers: every operation, no matter how complex, starts with these simple states.


How computers store bits


Source: Mister rf at English Wikipedia - Own work, CC BY-SA 3.0
Source: Mister rf at English Wikipedia - Own work, CC BY-SA 3.0

These states are stored and controlled by transistors, which are very tiny switches inside the computer chip.


The transistors control the flow of electric current. When open (allowing current to pass), they represent "1"; when closed (blocking current), they represent "0."


The transistors in modern computers consist of 3 parts:

  • "Gate" controls the flow (like your finger on the light switch).

  • "Source" is where the current enters.

  • "Drain" is where the current exits if the gate is open.


Transistors are connected in specific arrangements to form a Logic Gate.

Depending on how transistors are arranged, logic gates perform specific operations like "AND," "OR," "NOT," and more.

  • AND gate outputs "1" when the current flows through transistor "A" and transistor "B,". Otherwise, the output is "0".

  • OR gate outputs "1" if current flows through either transistor "A," transistor "B," or both. If no current flows, the output is "0".

  • NOT gate inverts the input or flips the state. If current flows, the output is "0"; if current doesn't flow, the output is "1".


There are billions of these gates, combinations of transistors, that form complex circuits: ALUs, CPUs, GPUs, and memory. You might remember seeing some of these terms when choosing your last laptop.


How computers work

Earlier, I mentioned that a bit is a basic unit of information for the computer. The bits are grouped into a byte (8 bits). When you hear about your computer or smartphone's memory or your cloud storage on Google Drive, iCloud or elsewhere, something like "16 GB", it's really:

16 gigabytes = 16 billion bytes = 128 billion bits. 

You can read about all multiple-byte units on Wikipedia here.


Now, to the actual use cases:


When you open a text editor on a computer, you open software. Let's say you type "Hello World".


  1. When you press "H", your Keyboard sends the binary code for the character "H".

  2. ASCII (American Standard Code for Information Interchange) assigns a unique sequence of bits to each letter (number, punctuation mark, and other characters). "H" in ASCII equals to "72" and in binary: 01001000.

  3. Each following letter is also translated into binary.

  4. So "Hello World" becomes:

H → 01001000
e → 01100101
l → 01101100
l → 01101100
o → 01101111
(space) → 00100000
W → 01010111
o → 01101111
r → 01110010
l → 01101100
d → 01100100
  1. The computer stores this data in memory (RAM) as a series of bits.


When you press keys to type, the software running on the CPU reads the input and instructs the CPU to process and display the character.

  • The software has predefined instructions: "When this key is pressed, display its character on screen".

  • The CPU executes these instructions by moving and processing bits in memory.

  • The binary data is sent to the GPU and screen driver.

  • The screen knows how to represent bits as pixels.

  • So you see "Hello World" on your screen, but underneath, it's still just bits.


Let's take another example.


You open another software (Calculator app) on your computer and type "2 + 5 =", and the computer provides the result of "7".


The computer will execute similar steps as when we typed "Hello World":

  1. You type "2", and the Keyboard sends binary for "2" as a character (ASCII code 50 = 00110010). You type "+", binary for "+" (ASCII code 43 = 00101011). You type "5", binary for "5" as a character (ASCII code 53 = 00110101). (For now, "2" and "5" are still character data types, although represented in binary format.)

  2. The software reads your intentions, and the calculator app recognizes numbers and operators.

  3. The software parses the input characters into numeric form before passing instructions to the CPU.

  4. Character "2" (ASCII) is Numeric 2 (binary 00000010)

  5. Character "5" (ASCII) is Numeric 5 (binary 00000101)

  6. The CPU executes machine instructions prepared by the software using the ALU (Arithmetic Logic Unit), a circuit built from logic gates.

  7. The calculation can be visually represented similarly to adding two numbers on paper.

  00000010   (2)
+ 00000101   (5)
------------
  00000111   (7)

Bit 8 (rightmost):
0 + 1 = sum 1, carry 0
Bit 7:
1 + 0 = sum 1, carry 0
Bit 6:
0 + 1 = sum 1, carry 0
Bit 5-1:
0 + 0 = 0, carry 0
  1. CPU sends the result in binary format to GPU and screen driver.

  2. The screen knows how to represent bits:

00000111 -> ASCII code "55" for character "7" -> screen displays "7"

These all happen in milliseconds now, and you won't even notice the time lag between pressing a key on the Keyboard and a letter showing up on your screen or, in our second example, the execution of 2 + 5 = 7. Behind the scenes, billions of transistors are flipping states across the CPU, ALU, and GPU to make this happen.


If you've ever worked with large datasets in Excel and run complex formulas, you might have seen your screen freeze. In reality, your computer lacks the CPU or RAM to execute and/or display results simultaneously. Hence, it starts executing operations in chunks of data based on its memory limits and eventually returns the output you expect. Sometimes, it crashes, and you must relaunch Excel, hoping your results were autosaved.


Supercomputers: powering complex problems


We explored how computers read and display information and execute operations. Our computers are sophisticated yet very simple machines. They are excellent for computing data and processing information. Even your computer can execute powerful software like Microsoft 365 Office Suite, the latest Call of Duty, or, if you ask Gen Z, Minecraft and Fortnite.


Many corporations require much more powerful computers to run the statistical models to analyze billions to trillions of data points: for banks to estimate the probability of defaulting customers, insurance companies to evaluate risks for a claim, and streaming platforms to find better recommendations of movies for you. With the AI revolution, the requirements to build bigger computers become even more critical. Training LLM models requires significant capacity for training and processing. That's why giants like Google, Amazon and Microsoft continue to build infrastructure for cloud services. For example, Microsoft's FY24 Q4 accounts for 43% of revenue from Intelligent Cloud, the largest compared to productivity, business processes, and more personal computing, with 32% and 25%, respectively. Amazon and Google generate slightly less revenue from cloud services than Microsoft, but it's still a significant and growing part of their business.


However, cloud computing provides access to computing resources, such as storage, servers, and applications, over the Internet to the public or businesses. The actual computations happen on physical servers housed in data centers across the globe, linked together to ensure data availability. Sometimes, copies of data are stored in different locations to achieve data redundancy and prevent loss in case of failures. The hardware itself is very similar to your computers, with transistors on a circuit at a much larger scale.


Moore’s Law and physical limits


Moore's Law is an empirical observation and not a physical law. It describes a trend of doubling the number of transistors on a microchip every two years. In other words, the transistors are getting smaller and smaller, and more of them can fit within the same physical chip area. This exponential increase in transistor density has led to faster, more powerful, and more affordable devices, enabling the processing of increasingly complex data and applications.


In theory, there is a potential limit to how small the transistor can be. The limit is estimated to be roughly 1 nanometer. Today, 3-nanometer transistors are the smallest in mass production, while those in smartphones are approximately 7-10 nanometers. As transistors shrink, they experience an unusual quantum effect called quantum tunnelling. At this tiny scale, electrons behave not just as particles but also as waves. Part of this wave can pass through barriers inside the transistor, even when the transistor is supposed to block the current flow. It’s as if an electron slips through a closed gate — not because it is open, but because quantum effects allow it to appear on the other side. While this might sound strange, the result is very real: electrical current leaks through the barrier, and the transistor becomes unreliable. Beyond theoretical limits, there are also practical engineering constraints in making transistors as small as 1 nanometer.


Quantum vs. Classical: A new paradigm


So far, we’ve explored how classical computers process information, from bits and logic gates to CPUs and cloud computing. Next, we’ll explore how quantum computers operate in an entirely different paradigm, not just scaling classical computing but rewriting the rules of how we process information.



 
 

©2025 by Vitalii Manoilo. All rights reserved.

bottom of page