August 1st, 2024

I don't know how CPUs work so I simulated one in code (2019)

The author simulated an 8-bit CPU to understand computer architecture, learning about bits, ALU functions, and I/O communication, while planning to explore advanced topics like RISC architectures and modern CPU features.

Read original articleLink Icon
CuriosityAdmirationNostalgia
I don't know how CPUs work so I simulated one in code (2019)

The author reflects on their journey to understand how CPUs work by simulating a simple 8-bit computer in code, inspired by the book "But How Do It Know?" by J. Clark Scott. Despite lacking a deep understanding of modern computing concepts, the author embarked on this project to grasp the fundamentals of computer architecture. They implemented a basic CPU that can execute simple programs, handle keyboard inputs, and render text on a display. The simulation involved creating a crude assembler and managing limited registers without a stack pointer or interrupts, which presented challenges. Through this process, the author learned about the movement of bits, the function of an ALU, and the basics of I/O communication. They also recognized the complexity behind writing assembly code and the effort required to develop even simple applications. The project was rewarding, enhancing their appreciation for computer systems and motivating them to explore more advanced topics, including RISC architectures and modern CPU features like caches. The author acknowledges that while this knowledge may not be essential for their day job, the learning experience has been enjoyable and fulfilling.

- The author simulated a simple 8-bit CPU to better understand computer architecture.

- The project involved creating a basic assembler and managing limited registers.

- Key learnings included the movement of bits, ALU functions, and I/O communication.

- The author plans to explore more advanced topics like RISC architectures and modern CPU features.

- The experience enhanced their appreciation for the complexity of computer systems.

AI: What people are saying
The comments reflect a shared enthusiasm for CPU projects and the learning experiences they provide.
  • Several commenters have undertaken similar projects, highlighting the challenges and time investment involved.
  • There is a recognition of the complexities of CPU design, including the importance of timing and signal stability.
  • Many participants reference educational experiences or resources, such as courses or YouTube channels, that have influenced their understanding of computer architecture.
  • Some comments suggest alternative methods for CPU interaction, such as using UART for communication.
  • Overall, the community expresses a strong interest in exploring and discussing CPU simulation and design.
Link Icon 16 comments
By @cogman10 - 6 months
So, what this project misses, which is quite hard to capture if you think of gates being just on off switches, is the fact that signals are not instantaneous, and everything runs in parallel.

As the AND gate 4 gates up the chain switches the NOT gate 4 gates down the chain starts to send different and unstable signals which may or may not be interpreted as a 1 or 0 in the downstream gate.

That's the reason computers have a clock, to make sure all transistors in a given stage of a CPU reach a steady state before moving on to the next instruction.

This is why it's probably a good idea to work with a HDL instead of just trying to wing it.

By @dang - 6 months
Discussed at the time:

I don't know how CPUs work so I simulated one in code - https://news.ycombinator.com/item?id=19969321 - May 2019 (172 comments)

By @donw - 6 months
Ben eater did an excellent series on YouTube where he built a CPU from scratch on breadboards: https://m.youtube.com/playlist?list=PLowKtXNTBypGqImE405J256...

His channel has a lot of other really great stuff on general electronics as well.

By @henrikschroder - 6 months
One of the most enlightening courses I took in university back in the day was digital electronics. Not because I ever wanted to muck about with it, but because we actually got to build our own super-simple physical 8-bit CPU. We had registers and an ALU and RAM and eight output leds, and we got to write the microcode for the fetch-execute cycle. Clock? There was a physical switch you would toggle on and off to make it step through the cycles to slowly execute the program we wrote in our own machine code. Realizing that instructions are just a bit-pattern saying which unit should write to the bus and which unit should read from the bus was quite eye-opening.
By @JR1427 - 6 months
I really love projects like this, where you learn a lot by just getting stuck in.

I built a simple 8-bit computer using a Z80 chip. You can read about it a bit more here https://www.jake-reich.co.uk/zx-jakey

By @naberhausj - 5 months
Love to see people doing CPU projects!

I just recently wrote a JavaScript emulator [1] for a simple 8-bit CPU I had built with logic gates. Both were fun projects, but one took a weekend whereas the other took about 4 months. Having skimmed through the author's code, I totally understand why it took them so much longer to get their's working. It's implemented with gate-level simulated logic gates, which is an impressive achievement. It's so much easier to just emulate whole components like I did.

[1] https://naberhausj.com/miscellaneous/8-bit-computer/page.htm...

By @vismit2000 - 6 months
By @russellbeattie - 6 months
> "The only cheat bit is to get the keyboard input and display output working I had to hook up go channels to speak to the outside world via GLFW..."

What I learned from Ben Eater is that a non-cheaty solution would have been to write a simple UART serial "device" and then interacted with the CPU via serial communication with a terminal.

By @rob74 - 6 months
Articles like this remind me of that one time I wrote a 6502 simulator in Turbo Pascal (complete with a primitive text mode disassembler where you could view and edit the machine code stored in memory) as a semester break project. It's of course very far from the complexity of today's CPUs, but basically they still work the same, and all of the optimizations of modern CPUs (pipelining, various cache levels, parallel execution, prefetch, branch prediction etc. etc. etc.) should be transparent to the user - of course, if you want to eke out the last bit of performance (targeting a specific CPU), it helps to be aware of all of that, but you don't need it for understanding how a CPU works.
By @vhodges - 6 months
Neat, I've seen a few logsim systems. After watching Ben Eaters videos (and not having enough time to tinker with hardware) I built

https://git.sr.ht/~vhodges/cputhing

By @P_I_Staker - 5 months
It can be fun to look at the various "achievements" or things that are generally possible for determined people (no geniuses allowed).

CPU just happens to be one of them. Fascinating that such a seemingly complex marvel of innovation is just something that everyone can do.

Of course, there's lots of groundwork and you need to know "the trick" (eg. that you can use transistors to make computation machines)... but it's easily within your grasp.

By @paranoidrobot - 6 months
I was once shown a dos-based CPU simulator back in the mid/late 90s.

From memory it showed instruction decode, execution, cache and memory.

Unfortunately I've never been able to find it, because all the google results are about running DOS games and/or DOSBox.

By @rohansood15 - 5 months
This is cool. Reminds me of my internship days when I had to build a custom 8086 emulator in Java, and I went down the rabbit hole of trying to figure out the best way to represent the architecture.
By @leoh - 6 months
If, for fun, I wanted to train an ML model on a ton of CPU instructions (which each predicted state/label being the state of the registers), does anyone have any clue how to gather that kind of data?
By @sam_perez - 6 months
Reminds me of some scenes from the first book of the three body problem.
By @luxuryballs - 6 months
I also don’t know and thinking about this makes me claustrophobic because I enjoy programming.