July 26th, 2024

Ask HN: Weirdest Computer Architecture?

The traditional computing stack includes hardware and software layers, but alternatives like quantum and neuromorphic computing introduce new models, changing components and architectures beyond classical Turing machines.

Ask HN: Weirdest Computer Architecture?

The concept of "the stack" in computing typically refers to the layers of abstraction that define how hardware and software interact. While the traditional stack includes physical components like electronics and logical constructs such as Turing machines, there are alternative computing paradigms that can modify or replace some of these components.

For instance, quantum computing introduces a different computational model that does not rely on classical Turing machines. Instead, it uses quantum bits (qubits) and quantum gates, which fundamentally change the way computation is performed. Similarly, neuromorphic computing mimics the neural structure of the human brain, using spiking neural networks instead of traditional logic gates and transistors.

In terms of software architecture, while Unix and similar systems dominate, there are other operating systems and architectures that can be employed, such as those designed for specific hardware like embedded systems or real-time operating systems.

Moreover, user interfaces can vary widely, with alternatives to traditional screens and input devices emerging, such as virtual reality interfaces or brain-computer interfaces.

In summary, while the traditional stack is well-defined, there are indeed alternative stacks that utilize different computational theories and architectures, particularly in the realms of quantum and neuromorphic computing, which can change the components from Turing machines upwards.

Link Icon 56 comments
By @runjake - 6 months
Here are some architectures that might interest you. Note these are links that lead to rabbit holes.

1. Transmeta: https://en.wikipedia.org/wiki/Transmeta

2. Cell processor: https://en.wikipedia.org/wiki/Cell_(processor)

3. VAX: https://en.wikipedia.org/wiki/VAX (Was unusual for it's time, but many concepts have since been adopted)

4. IBM zArchitecture: https://en.wikipedia.org/wiki/Z/Architecture (This stuff is complete unlike conventional computing, particularly the "self-healing" features.)

5. IBM TrueNorth processor: https://open-neuromorphic.org/blog/truenorth-deep-dive-ibm-n... (Cognitive/neuromorphic computing)

By @jecel - 6 months
"Computer architecture" is used in several different ways and that can lead to some very confusing conversations. Your proposed stack has some of this confusion. Some alternative terms might help:

"computational model": finite state machine, Turing machine, Petri nets, data-flow, stored program (a.k.a. Von Neumann, or Princeton), dual memory (a.k.a. Harvard), cellular automata, neural networks, quantum computers, analog computers for differential equations

"instruction set architecture": ARM, x86, RISC-V, IBM 360

"instruction set style": CISC, RISC, VLIW, MOVE (a.k.a TTA - Transport Triggered Architecture), Vector

"number of addresses": 0 (stack machine), 1 (accumulator machine), 2 (most CISCs), 3 (most RISCs), 4 (popular with sequential memory machines like Turing's ACE or the Bendix G-15)

"micro-architecture": single cycle, multi-cycle, pipelines, super-pipelined, out-of-order

"system organization": distributed memory, shared memory, non uniform memory, homogeneous, heterogeneous

With these different dimensions for "computer architecture" you will have different answers for which was the weirdest one.

By @defrost - 6 months
Setun: three-valued ternary logic computer instead of the common binary- https://en.wikipedia.org/wiki/Setun

Not 'weird' but any architecture that doesn't have an 8-bit byte causes questions and discussion.

EG: Texas Instruments DSP chip family for digital signal processing, they're all about deep pipelined FFT computations with floats and doubles, not piddling about with 8-bit ASCII .. there's no hardware level bit operations to speak of, and the smallest addressable memory size is either 32 or 64 bits.

By @mikewarot - 6 months
BitGrid is my hobby horse. It's a Cartesian grid of cells with 4 bit in, 4 bit out, LUTs (look up tables), latched in alternating phases to eliminate race conditions.

It's the response to the observation that most of the transistors in a computer are idle at any given instant.

There are a full rabbit hole worth of advantages to this architecture once you really dig into it.

Description https://esolangs.org/wiki/Bitgrid

Emulator https://github.com/mikewarot/Bitgrid

By @jy14898 - 6 months
Transputer

> The name, from "transistor" and "computer", was selected to indicate the role the individual transputers would play: numbers of them would be used as basic building blocks in a larger integrated system, just as transistors had been used in earlier designs.

https://en.wikipedia.org/wiki/Transputer

By @amy-petrik-214 - 6 months
they was some interesting funk in the 80s : Lisp computer: https://en.wikipedia.org/wiki/Lisp_machine (these were very hot in 1980s era AI) connection machine: https://en.wikipedia.org/wiki/Connection_Machine (a gorillion monobit processor supercluster)

let us also not forget The Itanic

By @ithkuil - 6 months
CDC 6000 was a barrel processor: https://en.m.wikipedia.org/wiki/Barrel_processor

Mill CPU (so far only patent-ware but interesting nevertheless) : https://millcomputing.com/

By @sshine - 6 months
These aren't implemented in hardware, but they're examples of esoteric architectures:

zk-STARK virtual machines:

https://github.com/TritonVM/triton-vm

https://github.com/risc0/risc0

They're "just" bounded Turing machines with extra cryptography. The VM architectures have been optimized for certain cryptographic primitives so that you can prove properties of arbitrary programs, including the cryptographic verification itself. This lets you e.g. play turn-based games where you commit to make a move/action without revealing it (cryptographic fog-of-war):

https://www.ingonyama.com/blog/cryptographic-fog-of-war

The reason why this requires a specialised architecture is that in order to prove something about the execution of an arbitrary program, you need to arithmetize the entire machine (create a set of equations that are true when the machine performs a valid step, where these equations also hold for certain derivatives of those steps).

By @mikewarot - 6 months
I thought magnetic logic was an interesting technology when I first heard of it. It's never going to replace semiconductors, but if you want to compute on the surface of Venus. You just might be able to make it work there.

The basic limit is the curie point of the cores, and the source of clock drive signals.

https://en.m.wikipedia.org/wiki/Magnetic_logic

By @drakonka - 6 months
This reminds me of a talk I went to at the 2020 ALIFE conference, in which the speaker presented an infinitely-scalable architecture called the "Movable Feast Machine". He suggested relinquishing hardware determinism - the hardware can give us wrong answers and the software has to recover, and in some cases the hardware may fail catastrophically. The hardware is a series of tiles with no CPU. Operations are local and best-effort, determinism not guaranteed. The software then has to reconcile that.

It was quite a while ago and my memory is hazy tbh, but I put some quick notes here at the time: https://liza.io/alife-2020-soft-alife-with-splat-and-ulam/

By @CalChris - 6 months
Intel's iAPX 432. 1975. Instructions were bit-aligned, stack based, 32-bit operations, segmented, capabilities, .... It was so late+slow that the 16-bit 8086 was created.

https://en.wikipedia.org/wiki/Intel_iAPX_432

By @muziq - 6 months
The Apple ‘Scorpius’ thing they bought the Cray in the 80’s for emulating.. RISC, multi-core, but could put all the cores in lockstep to operate as pseudo’SIMD.. Or failing that, the 32-bit 6502 successor the MCS65E4.. https://web.archive.org/web/20221029042214if_/http://archive...
By @mac3n - 6 months
FPGA: non-sequential programming

Lightmatter: matrix multiply via optical interferometers

Parametron: coupled oscillator phase logic

rapid single flux quantum logic: high-speed pulse logic

asynchronous logic

https://en.wikipedia.org/wiki/Unconventional_computing

By @nailer - 6 months
The giant global computers that are Solana mainnet / devnet / testnet. The programs are compiled from Rust into (slightly tweaked) EBPF binaries and state updates every 400ms, using VDFs to sync clocks between the leaders that are allowed to update state.
By @yen223 - 6 months
A lot of things are Turing-complete. The funniest one to me are Powerpoint slides.

https://beza1e1.tuxen.de/articles/accidentally_turing_comple...

https://gwern.net/turing-complete

By @metaketa - 6 months
HVM using interaction nets as alternative to Turing computation deserves a mention. Google: HigherOrderCompany
By @0xdeadbeer - 6 months
I heard of counter machines on Computerphile https://www.youtube.com/watch?v=PXN7jTNGQIw
By @AstroJetson - 6 months
Huge fan of the Burroughs Large Systems Stack Machines.

https://en.wikipedia.org/wiki/Burroughs_Large_Systems

They had an attached scientific processor to do vector and array computations.

https://bitsavers.org/pdf/burroughs/BSP/BSP_Overview.pdf

By @phyalow - 6 months
By @gregorymtravis - 6 months
By @GistNoesis - 6 months
https://en.wikipedia.org/wiki/Unconventional_computing

There is also Soap Bubble Computing, or various form of annealing computing (like quantum annealing or Adiabatic quantum computation), where you set up your computation as the optimal value of a physical system you can define.

By @elkekeer - 6 months
A multi-core Propeller processor by Parallax (https://en.wikipedia.org/wiki/Parallax_Propeller) in which multitasking is done by cores (called cogs) taking turns: first, code is executed on the first cog, then, after a while, on the second, then on the third, etc.
By @vismit2000 - 6 months
How about water computer? https://youtu.be/IxXaizglscw
By @yencabulator - 6 months
Just the operating system, but I like Barrelfish's idea of having a separate kernel on every core and doing message passing. Each "CPU driver" is single-threaded, non-preemptible (no interrupts), shares no state, bounded-time, and runs to completion. Userspace programs can access shared memory, but the low-level stuff doesn't do that. Bounded-time run to completion kinda makes me think of seL4, if it was designed to be natively multicore.

https://en.wikipedia.org/wiki/Barrelfish_(operating_system)

https://barrelfish.org/publications/TN-000-Overview.pdf

By @Joker_vD - 6 months
IBM 1401. One of the weirdest ISAs I've ever read about, with basically human readable machine code thanks to BCD.
By @29athrowaway - 6 months
The Soviet Union water integrator. An analog, water based computer for computing partial differential equations.

https://en.m.wikipedia.org/wiki/Water_integrator

By @trealira - 6 months
The ENIAC, the first computer, didn't have assembly language. You programmed it by fiddling with circuits and switches. Also, it didn't use binary integers, but decimal ones, with 10 vacuum tubes to represent the digits 0-9.
By @Paul-Craft - 6 months
By @jareklupinski - 6 months
Physical Substrate: Carbon / Water / Sodium

Computation Theory: Cognitive Processes

Smallest parts: Neurons

Largest parts: Brains

Lowest level language: Proprietary

First abstractions of programming: Bootstrapped / Self-learning

Software architecture: Maslow's Theory of Needs

User Interface: Sight, Sound

By @variadix - 6 months
From the creator of Forth https://youtu.be/0PclgBd6_Zs

144 small computers in a grid that can communicate with each other

By @porridgeraisin - 6 months
By @RecycledEle - 6 months
Using piloted pneumatic valves as logic gates blew my mind.

If you are looking for strangeness, the 1990's to early 2000's microcontrollers had I/O ports, but every single I/O port was different. None of them had a standard so that we could (for example) plug in a 10-pin header and connect the same peripheral to any of the I/O ports on a single microcontroller, much less any microcontroller they made in a family of microcontrollers.

By @mbfg - 6 months
I've got to believe x86 is in the running. We don't think of it because it is the dominate architecture, but it's kind of crazy.
By @PeterStuer - 6 months
In the 80's our lab lobbied the university to get a CM-1. We failed and they got a Cray instead. The connection machine was a realy different architectute aimed at massive partallel execution https://en.wikipedia.org/wiki/Connection_Machine
By @dwrodri - 6 months
If you really want to see some esoteric computer architecture ideas, check out Mill Computing: https://millcomputing.com/wiki/Architecture. I don't think they've etched any of their designs into silicon, but very fascinating ideas nonetheless.
By @jacknews - 6 months
Of course there are things like the molecular mechanical computers proposed/popularised by Eric Drexler etc.

I think Transport-triggered architecture (https://en.wikipedia.org/wiki/Transport_triggered_architectu...) is something still not fully explored.

By @BarbaryCoast - 6 months
Look at the earliest computers, that is, those around the time of ENIAC. Most were electro-mechanical, some were entirely relay machines. I believe EDSAC was the first _electronic_ digital computer.

As for weird, try this: ENIAC instructions modified themselves. Back then, an "instruction" (they called them "orders") included the addresses of the operands and destination (which was usually the accumulator). So if you wanted to sum the numbers in an array, you'd put the address of the first element in the instructions, and as ENIAC repeated that instruction (a specified number of times), the address in the instruction would be auto-incremented.

Or how about this: a computer with NO 'jump' or 'branch' instruction? The ATLAS-1 was a landmark of computing, having invented most of the things we take for granted now, like virtual memory, paging, and multi-programming. But it had NO instruction for altering the control flow. Instead, the programmer would simply _write_ to the program counter (PC). Then the next instruction would be fetched from the address in the PC. If the programmer wanted to return to the previous location (a "subroutine call"), they'd be obligated to save what was in the PC before overwriting it. There was no stack, unless you count a programmer writing the code to save a specific number of PC values, and adding code to all subroutines to fetch the old value and restore it to the PC. I do admire the simplicity -- want to run code at a different address? Tell me what it is and I'll just go there, no questions asked.

Or maybe these shouldn't count as "weird", because no one had yet figured out what a computer should be. There was no "standard" model (despite Von Neumann) for the design of a machine, and cost considerations plus new developments (spurred by wanting better computers) meant that the "best" design was constantly changing.

Consider that post-WWII, some materials were hard to come by. So much so that one researcher used a Slinky (yes, the toy) as a memory storage device. And had it working. They wanted acoustic delay lines (the standard of the time), but the Slinky was more available. So it did the same job, just with a different medium.

I've spent a lot of time researching these early machines, wanting to find the path each item in a now-standard model of an idealized computer. It's full of twists and turns, dead ends and unintentional invention.

By @supercoffee - 6 months
I'm fascinated by the mechanical fire control computers of WW2 battleships.

https://arstechnica.com/information-technology/2020/05/gears...

By @ChristopherDrum - 6 months
Mythic produces an analog processor https://mythic.ai/

There is also the analog computer The Analog Thing https://the-analog-thing.org/

By @sshb - 6 months
This unconventional computing magazine came to my mind: http://links-series.com/links-series-special-edition-1-uncon...

Compute with mushrooms, compute near black holes, etc.

By @t312227 - 6 months
hello,

great collection of interesting links - kudos to all! :=)

idk ... but isn't the "general" architecture of most of our computers "von neumann"!?

* https://en.wikipedia.org/wiki/Von_Neumann_architecture

but what i miss from the various lists, is the "transputer"-architecture / ecosystem from INMOS - a concept of heavily networked arrays of small cores from the 1980ties

about transputers

* https://en.wikipedia.org/wiki/Transputer

about INMOS

* https://en.wikipedia.org/wiki/Inmos

i had the possibility to take a look at a "real life" ATW - atari transputer workstation - back in the days at my university / CS department :))

mainly used with the Helios operating-system

* https://en.wikipedia.org/wiki/HeliOS

to be programmed in occam

* https://en.wikipedia.org/wiki/Occam_(programming_language)

the "atari transputer workstation" ~ more or less a "smaller" atari mega ST as the "host node" connected to an (extendable) array of extension-cards containing the transputer-chips:

* https://en.wikipedia.org/wiki/Atari_Transputer_Workstation

just my 0.02€

By @dsr_ - 6 months
There are several replacements for electronic logic; some of them have even been built.

https://en.wikipedia.org/wiki/Logic_gate#Non-electronic_logi...

By @floxy - 6 months
By @solardev - 6 months
Analog computers, quantum computers, light based computers, DNA based computers, etc.
By @osigurdson - 6 months
I'm not sure what the computer architecture was, but I recall the engine controller for the V22 Osprey (AE1107) used odd formats like 11 bit floating point numbers, 7 bit ints, etc.
By @joehosteny - 6 months
The Piperench runtime reconfigurable FPGA out of CMU:

https://research.ece.cmu.edu/piperench/

By @ranger_danger - 6 months
9-bit bytes, 27-bit words... middle endian.

https://dttw.tech/posts/rJHDh3RLb

By @vapemaster - 6 months
since this is a bit of a catch-all thread, i'll toss Anton into the ring: a whole bunch of custom ASICs to do Molecular Dynamics simulations from D.E. Shaw Research

https://en.wikipedia.org/wiki/Anton_(computer)

By @dongecko - 6 months
Motorola used to have a one bit microprocessor, the MC14500B.
By @ksherlock - 6 months
The tinker toy computer doesn't even use electricity.
By @gjvc - 6 months
rekursiv