October 7th, 2024

The computer built to last 50 years

The #ForeverComputer is designed for a 50-year lifespan, emphasizing durability, simplicity, and sustainability with replaceable parts, low energy use, and a focus on essential tasks like writing and reading.

Read original articleLink Icon
The computer built to last 50 years

The concept of a computer designed to last 50 years, referred to as the #ForeverComputer, emphasizes durability, simplicity, and sustainability. Drawing inspiration from typewriters, which have proven to be long-lasting and functional, the proposal suggests creating a computer that focuses on timeless activities such as writing and reading. Unlike modern devices that require frequent upgrades and are often discarded, the ForeverComputer would be built with sturdy, easily replaceable parts, allowing for repairs and modifications over time. It would prioritize low energy consumption, potentially utilizing e-ink screens and offline functionality to reduce distractions and enhance user focus. The design would encourage a deeper connection between the user and the device, fostering a sense of purpose and commitment. By limiting its capabilities to essential tasks, the ForeverComputer aims to provide a reliable tool for communication and information access without the need for constant updates or internet connectivity. This approach not only addresses environmental concerns but also challenges the consumerist culture surrounding technology.

- The #ForeverComputer aims for a lifespan of 50 years, focusing on durability and sustainability.

- It emphasizes essential functions like writing and reading, avoiding unnecessary features.

- The design includes easily replaceable parts and low energy consumption.

- E-ink screens and offline functionality are proposed to enhance user focus.

- The concept encourages a deeper connection between users and their devices.

Link Icon 44 comments
By @gchadwick - 6 months
I find it odd the author spends lots of time talking about vintage typewriters but then fails to consider vintage computers which can give some real life examples of computers that are still usable almost 50 years on from their original release. E.g. the Commodore 64, lots of working examples still around and now 42 years since first release.

Certainly a C64 is highly restrictive compared to a modern machine and were one to specifically build a computer to last 50 years it's not where you'd start but surely a machine that has actually lasted almost 50 years and remains usable has things to teach you about long lasting computer design.

In particular interesting to see how open source fits in. The modern C64 ecosystem has plenty of tools and utilities that do use open source software and hardware (e.g. the Kung Fu Flash cartridge: https://github.com/KimJorgensen/KungFuFlash) but plenty of the core software, that actually runs on the machine, is proprietary software the source is long gone for. It's still around because of archivists and pirates and can continue to be used because the original copyright holders don't care to enforce their copyrights. So is open source actually a core item as the author asserts or just a nice to have? Having the software be archived and easily available later was the key. Along with simplicity, you just run the monolithic binary, there's no dependencies and the software is sufficiently simple that hacking around with the raw binary is perfectly feasible.

By @topherPedersen - 6 months
I have several computers that are 40 years old. I think the reason the old 80s microcomputers last so long is they don't have any moving parts like disk drives that go bad (I've had really bad success with the external disk drives I've purchased). Unfortunately, I think the reason why computers and phones don't last a long time now is because the companies designing the phones, computers, and operating systems WANT them to quit working.

The reason computers slow down and stop working worth a damn has nothing to do with the hardware, it's the operating system's receiving "updates" that make them quit working. I have a TRS-80 Color Computer running the Microsoft BASIC "operating system" that Bill Gates wrote himself and it still works great 40 years later.

And then the big issue with phones are the batteries. The phone manufacturers know that the batteries go bad, so they glue them into the phones so you can't replace them. Obviously if you wanted the phones to last a long time, you'd make it to where you can put a new battery in the damn thing. They also know that the screens break, so they'd make those easy to replace yourself as well if they cared.

That is nice you can take phones to those little repair places and they seem to do a nice job replacing screens and batteries, but they could probably design a phone where you can do it yourself.

By @t-3 - 6 months
Haven't we basically already built them? They're just slow and not supported by software vendors so nobody wants to use them. Other than replacing capacitors and realtime clock batteries on every 20 years or so, dusting and replacing fans when bearings go bad (assuming it's not a passively-cooled design), most computers should basically last beyond a human lifetime (I've read that those less than ~20nm will go bad over time as traces lose atoms and eventually fail, but older processes should be fine).
By @falcolas - 6 months
An excuse to link one of my favorite NASA/Honeywell slideshows:

https://c3.ndc.nasa.gov/dashlink/static/media/other/Observed...

The long story short is that there are byzantine failure methods which prevent a 50 year computer. A sample:

- Capacitors can act as bullets

- Forced air cooling creating water

- The smaller the parts, the greater the chance they'll transmute to another part. Even, or especially in solid state parts.

- Digital isn't (i.e. 1 isn't really full voltage, and 0 isn't really no voltage).

- Thermal expansion matters, even for ICs on a board.

- Wire length, and the position of sensors on that wire, matters.

A 50 year computer would probably have to be one in which each part can and is replaced on a schedule. And the faster the computer is, the more often parts would need to be replaced. Additionally, if we want 100% uptime there would also have to be sufficient redundancies to ensure that the computer could continue operating during failures or replacements of components.

By @dhosek - 6 months
I kind of feel like we’re rapidly approaching an end of history point on computing. The joke in the late 90s/early 00s was that your computer became obsolete on the way home from the store. My computing upgrade cycles have been getting longer and longer. Same with phones. I last upgraded my iPhone in 2022 not because I needed to (it was three years old), but because I wanted some of the newer features. What used to be a 2-year cycle like clockwork has stretched to 3 or 4 years. My laptop cycle has gone from 3 years to 5 years and that last only because the display stopped working (it’s now running headless in my music studio). The limiting factor has become less one of functionality and one of durability, and while there’s work to be done there, right now the economic factors don’t make sense. As revenue shifts to services from hardware though, I expect to see a greater emphasis on long-lasting computers until the expectation is that a computer, phone or tablet is expected to have a 10–15 year lifespan.
By @dave333 - 6 months
In the 1980s AT&T was designing cabinet-sized minicomputers that would have less than 2 hours downtime in 40 years and went to great lengths to enable software update without reboot (functions accessed via transfer vectors) and ability to survive and continue running through earthquakes. These are still running I gather as part of various phone switching systems 4ESS, 5ESS although the hardware has been "reengineered." https://en.wikipedia.org/wiki/3B_series_computers .
By @jmrm - 6 months
I think we have reached a point in tech where there isn't a huge benefit about changing computer every 3 years like in the past.

You can have a computer with 10 years that can run modern OSs and software without being incompatible or too slow, a thing totally impossible 20 years ago.

If you do AI related developement or play videogames, you would require at least a new GPU, but outside that, I think the only couple things (pretty major IMO) making those computers less useful are more complex video formats not available to decode by hardware, and the vast amount of code some web apps use (try using YouTube or Twitter in an old laptop)

By @adrian_b - 6 months
Most modern MOS circuits are no longer designed to last 50 years, unlike most integrated circuits and discrete semiconductor devices of 50 years ago. There is no chance for any up-to-date CPU or memory module to work for 50 years.

Nevertheless, it is quite easy to be able to use a modern computer for 50 years, if you just get 10 computers that do not contain components that age even when they are not used, e.g. batteries or electrolytic capacitors, and you use one computer until it breaks, keeping the others in storage until you must replace the current work computer.

Such a set of modern computers would be faster, cheaper and smaller than a single computer in the style of PDP-11 or VAX, made by using low-density components that can work for 50 years.

By @VyseofArcadia - 6 months
The web sure is convenient, but for the actual work I do, I could in fact work on a 30 to 35 year old computer. I mostly code, and occasionally I process words or spread sheets. All things I could do on a DOS machine, or even something like an Apple //e. I'd certainly be fine on an Amiga. I'd be on cloud nine with a NeXTCube. I don't know that I'm willing to go older than early 80s, though. I need my computer to at least handle both uppercase and lowercase.

So arguably we've already built computers that last 40 years. Another decade doesn't seem crazy.

By @kbrecordzz - 6 months
Computers continue to work "forever" if you only use its own closed system, like writing Word documents on the harddrive. It's the complexity of the internet that makes hardware obsolete today. The internet consists of too many parts working together to make it profitable to focus on longevity and stability, the focus on the internet is instead flexibility and broad usage. And it's mostly the security standards that force us to buy new hardware in the end. From SSL to TLS, to TLS 1.2 & 1.3, almost all sites upgraded to the new standards and made old web browsers not work to browse the internet with anymore. And if the newest web browser your computer supports is one before 2014 (before TLS 1.2), your computer is dead, because it can't visit the internet. So it's mostly the software layer of the internet that makes us not get "forever computers", and therefore "we" software people maybe are the ones with the power to make a change here?
By @theodric - 6 months
E-ink maybe isn't it, anyway. A few years ago I built an e-ink clock/gimmick that refreshed every few seconds with some different text on a given part of the screen, and within 1.5 years the sides unto about 2cm in - but not the parts being constantly refreshed! - had more stuck/weak pixels than not. A halo of rot. 50 years is a long time, much longer than 1.5.
By @Clamchop - 6 months
> Lots of writers keep using [typewriters], they became trendy in the 2010s and, to escape surveillance, some secret services started to use them back. It’s a very niche but existing market.

At first blush, this sentiment appears to also be true of old computers. There is growing "trendy" interest in them, and they're otherwise still fit for purpose for some tasks, like gaming, writing, driving long-unsupported hardware or software. The community around it has been rather industrious in servicing old machines, particularly Macs.

But they cannot satisfy all the requirements we have of a modern computer, and neither can a typewriter. However, the length of time a computer has before being truly obsolete seems much longer now than it used to be. You could easily get a decade or more if you can control the itch for new and shiny and have modest performance needs.

Might need to replace the battery, if the device has one. There's some luck involved with getting the longest support window possible from MS or Apple. Google and co are famously a lot worse on this front, if we're talking phones.

By @makeitdouble - 6 months
This feels like a really interesting idea that would benefit from a real world use case.

As it is now, the main motivation is "save your attention, your wallet, your creativity, your soul and the planet" which to me sounds like no specific purpose, which makes it hard to imagine if the result will have any use in 50 years.

As a comparison point this story about warning signs [0] makes the challenges a lot more palatable, and that's how I'd see any chance of success for a product design.

And that makes me wonder how many actual long lasting computer projects already exist in the world, for instance to control nuclear reactors, to activate water pumping stations, control emergency valves etc.

[0] https://99percentinvisible.org/article/beyond-biohazard-dang...

By @gladiatr72 - 6 months
https://hackaday.com/2019/12/06/visiting-the-facom-128b-1958...

I was quite impressed to learn about the 66 year-old computer that is still in use with the Japanese transit system.

By @johnklos - 6 months
The oldest computer that I run regularly is an Amiga 3000. It's now 33 years old, yet the system has everything it needs to run lots of excellent and modern software: 16 megs of memory, a SCSI bus that can take drives of any size, decent resolution graphics that can be used on many modern monitors and TVs, a processor with an MMU, and a good keyboard and mouse.

Add an ethernet card and perhaps a Zorro III RAM card, and it's usable even on the modern Internet: modern TLS works, and for sites that are too complex for AmigaDOS browsers, there are public proxies that can help.

While I wouldn't suggest anyone tries to get serious work done on the modern Internet using an original, unaccelerated Amiga 3000, it makes an excellent example of how things really haven't changed aside from speed and size since we moved to 32 bit CPUs with MMUs.

Something like this could easily be used for non-Internet heavy tasks for fifty years. We just need to be aware of the things that typically fail, such as bearings and capacitors.

By @themadturk - 6 months
I couldn't help thinking of the AlphaSmart[0] while reading this. The writer's primary need seems to be an offline, lasts-forever writing device, so no version of the AlphaSmart meets all the criteria. But it is (or was) an offline-only device that was limited to writing and a few educational applications. The keyboard was excellent, text could be transferred between device and computer via cable, and the AA-batteries would last for literally hundreds of hours.

[0]https://en.wikipedia.org/wiki/AlphaSmart

By @recursivedoubts - 6 months
i like to think about thought experiments like this: what if electronics/large consumer goods were all bar-coded and, when they are disposed of, scanned in, and the original manufacturer is charged some fee for the recycling/disposal of them. Make "repairing with minimal waste" the recurring revenue that product companies shoot for, rather than the new new thing.
By @anthk - 6 months
>But this permanent connectivity is a choice. We can design a computer to be offline first. Once connected, it will synchronise everything that needs to be: mails will be sent and received, news and podcasts will be downloaded from your favourite websites and RSS, files will be backuped, some websites or gemini pods could even be downloaded until a given depth. This would be something conscious. The state of your sync will be displayed full screen. By default, you would not be allowed to use the computer while it is online. You would verify that all the sync is finished then take the computer back offline. Of course, the full screen could be bypassed but you would need to consciously do it. Being online would not be the mindless default.

Offpunk. Slrn with slrnpull and mutt +mbsync/msmtp. Heaven.

Offpunk:

https://sr.ht/~lioploum/offpunk/

By @nickpsecurity - 6 months
When I studied ASIC design, it taught me that the complexity increases and reliability decreases every time they shrink a process node. There’s increased variation, degradation, etc. One paper on 130nm to 90nm said they had to build digital monitoring or correction when in the analog parts while cell phones were mixing in analog to save energy (mixed-signal).

The point is that having the reliability of older computers might require the older, simpler hardware they were using. More breakage might be an inherent tradeoff of using modern processes, at least up to a point.

I still have always wanted to see someone mix low-cost components with the older, patent-free, NonStop architectures. The desktops would look like the dual-motherboard SGI’s with pluggable CPU’s, etc. Just replace what breaks with the system chugging along using other components.

By @dave333 - 6 months
Desktop PC hardware is sufficiently modular and easy to upgrade. It would be nice if upgrades were add-on rather than throwing away the replaced module(s) but HW changes so fast it's almost never worth it. Software could be improved to make things longer lasting such as make clean install trivial with good separation of user vs system data. I recently added an SSD and made my old HDD the G: drive but the new instance of Windows on the SSD did not consider the new SSD userid with the same name as the old HDD userid to be the same user and so accessing the old files became a file sharing nightmare. Also the old HDD drive started taking forever for reboot file system checks and I had to just disconnect it. So now I am wading through all my old backups trying to figure out what is what.
By @loloquwowndueo - 6 months
I have a Tandy trs-80 model 100 which is at least 40 years old. Not so far from the 50 year mark.
By @jwrallie - 6 months
I have been thinking about this topic for some time, but my focus was a bit different. I wanted to know what would take to make a computer where the state is always preserved, meaning reboots are never required, but most importantly whatever you have been working with is always the exact way you left.

Somehow I think with the current culture of updates, which is linked to security requirements and ultimately to the fact your computer is always connected to the internet are fundamentally incompatible, thus this kind of computer would need also to be offline, but certainly there are tricks like live kernel updates that could be employed to extend uptime as long as possible.

By @d_silin - 6 months
A good laptop will last for 5-10 years, about as long a car, I guess. 20 year laptops (Thinkpads mostly) are still around.

If average laptop lifetime is about 5 years (for all reasons), then about 0.1% will make it to 50 years and remain operational.

By @K0balt - 6 months
I think the trick here is going to be using at a minimum only automotive rated parts or better, FRAM storage if possible, tantalum and ceramic caps, things like that.

Speculatively, with AI moving along as it is, a “computer” might be very much like a typewriter, primarily a device for creating documents, getting them to peripherals, etc for the human user and their AI ghost/API.

With thermodynamic neural nets, especially if we can get them working at room temperature, we could easily see a situation where it would be more cost and power efficient to have local generative AI simulate standard computer architecture than to actually build a Von Neumann computer from discrete components.

If we can get the thermodynamic wells down to the size of flash cells, that could mean running 1024b models locally on chips the size of an SD card, peaking at around 20 watts at full utilisation.

I could easily see using MCU scale compute to run a stripped down system to provide the wireframe from which the GAI could hang on the pixels and pizaz, helping the user to stay on the rails of a strictly deterministic system while decorating it with GAI. A “50 year computer” might be useful as a stand alone, but basically be an interface device when combined with generative AI running on devices that would probably be much more needful to keep current.

By @FuriouslyAdrift - 6 months
MOCAS is still going from 1958... Hardware has been updated a few times and it currently runs on a IBM 2098 model E-10 mainframe (2008?)

https://www.technologyreview.com/2015/08/06/166822/what-is-t...

By @kccqzy - 6 months
The voyager spacecrafts are almost fifty years old.
By @tony-allan - 6 months
This is a great thought experiment!

The design goal is to build a computer that lasts 50 years. To me this implies a design that is modular and repairable and possibly not based on something you can buy today. I don't want to base my computer on the products that existed 50 years ago or the products I can buy today.

What would I give up in order to get a computer (hardware and software) that lasts 50 years? Size, weight, speed, complexity. Sure.

We now know a lot about change so I need a device that accounts for almost every technology that I use today to have evolved significantly. So I need some long term features.

I want to think in terms of modules, which may be independent physical things. I also want a case to put it all in.

Over the next 50 years I (and my grandchildren) need to be able to repair and replace any part that breaks and continue to evolve the modules that I use, the case and the way the modules interact with each other. My needs will continue to evolve. The rest of the world will continue to evolve around me and I still want to interact with it and its services.

I think some things are constant. I need power; a way to input data; process and store it; usefully share it with others; and a way to output that data.

My modules may therefore include a keyboard, some sort of pointing device and potentially other input devices in the future; a power supply; a bunch of CPU's for various purposes in one or more modules; a set of storage and archive devices; networking; one or more output devices, perhaps a screen or two.

Perhaps the most important thing is an idea, philosophy and a clear idea of what I want the device to do. The article talks about typewriters which are clear on each of these points. I also like the idea that I will need an emotional investment in whatever I end up with.

If I wanted to experiment today I would start with a bunch of Raspberry Pi's and their kindred microcontrollers. Each of my modules would contain one or more of these devices. I would pick a set of connection standards. I don't know where the idea's go from there but it would be fun to find out!

By @S_A_P - 6 months
I’m going on 7 years with my iMac Pro now and it’s still more than enough for my uses. (Audio recording/production) I am hoping to get 3 more years out of it if possible. We’ll see if Apple lets that happen.
By @everyone - 6 months
Voyager 1 and 2 are still functioning. And they were built on the cheap.. They certainly weren't supposed to last 47 years but they did.
By @hyperman1 - 6 months
We're regressing. My last computer was a thinkpad x220t, with a life of 10 years. After that I bought an ideapad. It is now in its 3rd year, and the hinges between body and screen have already broken off and been reglued in 3 different places. That last repair was once to many, and I see myself buying a new laptop in the near future.
By @ted_dunning - 6 months
Surprisingly, nobody has mentioned missile silos.

Talk about a killer app.

https://www.defensenews.com/air/2019/10/17/the-us-nuclear-fo...

By @causality0 - 6 months
By contrast, we have to change our laptops every three or four years. Our phones every couple of years. And all other pieces of equipment (charger,router, modem,printers,…) need to be changed regularly.

This is such a horseshit statement. We change those things because of social pressure, not because they wear out. My mother is still using her first generation iphone SE, eight years later. It still facetimes and texts and watches netflix just like it did in 2016. The Nighthawk R7000 router I bought 11 years ago still isn't fully saturated by my network traffic. I have USB chargers in use that came with phones I bought in 2009. My HP printer/scanner is from 2005 and they still make cartridges for it.

By @datavirtue - 6 months
By @Apreche - 6 months
The Apple IIGs exists. I have one. I think it's going to make it to 50 no problem.
By @rjakobsson - 6 months
I really vibe with the author’s vision: an offline-first computer, made to last.
By @flobosg - 6 months
(2021)
By @miohtama - 6 months
Voyager is still going, qndand has a computer by very early definition.
By @kva - 6 months
someone needs to bring up the indestructable Nokia
By @speedbird - 6 months
PDP-11