Why should anyone boot *you* up?
Onur Solmaz's blog post discusses the philosophical and economic challenges of future brain emulation technology, questioning the rationale for reviving individuals whose skills may become obsolete over time.
Read original articleThe blog post by Onur Solmaz explores the philosophical and practical implications of future brain emulation technology. It presents a hypothetical scenario where an individual's brain is scanned and stored after death, but the technology to emulate this brain becomes available only a millennium later. The author questions the rationale behind reviving such a brain, considering the potential costs and the likelihood that the individual would be unable to adapt to a drastically changed society. Skills and knowledge may become obsolete, rendering the revived person helpless and dependent on others for education and survival. The discussion also touches on the challenges of updating a brain scan with contemporary knowledge and whether such an updated version would still represent the original individual. Furthermore, the post highlights the immense data storage requirements for a complete brain scan and the potential loss of information during the scanning process. Ultimately, Solmaz argues that the most significant barrier to digital immortality may not be technical but rather economic, as there may be little incentive to revive individuals from the past.
- The feasibility of reviving a brain scan in the future raises economic and philosophical questions.
- Skills and knowledge may become obsolete over time, making revived individuals dependent on others.
- Updating a brain scan with contemporary knowledge challenges the notion of personal identity.
- The data storage requirements for a complete brain scan are immense and may lead to information loss.
- Economic incentives may be the primary barrier to the implementation of brain emulation technology.
Related
A Model of a Mind
The article presents a model for digital minds mimicking human behavior. It emphasizes data flow architecture, action understanding, sensory inputs, memory simulation, and learning enhancement through feedback, aiming to replicate human cognitive functions.
Artificial consciousness: a perspective from the free energy principle
The article explores artificial consciousness through the free energy principle, suggesting the need for additional factors beyond neural simulations to replicate consciousness in AI. Wanja Wiese emphasizes self-organizing systems and causal flow's role in genuine consciousness.
China Brain
The "China brain" thought experiment questions if a collective of individuals could simulate a human brain's function, raising debates on consciousness, artificial intelligence, and the nature of mental states.
A.I. Will Fix the World. The Catch? Robots in Your Veins
Ray Kurzweil's "The Singularity Is Nearer" predicts AI will surpass human abilities by 2029, discusses nanobots for health enhancement, and raises ethical concerns about technology's impact on humanity and mortality.
Don't Enslave Digital Minds
Robin Hanson discusses the future of digital minds, highlighting potential cultural evolution, concerns about control resembling slavery, and the need for a Malthusian approach to promote flourishing for all beings.
BORING!
The work I do is appreciated by those around me. It isn't world changing, it doesn't need to be. I am happy doing my work, other people are happy with the work I do. When I am done with work I am done with work and do the mediocore stuff that also makes me happy. When the time comes, I'll shut down and that will be enough. Thank you very much.
The scenario pre-supposes that humans would need to contribute some kind of value in order to justify their existence. I personally doubt any human being would be able to generate enough meaningful value that would be enough to justify the operating expense of their existence. Computing resources that could be used running human.exe, would almost certainly be better used to run some other program. So in a such a world, no human brain would be booted up, it would be entirely populated by other programs that are busy contributing whatever value the mis-aligned system requires.
Once we dispense of the assumption that a human would need to justify their existence outweigh the cost, we can more easily answer the original question. Humans would clearly have some right to exist in this scenario and we just need to make sure that it extends to humans that are already dead but sufficiently scanned to be recreated.
Eventually some rich guy thaws him out so he can learn more about his interviews and he goes on from there trying to cure-and-revive his wife.
IIRC he keeps freezing and thawing himself throughout millennia... can't recall the name of it though! Arg.
Nobody is going to have any practical use for your brain. The only reason you'd be brought back is if we build and maintain a society that values human life enough to breathe it back into your decrepit, worthless neurons.
You can't even rely on languages being somewhat similar (French, Spanish, Portuguese and Italian are close enough that, given the amount of change, you probably can get away with learning just one of them; if you were really reductionist you could take one Indo-European language, one Turkic language, etc). Even being maximally reductive, the number of languages you need to learn is still 9, 10 if you go by the upper estimates (the Tai-Kadai family, represented by Thai, being optional). That's a lot, especially given that this metric lumps languages as separate as English, Bengali, and Farsi together, or Hausa, Arabic, and Amharic (and Maltese, for what it's worth). And that's just to have a decent shot at understanding a thousand-year-old version of the lingua franca!
> Given that running a brain scan still costs money in 1000 years, why should anyone bring you back from the dead? Why should anyone boot you up?
I don't think that's really the crucial question. In my view, the crucial question is why would you want to be booted up?
* Some people will use cryonics services that set aside and invest money in order to pay for reviving people, so specific resources could be available for reviving those people
* Future people may be curious for archaeological and historical purposes (I'd certainly like to be able to interview that medieval stonemason!)
* Future people might see it as a form of charity (a way of voluntarily helping helpless people)
Of course these are all assuming a future in which a revival technology exists and is meaningfully applicable to people whose brains were preserved in the present day.
- Proof: At first, dead people will be resurrected just to prove that it can be done.
- Research: The dead person was a witness to / participant in a historically significant event about which historians want to learn more. Or is a semi-random choice for learning about life at a particular time and place.
- Fame: The dead person is a historically significant figure and people want to meet them / profit from them.
- Connection to a living person: The dead person is identified as a direct ancestor of someone alive and they are curious to meet them, or they feel a responsibility to bring them back.
- Connection to the resurrected: Resurrected people will want to bring back their family and friends, and may push those in control to make this happen.
- Ethics: Some people and cultures may come to think they have a moral obligation to resurrect as many people as possible.
- Just because: Someone with the means to thought it would be cool to bring a random person back and see their reaction.
The way easy to verify this is remove the death condition. Do you think you are suddenly experiencing both realities? The computer version can clone itself too. It need not experience the realities simultaneously and almost certainly won't. Though that isn't to say it couldn't recombine the knowledge (or even upload to the human). But none of that is "you".
But to the article I'm sure random people would be booted back up. Though I'm pretty sure if it's an accurate representation of me it wouldn't want to exist as just a tool for historians (to proxy interviewing people of the past). So maybe by careful booting "me" back up
I would personally love to boot up all my ancestors, going as far back as I could reasonably afford. China has a long tradition of "filial piety" and I there is a little bit of this inclination in everyone.
Also I could easily see a type of "chain effect" where each generation wants to keep its parents around as long as possible. For example: I want to boot up my dead mom because I love my mom, my mom wants me to boot up her mom because she loves her mom, and on and on. As long as humans keep their affinity for their family I can see a desire to boot up minds.
It's similar to the desire that most people have to see their grandkids grow up. Most people want to keep as many members of their direct lineage in their life as they can.
These days it appears that Ideocracy was optimistic with the 500 year timeline and we'll get to the dysfunctional civilization much sooner and without the sci-fi advances that would make life confrontable. Of course without that sci-fi science there would be no way to boot people up so the point is moot.
But that requires a few massive assumptions including that the kind of novelty that is desirable to a future might be found in the brains of individuals from today and that such novelty could not be generated in some other cheaper manner.
I suspect if we can recreate a human based on data, we also can learn them pretty quickly. Take a language course?
Also brains are complex systems in action, with vast quantum data and momentum that would be infeasible to measure let alone store or recreate.
A brain scan no matter how detailed is unlikely to provide the information necessary reproduce a mental state.
Two works that come to mind:
- the videogame Soma
- "Accelerando," by Charles Stross
To the article’s premise where this technology constitutes resurrection, I’m not convinced that a post-neural-cloning society in the year 3000 would lack the automated resource access to flip the question to “why not?”
We’re already seeing population growth slowing or reversing in populations with mere modern levels of abundance, and we don’t even have The Good Matrix yet! And that’s basically paired with the tech that enables digital mind clones - fully digital interfaces between mind and sensory data / bodily functions, so you can make a virtual world to exist in instead of wiring up a bunch of prosthetics. So you won’t need any non-fungible material resources to satisfy your every material desire.
2. To laugh at you
3. To find out things about the period in which you lived
4. To experience this thing called feelings
5. For lulz (if that's the right term)
6. To get the pin number
If progress continues, laws will be developed to mandate minimum compute resources available for simulants, like a basic right to life for biological people.
The basic premise of the article, "Given that running a brain scan still costs money in 1000 years, why should anyone bring you back from the dead? Why should anyone boot you up?"
The answer is, something has gone awfully wrong. Understanding cognition and the architecture of human minds has several parallel tracks, some of which should result in what amounts to brute-forcing the problem. It's not unreasonable to expect several of those tracks to converge on a solution within the next 5 decades - breakthroughs could significantly speed up that timeline, and molecular scale imaging is being pursued for many reasons beyond brain science.
This is a bit like asking "what if we only have ICE cars in 1000 years?"
Technology - especially compute - is on a persistent trajectory, and within 150-200 years, the atomic wizards and tamers of rock lightning should have us to about the halfway point of the Landauer limit, in terms of compute efficiency. It's hard to imagine things will come to a screeching halt, and we'll just live with early 2020s chip technology, give or take, for the rest of human history. We could even optimistically hope that we'd have overcome our collective nimbyist anti-nuclear idiocy and were on totally renewable and nuclear energy, with a thriving, nascent spaceborne civilization working out in the solar system.
You'd fire people up because you could - because there will be people who just like to explore the old stuff, because unless something goes wrong, treating brain scan simulants is the moral and humane thing to do. Boredom, curiousity, legislative mandate, slave labor to populate your torment nexus, friends to hang out with in your virtual paradise - there are as many reasons as there are humans. Imagine the heists and treasure hunting possibilities... just spin up the bank CEO and get his passwords and knowledge of the system, or spin up the space pirate for the coordinates of the golden asteroid.
I think a better question would be "since it cost time and effort to scan a brain, and this person underwent the procedure willingly, isn't the only moral answer to boot them up?" By the time it becomes a meaningful question, the operational overhead should be trivial. Strange things will come up, like how many copies you're allowed to run, globally and locally, what and how you are to contribute back to society, if at all, what the rights and responsibilities and citizenship and political valence, and all those things. I can't fathom a future where it's ever a matter of "hey, there's a 1000 year old copy of a person here, but nah, we're so bored and jaded, why even bother."
Related
A Model of a Mind
The article presents a model for digital minds mimicking human behavior. It emphasizes data flow architecture, action understanding, sensory inputs, memory simulation, and learning enhancement through feedback, aiming to replicate human cognitive functions.
Artificial consciousness: a perspective from the free energy principle
The article explores artificial consciousness through the free energy principle, suggesting the need for additional factors beyond neural simulations to replicate consciousness in AI. Wanja Wiese emphasizes self-organizing systems and causal flow's role in genuine consciousness.
China Brain
The "China brain" thought experiment questions if a collective of individuals could simulate a human brain's function, raising debates on consciousness, artificial intelligence, and the nature of mental states.
A.I. Will Fix the World. The Catch? Robots in Your Veins
Ray Kurzweil's "The Singularity Is Nearer" predicts AI will surpass human abilities by 2029, discusses nanobots for health enhancement, and raises ethical concerns about technology's impact on humanity and mortality.
Don't Enslave Digital Minds
Robin Hanson discusses the future of digital minds, highlighting potential cultural evolution, concerns about control resembling slavery, and the need for a Malthusian approach to promote flourishing for all beings.