Human neuroscience mustn't forget its human dimension
Human neuroscience advances with AI, focusing on awake brain surgery studies. Emphasis on ethics, consent, and data privacy. Aim for personalized treatments while upholding ethical standards and participant well-being in research.
Read original articleHuman neuroscience is advancing with innovative technologies like artificial intelligence, allowing for detailed studies of the human brain. Research emphasizes the importance of considering the human dimension in studies involving awake individuals during brain surgery. The field aims to understand speech production, cognitive abilities, and neurological disorders better. Ethical considerations, such as consent procedures and data privacy, are crucial in human neuroscience research. Researchers are urged to uphold ethical standards, ensure participant involvement, and address the implications of AI and machine learning in data analysis. The focus is on developing personalized treatments while respecting the dignity and rights of individuals involved in research. The call is made for improved data ethics, shared responsibility among stakeholders, and ongoing informed consent processes to safeguard participants' well-being and privacy. Human neuroscience is poised for significant advancements, but success hinges on maintaining ethical practices and prioritizing the human aspect of research.
Related
I don't think we live in any dystopian future or Black Mirror episode yet, but I do think that if a character in one of those episodes had a flashback to pre-cyberpunk life, it'd look quite a bit like 2024.
Are consumer protections even the right way to navigate this? Could American government reasonably handle this? Supposedly, it'd be the NIH and its peer organizations doing this, and I trust them a reasonable amount.
The sentiment is admirable but there isn't a govt or company in the world that sees mind control devices and thinks "how can we use this to protect individual rights?"
I'm kind of surprised by this take and ultimately think this article wasn't particularly interesting.
Here's my take on the cross-section between neuroscience and AI: https://bower.sh/who-will-understand-consciousness
Ultimately, I don't think we are going to discover consciousness in chaotic biological systems. Rather, we will find it in silicon.
https://humanrights.gov.au/our-work/technology-and-human-rig...
>>Without a doubt, human neuroscience is entering a new and important era. However, it can fulfil its goals of improving human experiences only when study participants are involved in discussions about the future of such research.
Ok now what I’m really curious about — I would ask that we take “AI is a big deal now that the frame problem is tractable, and LLMs are shockingly good at encoding human brain activity” for granted in this discussion, if possible:
Is anyone else way more scared than they’re ever been about (science) news, and feel unable to read stories that are objectively interesting and important to you because they remind you of the uncertainty to come? Even if we abstract away all the economic automation, aesthetic and cultural conflict, and ethics/safety concerns of the new tech… we are reading people’s minds. Sure it’s mostly in fMRIs for now-it’s clearly in its infant stages, they’ve only had since early 2023 to work on it in earnest—I think it’s moving at lightning speed for life sciences. For example, look into the explosion of consumer/prosumer/freelancer BCI tech, such as EEG headsets (g.tech Unicorn being my hacker fave, $1000 for a modular Bluetooth headset) and fNIRs visors (Muse being the big player in the consumer space, the prosumer space is still coming down in price to feasible levels). For another example, this paper is what first alerted me to this underreported situation (tho now that there’s a whole issue of Nature on it, I imagine it’ll gain prominence):
https://arxiv.org/abs/2309.14030v2
So, anyone have advice? Again, other than “you’re wrong it’s not a big deal”, please :)
I refer back to my comment from a few days earlier: