June 24th, 2024

Human neuroscience mustn't forget its human dimension

Human neuroscience advances with AI, focusing on awake brain surgery studies. Emphasis on ethics, consent, and data privacy. Aim for personalized treatments while upholding ethical standards and participant well-being in research.

Read original articleLink Icon
Human neuroscience mustn't forget its human dimension

Human neuroscience is advancing with innovative technologies like artificial intelligence, allowing for detailed studies of the human brain. Research emphasizes the importance of considering the human dimension in studies involving awake individuals during brain surgery. The field aims to understand speech production, cognitive abilities, and neurological disorders better. Ethical considerations, such as consent procedures and data privacy, are crucial in human neuroscience research. Researchers are urged to uphold ethical standards, ensure participant involvement, and address the implications of AI and machine learning in data analysis. The focus is on developing personalized treatments while respecting the dignity and rights of individuals involved in research. The call is made for improved data ethics, shared responsibility among stakeholders, and ongoing informed consent processes to safeguard participants' well-being and privacy. Human neuroscience is poised for significant advancements, but success hinges on maintaining ethical practices and prioritizing the human aspect of research.

Related

Link Icon 14 comments
By @sleepingreset - 4 months
The time for cybernetic enhancements of human minds and bodies is, reasonably, a decade or two away from becoming mainstream. Leading up to that, I wonder if there will be enough effort put into consumer protections so that it doesn't turn into a knock-off cyberpunk novel.

I don't think we live in any dystopian future or Black Mirror episode yet, but I do think that if a character in one of those episodes had a flashback to pre-cyberpunk life, it'd look quite a bit like 2024.

Are consumer protections even the right way to navigate this? Could American government reasonably handle this? Supposedly, it'd be the NIH and its peer organizations doing this, and I trust them a reasonable amount.

By @JohnMakin - 4 months
Since the article mentions Leborgne's story as a preface for the rest of the article, but fails to describe what is significant about that story, here it is: https://www.sciencedirect.com/science/article/abs/pii/S18788...
By @will1am - 4 months
I think it is vital to maintain a strong focus on ethical considerations in this field especially
By @staplers - 4 months
The same companies that test self-driving cars on local populaces before getting consent?

The sentiment is admirable but there isn't a govt or company in the world that sees mind control devices and thinks "how can we use this to protect individual rights?"

By @qudat - 4 months
Based on the title, I was waiting to see mention of ML/AI but it quickly went down a path I didn't expect: the potential downsides of have AI de-anonymize data that was previously anonymized.

I'm kind of surprised by this take and ultimately think this article wasn't particularly interesting.

Here's my take on the cross-section between neuroscience and AI: https://bower.sh/who-will-understand-consciousness

Ultimately, I don't think we are going to discover consciousness in chaotic biological systems. Rather, we will find it in silicon.

By @somesortofthing - 4 months
Seems like this article is putting the cart before the horse - what recent breakthroughs have there been to justify saying that we're entering some new era of neuroscience, or that AI has anything to do with that new era?
By @abarker - 4 months
The Australian Human Rights Commission discussed many of these issues in its March report, "Protecting Cognition: Background Paper on Neurotechnology":

https://humanrights.gov.au/our-work/technology-and-human-rig...

By @amatic - 4 months
Where is the author's name? Strange editorial. Neuroscientists should "involve participants"? What if they cannot speak, like that guy Broca's area was found in?

>>Without a doubt, human neuroscience is entering a new and important era. However, it can fulfil its goals of improving human experiences only when study participants are involved in discussions about the future of such research.

By @bbor - 4 months
First, a disclaimer: this is a great article and an important moral issue, and reminded me of the fantastic museum the Germans set up at the Charité in Berlin, a longstanding medical institution that obviously had a significant role to play in murderous Nazi “science”, and regardless deals with those sacrificing their wellbeing to science via testing and tissue donation to this day. Definitely do not skip, if you’re ever in town.

Ok now what I’m really curious about — I would ask that we take “AI is a big deal now that the frame problem is tractable, and LLMs are shockingly good at encoding human brain activity” for granted in this discussion, if possible:

Is anyone else way more scared than they’re ever been about (science) news, and feel unable to read stories that are objectively interesting and important to you because they remind you of the uncertainty to come? Even if we abstract away all the economic automation, aesthetic and cultural conflict, and ethics/safety concerns of the new tech… we are reading people’s minds. Sure it’s mostly in fMRIs for now-it’s clearly in its infant stages, they’ve only had since early 2023 to work on it in earnest—I think it’s moving at lightning speed for life sciences. For example, look into the explosion of consumer/prosumer/freelancer BCI tech, such as EEG headsets (g.tech Unicorn being my hacker fave, $1000 for a modular Bluetooth headset) and fNIRs visors (Muse being the big player in the consumer space, the prosumer space is still coming down in price to feasible levels). For another example, this paper is what first alerted me to this underreported situation (tho now that there’s a whole issue of Nature on it, I imagine it’ll gain prominence):

https://arxiv.org/abs/2309.14030v2

So, anyone have advice? Again, other than “you’re wrong it’s not a big deal”, please :)

By @Razengan - 4 months
As with all knowledge and technology people will find ways to use it for money and power, until there is a fundamental shift in human society.
By @seydor - 4 months
I think mind uploading will happen first, and tbh i dont care about my carbon copy
By @jampekka - 4 months
The previous era showed that dead fish get emotional when they see pictures of humans. Can't wait to see what will be shown in the new era where you throw in additional billion or so parameters to predict dozen observations.
By @vunderba - 4 months
The wild and unsubstantiated claims in this thread from "cybernetic enhancements" to "mind uploading" are so hand wavy we could start an entire jazz hand quartet.

I refer back to my comment from a few days earlier:

https://news.ycombinator.com/item?id=40733615#40734638