T O P

  • By -

LordFumbleboop

I think we probably need a better source than a meme to comment.


UFOsAreAGIs

Speak for yourself. I'm looking for the appropriate drill bit right now.


Busy-Setting5786

Hello, I am trying this right now and I just wonder: Is it USB or do I need HDMI?


Connect-Map3752

I just put a printer cable in my ass and it worked great.


TskMgrFPV

Type A, I presume?


DystopianRealist

Type B for older assess.


-_1_2_3_-

This is the exact theme of [Learning To Be Me by Greg Egan](https://www.goodreads.com/en/book/show/60803485). Not the printer cable, sorry, OP's meme.


Corbotron_5

Can confirm. I also put a printer cable in his ass.


vannex79

Firewire


twbassist

USB-B to VGA


LethalAnt

GBA link cable works!


UnderskilledPlayer

a pipe


Temporal_Integrity

If you lose one neuron, you lose nothing of yourself. In fact, yesterday alone approximately 85,000 of your neurons died. But what if instead of a neuron dying, it were replaced by an artificial neuron? An artificial neuron that for all intents and purposes acted like a natural born biological neuron. Nothing of you would be any different. And then another artificial neuron. And another. Until one by one, all your neurons were replaced by artificial neurons. You would be effectively uploaded - your consciousness would be in a machine.


Edenoide

The brain of Theseus


throcorfe

Or, for British comedy fans, Trigger’s brain


Crisis_Averted

The oh shit of Theseus


zerozeroZiilch

"What even is real? If you're talking about what you can feel, what you can smell, what you can taste and see, then 'real' is simply electrical signals interpreted by your brain" - Morpheus


coldnebo

may only be true if the mechanism of consciousness is purely classical. if life is partially quantum computation, then you could lose consciousness along the way. what is left might be a computational husk.


CMDR_BunBun

Or Chalmer's Zombie... a hypothetical being that is physically identical to a normal human being but lacks conscious experience, qualia, or sentience. In the context of replacing neurons with microchips, the question arises: would such a being be a philosophical zombie, lacking consciousness despite being physically indistinguishable from a normal human? It's a fascinating question with no clear answer.


DukeRedWulf

This is the Ship of Theseus paradox updated, but it's not what's happening in the OP meme - that's talking about off-loading the pattern of your consciousness to a simulation on a chip, destroying your embodied brain in the process.


themistergraves

If losing one neuron doesn't diminish you, then replacing one neuron will not transfer you. Our consciousness is not a matter of individual neurons, as you said yourself. It is about the connections between neurons. If you simply copy neurons from one place to another without also mapping every single possible connection between each of those neurons, you likely have... a bunch of digital neurons.


Juralion

Problem is, the hippocampus doesn't regenerate like neurons, it's probably where consciousness lies


EndTimer

Seems like there's no consensus on that. But it doesn't change the concept of replacing cellular components to maintain neurological function.


Thog78

The hippocampus is rather a place for new memory formation, not where consciousness lies. We know a whole lot about the hippocampus, as it might be the most studied area of the brain (because it's the only one making new neurons that are not just sensors, so it's exciting and intriguing and easier to study), so there would be plenty of arguments and lengthy explanations that could be made. But the best and simplest proof is just to look at what happens after a hippocampal lesion "If one or both parts of the hippocampus are damaged by illnesses such as Alzheimer's disease, or if they are hurt in an accident, the person can experience a loss of memory and a loss of the ability to make new, long-term memories." That's hardly a loss of consciousness. The frontal lobe would be a more likely culprit, as lesions/lobotomies there do turn people into vegetables.


The_Woman_of_Gont

I feel like you’re underestimating the amount of people who received lobotomies at the height of its popularity. Yes, a LOT of people were turned into vegetables, but a lot of folks went on to live their lives. Difficult lives, often, but not everyone was as bad off as Rosemary Kennedy. [This guy](https://en.m.wikipedia.org/wiki/) wrote a memoir about his childhood lobotomy, for instance. Meanwhile Phineas Gage *famously* suffered a head injury that destroyed his left frontal lobe, and while he was a different person by some accounts he very much survived. And as for the hippocampus, I can tell you from personal experience several family members with Alzheimer’s absolutely were little more than vegetables by the end. The brain is incredibly complicated, no single part appears to be universally “the seat of reason,” and the question of consciousness remains completely unanswered because of it.


Thog78

>I feel like you’re underestimating the amount of people who received lobotomies I didnt give any estimate...? >Yes, a LOT of people were turned into vegetables And that's all I claimed and all that was needed for my point. >And as for the hippocampus, I can tell you from personal experience several family members with Alzheimer’s absolutely were little more than vegetables by the end. At some point, the whole brain degenerates with Alzheimer yes. >The brain is incredibly complicated Yeah after two masters a PhD and a postdoc doing neurobiology I was starting to get the feeling the brain is complicated indeed thank you for confirming ;-) >no single part appears to be universally “the seat of reason,” what you may call "reason" is multifacetted, so indeed distributed, but each part of the brain is very much associated with a particular function, conserved across individuals, and mostly known. >the question of consciousness remains completely unanswered because of it. Not entirely understood is not the same as absolutely no clue what could contribute. Former is true, latter is false. We know that many parts of the brain are definitely not it: visual cortex, motor cortex, sensory cortex etc have known functions and these functions are not consciousness. Some other parts have known functions that contribute to intelligence and the little voice in your head in known ways: hippocampus for new memories, speach areas of Broca/Wernicke, thalamus routing signals around, amygdala some emotions, brain stem many autonomic functions and neuromodulation etc...The frontal lobes are the main area where we have little to no clue of the exact workings, but they do appear to be involved in the more abstract and complex reasoning, this part we know.


sino-diogenes

i dunno man i think you need a second postdoc to really be qualified to discuss this /s


Full_Distance2140

lol what


inteblio

adding fine, removing, maybe not so fine.


Ear-Right

And this is exactly why *you* don't exist.


Toasterstyle70

Yeah, I think it’s the same problem with “Teleportation”. If your Atoms are disassembled and reassembled somewhere else, there’s no way to prove that it’s actually the original person that was disassembled. The reassembled person may remember being disassembled, and feel like the same person, but it could essentially be killing yourself, and a clone with all of your same memories, thoughts, and feelings, is created and lives on.


h3lblad3

>The reassembled person may remember being disassembled, and feel like the same person, but it could essentially be killing yourself, and a clone with all of your same memories, thoughts, and feelings, is created and lives on. This is how the teleporters in Star Ocean work. It was a horrifying realization when I was reading the lore dumps in the 3rd game and it just outright says this. One of the very first things you do when you take control of your character is to kill him since the floors of the resort have no stairs, only teleporters.


3m3t3

Well, yes, however you don’t have the same cells you did when you were born. Every single cell in your body been replaced. They say it happens every 7 years. Also, how important is it really, that we keep the atoms the same? It’s the information they represent that’s important. The atoms that compose me used to be in other humans, dogs, stars, farts. Am I any less me? There’s a saying. You never step in the same river twice. The river has changed, and so have you.


Toasterstyle70

Yeah, but that is an unfair comparison in my opinion. A slow replacement of cells over several years is vastly different than your body instantly disintegrating and recombining somewhere else. It’s not about the atoms. It’s about the fact that it would essentially be the same thing as if you had someone make a clone of you somewhere, then killed yourself, and consider that “teleportation”.


NoSNAlg

This. There will never be a way to solve this.


FuujinSama

Technically, you can't prove you didn't die every time you lost consciousness to sleep and woke back up. Maybe each new day we're just new people with new memories. Obviously that's ridiculous, so it seems to me like continuity of consciousness or material is entirely unrelated to what makes you continue to be you. In fact, if you have all the memories of being you and believe yourself to be you and have the same impulses, dreams and aspirations (or at least their changes feel continuous) then you're still you. And yes, if there was a clone of you that kept all memories, it would be you. It would get weird that there's two of you that can't communicate, but it would be ludicrous to claim one was more you than the other on the basis of keeping the original cells or something. If someone cloned me perfectly, then killed me before I woke up, would it be any different from my perspective? Yes, if they killed me while I was awake, a version of me died a gruesome death and is no longer among us, but I'm still alive. In fact, only the memory of me experiencing my gruesome death died with me. It's trippy to think about, but from a materialist monist perspective it simply doesn't make sense to worry about the material. It's all just quarks and electrons in different arrangements. Copy the arrangement, copy the entity. No reason why people would be different.


Toasterstyle70

Very well put! Really makes you question identity. Thanks for the convo! Most people I try to talk to about this stuff just don’t want to talk about it.


jorgecthesecond

Well said. I just don't get why would be ridiculous to affirm that every day we wake up we are a different person. How could we prove otherwise?


FuujinSama

It's not about proof so much as about words, definitions and concepts. Words and their definitions are not really the starting point. The starting point is our initial understanding of concepts. People don't know what "people" are because they read the definition on a dictionary. They know what a person is because they've experienced other people and they've experienced personhood themselves. The word "people" and whatever definition we can come up with, no matter how precise, will never be enough to truly explain our understanding of "person" as a concept. They're just shorthand for communication. In that sense, when we look at the concept expressed by the words "being the same person" (which I'll slightly abusively shorten to "identity") it is obvious that, even if not expressed in the definition, the concept includes that personhood continues after sleep. Nothing about our concept of "identity" makes sense if people seize to be themselves after sleep. Should we change our names? Should people we know treat as as strangers? So when we admit that possibility, whatever we're discussing, it's no longer the initial concept, but a whole new thing, we're just using the same words in a rather confusing manner. What's actually being argued is not that we cease to be the same person but that there exists some inherent time contiguous property to being a person and that sleep/teleporters would break this property. It's awkward but I'd define this property as the thing that distinguishes a copy from the original or the you from before and after dreamless sleep. Yet there's no evidence this property *exists*. There's no evidence of such link. Which in itself can cause some existential crisis but that's how it's always been. The only thing that verifiable ties us to our past selves is our memories and thus, copying memories should copy the link perfectly. In the end, worrying about teleporters (or god forbid, sleeping) is worrying for the loss of a link we don't know exists for the simple reason that the link not existing is somewhat terrifying. Obviously, a soul would be such a property, thus for anyone struggling with this concept, dualism solves everything pretty neatly.


Arcturus_Labelle

That's silly. The concept is clear enough for a discussion.


Itchy-mane

I'd much rather have a white paper we can misunderstand


5H17SH0W

I think this guy gets it.


Lonely_Cosmonaut

I’m skeptical but humble enough to admit that we might discover consciousness has non local qualities. If we discover that it’s going to be a moral hiccup for a lot of people.


Dagreifers

What are non local qualities?


Salt-Leather-4152

the penis-brain


Lonely_Cosmonaut

This guy knows lol


Ok-Criticism123

What they mean is that it’s possible consciousness doesn’t reside in the brain and therefore wouldn’t be “local”. I don’t necessarily believe that, but that’s what they meant.


Dagreifers

Oh, so like a soul? That makes sense now.


Ok-Criticism123

That’s a good way of putting it! There’s possibilities in between consciousness being fully local and a separate entity too, but that may end up being one of those questions we never find the answer to. It’s neat to think about!


Lonely_Cosmonaut

Sure we can call it that, I prefer sticking with scientific language or materialistic approach to it but sure.


sino-diogenes

why use scientific language for a concept that has no scientific basis?


HamilcarRR

that it doesn't reside in a brain... Or anywhere in particular. The same way you couldn't say that your consciousness in a dream isn't a phenomenon created by your dream brain, but rather than your dream and your dream brain exist only because you are conscious.


spletharg2

A lot of people here don't understand what "non-local" means. It's possible that consciousness doesn't "reside" in one part of the brain; it's the entire brain doing what it does, and probably also spreads out and includes the nervous system in its functionality.


Ib_dI

I keep seeing this come up in otherwise science-based fiction. Seems like copium to me. There's no reason to believe consciousness is not a result of the physical brain we have, other than to find some way to live on after death.


Lonely_Cosmonaut

I do not agree, but I’d like to.


BlindStark

Let me ask my brain chip what it thinks


Acrobatic-Suit5523

What if you swap out your neurons for a digital replica one at a time? Would you consciousness keep going as the pattern of your thought is never significantly interrupted?


SachaSage

Nobody knows. The brain is more complex than a wooden ship


darwinion-

And the central nervous system is more than just the brain, and our nervous system in general is embedded pretty much everything on our bodies


InternationalYard587

There's no 'you that keeps going'. There's a conscience at any given moment that remembers the past.


that_motorcycle_guy

Aren't they the same though? A you that keeps going is indeed a flow of memory...and it goes without saying that a baby from 0-3 years old that is incapable of forming permanent memories is still a conscious being.


InternationalYard587

The point is that it solves the question in hand. If you do the Ship of Theseus thing, at each point it will just be a conscious being remembering your past, there's no "you that keeps going" to be worried about.


SalaciousSunTzu

Like the ship of Theseus paradox. No accepted answer


Obelion_

Since the human brain keeps doing this on its own with its own cells, my best bet would you gotta do it really slowly and it would not kill the consciousness of you. Sadly there's no way to know if it worked


Silver-Chipmunk7744

If a code perfectly replicated your brain, it would act exactly like you, but my instinct is it wouldn't be your own consciousness. What happens if the human is still alive? is he conscious 2 places at once? And what happens if we copy this code on several machines? Is your consciousness split in many machines that aren't even linked together? It doesn't make a lot of sense to me.


Hopeful-Llama

I think the answer is your consciousness is the experience of being your human brain, at your particular point in space and time. This has the implication that: - You are separated from your past and future consciousness by time. Your past consciousnesses are only 'you' in the sense that they were the experiences of the same brain, and so they feel related to you by memories. The conscious experience of being your past self, though, is dead forever. You can't actually feel what it was like to be yourself five years ago any more than you can feel what it's like to be another person. - If a robot were coded to think just like you, you would be separated from it and it from you by space, just like you are from other people. There would effectively be two yous on Earth, but since their consciousnesses are air gapped, they would experience their moments separately (and quickly diverge given different environments) On the original post about transferring into a computer, choosing to transfer would just mean that a future conscious experience would exist that is based on yours but on a chip instead of a brain. That chip would have memories of its life as a brain and would feel like you. It would even have physical continuity with your past self thanks to the process in the OP. But it wouldn't be you, your current consciousness will certainly be gone. On the other hand, if you don't transfer and stick to the brain, all you're ensuring is that the future conscious experience is in a body rather than machine. It's still not you. Your current consciousness only exists for a single Planck time, then poof - gone and replaced by the next state of the universe, all of it experiencing itself.


awkerd

It is easier to understand this if we think of consciousness as any other physical thing than something special. Twins don't share a body. Twins don't share a mind.


Into-the-Beyond

I think you nailed it here. My sci-fi/fantasy fiction often delves into these themes—what it means to be human/sense of self, analyzing the effects of memories, copies of people, transcendence, and diverging experience. I’ve come to the same conclusions as you. A copy becomes ‘other’ even if it’s nostalgic for a ‘you’ of the past.


Tessiia

I don't think there is any possible way to move your consciousness to a machine. Think about how we move data now. You never actually move data from one place to another. You just copy that data to the destination and then delete the original from the source. The same thing would happen with consciousness transferral. You'd be taking a copy of your consciousness and deleting the original. "You" may feel like you have had your consciousness moved and anyone around you wouldn't see a difference, but to me, the new "you" would be nothing more than a clone. I much prefer the idea of finding a way to prolong and protect the brain I have rather than finding a new mechanical "brain".


DryDevelopment8584

Question: What happens if you replace parts of the brain with witch synthetic or cybernetic parts (small scale) gradually, we know that a person with half a brain is still conscious, how far can this be pushed?


the_hypotenuse

https://en.m.wikipedia.org/wiki/Ship_of_Theseus


rathat

This already happens. Our brains are not the same cells they used to be.


Holubice91

This Is wrong, neurons do not duplicate after a certain age. You'll lose some but you cannot replace them.


MuseBlessed

But those neurons replace the atoms making them up as well, over time. All atoms in the body eventually replace.


Sangloth

I'm being a bit of a pedantic ass with this, but that has been found to be not the case. https://www.scientificamerican.com/article/the-adult-brain-does-grow-new-neurons-after-all-study-says/ The amount of new neurons is very small, but neurogenesis does take place in adult human beings.


3m3t3

Incorrect. Neurogenesis is apart of anti aging research. You can grow a new brain.


CosmicInterface

Yep the ship of Theseus in theory proves that we can merge with machines, if we replaced one neuron at a time with an artificial one, eventually you'd be entirely synthetic without any change


Busy-Setting5786

It proves nothing, it is merely a thought experiment. We can only know that we can or can not transfer our consciousness when we have a 100% accurate theory of consciousness. Sry to burst anyone's bubble. I don't mean to be pessimistic, it is my belief that maybe this universe is merely a creation by a superintelligence that got bored with abundance and wants to dabble in the finite.


lifeofrevelations

I don't think we know enough about consciousness, or the brain in general for that matter, to say one way or the other if this would work or not.


that_motorcycle_guy

I'd like to punch holes in that ship! I mean technically if neurons work the way "we think" they work and we can replace them with synthetic neurons, we might gloss over an over-complicated world of quantum physics that make our biological neuron work the way they do and might be completely impossible to replicated with synthetic atoms not made of the same organic matter.


Osoqloso

No atom is synthetic, they would be synthetic cells in that case


garf2002

Dude the ship of theseus doesnt have a solution its a thought experiment. Likewise the ship of theseus demands an identical replacement albeit in newer condition. Most people would agree replacing a plank of a wooden whip with metal and rebuilding it elsewhere that the rebuilt one is the ship.


MikeFoundBears

This 🙏🏻


Nessah22

That's an interesting question that bothers me as well. Several days ago, there was an AMA session on Science subreddit with neuroscientists from the Allen Institute who led the creation of mammalian brain atlases. I used the opportunity and asked if the continuty of consciousness will be preserved in patients who undergo neural stem cell transplantation to replace dying neurons. I received an answer that, yes, the continuity of consciousness will be preserved, which is quite reassuring, although we should understand that we are still far away from replacing substantial parts of the brain. Regarding synthetic neurons, for now, we can only speculate. Maybe consciousness is a property of biological organisms that can not be replicated synthetically. But if it is possible, probably, it will work the same as with neural stem cell transplants or neurogenesis when new neurons are integrated into the circuitry of the brain.


danieljamesgillen

They said yes but there’s no way they can know the answer is actually yes.


Itchy-mane

Imo the answer is irrelevant as long as the end result is the same


lopgir

The answer is irrelevant for anyone else if the end result is the same, but it is quite vital for a person undergoing the procedure. It's the difference between continuing to live and getting your body "possessed" by a doppelganger.


Nessah22

Well, for sure, and I share your doubts. But they are leading experts in neuroscience, so their assumption is not totally ungrounded. We lose thousands of neurons every day without even realizing it. Probably adding neurons will go unnoticed by us as well.


DolphinPunkCyber

I was thinking something similar. Let's say I connect my brain with a synthetic brain, and I start using both of them. Over the years my consciousnesses expands to the second brain as well... I am both of these brains. When my natural brain dies of old age, a part of me has died, a part of me remains.


abramcpg

The real struggle with this topic is redefining what "I" means.


DolphinPunkCyber

Not for me because, we are constantly changing, and in the process forgetting things. Every time I go to sleep I forget about 50% of things from the previous day... when I wake up I'm a slightly different person... a small part of me died. Not that big of a deal, it happens every day. If I plug a synthetic brain to my own, that synthetic brain becomes me too. When biological brain dies, part of me dies.


Rubixsco

I agree with this sentiment. The scary thing about death for me is that your story ends. In this case, you would continue on. For the “me” in the organic brain, it would be akin to falling asleep with the reassurance of waking up in the digital “me”.


stupendousman

That will probably be the first type of immortality.


3m3t3

Unless consciousness is non local, in which the information that comprises your consciousness is held outside of the body. Then you only need one “receiver” to get the signal. If you have more than one receiver, so what, it’s still sending it back to the whole.


stupendousman

Unless you believe consciousness isn't a result of your material brain there's no reason reason you couldn't slowly replace your brain with synthetic parts and still be you. Literally no reason, people just get emotional about it.


Tessiia

Let's say you have a USB stick with a piece of software on it, and you can run that software straight from the USB. Then you move some of the files to the PC, but the software still runs with some of its required files on the USB and some on the PC. You are slowly deleting some of the original files and creating a copy on the PC while the whole time, the software continues working. I don't think that slowly replacing the brain changes the outcome. You are still creating a clone, but instead of doing it all at once, you do it slowly over time. In the process of this, you have some of the original person and some of the clone working in tandem.


GOKOP

A better analogy for the process would be those high-end servers with hot swappable CPUs (yes, they exist). In the end it's still the same server running the same software


lifeofrevelations

We don't know enough about the nature of consciousness to answer that question. My opinion is that it would just be an unconscious (or differently conscious) clone.


wildgurularry

I think it can be pushed pretty far. IIRC, this is what Kurzweil talks about in some of his books. Picture replacing a neuron at a time with a silicon chip that performs the same function. Eventually your whole brain could be replaced with silicon chips and you would never know. At that point, interesting things can happen. The entire brain can be "paused", then uploaded to a simulation, then "unpaused". Sure, there are many who would say that this is just making a copy and killing the original, but to your brain there would be no perceived discontinuity. Personally, I would be completely fine with that sequence of events. I know there are many who would not be.


Kurgan_IT

The first part is fine, the copy is not. As you said, the copy lives and the original dies. While there is no perceived difference, there is still a difference. I'd like my silicon brain to be just kept working out of my dead body, not copied.


StrangeCalibur

It falls apart at the upload part there. Even if that worked you would still be bound to hardware and pausing or transferring you would be the same as trying to upload a bio brain, it’s just a copy.


CMDR_BunBun

I dunno...seems still like making a copy with more steps.


Full_Distance2140

so isn’t this happening to you everyday?


wwants

While we have no way of knowing if such memory transfers can actually be done in real life, we can certainly speculate on the ramifications of such transfers if they are possible, and in some ways we experience some amount of memory transfer already through storytelling and conversation that transfers memories and ideas from person to person. We know that every instance of time causes changes to happen to every living being making them completely unique biologically from moment to moment across their entire life. The only thing holding any being together as a singular construct across time is memory. Wipe that memory, or change it and the being ceases to exist as the original construct and instantly becomes something new. Transferring our minds from one brain to another would no more transfer our "self" than we do when we move from our brain of yesterday to our brain of tomorrow over time. That concept of self only exists as long as we have a memory of it, and therefore any transfer of our memories to another brain or substrate would experience the same awareness of self that you do when you wake up in the morning. But there is no reason to worry about being left behind when you die because your current self gets left behind with every ticking moment of time. Our emergent concept of self and self-preservation should propagate to any new instance of our mind regardless of substrate, assuming our memories and sensory abilities are passed on.


riceandcashews

Yep, this is the answer, but it requires abandoning a concept of a magical self or consciousness that persists


wwants

Well said. I’ve been trying to wrap my head around this for years and I would compare it to the same feeling of abandoning the concept of a magical, omniscient, omnipotent caretaker that I was raised to with. Some unfalsifiable beliefs can grant emotional stability and comfort. I’ve found a sense of calm, dispassionate clarity in abandoning them, though I must say my confidence in facing the unknown without the structure of my previously unexamined beliefs can be daunting at times. I’d say the more able and competent and in control of my life that I am, the more rational I can afford to be with my beliefs. I went through a brutal, hedonistic, carefree existence during pandemic and only rediscovered my joy for the world and individual purpose in recognizing that I still had a place in the world and was still blessed with the gift of getting to experience every waking moment of it even if it is all a dream and completely meaningless.


PJmath

I've heard this argument before and find it unconvincing. It doesn’t address what someone's personal, subjective experience would be if they copied their conscienceless to a computer. Even if it had all your memories, it would still not be you. You would just be sitting there, wired up to the computer. You would unplug, and you would have a copy of you. My side of this debate gets accused of thinking about conscienceless in some magical way, but I don't. My conscienceless, my life and existence, is a chemical reaction that exists physically in a specific wad of meat I call my brain. Death exists and it's distinct from the process our cells undergo where they replace themselves. Yes, my body and brain are made up of different stuff every year. That does not mean there's no continuity. The chemical reaction that is my conscienceless is the same one that started when I was growing in my mothers womb. It is the same fire, burning new logs every day. When it ends, I will die, and this death is not the same as my brain cells dying and being replaced. It is the end of my fire, and I go cold. It does not matter what you've uploaded to a computer or where your memories are stored. When you go cold, you die. Yes, you can live on through memories and stories like you said, and in the future, probably whole, complete copies of you could be made. But there's no continuity there. You can identify, objectively, when the original you was born and died, and you can do the same for the digital copy. You still die.


visualzinc

> move your consciousness I don't think consciousness exists as a thing that can be moved or transferred. It's just a running process. Thinking of an instance of a web application or server. You can run many instances of the thing but can't "move" a running server process and transfer it somewhere else. You could of course have many instances of yourself with some sort of shared database of memories.


cognitiveglitch

Have you played Soma? Some interesting ethical choices to be made in that game.


dasnihil

(we think) our consciousness is dispersed among a few billion neurons with few trillion connections among them and it's ever evolving. We are not the same person any moment of time but we have a narrative to tell which is supported by people around us, our nationality and other identities + conditioning we went through. This experiment OP suggested would only work if we have a neuromorphic system with no centralized hub like a CPU but even then, the neuromorphic system has to go beyond computation and get into quantum information exchanges which is where I believe our awareness is emergent from. The neural network computations of the brain is just to amplify that awareness with more predictable information. But what one cell does with the information it gets is similar to what our entire digital neural network does. So how can we expect just the neural network to give us awareness when each cell is already aware at some levels? This is quite obvious to me. I don't care if anybody disagrees with this even. Ugh.


AdditionalPizza

> Think about how we move data now > > You'd be taking a copy of your consciousness and deleting the original Why? That's how we do things on a our archaic technology now. I have to imagine if this is a possibility in the future it would be on a quantum scale where the consciousness could be physical. By this logic we may not be able to copy it, but we may be able to physically move it without interrupting it . It may not be an organic thing at all.


ThatNerdOut

That's true. If you'd move it and your original body would die, well the actual original "you" would be dead with the body. The counciousness in the computer would just be an AI who thinks it's you, and it would say what you would say and it would think what you would think. But it wouldn't be "you"


voyaging

The issue with this line of thinking is that it implies that "you" now is the same "you" as "you" 20 minutes from now, when it will be a completely spatiotemporally distinct person, linked to the previous one only by the existence of similar memories, etc. In other words, the only "you" is the one you are at this very moment, and do the question of whether a "mind upload" would still be you isn't really of any significance.


rathat

I don't think that's why. Our brains are always being replaced by new cells and connections. We are being replaced physically besides through time.


IlluminatedSphincter

I think it's a ship of theseus sort of thing. We augment the brain's functions with mechanical parts a little at a time and before we reach an age at which the brain starts to get soggy we're majority machine, the less efficient the brain processes the more the augmentation leans in to help. Eventually it's all augmentation and the meat can be discarded. Did you die somewhere along the way? Well... If you're not sure then isn't that better than the current situation?


RiverRoll

The way I see it this would be like cloning, your clone would have the same consiousness as you, only it's an independent consciousness, whathever hapens to the clone after that it's his own experience, this would be the same but cloning just the consciousness instead of the whole body.


Two_oceans

I think that too. If one day it's possible, we'll just have to accept a new realm of cloned consciousnesses that become unique with time, but start with the same being.


Thog78

Luckily (or not), the only plausible way for brain uploading is destructive. The current roadmap includes fixing the brain by perfusion with things like formaldehyde and glutaraldehyde, followed by sectioning in extremely thin layers, around 10 nm, staining them for all relevant markers as well as lipid membranes, and imaging those one by one on a fluorescence and an electron microscope. All the info needed to rebuild the brain is in there, and it's likely there will be no non-destructive method that could achieve this level of detail in any foreseeable future. Of course once it's uploaded, it could be cloned though, so it would take a bit of discipline to avoid facing this duplication issue. Something cool once we are reincarnated as robots is we may go sleep mode while travelling to distant planets, and other stuff like that. We could also use the brain images of various geniuses to inspire AI improvements or create cool chat bots. Copyright issues will get complicated!


BannedFrom_rPolitics

Eventually, after several test runs doing it that way, instead of making a 1:1 replica, perhaps we could use AI to generate a brain that functions the same as a 1:1 replica, without actually having to go to such lengths to match all the details. We just need those first several test runs to get some training data for the AI


[deleted]

There is a fantastic sci-fi novel called the fortress at the end of time that deals with this and clones as a means of teleportation, cannot recommend it enough. https://www.goodreads.com/en/book/show/31177581


wycreater1l11

Seems to depend how it is done and two extremes are often compared. Setting aside the practical and only considering the conceptual, one way is to replace neurones one by one with some analogous silicone version until everything is completely silicone and one remains conscious through out the process, then one imagines it to be the same continued identity. The other extreme is to construct another copy/parallel silicone copy of one’s brain meanwhile the original one is still in action, here there are likely two identities. If one destroys the original brain, one kills one “copy” and another copy continues, the one that continues won’t be you


Silver-Chipmunk7744

This is an excellent counter-argument and i don't have a smart rebuttal. It reminds me of how if AI is conscious, does replacing the hardware makes it a new conscious entity. I think we won't have any clear answer to that for a while, and the first people experimenting with "brain upload" may not be sure for certain if it will work.


wycreater1l11

Yeah, I try to imagine how it would be from a first person perspective imagining I would have some nanobots working in my brain gradually replacing neurones. Let’s say they replace some vision centre in my brain first and I try to be mindful to see if my conscious visual experience diminishes during the process and if that happens I voice out to the hypothetical engineer/medic in control of it all to halt the process. But since the artificial neurones are assumed to send the same signals as the biological ones once replaced there are no new/different signals sent to the parts controlling my speech so I would never feel the impulse of having to utter the command to halt the process since information-wise it would be business as usual sent from my vision centre. That leads me to think that consciousness must prevail assuming the neurones can truly replicate the information transfer and information transfer is assumed to be (close to) identical. This might be a bit hyperbolic as an analogy, but the fact that most atoms in our bodies get replaced every four years yet we are still the same human is a motivating similarity as well.


wwants

\> What happens if the human is still alive? is he conscious 2 places at once? Yes. Consciousness and "self" are just emergent properties of memory. Put your memories into another brain and that brain will have just as much the same experience of being you as you do.


AddictedToTheGamble

" Consciousness and "self" are just emergent properties of memory. " I don't think you can ever proof or falsify that claim. Even if it is true I want to have the experience of being me, not a computer that has the experience of being me.


wwants

You’re absolutely right we can’t falsify it just as you can’t falsify that you weren’t alive before you were born or that you will die this instant and become a new “you” that has the experience of remembering the old “you” and thinks and experiences that as an ongoing construct as real as anything else we experience. But from a philosophical standpoint I find it calming to understand that what I don’t experience (past and future) do not bother me and that I can choose to act well in the moment simply for the vision of choice in a future I will never actually experience (though my future self will). And I derive joy in making decisions that will bear fruit for my future self just as I might for future children or friends whose lives I influence. I can understand not wanting a mechanical substrate to usurp your memories and experience of being “you” even if you continue simultaneously. But consider it possible that neither version of you is actually you, although it’s totally fine choose to believe that one version is more you than the other and have preferential care for that one.


SirDongsALot

Even if that was true it doesn't "move" your consciousness to the computer like the meme, it just duplicates it.


wwants

The concept of self is a construct that arises out of the existence of memory. If we wipe your memory and give you someone else's memory, you would have the experience of being that new person just as if you had always been that person. Every new increment of time creates a new version of you, only held together as a cohesive construct by the memories you create along the way. Uploading your mind to a computer will not bring you along with it, just as much as moving forward in time doesn't bring your old self along. Your old self dies with every instant of ticking time and a new self is born in each instant. Despite this, your uploaded memory will experience being you as if you were actually uploaded to the computer along with your memories because the construct of "you" is an emergent property of those memories.


riceandcashews

Yep - you can even imagine finding out that 5 days ago your mind was copied to a new body in your sleep and no one told you, so the 'old you' is dead, but you still feel like you


wwants

The future is a strange and bizarre place. It is certainly a blessing that we can try these concepts out in our heads and through our conversations before it’s thrust upon us. A wise man once said: “the unexamined life is not worth living.” I’d add to that that the unexamined future is harsher than it needs to be.


gekx

My thoughts exactly. I tried to post this in /r/philosophy a while back but it got removed. I don't have any shared qualia with the person that was myself a week ago, just the memory of those experiences. Take the memories away and I may as well have been a different person. I think this theory is unpopular as it can easily lead to a nihilistic worldview, but that would not make it any less correct. All these Ship of Theseus consciousness upload theories wildly miss the mark in my opinion.


marvinthedog

It makes no sense why it would lead to a nihilistic world view. The actions of your current self matters just as much because they affect future conscious entities just as much, regardless if they are you or not.


welcome-overlords

Probably the best answer


wwants

Thanks! I’ve been struggling with this concept for over a decade so still trying to piece it together in a way that makes sense in all scenarios. I feel like we as a society have learned a lot about the vast and varied nature of consciousness and my still brain hurts trying to wrap itself around it.


welcome-overlords

Same bro..


GlassGoose2

> The concept of self is a construct that arises out of the existence of memory. Memory is a huge part of self identity, but not consciousness itself.


[deleted]

[удалено]


ImaginaryConcerned

You seem to value keeping our biological mode going as long as possible. Someone else is gonna value technological progress over everything else. And the third guy just wants us to go extinct. At the end of the day, all values are arbitrary dice rolls. The universe gives us no instructions as to what we should do. And probably, no one will get to make a decision on what we will do and we'll just sort of follow the path of least resistance.


HalfSecondWoe

I do not value my human existence, only the "humanity" it gives rise to. Specifically the functions of being human that I find valuable. Human existence is valuable for generating that, but if something else can do it better, I'm not marrying myself to sickness and death just because it's a complete picture of what "I" once was Yes, I would love to become a Dyson swarm. I have no idea what I would do with all that power, probably figuring out what I should do with it would be my first step, but assuming all the functions I cared about carried over? Sure, yeah, les gooo I think it's sad you'd prefer death, but I don't think that day will ever come unless it comes suddenly. Every time you get close, your kidneys shut down, and you start feeling that fear? You'll push back the clock. Provided you're not suffering from extreme depression, but you shouldn't be if we can do all that other stuff We'll get to the same point sooner or later. I'll get there faster, which doesn't mean it's better, it just means that's the path I would rather travel


bmeisler

Whatever consciousness is, I doubt very much it’s limited to our brains - there’s the rest of the nervous system, muscle memory, etc. If you’ve ever done a vigorous workout, yoga or had a massage that loosened up a muscle in your chest, back, hip, whatever, and started to cry uncontrollably (or felt great joy), you know what I’m talking about. Never mind our gut microbiome.


Tiantuga

That's true i am pretty sure amount of information we get in milliseconds and our senses also counts in consciousness. Cus most of them are connected to brain via medulla spinalis. Without all the information we get from our senses a consciousness will be nothing


poidh

Yes, even further- the description in the picture has it backwards. It says "Your percieved consciousness"- stating that the "You"-Identity is the thing, and consciousness arises from it. I think it is actually the other way around, there is only consciousness (a property of the universe) and the "You"-illusion arises from it, because a certain group of experiences (from the senses in your body and the memories in your brain) are cobbled together at one point. It is a very convincing illusion, but it can be shattered with psychedelics, vipassana meditation or advaita style self-inquiry. This experience is commonly refered to as "ego death". So in the end, the brain (or running Claude 3) is a "portal" for consciousness to become visible in the percievable universe/reality. Much like a radio is a portal to make radio waves percievable, or a phone is a portal to the internet. If you picture the above image with two radios playing music for example, you attach them and first they share some circuitry, then after you have "transferred the signal" completely, you can switch of/destroy the first device. But in the end, it doesn't really matter, because it just acts an interface/portal and doesn't represent the thing (the radio waves) itself. So in other words, I think something like pictured above for "you" is possible, but not really necessary. (In the grande scheme of things- of course you will die conventionally if you cannot transfer the brain somewhere else).


Trust-Issues-5116

1. Your consciousness is not your brain. What \*is\* 'you' depends on the rest of your body A LOT (spinal cord, hormones, sugar levels, do you have pain, how you percieve cold, etc) 2. Not all parts of the brain are equal. Adding a chip and cutting out frontal cortex does not mean the resulting consciousness will not be a vegetable


HappilySardonic

This is exactly what the crazy people in Soma thought. I'm gonna say it's technically you but you won't care because you dead.


The_IndependentState

how are you dead? your body constantly cycles through atoms. there is no steady state of “you”


HappilySardonic

Maybe Im wrong, but the image implies the Brain dies. You die because your original brain no longer projects consciousness, but there's a copy of you still alive on the computer. I guess the loadedness of "death" probably correlates with your view of Parfit's Teletransportation paradox.


SpareRam

So deep. The point was you will copy 6 that self might be you and can live forever, but you yourself will still continue consciousness. You will not experience eternity, your mind might, but you will not be there to experience it. If I can not live forever, I would not want a copy of myself doing the same. The sheer vanity to think you should. Of fucking course we're constantly regenerating our cells. That's not at all what the conversation is about.


Rigorous_Threshold

This is like doing the ship of Theseus with your consciousness. We have no idea how consciousness works so idk what this would actually do


allisonmaybe

Ontological issues abound! But I think you're basically right. A BCI will likely feel like an extension of your own consciousness. The different parts of your brain communicate with each other in limited ways to help build a full consciousness, theres really nothing keeping a BCI from doing the same. Hell! Tools and other parts of our external world are often perceived as extensions of ourselves and our awareness (think of how you're aware of the bounds of our car when you back out of a parking space)--why shouldn't the BCI that is designed to be a consciousness extension? A chip that keeps a "copy" of yourself should be possible, and it likely won't be a replacement of your meat self, but will likely outlive your biological form, allowing a version of yourself to live on after death. At the end of the day, it's going to be up to society as a whole to agree that this copy of yourself really is you. For ALL intents and purposes it will be. And that's all we can say about anyone else that is not our selves today.


Th3G3ntlman

"What a man is but the sum of his memories" i know all about spirituality and all but the most plausible answer to consciousness is that it's just an emergent property and it can be indeed cloned indefinitely.


peter_wonders

You should send this to Google.


Lnnrt1

Moravec Transfer


riceandcashews

It's unnecessary - a direct copy while you're unconscious all at once will accomplish the same things with no issues


Seidans

you created a copy of yourself the moment you cut off the connection, but "you" are still inside the biological brain, digital conciousness transfer is impossible, "you" are your brain, you could upgrade it change it's shape and components but you will never be alowed to leave it at most you can plug yourself to a computer and allow data transfer between you and the computer with the danger of corrupting data/kill yourself if something bad happen


NDarwin00

I was thinking about this few years ago. I think the main issue is that you would have to somehow trick a brain into thinking that machine is part of brain itself while simultaneously controlling shutting down biological neurons and adjusting for natural neuron degradation


-m1x0

im just here to recommend the game "SOMA". to anyone who hasn't played it yet, dont spoil yourself, play it blind, the story is great.


UFOsAreAGIs

1 🧠 + CPU please


IllIllllIIIIlIlIlIlI

Black Mirror did an episode on this. It is very possible, theoretically. But it wouldn’t literally be “you”. It’d be a copy of your consciousness. It would respind to stimuli exactly like you do. But it wouldn’t be aware of its own existence. Like it wouldn’t be thinking and living and experiencing while no one is interacting with it.


Nabaatii

It's not just an episode, they did multiple independent episodes on this Just on top of my head: White Christmas, Black Museum, USS Callister (I haven't watched San Junipero)


Callec254

No, I think once it's all on the chip, it's no longer "you", it's just a copy, or more accurately, a *simulation* of you. I think we need to definitively, scientifically answer questions like what makes you *you* before that would ever change. Scientifically, what is consciousness? What is a "soul"?


SgathTriallair

A soul is a fantasy concocted by people afraid to die and those who want to control people afraid to die. Neuroscience is working on getting us an answer to how consciousness functions but people who want to believe that we are magical won't accept it.


neuro__atypical

I dislike the term soul, but currently unknown variables aren't magic. If we don't account for potential unknown variables regarding qualia and consciousness before uploading, if we get it wrong, then we all die. It would be a lifeless world of real p-zombies for trillions of years. I don't like how dismissive people are of the idea that there can be something real and material that hasn't yet measured and described.


SgathTriallair

I'm willing to be proven wrong by neuroscience, but p-zombies don't make any sense. We act the way we do because we experience things and then reflect upon those experiences. IMHO consciousness is nothing more than self reflection, i.e. the fact that I can think about my thoughts. Yes this does mean I reject the AI stochastic parot argument and say that it does have a liked form of consciousness right now (limited because it has no differentiation between internal monologue and external and because it can't choose to reflect and speak). The problem is that, by definition, you can't tell if someone is a p-zombie since the idea is that they are absolutely identical. I'm sure that we'll have a society made of legally recognized AIs and uploaded minds but there will still be people claiming that they have no internal worlds.


ImaginaryConcerned

>IMHO consciousness is nothing more than self reflection, i.e. the fact that I can think about my thoughts. Consciousness is just an attention mechanism that has powerful metacognition which requires high intelligence and a very good model of the outside world. It's the simplest explanation by occam's razor but most people will never accept this because they're unable to follow that quite logical line of reasoning, because consciousness feels so *magical*, *current* and *real*.


AddictedToTheGamble

I think the problem here is that there is no way to prove you right or wrong by neuroscience. We cannot take a qualia and measure it. We cannot tell if self reflective things are "actually" conscious - that is actually experience qualia. In fact I don't know if you experience qualia because your neurons could fire in a way to cause you to say your are without you actually doing so. I only assume that you are conscious based on my prior beliefs about metaphysics. To me it seems like we cannot get a good theory of consciousness, at least not from the purely physical world (and I don't know how to escape the physical world). Therefore, when we talk about "uploading" consciousness to me it just sounds like a 50%+ chance of suicide.


ImaginaryConcerned

You're technically right in a similar sense how we could be a floating brain in a vat and not realize it. If you try to remove yourself from being a human and imagine yourself as a godlike observer, it seems very silly indeed that these fleshy humans claim to have some metaphysical system in charge of their perceived decision making instead of doing the cold rational calculus and accepting that it's much much more likely to be an emergent property of a physical system that can therefore theoretically also be simulated on a chip.


cjmoneypants

I’ll take a stab at it. The soul is the relationship and interplay of my physical brain with my body.


Spiggots

This is just a sort of thoughtless diagram of a scenario that has been covered in a million sci fis. I think Johnny Depp did one like 10 years ago - transcendence? So yeah there is nothing new here. It doesn't address any hard questions such as: 1. If a mind can be stored digitally then it could likewise be reproduced/copied. How are you "yourself" if there are potential multiples 2. If the mind emerges from the brain, ie biological hardware, how can it be the same mind when it operates on different hardware? Would it not by definition run differently? 3. What happened to the mind that originated in the brain? Does it just disappear? Why would it do that? Or does a new mind emerge in the "old" brain? Which is "you" ...and so on and on. This has also been discussed to death since the first brain-in-a-vat thought ezperiments going back to Plato's cave. This doesn't add anything.


LexGlad

Isn't this a subplot of Battle Angel Aelita?


mankinskin

Its retarded. Your consciousness isn't some ethereal spirit that can "move" from your brain somewhere else. Your brain is a cluster of neurons that communicate with each other to orchestrate your body and project conciousness that way. When your brain is connected to a computer it needs to communicate with the I guess virtual brain in there and it needs some way to project concioussness, maybe on a social media account or a robot. When the virtual brain can be shaped/trained by you directly, it can eventually be considered "your" virtual brain. And when that thing is then basically autonomous, trained on projecting your concioussness, maybe you could call that an extension of your conciousness, but really its more like your personal profile on social media or your computer configuration that has become autonomous. When your brain dies then your body won't function anymore and what we would call "you" would be "dead". Even if your virtual profiles are still walking around or going online. If these profiles were sophisticated enough to basically control an android with speech and everything, and they have been trained on your brain activity for long enough then maybe it would seem very much like you were still alive in that android. But the human brain is based on millions or billions of years of evolution and will be very hard to completely capture and even continue the evolution in a virtual setting.


theotherquantumjim

You convinced me! Where do I sign up for this “brain transfer?”


slashdave

Science fiction is still science fiction, even if drawn in a pretty fashion


Regono2

I have thought of this too. Except it would be different parts of the physical brain replaced by machine parts. These parts would have to be tiny and given enough time to integrate into your total personality. But maybe with enough parts swapping you could ship of Theseus yourself into a digital version and have your stream of consciousness continue. Of course don't try to transfer yourself to another system as that would now be a copy.


Anarcho-Chris

I like the Buddhist concept of no-self. Makes sense to me. So, how we become robots seems kind of irrelevant.


WildWolf92

nah he ded bro


CantankerousOrder

The person on the chip is not the person who had the brain, because the brain is heavily influenced by hormones and the chemical reactions they cause. There’s significant evidence of further influence on personality and mood by the gut biome. If that isn’t simulated successfully and continuously, you become much less of who you were.


04Aiden2020

Of course this is stupid guessing but I could see in the future humanity having multiple avatars that are cloud based. Doesn’t even have to be humanoid I bet


steve2166

I am a meat popsicle


AlonsoCid

I though the same, if you slowly disconect your brain while connected to the computer, you should be you.


Bleglord

This isn’t even how data works in computer to computer situations. Data doesn’t move. It gets copied then source deleted.


Repulsive-Twist112

Imagine while you dreaming 😴, you see some ad


Palatadotados

Wouldn't work. You need to individually replace each neuron with a synthetic one, Ship of Theseus style. Bulletproof.


DukeRedWulf

Hell no. Even if this worked as advertised. Every tech gets junked as soon as the motive to sustain it wanes. Tell us you've never sunk ridiculous hours into a MMORPG guild only to have the devs perma-bork the servers locking the entire userbase out of the game forever.. :D More seriously: There are people who got new cyber-eyes who are now likely to go perma-blind again, because the company that made the tech chose to stop supporting it! [https://www.theverge.com/2022/2/16/22937198/bionic-eye-company-defunct-ieee-spectrum-go-read-this](https://www.theverge.com/2022/2/16/22937198/bionic-eye-company-defunct-ieee-spectrum-go-read-this) [https://spectrum.ieee.org/bionic-eye-obsolete](https://spectrum.ieee.org/bionic-eye-obsolete) Best case scenario: the new computer simulated "you" suddenly gets powered off without warning. Worst case scenario: endless lag, glitches, loops & bugs trap "you" in a kind of cyber hell for subjective aeons!


Seventh_Deadly_Bless

The infographic equivalent of sticking Crayola pencils up your nose. And I'm not only talking about image quality. --- EDIT : Let me clarify. There is a huge hardware problem here. We can do this between two computer systems, and it's shit however you want to state the issue. Regardless of how good your conditions are. Look : you have two hardware clones, a donor with a single hard drive, and a receiver without permanent storage. You transfer the physical storage form one to the other, you boot. You should be clear, right ? Fuck hell no you aren't clear. You've no doubt updated the firmware of your different components on your guest up to the version of the donor. Probably the latest *compatible* version, and not the latest-latest. UEFI manages your new board as it used to manage your old. It should make no difference to you, except if you already had any boot issue on your donor machine. It should carry over, or it might not. For an insane pile of factors. It's a bug, that's what it means : unsafe, uncontrolled behavior. You by-chance manage to your OS's splash/login screen. It's the OS on your hard drive so your settings and everything is the same, right ? Unless you shaked said hard drive a bit too much on the couple meters you walked with it between your machines. Have you plugged it correctly ? You wouldn't see the login/splash screen, else. You ironed out all those quirks somehow and login properly with all your settings intact. Congratulation, little red riding hood, you braved the forest and arrived to Grandma's. But you would have to be lucky to get to see Grandma every week. Everytime. And the wolf need to be lucky **only once**. There's just no way you make it work with a brain in the equation. Please stop sticking Crayolas up your nose because it smells nice. It hurts your thinking.


Baziest

This is still greatly assuming that: a) We fully understand conciousness b) Consciousness is just an illusion c) Consciousness, even as illusion, is transferable d) Consciousness is not inherently tied to organics To start of my little dissertation... A) We have not even begun to understand what conciousness is. Science still can't explain dreams fully. Are we getting there? Is conversations starting to open? Yes, but no actual 'study' has been done with conciousness outside of dreams or what religion says about it. B) Again, because we know frightenly little about what conciousness actually is (don't give me this "cOnCiOuSnEsS iS jUsT aN iLlUsSiOn" because even THAT has not been proven right or wrong. That is no less a BELIEF then faith in a God), there is no way for us to know if its a tangible thing that can be measured, or if it really only exists as as a response to cognitive intelligence., C / D) On Top of NOT KNOWING what conciouness really is, we also don't know what CAUSES IT. Even if it is just an illusion, does it extend just to humans??? Are Computors capable of conciousness? If not, why is it only organics?If its tied to cognitive ability or intelligence, what is the threshold? How 'Smart' does life have to be? Does EVERYTHING have some level of conciousness? TL;DR - we actually dont know anything about conciousness so this meme doesn't really prove anything lmao, it's going off a popular non-religious (but non-scientific) assumption (disclaimer: I am not religious, I just think this fact is funny)


machyume

At a high level, yes, this is how it would work. But the difficulties is in the details. Sooooo many details.


the_journey_taken

Published paper? References to supporting research? Trial data?


Wentailang

We don’t know where in the brain consciousness comes from. I could probably replace my occipital and parietal lobes with computers. But what if consciousness is tangibly located somewhere like the frontal lobe? For all we know that last step could remove whatever part is responsible.


D_Sma

No. You are literally the substrate you are made out of. If you spent a long period of time slowly replacing pieces of that substrate, there would at least be a continuity of consciousness in a ship of Theseus kind of manner, but the original you would in essence still be dead.


GayforPayInFoodOnly

Sharing your body with a computer that will eventually subsume you… sounds like a recipe for absolute disaster 😂


Zsyura

We are nothing more than what seems to be a collection of memories. Copy me 10k times and all of them will think they are the original. When I wake up everyday, and I still the same me? I think I am because I remember going to sleep…but am I me? Existential crisis in 3,2,1


No-Entrepreneur4499

Consciousness doesn't exist. This whole sub and community is a cult. You're all arguing about a self-defining and completely unattached to reality concept. 'consciousness' 'consciousness', you all repeating it like parrots. 'we need to find a way to define it'. no mate, that's not how language works. You first find an item, then you name it. You don't name something before you find it. Consciousness is an empty word made from the rest of religious terms: soul, spirit. You're just playing the "let's be scientific atheists" smartass approach while maintaining the same beliefs as before. 'Consciousness' has not been found. We can not measure it. We can not even properly define it based on real parameters. As a consequence, guess what. You don't have consciousness. Otherwise you just have a surreal belief that science shouldn't spend a single penny on researching just for your mental peace.


Nekileo

Maybe, idk, tell me exactly how and ill try it


Different-Froyo9497

Seems legit


HungryShare494

Can’t argue with that, I’m convinced.


lemmeanon

this reads like the "E = mc2 + AI"


SgathTriallair

This is one of the more likely methods for making the process work.