Bay 12 Games Forum

Please login or register.

Login with username, password and session length
Advanced search  

Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.

Messages - LordBucket

Pages: [1] 2 3 ... 373
1
General Discussion / Re: Thoughts on Technological Immortality
« on: March 13, 2016, 12:18:24 am »
every time you do, the steam condenses on a sheet of metal above my bucket and the water drips in. One handful at a time, until your bucket is empty and mine is full.

But you're not doing that in an upload process. There is no "moving" of things from one place to another. How do you scoop up a handful of consciousness and move it from one place to another?

Quote
Though honestly, to me, all metaphor for consciousness from your bucket of water to the Ship of Theseus to the Grandfathers Axe break down because none of them have a sense of self. There's no ship-mind wondering if it's still the same ship (the closest you get would be the crew, and in that case, yes it is the same ship because it is the same crew. So what happens if you get a new crew? See? It breaks down)

Yes, that problem also exists. Yes, it is not illustrated by the water bucket analogy. *shrug* There are all sorts of problems with the "copy the brain to transfer consciousness" concept, and still others we haven't even talked about. For example, why do the people who subscribe to the "self as pattern" notion not worry about consciousness entanglement? If you're really "pattern" that can exist in multiple places in multiple formats, wouldn't you expect to be observing from those multiple places all at once? Or for that matter, if "self is pattern" then don't you die every millisecond of every day? The configuration of your brain is different NOW than it is NOW. What's so special about that one particular configuration at the moment you uploaded? Isn't it going to die the millisecond after you do? Or are you planning to maintain the same configuration forever?

That one example was not intended to address everything wrong with the brain-copy-upload concept.

Quote
Now how an ACTUAL slow transition that's not just really slowly killing a person would "teleport" a mind into a computer (i.e. the "Bolt on hard-drives" method, or the "replace neurons with nanomachines" method), it's because there is a... I guess the best term is "temporal gradient" over which it's impossible to pick out a single moment where the mind is NOW machine when the previous moment it was organic.

That method is interesting, but it actually avoids the transition. Let's imagine you replace neurons one at a time, let's say it works, and so now your entire brain is nanobots. Now what? You're still not software. Maybe you can do something with that, and maybe if your goal is exclusively the thread title rather than what people typically mean when they discuss uploading, maybe that's all you need. Your bones can use potassium just as well as they can use calcium. if you swap out calcium for potassium, they're still your bones. If you're happy replacing your meat brain with a metal brain...maybe that could work. I would still have concerns, because as mentioned previously in the thread, I don't think that "you are your brain." But this method does address some of the issues.

If you're like my original example, light shining through stained glass windows...if you replace the silicate glass windows one a time a time with plexiglass, it's still the same light shining through them.

2
General Discussion / Re: Thoughts on Technological Immortality
« on: March 12, 2016, 11:18:27 pm »
So that's argument in favor of the "slow transition" premise?

No. "Slow transition" is just hand-waving to avoid the issue.

Imagine that you and I have identical buckets. Imagine that your bucket is full of water, but mine is empty. Imagine that you pour the water from your bucket onto hot coals and it becomes steam. Imagine that I fill my bucket with water from the sink. Is it fair to say that the water from your bucket teleported into my bucket? No, of course not.

Now, imagine that we have the exact same setup, but rather than pouring your bucket of water onto hot coals, instead you scoop it out one handful at a time and pour only a handful onto the coals. And every time you do, I fill one handful of water from the sink and put it into my bucket. One handful at a time, until your bucket is empty and mine is full.

Does doing it this way cause the water to teleport? Of course not.

So why would slow transition result in your consciousness being uploaded?

The whole "copy your brain to transfer consciousness" idea is predicated on the completely arbitrary assumption that consciousness is an emergent property of a configuration. Not "of" anything, just "a configuration."

For example, think of a book. How about, the Velveteen Rabbit. That's "a book." But when we say that the Velveteen Rabbit is "a book" we don't actually mean that it's a physical bunch of paper. Really, it's "a story." For example, if you have a copy of the Velveteen Rabbit on your nightstand and I have a copy of the Velveteen Rabbit on my nightstand, we'd both probably agree that both your copy and my copy were "the" velveteen Rabbit. And, if you burned your copy and downloaded it to an electronic tablet, we'd both still agree that the electronic copy on your tablet is still "the" Velveteen Rabbit.

"The Velveteen Rabbit" is a configuration. It doesn't really matter if that configuration manifests as ink on paper or electrons on a screen or words being spoken by a storyteller talking over a campfire or table entries in a database, or whatever. It's still "the" Velveteen Rabbit. Even if you translated the words into Chinese and encoded paragraph and page breaks with HTML and stuck in on a web page, resulting in all the individual characters and words being completely different, it would still be "the Velveteen Rabbit." Right?

The "copy the brain to upload" crowd apparently believes that consciousness works the same way. For some reason.

I see no reason to believe that.

3
General Discussion / Re: Thoughts on Technological Immortality
« on: March 12, 2016, 07:45:44 pm »
The water's the same, but now you have two water sources, not one. Interactions with one of the bodies of water does not alter the other. They're now effectively two buckets of water.

Ok. And the various trillions of cells in your brain are effectively trillions of cells rather than one brain. And when you cut a pizza in half it's effectively two pieces of pizza. And you and I are effectively two different people. And if you carve a statue from a hunk of stone, it's effectively the statue over there and a whole pile of stone shavings over there. And if you partition off pieces of your brain so that they don't talk to each other, it's effectively two different personalities.

But if you remove that board, it's going to go back to being effectively one bucket of water. And if you dump that bucket into the ocean it's going to be effectively one ocean, and not "the ocean over there and that bucket of water right there."

And if you linked up every human brain together, the idea of "different people" might seem just as odd as thinking your brain as those neurons on the left and those neurons on the right.

Identity is very flexible.


4
General Discussion / Re: Thoughts on Technological Immortality
« on: March 12, 2016, 06:52:56 pm »
So a sense of self-awareness is seamless  between changes is what you're saying? That still doesn't deal with the case of a full split-brain case. If there's no communication between either hemisphere, they are acting independent. One is not aware of the other. So where does our self-awareness go?

I don't understand the question. If you have water in a bucket, then split the water down the middle with a board, where did the water go? It's still there. It just has a board separating it now. Adding a board does not destroy water and removing the board does not create it.







5
General Discussion / Re: Thoughts on Technological Immortality
« on: March 12, 2016, 06:45:12 pm »
So, all sub-components of a brain have their own sense of self?

That's looking at the situation from a peculiar angle.

Imagine that you have a whole pizza that has not been cut. How many slices are there? The question doesn't really make sense. There aren't any slices until you cut the pizza. Saying that the components of the brain "have their own sense of self" is similarly a weird way of looking at it. Yes, if you were to partition your brain, the partitioned pieces would perceive themselves as "self" but with a whole brain, it doesn't make a lot of sense to think of the various pieces as having their "own sense of self."

Awareness is non-discrete. Simple example: vision. Think about what you see right now. It's "what you see." Now close one eye. What you see is diminished, but it's still "what you see." Now close the other eye. It's still "what you see." Sure, each eye is individually feeding different data to you, but when they're both open you still perceive it as a singular "what you see" rather than two separate data feeds.

6
General Discussion / Re: Thoughts on Technological Immortality
« on: March 12, 2016, 05:14:34 pm »
If you split a person's brain, what happens to their sense of self?

Unsubstantiated opinion:

Consciousness is awareness. It doesn't particular matter of what. The "you" that you are/perceive is not a discrete entity. The "entity" that is being perceived as self is fluid.

For example, imagine a building with a bunch of stained glass windows. Light is shining through the windows. It doesn't particularly matter where one window ends and the next begins, light will shine through them regardless. If you have:

(window) (wall) (window)

...you, as an outside observer would tend to think of that arrangement as being "two windows." But if you knock out the wall and put in more stained glass between the two, it now appears to be only one window. if you put up more wall in the middle of a window:

(win)(wall)(dow) (wall) (window)

...it now looks like there are two small windows next to one big window. The light shining through all of this doesn't care. If you cut up a brain into smaller pieces, you'd simply have smaller windows. If you were to network every human brain together, for example, you'd simply have a great big window.

From the point of view of the windows, each window would always perceive itself as "self." When you learn a new fact, or switch from closed eyes to opened eyes, your greater perception/awareness doesn't typically cause you you perceive yourself as a different entity. Water in a bucket is still water in a bucket regardless of whether you add more water to the bucket. If you put some water into a bucket, and then look at it and say "this is some water in a bucket" and then add some more water into the bucket you don't look at the water and conclude that it's two different waters. It's "water in a bucket." And if you take some of the water out, you still have "water in a bucket" not "half a water" in the bucket.

I propose that consciousness is similar.


7
General Discussion / Re: Thoughts on Technological Immortality
« on: March 12, 2016, 03:43:36 pm »
Even if they end up just as a shadow, would that be better than simply ceasing to exist

The shadow is them "ceasing to exist." Go stand in the sunlight. Look at your shadow. Now make a piece of cardboard shaped enough like you that it casts the same-shaped shadow. Congratulations! You've now created a shadow. If you now kill yourself, does the fact that the shadow from the piece of cardboard still exists make it "better" than simply killing yourself without making the piece of cardboard?

Quote
or be in the torment of being trapped in their body?

If you want to suicide to escape pain, i have no problem with that. But don't point to your facebook page and claim that so long as that exists, it's ok for you to jump into an incinerator because that facebook page "is you." It isn't.

I dispute the idea that duplicating brain patterns in software would result in the transfer of your consciousness into the computer running the software. But a lot of people apparently believe that it will. What it might do, however, is create a mindless Siri clone with conversational patterns than resemble how you speak. And if people see that mindless thing and conclude that "uploading works" and then start suiciding in large numbers so they can live digital immortality...I think that would be a very unfortunate result.

8
General Discussion / Re: Thoughts on Technological Immortality
« on: March 12, 2016, 01:42:51 am »
Neither do I.

9
General Discussion / Re: Thoughts on Technological Immortality
« on: March 12, 2016, 12:53:38 am »
I didn't mean to imply that you agreed with me about everything, just that you agreed with the idea that us being unable to bridge the mind-machine gap is incorrect. Apologies. (Though to ALSO be clear, I don't think duplicating your brain in software will do it either, and I'm rather saddened that you think that's what I meant.)

No, no worries. I just wanted to be clear. There are certain positions people take on this topic that I think may eventually lead to perhaps millions of human beings suiciding to create electronic zombies that superficially resemble them, so it's an important topic to be clear on. Some of the statements I made happen to overlap on a venn diagram with statements that might conceivably also be made by somebody in the philosophical camp that I believe will result in millions of deaths...so, again...it was very important to me to be clear.

peace/namaste/no worries

10
General Discussion / Re: Thoughts on Technological Immortality
« on: March 12, 2016, 12:37:34 am »
I guess what I mean is that while the definition terms or the concepts me and you have for 'consciousness' and 'free-will' are really the same thing.

You can't have the same idea about life and how you experience if you didn't at least have the illusion of free-will. That you are doing things because you want them to happen. It is hard to think that you are merely 'witnessing' your hands type something in a box in response to the dumb arguments of an internet stranger.

If your consciousness was one where you you could truly perceive and understand that you had no control over your actions, that were a passive observer of everything you did in your body, you would really challenge traditional arguments over the nature of free-will.

The fact that it is impossible for you to perceive yourself as a brain, or a thinking meat-computer is already a sort of unworking of the idea of consciousness as a discrete metaphysical concept with any root in reality.

At least to me this makes sense. I hope you are not just quibbling over definitions and talking about some hypothetical reality thing where reality describes and defines word and not the other way around.

Sorry, but would you clean up the grammar in your post? There are several sentences in there that I don't understand what you're trying to say.



The point of mine (and LB's) post was that

To be clear, my position is that while "uploading" is a thing that might be possible in the sense of "running" "you" on a silicon computer instead of a meat computer, I dispute that the method of duplicating your brain in software will accomplish that. For a couple reasons, which if I recall correctly were discussed at great length previously in the thread.

11
General Discussion / Re: Thoughts on Technological Immortality
« on: March 12, 2016, 12:01:57 am »
if the consciousness is defined by as a passive observer to the functions of a deterministic computer (made out of meat) it doesn't really follow any philosophical or scientific literature I've ever read.

https://en.wikipedia.org/wiki/Consciousness

"Consciousness is the state or quality of awareness, or, of being aware of an external object or something within oneself"

https://en.wikipedia.org/wiki/Free_will

"Free will is the ability to choose between different possible courses of action."

They're different things.

Quote
Im not sayng we do not have consciousness, but only that consciousness is an unreality, an illusion.

That doesn't make sense though. If it's an illusion...who is viewing the illusion? if you have an observer, there is consciousness.

Quote
Which is kinda weird, since we're all typing and dragging our eyes across text to say we have no free-will and no consciousness, but

Well, personally I dispute the "no free will" interpretation. I suspect that "free will" is a bit more fluid than yes/no. I don't typically choose when I blink, for example. When I see a kitten I don't "choose" to find it adorable. Certainly this device that is my body and brain is capable of running functions on autopilot. And some people probably spend a larger portion of their overall behavior on autopilot than others. But the demonstration that non-volition functions do occur doesn't mean than volition-functions don't occur.

But free will isn't necessary for consciousness, so I didn't particularly feel the need to dispute the point very hard.


12
General Discussion / Re: Thoughts on Technological Immortality
« on: March 11, 2016, 11:35:36 pm »
There is no technology that directly connects a person's brain to any artificial computer right now.

I don't believe such a connection is actually possible, until someone shows me otherwise.

...erm, ok.

 * Braingate video from 2008
 * Experiments from 1999 where they extracted video from a cat brain
 * Video of monkeys controlling robotic arms via brain implant
 * Here's a guy playing World of Warcraft via a brain control interface

Do you want more, or is this sufficient? Here's a wikipedia article: Brain computer interface



to go from these technologies to expanded mental capacity. I'm betting it's less of a technological hurdle and more of a will/funding thing; it's one thing to give sight back to the blind or let a double-amputee open a door on their own, and another to make someone smarter, you invite a lot of nay-sayers and doomsday preachers with that kind of thing, so people are less apt to fund it, at least publicly.

http://www.darpa.mil/news-events/2015-01-19

http://www.darpa.mil/program/our-research/darpa-and-the-brain-initiative

13
General Discussion / Re: Thoughts on Technological Immortality
« on: March 11, 2016, 11:06:31 pm »
Consciousness and free will are illusions.
I don't understand what this means. Would you care to elaborate? I've never seen this used in a way other than as a catchphrase.

https://en.wikipedia.org/wiki/Neuroscience_of_free_will

Current understanding of neuroscience suggests the brain arrives to a decision before you are consciously aware that you have made a decision. Consider that if someone throws pocket-sand at you, your eyes will close before you actively thought to do so. Same thing with every other decision you've ever made in life, your brain makes a decision, then 'you' are aware of it and you think you made the decision, There is a non-trivial time-gap. This suggests free-will and consciousness are just illusions or by-products of how the brain functions or how it evolved to to function, for humans at least.

 It's all theory, but nothing else around besides philosophical schools to debunk it.

That's an argument against free will, not consciousness.

Analogy: Just because you don't get to decide what happens in a movie doesn't mean you're not watching it.

14
General Discussion / Re: Thoughts on Technological Immortality
« on: March 11, 2016, 09:34:34 pm »
I wouldn't get digitally uploaded, but I would try and mess with everyone who did.

Apologies for the personal promotion, but since we're discussing consciousness and uploading again...it happens that I recently published a story on fimfiction that is relevant to this topic. Story assumes the reader is familiar with the Optimalverse.

It does not have a happy ending.

15
General Discussion / Re: Thoughts on Technological Immortality
« on: March 11, 2016, 09:28:19 pm »
Going to dig this back up due to a fascinating article that I read that... really, really shows some light into the content of this thread. Article's right here. Seems to sum up this thread really well, actually, in that the reason the discussion was so frustrating was because nothing that we were discussing was... well, concrete.

I basically agree. I think I said it previously in the thread: this idea that brain produces consciousness is cargo cult science. "There's a brain! It's doing stuff! There is consciousness! Clearly the brain doing stuff results in consciousness!" Just like people on a desert island watching people build runways and towers and assuming that building runways and towers causes cargo planes to land.

It's bad thinking.

Pages: [1] 2 3 ... 373