• Log in with Facebook Log in with Twitter Log In with Google      Sign In    
  • Create Account
  LongeCity
              Advocacy & Research for Unlimited Lifespans


Adverts help to support the work of this non-profit organisation. To go ad-free join as a Member.


Photo
* * * * * 1 votes

The Transition


  • Please log in to reply
21 replies to this topic

#1 Matt

  • Guest
  • 2,862 posts
  • 149
  • Location:United Kingdom
  • NO

Posted 22 May 2005 - 08:20 PM


I've been wondering how it would feel to go through the transition and augmentation of nueral implants and would there be any difference between biological brain and silicon. More importantly, how would I be 'me' when I had completed the transition. How do I make the transition and keep my consciouss self alive, not just created a copy of me (destructive scanning)

I was thinking of a process that could be the safest way possible in ensuring that I end up on the other side of the transition as 'me'. At somepoint in the future if people like Ray Kurzweil is right then we should have the capability to scan the brain with some sort of advanced Nanotechnology and make a make a copy of it, then port it over to a machine. But I was thinking, wouldn't this just be a copy of you? and your consciouss would still be in your biological mind or non existent if undergoing destructive scanning.

I thought of a process that would ensure that at the end of the transition, I would be 'me' but not in my biological brain. Heres a few simplified steps:

1. Start off with your Brain
2. Augmentation of the brain using neural implants that take over, mimic and enhance specific areas of the brain
3. Both Neural Implants and brain work together and become one.
4. You always change as a person and continually build an identity on these new implants. Consciousness spreads, as its not located in any one particular place
5. degradation and eventual removal of biological brain

At first we will start with Brain Augmentation with neural implants that enhance our natural capabilities. With knowledge of the brain we can get these implants to become just like our biological brain and become one, to work together. Stronger connections will be made in the neural implants so the transition of 'you' meaning your experiences, memories etc, will all be stored in the nueral implants rather than completely being stored on your biological brain. Eventually biologial sections of the brain will be used less and less and soon the neural implant takes complete control over specific functions without you even noticing the transition that took place.

When some old memory is triggered, the neural implant could instantly store the memory you just had. If a memory is important enough then they will stay but now stored in the implants. If something is not important you obviously lose that memory.

When a person gets one of their Brain hemispheres removed because of problems, they are still the same person right? but with just a few disabilities or problems that can be overcome to a certain degree. We can assume that there isn't one particular part of the brain that actually stores consciousness, but it's the brain working together. Because of the continuous transition, the new and old memories would be stored on the nueral implants. We can assume that the biological part of the brain might not be much eventually since the neural implants are taking over with certain tasks as time goes by.

As a person ages and when the biological parts of the brain stop functioning properl, they can be removed after assessing the activity and damage of these areas first You will have then reached the point where your memories, experiences and your 'conscious self' will, and have been all stored on the implants (think of consciousness spreading). This is a better approach than say the destructive scanning method because it would be a slow and seamless transition. I believe it raises less questions about whether you actually make the transition.

We are always changing as people, we are never the same person that we were a few years ago. It is our experiences, feelings, memories, relationships and other things that make us who we are now. As the transition takes place all the things that make up you will be stored on the implants. In the end your conscious self exists entirely on the artificial implants that mimic the biological brain. The cool thing is, you will never notice that the transition ever took place! (other than a few good enhancements).

You create your future self by what you do today, it's a process that we go through all our lives and it continually changes who we are. The way to go when thinking about making a transition over to machine is that we slowly build up our idenitiy over time and this identity or consciousness will transfer over to machine seamlessly if we go for it in a similar way to what I described rather than the Cut & Paste approach which leaves you dead and only a copy of you on the machine.


+ updated 19/11/05 +

Edited by Matt, 19 November 2005 - 05:34 AM.


#2 susmariosep

  • Guest
  • 1,137 posts
  • -1

Posted 22 May 2005 - 10:29 PM

The root of identity.


How do I make the transition and keep my consciouss self alive, not just created a copy of me (destructive scanning)

...scan the brain with some sort of nanobots and make litrually make a copy of it then port it over to a machine. But wouldn't this just be a copy of you? and your consciouss would still be in your biological mind.

. . .

When a person gets part of their brain removed because of problems, they are still the same person right?

. . .

You have reached the point that your memories, experiences and your conscious self will and have been all be stored on the silicon implants. Memories that werent important would of died and important memories remain.


I think you are talking about the concept of identity and the reality of identity.

How do you know that you are you? And how do people know that you are you?

There are then two kinds of identity: the one by which you know yourself to be you and the one by which others know you to be you.

And there are two yous: the one that is you to yourself and the one that is you to others.

How then do you know that you are the you that you know? And how do others know that you are the you that they know?

And how do both you and others get to know that they know the same you that you and they respectively know to be you, you to yourself and they the you to themselves.

Simple, compare what you are to what you have of memory data of yourself: you from your memory of yourself in your brain/mind and they from their memory of you in their brain/mind.


We are talking here about identity, and you are relating it to the machine that will take our place as a copy of ourselves either when we are gone so that the machine is the only copy, or when we are still around so that the machine is one copy, a machine copy, and we are another copy, a biological copy.

I invite you to think about the identity of the you and of the me and of the he or she, namely, of the self.

Whether a machine copy or the biological original copy, and either of them the only one or they are two or more copies whether machine ones or biological ones by sexual reproduction or by cloning, which copy is the one with the true identity, or they all if several copies have the same and therefore true identity?

In college philosophy there was this sort of a dictum I learned which I must confess here not to have understood except very shallowly, namely, being is one, is true, is good, and is beautiful: ens est unum, verum, bonum, et pulchrum, -- that was taught in what we might now call the perennial or classical philosophy which is still taught in Catholic institutes of learning.


To make a long story, what I think makes up identity is the possession of memory data by a person whereby he can recognize himself when he looks in the mirror at himself.

These memory data are what can be found in a police dossier that will enable the police to find you and not get mistaken about the real you. The memory data can be reduced into two big categories, one covering your physical description, the other covering your moral description which is divided into two sub-categories, namely, the personal and the social.

When you lose your memory data, as in total amnesia, then you have lost your identity to yourself, you don't know who you are. You have to ask people if they know you, who you are. If no one knows you in a place you happen to find yourself, then you will not know yourself anymore unless and until you get to a place where people can recognize you.

But where people do not recognize you, today you can still get an image of yourself and flash it on the internet, asking anyone who should see the image of yourself with your physical details and present circumstances, who might recognize you, to contact you and tell you who you are.

Or again, you might report to the nearest police station for help, asking the personnel there to identify you who present yourself before them for help to know who you are. The police now has an enormous access to worldwide database of people, which they can use to find out what your identity is, who you are.


Now suppose there are several copies of you? Which one is the real you? Here is where that dictum of philosophy comes in: Being is one, is true, is good, and is beautiful.

Every copy is a version of you, so that two copies make two versions and make two beings: because a copy of you is a being, and it is one, meaning every being is a one therefore two copies are two beings and they are distinct; so that Number one is not number 2.

Conclusion here is that when they are several copies of you with your identity then there is not only one identity but several, for the simple reason that every one of the copies is arithmetically distinct from the other copies. As I said, #1 is not #2 and not #3 and not #4...

But we can imagine and even make copies which are only numerically distinct from one another but otherwise physically and morally similar. Take careful note of the word, similar.

So, before you go about making copies of yourself that are essentially similar among themselves, like mass produced from the assembly line, you have to plan ahead for one copy to be in effective charge of all the other copies.

There is in law what we call 'alter ego', the 'other I'. The possibility of having several copies of oneself is a good way to get a lot of things done by the copies for yourself, by your alter egos, which you cannot get done in person.


Yes, I welcome the possibility of having copies of myself, then I can spend a lot of time doing what I enjoy doing, and consigning to my copies what I have to do but which are an inconvenience to me if nothing else, taking time away from myself for things I enjoy.



I almost forgot, in the event that your identity cannot be found at all, and you don't have a copy of yourself, then you can start acquiring a new identity, which will be a new person, a new self. But metaphysicians and physicists will tell you that there is still that continuum between your new self and your old self, only that you can't access the memory data of your old self and no one can, yet it is still existing in the metaphysical universe and somewhere in the physical universe, at least as a objective impersonal memory datum in the cosmos of existence.


Susma

#3 Matt

  • Topic Starter
  • Guest
  • 2,862 posts
  • 149
  • Location:United Kingdom
  • NO

Posted 24 May 2005 - 09:03 PM

susmariosep, If at some point in the future all this becomes possible, but only "destructive scanning" a bit like ' Cut & Paste ' would you go ahead and do that?

I personally don't know If I could go ahead with that kind of process. Although everyone else wouldn't notice any different and the copy of me in the machine would be almost identical and claim to be 'me'. I don't see how I actually make it into the machine myself with the brain being destroyed. Therefore I would be much happier with the proceedure I mentioned above, if feasable.

Do you think my idea of porting mind over to machine is more or less feasable than other ideas?
Do you prefer another method?

anyone else got any ideas? please share them.

sponsored ad

  • Advert

#4 susmariosep

  • Guest
  • 1,137 posts
  • -1

Posted 25 May 2005 - 09:12 PM

Right away, sure!


I welcome copies of myself, the more the merrier, anyhow they are produced, provided I maintain my primacy of power and honor -- or better still: I determine which copy is the best to be in charge and to take my place so that I am the effective first and original by legal fiction.

Scanning and copying by clip and paste need not be destructive, not with our advanced technology today. We used to do that with scissors and adhesives, now we do it with a computer program. I do that all the time, when I select all text in this message box and drag it to the collector's link provided by my GreenBrowser browser (get a free copy, find it with Google; its a browser of less than one meg size but awesome).

I am happy to see the day when I can scan my brain and produce a better copy which I can have it for a replacement of the whole one, by improving of course the copy first before replacing the old one with the new copy.

We have foods that sport the tag: New and Improved. Also with so many everyday appliances and devices for home and office. Consider the computer, every few months we have new and better models.

When I have a new and improved brain, I can sport this inscription on my shirt front, New and Improved Susma, 052505, completely revised but totally backward compatible and user friendly.

Susma

#5 stevethegreat

  • Guest
  • 34 posts
  • 0
  • Location:Doesn't really matter

Posted 26 May 2005 - 08:27 AM

Will it be ever possible to make something better than our own brain (in all aspects)? I mean how come our brain can make something superior from it's very self.

OK silicon has a lot more capabilities than neurons in some aspects, but these are the known ones. How can we be sure about silicon being better in all aspects than neurons?

... Just a few questions

#6 susmariosep

  • Guest
  • 1,137 posts
  • -1

Posted 28 May 2005 - 03:05 AM

Dream away!



Will it be ever possible to make something better than our own brain (in all aspects)? I mean how come our brain can make something superior from it's very self.

I might sound nutty, but history tells us that in fact everything that is better came from versions or even entities totally different but not as good.

From the abysmal darkness of the cosmos appeared life which to my evaluation is better than non-life.

Then from life came consciousness, another step forward and upward from a less perfect version or model.

From the natural eye we can now see much farther and much closer with an attachment, even into the molecular scale of substances and into the light-years distant galaxies.

The brain can produce something of more prowess than it in its original nature possesses. The brain can do multiplication but the calculator and more advancedly, the computer can do faster and better and much more mathematics than the brain can ever match in quantity and in quality. And these are devices deviced by the brain.

Evolution from all appearances means going upward and forward in enhancement.


But is there a disaster facing us with our advances by brain power into more and more superior inventions to better the functions of the brain and even to replace the brain itself?

Yes, if we don't employ our brain power and our products from our brain, science and technology, to control our primitive emotions of aggression, greed, envy, hatred, ambition, then it is certain that we will destroy ourselves totally and irreversibly.

War as a phenomenon of humankind and as waged by humankind is only peculiar to humankind, and in there is the seed of man's extinction -- unless and until we employ our science and technology to control our emotions.

And don't let anyone convince you that controlling man's emotions will be contrary to human nature or whatever God's will there be. That is the essence of evil in mankind, such kind of perverted reasoning.

Susma

#7 stevethegreat

  • Guest
  • 34 posts
  • 0
  • Location:Doesn't really matter

Posted 28 May 2005 - 09:13 AM

Yes, but we don't fully know what brain is capable of, do we? It surely isn't just a mathematical device, emotions are conneced with the older part of our brain, the one that has remained from our beast-ages, it doesn't seem to work with a mathematical way, it can be easily decieved.

There has been lots of supernatural events caused by single men. We will never know if those events ever happened, but we -I- cannot turn down the chance that those things was/are happening. Einstein said mass is one expression of enegy. Then we ARE energy, what if brain can handle this/our energy better than machines. If our brain become HD we will collect useless knowledge, don't we?

In fact I want to come to my main question: What make us humans knowledge or emotions?

#8 susmariosep

  • Guest
  • 1,137 posts
  • -1

Posted 30 May 2005 - 03:05 PM

Here is my naive opinion.


In fact I want to come to my main question: What make us humans knowledge or emotions? -- Steve


My stock knowledge recalls that lesson in introduction to philosophy by which man is defined as a rational animal.

I am not sure whether everyone accepts that definition of man.

But I can understand humanity much better with that definition than with any other kind of description of man.


Emotions are peculiar not only to man but also to animals -- no offense to animals, though.

My daughter wants to keep many cats and dogs in the house; she knows more about them than me. She told me that these pets have emotions like man.

You can see that, can't you? For example, there was a news once about a big dog which killed the newly born baby of a couple, seemingly from envy; because until that baby came, the dog was the 'baby' of that couple.

If you are not aware of emotions in animals, emotions exactly like ours, then take time to observe them. They even have personalities like us. Among the cats and dogs we have had, one is lazy, another is troublesome, one eats excessively, one is fond of playing all the time; there was one which I could consider to be paranoid.

But with humans we also have reason, that is what is peculiar about humanity: we can think and make judgment, and consciously pursue what we have determined and decided what we want to be in life, aside from staying alive and reproducing.

So, what makes us human, knowledge or emotions? My answer is that knowledge makes us human, emotions assimilate us to animals -- nothing against animals, and in many instances it is better we be animals than possessed of knowledge whereby we can be demonic, diabolical, satanic, ruthlessly wicked, by the misuse and abuse of knowledge.

Death and destruction wrought by man on fellowman, isn't that due to man's emotions? and not man's reason and knowledge.

Consider this quote I can't recall from what writer:

There is enough on earth for everyone's needs, but not sufficient for anyone's greed.

That is one dictum attempting to convince people about the superiority of knowledge to emotions.

Here are the seven deadly sins or vices, which when you consider them carefully are all emotions -- not in any order of importance or viciousness: anger, envy, greed, gluttony, lust, pride, sloth.

And here are the four cardinal virtues (cardinal, from Latin -- cardo, cardinis, meaning the hinge which supports the door and enables it to open and close): justice, temperance, prudence, and fortitude; which are all founded on reason and knowledge, and which are intended to control the emotions.


The philosophical ideas are from my school days various introduction courses to philosophy.


Susma

#9 Matt

  • Topic Starter
  • Guest
  • 2,862 posts
  • 149
  • Location:United Kingdom
  • NO

Posted 19 November 2005 - 05:25 AM

Just updated my first post so it makes a bit more sense...

Anyone think that a slow transition is a much better way to go than "destructive scannning"

#10 bugmenot.com

  • Guest
  • 31 posts
  • 0

Posted 21 November 2005 - 07:46 AM

After sleep is the person that awakens, really "you"? Prove it.

On a strangely related note, be aware that this account is... how shall I say... public. So previous messages aren't "me" and any subsequent messages may or may not be from "me" [sfty]

#11 bgwowk

  • Guest
  • 1,715 posts
  • 125

Posted 21 November 2005 - 08:42 AM

Anyone think that a slow transition is a much better way to go than "destructive scannning"

As I've explained in other discussions, I don't think it matters for transfer to physically-identical substrates.

What I do want to point out is that you may be confusing the problems of duplication and uploading. This is very common, and unfortunate because they are philosophically distinct questions. Whether your mind can be transferred to another brain that is physically identical to your brain (the duplication question) is a separate question from whether your mind can be transferred to another "brain" that performs the same logical operations on competely different hardware (uploading).

Actually, upon re-reading your first post, I take it back. You are not confusing uploading and duplication. In fact you distinctly say that you wonder whether there is a difference between biology and silicon. (Although I think uploading literally to silicon is as likely as uploading to vacuum tubes.) But you then say you are MORE concerned with the duplication problem (as manifested by destructive scanning).

I think you have it backwards. The idea of supporting human personal identity and experience on completely different processing hardware is much more radical than duplication. So, yeah, you are certainly wise to go with a gradual process. But not because of the duplication problem, IMHO, but because of the uploading philosophical problem.

---BrianW

#12

  • Lurker
  • 0

Posted 21 November 2005 - 10:31 AM

justin:

Do you believe that a copy of you (down to the last spin state) is subjectively you?


http://www.imminst.o...205

In Nate's response to the biological duplication question, he answers (with my emphasis included):

3. No. One of the necessary attributes for personhood – along with memories and self-referential observer-states – is self-observable numerical identity. 'Self-observable numerical identity' refers to the capacity of a being that can utter the statement (though not necessarily the sentence), "I am here and I am not there." If I was standing next to my copy, it would be a duplicate, and the copy and I would be qualitatively identical (e.g. A is very similar to B) but not numerically identical (e.g. A is B). Each would have its own self-observable numerical identity. Although each would be able to utter "I am here and I am not there," 'I' in each statement would have a referent that is different from the other. Since natural laws easily distort the notion of numerical identity with regard to personhood (e.g. if I am A when I go to sleep and B when I wake up, strictly speaking, A is not B, only very similar to B), personhood, as long as one cares about personhood (and about babies between non-sentient stage and personhood stage!), necessarily has a 'self-observable numerical identity' function.


Further explanation follows here.

You don't seem to mention "self-observable numerical identity" or anything akin to it in your posts, bgwowk. Do you consider it significant? If so, does that jeopardize your belief (however tentative) in the retention of personhood* after biological duplication?

* personhood as defined by Nate, as quoted earlier

#13 psudoname

  • Guest
  • 116 posts
  • 0

Posted 25 November 2005 - 06:32 PM

1. Start off with your Brain
2. Augmentation of the brain using neural implants that take over, mimic and enhance specific areas of the brain
3. Both Neural Implants and brain work together and become one.
4. You always change as a person and continually build an identity on these new implants. Consciousness spreads, as its not located in any one particular place
5. degradation and eventual removal of biological brain


I agree that somthing along these lines seems a better idear then copy and pasting. However I would not think of this as a slow approach, as anything happening around the singularity (allthough some neural implants might start before the singularity) is likly to happen very quickly, and the brain augmentiation probly would theirfore be done very quickly.

#14 bgwowk

  • Guest
  • 1,715 posts
  • 125

Posted 01 December 2005 - 08:12 AM

You don't seem to mention "self-observable numerical identity" or anything akin to it in your posts, bgwowk. Do you consider it significant? If so, does that jeopardize your belief (however tentative) in the retention of personhood* after biological duplication?

Sorry for answering this late. The question only just came to my attention. Specifically, the question is what is the relevance of the observation, "I am here, and not there," to biological duplication.

The answer is that without a high-bandwidth communications link, an identity can only expect to exist in one place at one time. So two simultaneously conscious duplicates are two distinct identities that will see themselves as distinct entities, despite both being valid continuations of a single original.

Look at it this way, everytime you do a duplication experiment that involves the existence of two brains at time T + deltaT that are substantially identically to one brain at time T, that brain at time T should expect to subjectively wake in either duplicate brain. Indeed, the proof of this is that if you perform a branching tree of duplication experiments, and ask each person at the end what their subjective experience was, they will almost all report thusly: "I remember that each time I was duplicated, which brain I woke up in was random." This is because most branch tips are connected by a random path back to the trunk. QED

---BrianW

P.S. The Born Interpretation of quantum mechanical probability neatly derives from MWI by this same mechanism.

#15 signifier

  • Guest
  • 79 posts
  • 0

Posted 01 December 2005 - 07:38 PM

My general version is:

1. Replace one neuron with a 100% perfect nanomachine copy.
2. Do the same for every other neuron.
3. Stretch this process into a long as time as you like.

I don't think uploading, as traditionally envisioned, is a good idea.

#16 johnuk

  • Guest
  • 35 posts
  • 0

Posted 17 December 2005 - 05:34 PM

My general version is:

1. Replace one neuron with a 100% perfect nanomachine copy.
2. Do the same for every other neuron.
3. Stretch this process into a long as time as you like.

I don't think uploading, as traditionally envisioned, is a good idea.


How about, create a software model of what that neuron was doing, e.g. what it's input and output were linked to and what it did to the data flowing into it and then have it import and outport data to and from the same points it was connected to within the tissue.

Imagine speeding up your idea, swapping millions or billions of neurons for nanomachine copies per second. Within a few seconds the entire tissue volume would be exchanged.

The only difference being timescale.

To argue that this would destroy the consciousness suggests that some form of conscious entity, ether or volume of other sorts, that needs time to move (or grow and reorient if you want to be scientific and explain it as a cellular phenomena), is present.

I'm doing my best to reduce as much of the human mind to purely mechanical processes as possible. Memories, for example, being only clusters of synaptically tuned reflex arcs that mutate certain stimuli experienced at the same time as the memorable event, or related in some way, to represent what was experienced when a memory was made of that event; so relevant stimuli, lots of orange signals on the optic nerve for instance, trigger a 'reflex cluster' (memory) that has grown to reflex in a manner so as to represent those stimuli that make up the memory, the smell of an orange?

I think what people have a problem with when it comes to the idea of uploading their consciousness is that of disconnection with their 'real world' experience of life through their nervous system. I must admit, I am struggling to find an answer to this that I'm 100% happy with as well.

Again, as I've suggested in my other posts, an interesting thing to think about when considering this is what happens to you when you go to sleep or are anethetised with a general anesthetic. I've chosen these as examples because they involve massive reductions in the quantity and quality the of information getting from the external world to your conscious experience of it, you 'lack consciousness'. In effect, for this period of time, you're existing without a sensory nervous system (body) and relying on your own memory to provide you with a stand by sensual experience of life. The output of the reflex clusters (memories) seems to acquire a much lower priority than the data usually coming in through your body's sensory nervous system and so only become vivid when that data level is reduce, as you sleep. Although, memories can be recalled while awake during problem solving; that need not be numbers on paper problem solving, it could be social interactions, spatial and/or temporal orientation etc. They may also be recalled if your sensory nervous system isn't being particularly stimulated, in the form of daydreaming. I believe your memory acts as a buffer cushion while you sleep, creating a source for stimuli that you experience as dreams. Nore do I fully trust that you are actually unconscious, since to experience dreams you must be conscious; unless you only experience memories of them on waking that they themselves have created whilst you slept. I suspect that consciousness remains throughout sleep, merely that the data coming from external stimulation through your sensory nervous system and that access to the integration pathways of your memory are limited, so you don't experience much of your external world and a lot of what you do experience isn't remembered, but you are conscious.

This all seems to make sense to me. Think about what happens as your drift off to sleep. You become less and less aware of feeling things in the environment around you, distant sounds for instance. You also begin daydreaming and might 'wake up' slightly from this state not remembering what you've just been thinking about, but you remember you were thinking about something and that you weren't asleep. This suggests that something is happening with regards to the data in your sensory nervous system and your memory's permeativity to it's input. I began considering this path when considering the 'is it the same consciousness in the morning' question. Either you loose consciousness, and that way you can't be sure in the morning, or the consciousness is continuous through the sleep cycle, it's just that you don't remember much more of it than fractures of dream memories, themselves created from other memories. In fact, I also suspect that this sleep time may be extremely close to some requirement of your memory to have some 'quiet time', during which there's a greatly reduced level of sensory input, such that it can integrate pathways and connections without being disturbed by memory recalls.

These effects of losing contact and/or memories of your external world could be occuring simply because you lose consciousness, but if you go that way you end up back at, 'who's to say the awaking consciousness is the original?'. The characteristics of falling alseep are also a good match with the requirements of my model.

It get's harder as you begin trying to separate the memories from the thing experiencing them, if indeed that separation can even be made, as we know so little about what is happening in the brain at these levels of resolution, leaving only prediction of something we can't experience directly. We experience the output of memories and know they appear similar to the original stimuli that created them, and we know what these original stimuli are like because we know how our axons transmit signals to our brain tissue as neural impulses, so we can make a rough guess at how memories might be functioning on a mechanical level even though we can't easily stick electrodes into living brains to measure things like this. However, we can't experience consciousness directly it's self, only the effect of it, and we have nothing else to reference this effect to, let alone a reference we understand.

Indeed, I have decided to classify consciousness purely as experience. I derived this from 'I think therefore I am', which I believe to be wrong, as you can't be totally sure you're actually thinking, only that you experience the result of something like thinking, a delay in initiation and output; simple example, your thoughts could be pre-recorded memories that you're experiencing. The only thing you can be most sure of is that you are experiencing. What it is you're experiencing is still open for questioning! [lol]

I'm still confused as to what 'I' am. My memory must be presenting it's output to some form of processing, and the result of this processing can then fall back into the memory. Perhaps 'I' am located somewhere within this loop. A much more commonly accepted idea is that we aren't our memories, that simply copying memories across isn't enough. Less commonly accepted is that we aren't the thing that's doing the processing of our memory's output. Perhaps the answer is more complex than simple eithers and ors. Perhaps the effect we experience is produced in the loops of ultra short term memory and the resultant processing, refreshing from millisecond to millisecond; the cache of our mind if you like. In this sense, the memory and processing exist more as a single unit that occurs so rapidly we have nothing else to observe it by, we are it. As the memories being processed stream out and fall into our longer term memory, this 'conscious unit' can recall them and believe it's self to be existing because it has memories of experiencing.

Still a very complex and confusing idea but when I think about it for long enough, I do begin to consider the idea of attributing 'myself' to such a system. A very strange feeling!

Hugs and kisses,
John

Edited by johnuk, 17 December 2005 - 06:22 PM.


#17 Matt

  • Topic Starter
  • Guest
  • 2,862 posts
  • 149
  • Location:United Kingdom
  • NO

Posted 27 April 2010 - 10:35 PM

an article on transitioning the biological brain to an articial 'immortal' one. I was just speculating back in 2005 how it would happen, and thought the best approach is a gradual and seemless transition to an artificial type brain. Now as this article describes is the transition taking place using nanobots. I guess you could get the nanobots to replicate and replace the existing neurons so there is a transition that you wouldn't become even aware of. And the debate over whether conciousness can be transfered is I think, a non issue. Because you aren't making a copy of yourself, you're replacing yourself. Your old memories copied, your new memories have are from the start, non biological.

http://memebox.com/f...dying-in-future

"A daily pill would supply materials and instructions for nanobots to format new neurons and position them next to existing biological cells to be replaced. These changes would be unnoticeable to us, but within six months, we would be enjoying the benefits of a powerful new brain.”

As we begin to use our new brain, we would not be aware that our mind had been switched from one set of brain cells to another. And when our mind is transferred from a destroyed body to a new one, this move will also be unnoticeable."

#18 hotamali

  • Guest
  • 49 posts
  • 2

Posted 27 April 2010 - 11:46 PM

Talk about long dead thread resurrection.

#19 Matt

  • Topic Starter
  • Guest
  • 2,862 posts
  • 149
  • Location:United Kingdom
  • NO

Posted 28 April 2010 - 03:48 AM

i have a good memory :D

#20 Teixeira

  • Guest
  • 143 posts
  • -1

Posted 16 May 2010 - 03:16 PM

All about you, all about your consciousness is related with your biological body. So, what makes you think that a machine can be a substitute to your biological body? Who is going to be immortal, you or the machine? Machines sometimes has malfunction, what about that? You know how good is to be inside your body, but what about a machine?
To tell you the true, I don´t know what are you talking about. It is as simple as that...

#21 Elus

  • Guest
  • 793 posts
  • 723
  • Location:Interdimensional Space

Posted 16 May 2010 - 04:18 PM

All about you, all about your consciousness is related with your biological body. So, what makes you think that a machine can be a substitute to your biological body? Who is going to be immortal, you or the machine? Machines sometimes has malfunction, what about that? You know how good is to be inside your body, but what about a machine?
To tell you the true, I don´t know what are you talking about. It is as simple as that...


A biological body's propensity for malfunction is extremely high. By replacing biological substrates with non-biological ones, we would be able to do incredible things and be far more resilient to aging and disease. Not to mention the incredible leap in intelligence.


I also fully agree with the premise of gradually replacing neurons so we don't lose our consciousness. This type of process would be incredibly complex, so I would be curious to know when it might actually be feasible to do.

Edited by Elus Efelier, 16 May 2010 - 04:20 PM.


#22 Kolos

  • Guest
  • 209 posts
  • 37
  • Location:Warszawa

Posted 16 May 2010 - 06:50 PM

To be honest I don't like the idea of living in my biological body for thousands of years . I would say that improvement in our lifestyle would be more important than longevity itself since it's more important to have a happy life rather than just long.




0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users