Scientsts Create Scotty, The World's First "Teleporter"

Miral

Random Lurker
Jun 6, 2008
435
0
0
Fanghawk said:
Smilomaniac said:
Besides that, the teleporter isn't the "holy grail" of Star Trek. That would very much be the replicator, closely followed by the warp drive.
Except we <a href=http://www.escapistmagazine.com/news/view/134688-Science-Discovers-Method-Of-Turning-Light-Into-Matter>kinda sorta have that too, along with potential warp drive models.
Not really, no. Both the technologies mentioned there aren't as exciting as they've been made to sound. The "warp drive" is nothing of the sort -- at best it's impulse drive, since it's sub-light. And the "replicator" is just a way to turn gamma rays into electrons and positrons -- which is cool, but not energy efficient, and a LONG way from creating even a single hydrogen atom.

Meanwhile the tech in this article is a shredding 3D scanner plus a 3D printer. Connect them with a network and as others have said, it's a 3D fax machine that destroys the original. Which is useful, certainly, but hardly novel. And you still have to ship the raw matter to be printed to the destination.
 

Strazdas

Robots will replace your job
May 28, 2011
8,407
0
0
Kathinka said:
Strazdas said:
Kathinka said:
Ahhh OK I see the problem. We are talking about two different devices here. I'm speaking of the type of teleporter that is specifically designed in a way that it destroys you and just "3D prints" a copy of you at the target location.

If we are talking the kind that transports you, yeah, totally, I'm with you.
Not necessarely. What does the "3D printer" one does differently than one you think i am talking about? Our conciuosness is nothing but the neurons firing in our brain. if we "print" those neuroms completely identically, we replicate the exact same conciusness. So even a teleporter that destroys you and reprints you exactly merely "pauses" the conciusness in practice. while technically it does destroy and create it a new.

The question arrises though - why does it matter. if your conciusness is identical and no sideeffects are present, for all practical purposes nothing changed.
I'd say the difference is what happens from the perspective of the person that steps into the teleporter.

he gets in, buttons get pushed, and he wakes up at another place: transport.
he steps into the device, it gets activated, everything goes black and that's the end for him: more iffy.
Oh, then there is no difference. from that perspective a person is teleported. he does not know or feel any difference or death or anything you want to call it.

The person printed at the other end is still "him".
 

Loonyyy

New member
Jul 10, 2009
1,292
0
0
PaulH said:
Loonyyy said:
I think it's continuity. There's a continuity between you and the cells you replace. You maintain a distinct consciousness, whilst a transporter hypothetically removes that continuity, and creates another you, identical in every way, but your consciousness ceases to be, in effect, you die.

This item actually reminded me, I saw a 3D printer that could make a kit for itself, the idea being you pooled to buy one (And it was pretty cheap too) and then you could print others for your friends, to spread the tech around. That was a neat piece of kit.
I guess so ... but I still kind of see it as a moot point. It's not like you 'die' in any realistic sense. You're merely transmutated through three steps in the Star Trek examination of things. Matter, energy, matter. Which is arguably all our body does in the firstplace. Arguably that's all the universe does in the first place.
That would depend though, on whether the neurons continue firing, and your consciousness continues to be. Otherwise, the thought is it's kind of like volatile memory, and when you disconnect it, it's gone.
I mean do you consider yourself dead inbetween heart beats? Or when your brain has a petit (or grand) mal seizure?
Well, no, that's not what I meant by continuity. I mean that a person is more than just the makeup of the person, they are the individual one. When your heart beats, when you have a seizure, your brain, and your mind, which all neuroscience leans towards arising from the brain, continues to be. In particular, the case where you could create another you, even if we take it that the transporter is sending your atoms, if they can plan you, they can build you, and if they build another you, your consciousness would not percieve that body.
When you experience a moment of deja vu? When you move between REM and the four stages of sleep? It seems like unnecessary dialogue, unless there is someone is actively fiddling with the data stream to create a new consciousness from an old one.
That's the thing though, dissolving someone into that data is ending the mind. It's stopping brain function, pulling the brain into it's constituent atoms, and then building a new one. A second consciousness arises, but is it the same, and does the original one ever leave the oblivion it enters upon disintegration?
How can you be 'murdered' if you have your consciousness merely alter location?
Does the consciousness alter location? Can a mind do that? Especially absent a brain. How can a mind continue to exist without a brain?
Does Neo get 'murdered' everytime he goes into the Matrix (or more to the point, when he exits)?
I don't know. Does the matrix jack into his brain and his brain control the simulation, or does it literally download his brain, in which case, can we have more than one of his brain, a la Smith? And does the original Smith gain the perception of his clones and their experience, and when he dies, what happens to them? It seems fairly straightforward that these are seperate entities.
I would say it's a moral good. No damage or alteration to one's mind,
I don't think we have nearly the grounds to say that the consciousness is continuous.
AND you can eliminate foreign, deleterious bodies from your physical nature.
This admittedly, would be very cool.
You are still you, with benefits. Which is all you can ask for from the progression of science.
Science is actually far from settled on this, particularly neuoroscience, who's field we're dancing in. I was actually in part referencing an article by Yale Neuroscientist Dr. Steven Novella, here: http://theness.com/neurologicablog/index.php/the-continuity-problem/
 

Addendum_Forthcoming

Queen of the Edit
Feb 4, 2009
3,647
0
0
Loonyyy said:
That would depend though, on whether the neurons continue firing, and your consciousness continues to be. Otherwise, the thought is it's kind of like volatile memory, and when you disconnect it, it's gone.
Well, debateable ... as people with, say, epileptic fits do not remember the time they spent seizing as an example. They just end up on the floor and wonder what they're doing there. A lot of the times people who go into seizures never realise they have until there is a witness to tell them as such.

Loonyyy said:
Well, no, that's not what I meant by continuity. I mean that a person is more than just the makeup of the person, they are the individual one. When your heart beats, when you have a seizure, your brain, and your mind, which all neuroscience leans towards arising from the brain, continues to be. In particular, the case where you could create another you, even if we take it that the transporter is sending your atoms, if they can plan you, they can build you, and if they build another you, your consciousness would not percieve that body.
Right, but you're arguing a hypothetical ... that someone COULD clone you. Arguably people can do that now, but it doesn't make the science bad. We know where to draw the line. Are we going to put a value on merely whether you occupy a physical place as to be alive?

Loonyyy said:
Does the consciousness alter location? Can a mind do that? Especially absent a brain. How can a mind continue to exist without a brain?
Well, obviously it does ... you get beamed up ... material is reconfigure, and for you it's merely you were one place and suddenly another. In much the same way a lot of epileptics who go into seizures likely feel. They're standing, or in a chair, and suddenly they're on the floor with people crowding around them. Shouldn't it be seen as simply that? At the very least it makes no sense to treat such a person differently after being Star Trek-style beamed up as it does to treat an epileptic any different after each and every seizure.

Loonyyy said:
I don't know. Does the matrix jack into his brain and his brain control the simulation, or does it literally download his brain, in which case, can we have more than one of his brain, a la Smith? And does the original Smith gain the perception of his clones and their experience, and when he dies, what happens to them? It seems fairly straightforward that these are seperate entities.
I'd say the Matrix HAS to jack into his brain. People have been hacked into ... and if you die in the Matrix you die outside the Matrix. Unless you're Neo ... in which case you pull a rabbit out of the hat, until you don't. For reasons. So it's about as realistic to say that the Matrix allows you to transport your consciousness just as much as it allows you to occupy two places simultaneously.

If you die in the Matrix, you die where you jacked into it.


Loonyyy said:
I don't think we have nearly the grounds to say that the consciousness is continuous.
Why not?

Let's assume you are dematerialised and converted into some collection of wavelengths to be rematerialised later. It's not like you'd feel it. It would be like you went from one place and instantly emerged elsewhere. To you (the only person in which it matters) merely went from one place to the other.

(Edit) I'll give you an example. I went out today. Did a whole bunch of stuff. Checked out uni enrolment, moved around, met with various lecturers and the like, etc. But if you were to ask me how many times I opened a door and crossed its threshold I couldn't possibly tell you. I just didn't recall. I recall going to the places, but the number of doors I opened today? Probably escape me. I might be able to narrow it down to 1 or 2 off its mark, but it would be a lucky guess to count exactly how many doors I opened and how many thresholds I crossed into a new environment. I would imagine it would be just as inconsequential as it is to a person who regularly used a Star Trek-style teleporter.

The real question here is why if they have this technology do they need to kill people on battlefields at all? Why not just imprison people in a machine and release them into a holding cell? Or better yet, just keep them on disc until the war is over and hand it to the enemy nation after the war is finished.

if both parties did this in a manner as reasonable as can be expected, wars would be far less barbaric. They might be a little more long lasting given the recyclig of soldiers, but all in all the likelihood of true civilian casualties when entire planets are being attacked would be quite minimal. Instead of weapons that stun or kill people, have those tagging bots that allow people to be beamed directly onto a vessel from that god awful Insurrection movie.

More to the point, a person could feasibly carry a million soldiers onto a planet in a briefcase ... how cool a soldier deployment system is that?

Loonyyy said:
Science is actually far from settled on this, particularly neuoroscience, who's field we're dancing in. I was actually in part referencing an article by Yale Neuroscientist Dr. Steven Novella, here: http://theness.com/neurologicablog/index.php/the-continuity-problem/
I feel this question is better posed and examined by a philosopher, rather than a neuroscientist. Given reading this there seems to be very little science, and a whole lot of questions concerning the metaphysics of what it is to think and what it is to exist. I shall take the position that there is nothing intrisically you without self-authentication. If it matters naught to the person being beamed up, then it matters naught to who they are and what they represent.

They are still self-authentic. By all means, make it a choice. Would you like the shuttle or the teleporter? But for the person choosing the teleporter I would imagine it would be inconsequential to who they are as a person.
 

Loonyyy

New member
Jul 10, 2009
1,292
0
0
PaulH said:
Loonyyy said:
That would depend though, on whether the neurons continue firing, and your consciousness continues to be. Otherwise, the thought is it's kind of like volatile memory, and when you disconnect it, it's gone.
Well, debateable ... as people with, say, epileptic fits do not remember the time they spent seizing as an example. They just end up on the floor and wonder what they're doing there. A lot of the times people who go into seizures never realise they have until there is a witness to tell them as such.
Sure, people experience memory loss, or abnormal brain function. Which is why it's still a question, I'd lean towards consciousness being non-continuous, largely because of that cloning dilemma. If I put together a mind, whether out of the original material or other identical matter, the result should be the same. So if I can take a person apart and put them back together, there's no reason I couldn't put another together. These people wouldn't share consciousness.

I'm personally doubtful however, that it's possible to maintain a consciousness. I think that it's more likely that however you rearrange the person, you end up with a brain dead person, with no vitals, or way to start them. Anatomy's kind of a *****, and I don't see an easy way around that.
Loonyyy said:
Well, no, that's not what I meant by continuity. I mean that a person is more than just the makeup of the person, they are the individual one. When your heart beats, when you have a seizure, your brain, and your mind, which all neuroscience leans towards arising from the brain, continues to be. In particular, the case where you could create another you, even if we take it that the transporter is sending your atoms, if they can plan you, they can build you, and if they build another you, your consciousness would not percieve that body.
Right, but you're arguing a hypothetical ... that someone COULD clone you. Arguably people can do that now, but it doesn't make the science bad. We know where to draw the line. Are we going to put a value on merely whether you occupy a physical place as to be alive?
That's not quite what I mean. I mean, if I can pull someone apart, and retain the information to put them back together, and put them together with those constituent atoms, what's to stop me from doing it more than once? Atoms aren't unique, if I have the blueprint, I should be able to build it. I mean cloning via the hypothetical transporter tech, not traditional cloning techniques. If I can create a copy, the two wouldn't have the same consciousness (Assuming they are alive, and conscious), and there is no reason to believe that one is any more related to the original consciousness than the other, since the mind has essentially been recreated according to a blueprint for the brain. Each one has essentially had an identical version of the same mind made, but the original has ceased to percieve, which is what people are concerned about.
Well, obviously it does ... you get beamed up ... material is reconfigure, and for you it's merely you were one place and suddenly another.
But when the material is reconfigured, how does your mind continue to be? There's no brain anymore, nothing that gives rise to a mind. In effect, they've used you as the building blocks to build another you. There is no obviously here, that's the error. It's ignoring what we know of neuroscience and physics, and would have to rely on some unknown mechanism.
In much the same way a lot of epileptics who go into seizures likely feel.
But epileptics continue to have brains, and they continue to function, whether they percieve it or not. Same with sleep, with people with memory loss or brain damage. They all have brains that continue to perceive. It's not memory I'm concerned with, it's brain function, and that ceases when the brain does. Even if I build another and set it up the same, there's no reason to believe that the original person now begins to perceive through the new one, particularly when it's considered that I could make two. That it's the same atoms is irrelevant, because atoms don't have memory like that.
They're standing, or in a chair, and suddenly they're on the floor with people crowding around them. Shouldn't it be seen as simply that?
But we can track an epileptic's brain function (And it has been done). The mind doesn't just cease, the brain does not die. The brain acts abnormally, but it is the same person. You and I almost certainly have had periods of altered consciousness or memory loss (I definitely have), particularly when it comes to sleep. But the brain still functions during sleep, and we are still perceiving when we wake up.
At the very least it makes no sense to treat such a person differently after being Star Trek-style beamed up as it does to treat an epileptic any different after each and every seizure.
Not at all. That's not at all analogous. An epileptic still has a mind during a seizure, and their brains still function during a seizure, though atypically. And I'm not suggesting we treat anyone differently. They would believe, and for all intents and purposes, would behave, the same as the original. It's just that the original person had ceased to be. So if I went into the transporter, I'd cease to be. It'd be oblivion. My self would be gone. But another me would arise, which would believe it was me, remember me, but I would not perceive his perception. The new me would behave the same, and would go on to have new experiences, and others would perceive them the same. But I would stop perceiving it. It would essentially be "Last Thursdayism" on the scale of minds.

Loonyyy said:
I don't know. Does the matrix jack into his brain and his brain control the simulation, or does it literally download his brain, in which case, can we have more than one of his brain, a la Smith? And does the original Smith gain the perception of his clones and their experience, and when he dies, what happens to them? It seems fairly straightforward that these are seperate entities.
I'd say the Matrix HAS to jack into his brain. People have been hacked into ... and if you die in the Matrix you die outside the Matrix. Unless you're Neo ... in which case you pull a rabbit out of the hat, until you don't. For reasons. So it's about as realistic to say that the Matrix allows you to transport your consciousness just as much as it allows you to occupy two places simultaneously.

If you die in the Matrix, you die where you jacked into it.
Which would suggest to me that the self in the matrix arises from the brain. And then it gets messy when we consider exiting etc, or Smith taking over Bain's brain. That's where it gets messy. I'm not sure if the Wachowskis were certain whether they were operating under an outdated Mind/Body Dualist framework, where the mind is downloaded, and operates seperate from the brain, or whether they were operating under a mind as a consequence of brain function, in line with modern neuroscience. It's a mishmash of ideas that aren't consistent, which were designed as a fantasy, and perhaps an illustration of a philosophical concept. It's not a place to start seriously looking at what exactly the mind is.
Loonyyy said:
I don't think we have nearly the grounds to say that the consciousness is continuous.
Why not?

Let's assume you are dematerialised and converted into some collection of wavelengths to be rematerialised later.
That's really not how physics works. And what supports my mind in the meantime? What makes the mind which is created on "rematerialisation" the same? See, the problem exists before the scenario you're setting up, it's the assumption I have the problem with. The mind doesn't exist without the brain, not the other way around. What supports my mind? If we're putting together the body and brain from a previous state, why can't we do the same with other matter? Why would the same matter retain my mind when not in the same configuration, the one that gives rise to my mind? Essentially, this is more like "water memory" as a concept. If I can create one, given the matter I can create two. If I can stimulate the brain to produce the originals thoughts, I can do it again. I can create a million versions of me that all start where I ended, but I would perceive the mind of not one of them.
It's not like you'd feel it.
It wouldn't matter whether I feel it, it matters whether my consciousness continues to exist. Which it does in sleeping people, in epileptics, in just about any interruption you can think of that people return from. Brain disintegration, not so much.
It would be like you went from one place and instantly emerged elsewhere. To you (the only person in which it matters) merely went from one place to the other.
But there is no me. When I get split apart, I stop having a mind, and someone builds a new one out of me, and then stimulates it to think it was me when I ended. I'm not saying you don't get a person out of it, I'm saying that the original individual is gone.
(Edit) I'll give you an example. I went out today. Did a whole bunch of stuff. Checked out uni enrolment, moved around, met with various lecturers and the like, etc. But if you were to ask me how many times I opened a door and crossed its threshold I couldn't possibly tell you. I just didn't recall. I recall going to the places, but the number of doors I opened today? Probably escape me. I might be able to narrow it down to 1 or 2 off its mark, but it would be a lucky guess to count exactly how many doors I opened and how many thresholds I crossed into a new environment. I would imagine it would be just as inconsequential as it is to a person who regularly used a Star Trek-style teleporter.
Not remembering things is not the same as not having a mind. This is the key difference. We all forget things regularly, it's part of how our brain functions. We're not talking about a brain function anymore, because we're talking about breaking the brain down, ceasing it's function, building a new one out of the constituent atoms, still nonfunctioning, and somehow causing the brain to behave as it did before.
The real question here is why if they have this technology do they need to kill people on battlefields at all? Why not just imprison people in a machine and release them into a holding cell? Or better yet, just keep them on disc until the war is over and hand it to the enemy nation after the war is finished.
That would be very cool. Kind of like some of the various incarnations of Superman mythos.
if both parties did this in a manner as reasonable as can be expected, wars would be far less barbaric. They might be a little more long lasting given the recyclig of soldiers, but all in all the likelihood of true civilian casualties when entire planets are being attacked would be quite minimal. Instead of weapons that stun or kill people, have those tagging bots that allow people to be beamed directly onto a vessel from that god awful Insurrection movie.

More to the point, a person could feasibly carry a million soldiers onto a planet in a briefcase ... how cool a soldier deployment system is that?
Again, very cool. I do want to believe it, I am just presented with insufficient evidence to do so. As I've noted, the cloning (Or as someone above put it "The Prestige") scenario raises the objection, and I am not convinced that there is any reason we could not create more than one, and it is an absurdity that we would perceive both, and it contravenes our understanding of physics to assert a special relation with the atoms involved giving rise to individuality. I personally would very much like it to be the case, especially since I'm not all that fond of the concept of mortality, and the idea of being able to move my mind into another body, or even machine, is something I would like. However, I feel, as proposed by Novella, the best way to achieve this is through augmentation. Since the mind is not just some thing we can take, and move around at will. It is a consequence of brain activity, and not a seperate entity.
Loonyyy said:
Science is actually far from settled on this, particularly neuoroscience, who's field we're dancing in. I was actually in part referencing an article by Yale Neuroscientist Dr. Steven Novella, here: http://theness.com/neurologicablog/index.php/the-continuity-problem/
I feel this question is better posed and examined by a philosopher, rather than a neuroscientist.
If it makes you feel any better, Novella is an armchair philosopher of some note, the head of the New England Skeptics Society. But I have to fundamentally disagree. We have to reference what has been established scientifically, and without that, we end up speculating, and terms get muddy, ideas get confused, and basic errors are made. For instance, memory and consciousness are not the same thing. I'm not a fan of simply applying philosophy without reference to the facts, and feel that in this case, neuroscientists are the philosophers that we should be questioning, as a subset of science and philosophy. Otherwise, what principles, what facts, and data are we operating from? Are we going to discuss the Matrix as a framework for understanding complex neuroscience, or ignore the data regarding epilepsy, sleep, or mind/brain interaction (A field which has had more than a little study). Why is neuroscience not considered on a similar level to philosophy? What is a neuroscientist, if not a philosopher who specialises in both studying, understanding, analyzing and gathering evidence in the subject matter we are talking about, someone who stays up to date on the relevant information. This would be like discussing the merits of creation and evolution with no reference to biology. And it's been the source of numerous errors I've tried to illustrate.
Given reading this there seems to be very little science, and a whole lot of questions concerning the metaphysics of what it is to think and what it is to exist.
Actually, he does reference more than a few things that you seem to have missed. That said, this is an informal, thought-piece column, not over heavy on referencing. There's a lot of stuff on mind/body dualism around, it's not a particularly well supported idea. As Novella notes, the mind is a function of the brain. There's a big body of research behind it, and how interfering with the brain influences the mind. We see it in brain injury (Including memory), we see it in psychiatric medication, we see it in manipulation of the brain, and we don't see minds existing absent brains.
I shall take the position that there is nothing intrisically you without self-authentication. If it matters naught to the person being beamed up, then it matters naught to who they are and what they represent.
Well, to be frank, that matters not at all. To everyone else, they would represent the same thing. I'm not arguing that they wouldn't. I'm arguing that it's a copy of the original made out of the same components. And as established, if we can make the blueprint and assemble things, we should be able to do it twice, and it is an absurdity that we are both, and there exists no mechanism for the original material to have some special claim to the original mind, ergo, it follows that both are copies. They still think, they still feel, and they still believe themselves the original, and may be treated as such. But the individual who stepped in has ceased to perceive, their instance of their mind has been destroyed, and merely copied, which is an effect that most would rather avoid. The problem, is still that we disassemble the brain. It doesn't matter that we put it back together, we already took it apart, at the atomic level. If we so wanted, we could gather carbon, hydrogen, oxygen, and trace amounts of other elements, and build a second with the same data. That's a problem.
They are still self-authentic. By all means, make it a choice. Would you like the shuttle or the teleporter? But for the person choosing the teleporter I would imagine it would be inconsequential to who they are as a person.
I'm not arguing that the person has changed on a level of their consciousness being altered, I'm saying it's been stopped, and another has been made in it's image. And it's not a question of what I'd like, but what is supportable. I would like the teleporter, and I'd also like it to work. I'd also like to have more money than I have, but that hasn't made that true yet.