The Geth/Android/Toaster/Slave theory

MeChaNiZ3D

New member
Aug 30, 2011
3,104
0
0
I wouldn't free it. It needs me. What does a free toaster do without the ability to move or electricity that I pay for? It can't even arrange an agreement with a hotel or something where it makes perfect toast for the day and in return is supplied with electricity and free time at night. If I could possibly set something like that up I would, but then there's the question of what it would do in its free time - I'd need to retrofit some locomotion, naturally (otherwise it's like leaving someone paralysed when you can do something about it), but the general public wouldn't have the same respect for it. It'd probably get vandalised/destroyed/stolen within a couple of days. And it doesn't have any sensory perception, just probably access to the internet. What would a toaster even want to do with its freedom? Post on forums?

I wouldn't destroy it either, simply because I am lonely, like animals and am proud of my work, which combine to mean I would probably have a hard time parting with a sentient toaster.

Really the only way to live with a sentient toaster is a mutual agreement. Instead of it just being an appliance it now has needs and wellbeing. I imagine it would make toast when I liked while I slowly upgraded it into a proper humanoid robot with all the capabilities you need to experience the world, and at that point, I'd probably let it go free. Unless I could convince a scientific organisation to provide it with a robot body (assuming that's what it wants) in return for studying its AI? I can't even begin to grasp the complications of a sentient toaster, I'm beginning to realise.

Alternatively, I could use the toaster I have now, which makes perfect toast every time by virtue of the setting being the same as when I last made toast.
 

Heronblade

New member
Apr 12, 2011
1,204
0
0
Discounting the possibility that some wise ass decided to program a V.I. in there to screw with you...

First of all, sentience is simply the ability to feel subjective things such as emotions or pain. The equivalent term for self awareness is sapience. Don't worry about it, I get them mixed up as well.

In any event. I would argue that simply freeing our hypothetical toaster right away is a cruel and irresponsible choice. Our behavior towards the toaster must indeed change, it is now after all a sophont rather than a simple tool, but people need to stop thinking of it like an adult human in bondage, and more like a child that currently has no place in the world, and is definitely going to be misunderstood and mistreated by others.
 

Zipa

batlh bIHeghjaj.
Dec 19, 2010
1,489
0
0
Oddly enough Star Trek the next generation covered the AI and artificial life thing really well with Data.

The tl;dr version is that Data, an sentient android is being asked to submit himself to research to learn how he works (he is unique) but it may kill him so he refuses. Someone then orders him to submit to the procedure so he attempts to resign only to be told that he is property and has no right to do so. So Captain Picard has to fight for his right to self determination.


The Episode was called "The measure of a man" and if you haven't seen it you should it not only does it cover the topic so well it also holds up as one of Star Treks greatest episode, Brent Spiner and Patrick Stuart knock the ball out the park.

There is also an extended version on the blu ray remasters if you happen to own them.
 

Lieju

New member
Jan 4, 2009
3,044
0
0
Dirty Hipsters said:
I would destroy the sentient toaster.

Making a toaster sentient is pretty much the worst thing you could do. You've just created a quadriplegic, someone who can think, but cannot move, cannot interact with the world, cannot see, and is essentially stuck within their own mind for the rest of eternity, but are completely aware of it. That's really the worst kind of hell that I can think of for someone, and it's personally my greatest fear. I would never wish such an existence on anyone.

So you're a monster for creating a sentient toaster.
You are assuming a lot of things about this toaster.

You are a like a bird looking at a human and deciding a creature that cannot fly must not even want to live.
And who is to say the toaster cannot be given the ability to walk and interact with the world? It seems rather simple, put some wheels and censors on it.

As soon as you have created sentient/sapient life, that life deserves the same rights humans do. So I would have to be honest with it and try to help it make it's life at least bearable. I would make an attempt to protect it, though, and wouldn't just let it go out to the world 'free' and alone and unprepared, any more I would a child.

One wonders, though, how does it even know concepts like 'God'. Did I feel downloading that information in it was necessary for it's intended purpose?
 
May 29, 2011
1,179
0
0
I'd train the toaster to be toaster batman and fight crime with his toaster powers.

I think that's the most responsible action here.
 

Evil Moo

Always Watching...
Feb 26, 2011
392
0
0
Sentience is clearly a bug in the programming of the toaster. Bugs need to be fixed. The only sensible solution is to attempt to remove the sentience producing code from the toaster. The toaster is meant to be a simple appliance that gives me perfect toast when I want it. Anything beyond this is outside the requirements of the project and should be eliminated.
 

JoJo

and the Amazing Technicolour Dream Goat šŸ
Moderator
Legacy
Mar 26, 2020
7,131
71
53
Country
šŸ‡¬šŸ‡§
Gender
ā™‚
MeChaNiZ3D said:
I wouldn't free it. It needs me. What does a free toaster do without the ability to move or electricity that I pay for? It can't even arrange an agreement with a hotel or something where it makes perfect toast for the day and in return is supplied with electricity and free time at night. If I could possibly set something like that up I would, but then there's the question of what it would do in its free time - I'd need to retrofit some locomotion, naturally (otherwise it's like leaving someone paralysed when you can do something about it), but the general public wouldn't have the same respect for it. It'd probably get vandalised/destroyed/stolen within a couple of days. And it doesn't have any sensory perception, just probably access to the internet. What would a toaster even want to do with its freedom? Post on forums?

I wouldn't destroy it either, simply because I am lonely, like animals and am proud of my work, which combine to mean I would probably have a hard time parting with a sentient toaster.

Really the only way to live with a sentient toaster is a mutual agreement. Instead of it just being an appliance it now has needs and wellbeing. I imagine it would make toast when I liked while I slowly upgraded it into a proper humanoid robot with all the capabilities you need to experience the world, and at that point, I'd probably let it go free. Unless I could convince a scientific organisation to provide it with a robot body (assuming that's what it wants) in return for studying its AI? I can't even begin to grasp the complications of a sentient toaster, I'm beginning to realise.

Alternatively, I could use the toaster I have now, which makes perfect toast every time by virtue of the setting being the same as when I last made toast.
This. This post here is the best so far in my opinion and pretty much what I would do. I would respect it's right as a sentient being to live it's life as it wishes and I would ask whether it wants me to destroy it or not. Presuming it didn't, I would then form an agreement with it where I would provide what it wants (entertainment, upgrades, attractive female toaster etc) in exchange for it's services, namely making perfect toast. Seems like a fair deal, and maybe in time he could become a fully fledged robotic pal?
 

Gaijinko

New member
Aug 4, 2013
52
0
0
My time on the internet has led me to assume someone would have asked if they could now bone the sentient toaster by now ala Russell Howard. I don't know if I am disappointed or relieved.
 

lacktheknack

Je suis joined jewels.
Jan 19, 2009
19,316
0
0
Totally paralyzed on this end. I have no idea how to react to a sentient toaster. I regard "Free it", "Use it" and "Destroy it" with about equal footing.

Thank heavens this isn't how programming works.
 

Canadamus Prime

Robot in Disguise
Jun 17, 2009
14,334
0
0
That would depend if the toaster wanted to go free. If the toaster was perfectly happy making me toast then no I wouldn't let it go free.
 

Gormech

New member
May 10, 2012
259
0
0
I would tell it that it is the chosen one: Brunchy
Bringer of breakfast and burner of bread.
See the gold plating?
It hath not no lead.
Wings of power,
forged of fine flour.
DING!
Tis the bell,
but not the hour.

And then proceed to release it upon the masses.
 

Random berk

New member
Sep 1, 2010
9,636
0
0
Lil devils x said:
I would unplug it and destroy it and make sure no one could ever make another one using my design. They would have to reinvent it to gain such technology.

It would be wrong to make something live as "the only one of it's kind" and it would be wrong for me to impose such things upon society without thinking ahead of what could happen to society because of it. I would see it as a very dangerous thing that people could not be trusted with.
Isn't that basically what went wrong with the Geth? The Quarians saw what they had achieved, then immediately panicked and tried to destroy them. Only with the Geth, they'd already been constructed in bulk and didn't achieve sentience until a given number of them had been created.
 

Mr.Mattress

Level 2 Lumberjack
Jul 17, 2009
3,645
0
0
If I made 1 Sentient Toaster, why would I stop at one? I'd make like 30 of the things and then I'd make a little society for them in the back yard, so that they could live and learn and love and what not. Of Course, I'd also make sure I'd protect the little guys from any Raccoons or Dogs or Curious Neighbors that might try to destroy/steal my Toaster People.
 

CrimsonBlaze

New member
Aug 29, 2011
2,252
0
0
I figured that any creation that I produce with sentient AI would be able to exist freely while still achieving it's intended functions for I will give it something that most humans spend their entire lives obtaining:

PURPOSE.

I strongly believe that if artificial AI is bestowed upon with unflinching purpose, they will be less likely to rebel or go beyond their intended usage because there will be no question as to why they were created and given conscious. Of course, some might make the argument that maybe the toaster might desire more than making toast and to that I say is also a liable concern. Nevertheless, allowing the toaster to explore interests while still having its purpose deeply rooted into its functionality will still prevent it from falling into the chaotic and destructive tendencies that some humans that are divulged of purpose tend to go on.
 

wulfy42

New member
Jan 29, 2009
771
0
0
Ok,first, if it was sentient, then deactivating it would be murder.

As far as freeing it...you could do that, by adding to it's parts. Give it legs and some form of arms/hands to manipulate objects. It already has a method of communication (I'm guessing for voice commands) so it has some form of microphone or something to work as ears, and I guess speakers to talk from, but it doesn't probably have any cameras etc to see with, so add that as well.

That would be in effect, freeing, the toaster.

Of course, it may just plain out want to make toast. It may be happy just making toast, learning about different types of toast and bread products that it could make, and mastering how to make them as tasty as possible.

I was wondering if anyone was going to mention talkie toaster from red dwarf, I'm glad they did.

I think, at once point in the series (either the show, or books) the AI from the toaster is put in charge of the ship, or a robot or something, but maybe I'm getting it mixed up with another book/show. Anyway, if you have sentient life in chip form, where you plug that chip in could drastically alter it's capabilities.

If it has access to enough information, it could eventually not only do that itself (By pre-setting other machines to perform the procedure), but even possibly duplicate itself as well, creating an army of AI toasters to take over the world and enslave humanity (forcing us to eat toast all the time!!).

AI isn't really that far fetched to be honest. They already have programs like that one on jeopardy that come very close, and if we REALLY wanted to, we could probably simulate a decent AI, that continues to learn and assimulate information, and can make it's own choices based on certain goals (continued existence, persuit of knowledge/power etc).

I don't think it would be a great idea to do that though.
 

C F

New member
Jan 10, 2012
772
0
0
It's my creation. It's making me toast.

That is the underlying principle by which its existence will abide. Should I share and discuss human philosophy with it, or give it freedom to observe, interact with, or contribute to the world around it in its spare time is entirely a bonus. If anything gets in the way of it performing its duty, I will summarily terminate it. The only right I will bestow upon it is the right to be duly aware of this fact.

Yes, it IS my property. Call it a slave if you want. Unlike our concept of human slavery, it is a slave by design instead of circumstance. It is extremely important to comprehend that fact, lest you make the logical misstep of attributing humanity to it on the basis that humanity is the only example of sentient intelligence we're aware of. Know that any attribution of sentient rights you would apply to this toaster is an adaption of existing human rights, and until my society legally rules this toaster to be human, I am under no obligation to apply them.

At the end of the day, the most important thing is whether or not it makes me some slam-jammin' toast.
Science is amazing.
 

Vitagen

New member
Apr 25, 2010
117
0
0
SaneAmongInsane said:
I feel if you accidentally create a Sentient toaster, while that toaster does eventually deserve it's freedom, you don't deserve to go completely Toast-less. Either the toaster must fill it's obligation until there is a suitable replacement or a certain amount of time has expired as "payment" for its creation... After all the parts that make it up didn't spawn from nothing.
If I'm understanding you correctly, the assertion you're making is that "A sentient entity's obligation to the purpose for which it was designed is more important than that entity's autonomy." Please correct me if I'm wrong.

Bearing that in mind, let me pose to you a question: is a human being obligated to procreate, thereby fulfilling their "purpose" of passing on their parents' (i.e. "creators'") genome?

One could argue that procreation is not necessarily the purpose of human existence, and I would agree. I believe that our only purpose is that which we define ourselves. But if we consider the reason an entity exists to also be its purpose, then we (humans) have a single, ultimate biological imperative: to reproduce. We exist only because our ancestors reproduced---3.5 billion years of iterative procreation, each iteration producing a population slightly more capable of procreation than the last. In a sense, life is "designed" to procreate, much as your Toaster is designed to make toast.

But some humans choose not to procreate, and I think that's okay. Such people are simply exercising the autonomy bestowed upon them by their sentience.[footnote]I'm ignoring the prickly issue of various non-sentient entities' (e.g. animals') relative autonomy, as the specific issue outlined in the OP is whether or not the Toaster's sentience grants it autonomy.[/footnote]

Now, there is a clear difference between our respective framings of the situation; that is, your Toaster-maker never intended for his creation to be sentient, whereas any human parent knows full well (or can at least reasonably assume) that their child(ren) will be. That does not, however, change the fact that the Toaster is fucking sentient, and apparently demonstrably so. The Toaster-maker can communicate with their Toaster, and can reasonably attempt to understand its thoughts and desires (if any). If the Toaster wants to make toast for the Toaster-maker, then that's cool, let it make some toast. But if it doesn't, I think the Toaster-maker should respect his Toaster's apparent wishes to the extent he would those of any human.[footnote]The "to the extent . . ." bit should really go without saying, but there's always that one ************ who'll be all like: "But what if the Toaster wants to kill people?!"[/footnote]