i don't see how this will result in the end of mankind.
because even if they could kill us all, why would they?
you know what their main programing has always been? to help the humans that made them. build a robot that can walk, talk and can kill/destroy anything, and what will it do the first time you turn it on? nothing. it will just stand there and do nothing until you give it a command of some sort. and even if you build an army of robots and program them to kill all humans, it won't be of their own free will, becuase YOU programed them. again, robots are machines and machines have always been built and programed to help their creators. if they kill it would not be out of hate or because they think themselves as superior, but it will be because that's what they were programed to do. it is immpossable to program emotions and free will. that is because emotions are different with each different person, and if you program a robot to like something--a flower for example--then it will like it for the soul purpose that you told it to; not because it wants to. no free will, and no free will means no emotions, which leads to no feelings of hate, superiority or any other emotion that would lead to the destruction of mankind. no motive = no action.
because even if they could kill us all, why would they?
you know what their main programing has always been? to help the humans that made them. build a robot that can walk, talk and can kill/destroy anything, and what will it do the first time you turn it on? nothing. it will just stand there and do nothing until you give it a command of some sort. and even if you build an army of robots and program them to kill all humans, it won't be of their own free will, becuase YOU programed them. again, robots are machines and machines have always been built and programed to help their creators. if they kill it would not be out of hate or because they think themselves as superior, but it will be because that's what they were programed to do. it is immpossable to program emotions and free will. that is because emotions are different with each different person, and if you program a robot to like something--a flower for example--then it will like it for the soul purpose that you told it to; not because it wants to. no free will, and no free will means no emotions, which leads to no feelings of hate, superiority or any other emotion that would lead to the destruction of mankind. no motive = no action.