Poll: will machines ever be more intelligent than people?

Tipsy Giant

New member
May 10, 2010
1,133
0
0
Yes and it won't be far off.
Technology is advancing so fast and I don't think we will be waiting for longer than 20 years before we have the first robots inventing
 

Saladfork

New member
Jul 3, 2011
921
0
0
Computers are only capable of doing what you program them to do.

You can't program them to do something you can't think of yourself.

Therefore I assert that computers cannot become more intelligent than their creators. It might be theoretically possible for them to become as intelligent, though.
 

Frission

Until I get thrown out.
May 16, 2011
865
0
21
In pure "intelligence", when it comes to things such as memorizing, machines have already surpassed us. The only thing we have on them is our ability to create, and machines are being programmed to be able to do that. Somewhat. There was a pretty nifty article on the New York Times about a machine creating completely new classical music, but I can't find it anymore.

Oh well, there's still this.
http://www.liftagency.com/2008/09/metamorphosis-computer-created-art.html

In so far they can't become more intelligent than the collected intelligence of all humanity, but who knows? Progress and all that.
 

Alcamonic

New member
Jan 6, 2010
747
0
0
Depends on how you identify actual intelligence.
I see it as how well you can adjust to your surroundings and solve a large spectrum of different problem. Being creative and trying out new ways of solving these issues are a larger part of our general cleverness that has allowed us to survive this long and adapt to most of the areas of the world.

But all this creativeness comes from earlier data input, such as you can think ahead of what will happen if you throw a rock against a window, because of your past experience in same or similar situation.

When a machine(or AI rather, as the machine is only the body) can effectively set up a goal and learn from mistakes, but also improve designs on already existing tools (more effective weapons or transportation systems) without the need for human interaction then they will most likely surpass our intelligence. Only down side would probably be the hardware, but if they have reached the point of where they can make improvements of their own that becomes a none issue.

But this "incoming Matrix rebellion of the machines!" is something I find ridiculous. This fearmongering has been around for the past 70 years. Should be go back further, people actually thought that the electricity was something that the devil had bestowed upon us, and global reckoning would surely follow.

Also, give me a talking toaster. "I toast, therefore I am!"
 

Akytalusia

New member
Nov 11, 2010
1,374
0
0
none of you will see it. your AI research is curently heading in the wrong direction. sentient AI will never be on the horizon until you realize this and adjust your course.
 

ImperialSunlight

New member
Nov 18, 2009
1,269
0
0
Since "intelligence" as a concept is extremely vague, no. machines will never be objectively more intelligent than humans since the level of an object's intelligence cannot be defined objectively.
 

Bobic

New member
Nov 10, 2009
1,532
0
0
I could be an ass and point out that we could probably design a computer now that could pass an IQ test with near perfect accuracy. But that's clearly not what you meant so I'll move on to a better, non stupid, answer.

If something as unfocused as evolution can create something as complex as the human brain, I'm sure, given the ridiculous amount of time it'll take for humanity to go the way of the dodo (assuming no dipshit disasters like full blown nuclear war or, dare I say it, some ancient Mayan prophecy decides to make everyone look daft by actually coming true), people specifically trying to make more advanced AI will be able to match it. Do I think it will happen soon? hell no. But I'm sure it'll happen eventually.
 

Dakkagor

New member
Sep 5, 2011
59
0
0
Saladfork said:
Computers are only capable of doing what you program them to do.

You can't program them to do something you can't think of yourself.

Therefore I assert that computers cannot become more intelligent than their creators. It might be theoretically possible for them to become as intelligent, though.
That only assumes you have baseline people using just their brain meats to program computers. How smart is a room full of programmers, working on the same program, with top of the line machines and alot of good code already laid down? What happens when you have computers and people working together to make smarter computers?

I think we may have intelligent machines within our lifetimes. The problem then becomes what do we do with them? What do we let it do? If we let it design other machines, other intelligent machines, we will have truly opened pandoras box, and unleashed the technological singularity. And trying to wrestle it back into the box may be impossible.

I don't think the decision will be ours to make. I'll be in a senilecube painting my metal models that I bought back in the early 00s because I still haven't got around to finishing my orks. we can only pray our children make the right decision.
 

Surpheal

New member
Jan 23, 2012
237
0
0
For a machine to even have near human intelligence it would need to be programmed to recognize and implement common sense, and to even think of all that common sense is is a very immense task. Try it yourself, think of everything that you know of that you know required very little to no teaching at all.....

Difficult isn't it, now image putting that and all that you over looked into a useable code.
 

Owyn_Merrilin

New member
May 22, 2010
7,370
0
0
Point of order: learning computers are not only possible, they exist. Look up a little robot called "Kismet," or if you want a more hands on (but also much more limited) demo, get a copy of the original PS2 release of Virtua Fighter 4, and take a look at the train a fighter mode.

As for intelligence, it depends on your definition of intelligence. For deterministic things, they're arguably already smarter. Think raw number crunching, chess, and Jeopardy -- all of which have had the best humans beaten by computers. If by intelligence you mean the ability to adapt to new skills and situations, and the ability to work on things that can't be broken down into algorithms, well, we're a long way off from that, but I wouldn't write it off as completely impossible; just a matter of time.
 

miketehmage

New member
Jul 22, 2009
396
0
0
In my experience, and I'm in my first year of a Software Engineering degree, computers are pretty fucking stupid. Don't get me wrong, they're great at number crunching and such. However they aren't too good at learning. Yes, it's possible, and yes they are learning but they aren't even close to being as good as us at adapting to new situations, and learning from them. They are an extremely useful tool though.

Do I think they'll ever be more intelligent than us? maybe. But it's gonna take one hell of a smart guy to enable them to be.

Also here's a mindfuck: If we create intelligence greater than or equal to our own, can it be considered the creation of life? Granted it isn't organic, but does it have to be?
 

Zen Toombs

New member
Nov 7, 2011
2,105
0
0
the random said:
i dont think so because to become more intelligent than people machines have to be able to learn and i dont think machines will ever be able to learn. Because to make something learn you have to reward good behavior and punish bad behavior (like training a dog) and you cant reward or punish an emotionless machine (and i dont think machines will never be able to have emotions)
Only Skinner thinks that's exactly how learning works, just so you know. 'Sides, emotion is just electrical impulses, which could be easily simulated in a machine.

How one would make a machine more intelligent than people would work like this:

Fact 1: we create machines to solve problems, and they get better at it every day.
Fact 2: a problem we can try to solve is to create a better machine.[footnote]Corollary: it is not yet possible to have a machine carry out this task, but it will be in the future.[/footnote]

Once we have a machine that solves that problem, we use the newly created machine 2.0 to create a better machine; repeat ad nausium.

white ninja text

The real question is not if machines will be more intelligent than people, but when will machines be more intelligent than people.
 

freaper

snuggere mongool
Apr 3, 2010
1,198
0
0
Yea, but not anywhere soon. It'll probably be by accident too, and algorithm of sorts that happens to add "more intelligence than expected".
 

FarleShadow

New member
Oct 31, 2008
432
0
0
Of course they are going to get progressively smarter until they are smarter than we are, we're bound by the limits of biology and neurotransmitters. Computer minds would be bound only by the limits of whatever medium they run on (So they'll be crazy-fast).

That said, alot of people have seen too many 'KILL ALL HUMANS' movies, I suspect the first thing an AI will do is analyse us for abit, then make plans for a benign entry into society.

A good and easy example is like if an AI mothership arrived in orbit, but instead of getting involved in a costly war, they decide to trade bits and pieces of their lowest technology (Like, fusion reactors) to us in return for....I dunno, water and metal ores? That way, humanity feels like they're benefiting and the AI get whatever they want.

And, at the end of the day, if you're trading the odd bit of garbage tech for kilotons of processed metal (Instead of doing all the work yourself), its a win-win!
 

Kriptonite

New member
Jul 3, 2009
1,049
0
0
I think, completely, that machines can be smarter than humans. Easy.
I just don't think humans can make that(those) machine(s).
 

ecoho

New member
Jun 16, 2010
2,093
0
0
yes they will end up being smarter then us but i think at that point they will be part of us as well.
 

NoeL

New member
May 14, 2011
841
0
0
the random said:
i dont think so because to become more intelligent than people machines have to be able to learn and i dont think machines will ever be able to learn.
Machines are already capable of learning. Last semester I built a neural network that learns a bunch of patterns and is then able to recognise them.
 

manaman

New member
Sep 2, 2007
3,218
0
0
the random said:
i dont think so because to become more intelligent than people machines have to be able to learn and i dont think machines will ever be able to learn. Because to make something learn you have to reward good behavior and punish bad behavior (like training a dog) and you cant reward or punish an emotionless machine (and i dont think machines will never be able to have emotions)
Machines can learn. [http://en.wikipedia.org/wiki/Machine_learning]

What you are basically saying up there, at least what I get out of it, is that you believe there is some spark of life we cannot grasp and cannot be explained or duplicated.

That I don't believe is true at all. At some point we will know exactly how the brain functions, and when we know that we can duplicate it. Once we can duplicate it we can improve upon it.
 

NoeL

New member
May 14, 2011
841
0
0
Saladfork said:
Computers are only capable of doing what you program them to do.

You can't program them to do something you can't think of yourself.
Incorrect. Computers are ALREADY being used to formulate solutions nobody has thought of. Since the computer "thinks" in a different way to humans, the solutions they churn out can be completely counter intuitive, yet still work. In some cases we don't even understand HOW the computer's solution works - it just does.

All we need to do to create a machine more intelligent than humans is to create AI that's capable of writing code for itself. It will start off dumber than a bacteria, but it will constantly learn and update itself over the course of time, rewrite inefficient functions, write functions to allow for more functionality, and sometime down the track will surpass humans in terms of intellect - and only continue to learn and operate faster than ever before. If you give the damn thing a pair of hands there's no telling what would come of it. Most likely Skynet.

I should add that the machine will probably never resemble a human (unless it was initially coded to mimic humans). It will almost certainly be some weird alien intelligence that we don't quite understand, since the machine would experience the world in a completely different way.
 

SnakeoilSage

New member
Sep 20, 2011
1,211
0
0
FarleShadow said:
Of course they are going to get progressively smarter until they are smarter than we are, we're bound by the limits of biology and neurotransmitters. Computer minds would be bound only by the limits of whatever medium they run on (So they'll be crazy-fast).
The most powerful computer right now has less than 1/6th of our brain's memory capacity and performs less than 1/10th of the calculations our brains process every second. I don't have the exact data on me but the math was done recently.

Don't assume that because lifting your arm seems effortless to your consciousness doesn't mean that your brain isn't working countless calculations to perform the task. Not to mention the programming capacity necessary for cognitive self-awareness. The most advanced artificial intellects and robots have nothing, absolutely nothing like it. The closest they get to is the most basic instincts of insects. The idea that packing a computer full of knowledge will somehow allow it to explode into consciousness is laughable because the hardware and software do not exist to regulate that kind of sentience. Every computer in existence has the capacity to do just one task, and one task only:

To do what we tell it to do. We program it to respond to our commands, if x then y, if the human using my keyboard tells me to do a task I shall attempt to do it according to my human-created programming, and if I do not have that programming, I cannot perform the task. No compromise, no problem-solving, no awareness that a problem exists beyond the responses it was programmed to provide. By humans.

People keep expecting some kind of singularity, that just because a Chess program has been programmed with thousands of potential moves to react to human actions, that AI will somehow explode out of the Internet. It won't. No amount of information uploaded into a computer or series of computers will give it the capacity to perceive that information as anything but data to be utilized by the humans accessing it. There is no "awareness" examining the information, no computer attempting to process the info for itself.

The fact that we can't even define sentience ourselves proves that it is a state of being that we cannot emulate no matter how advanced our technology is, because our biological brains with millions of years of programming cannot comprehend its meaning yet.