There is a debate as to when computers will match our intelligence, or if it will always be exclusive to humans.
Here are some quotes from the language discussion:
ruprecht wrote:But natural intelligence is far advanced over artificial intelligence. If they can make a computer that breaks it's commands, and can make mistakes and learn from them, then it's a whole new ball game.
the computer that won in chess did not win by artificial intelligence. It needed it's programmers to reprogram in order to win, based on the opponent's previous win methods.
snooziums wrote:However, there are "neuro network" computers, that function by multi-point networks that have to learn everything they know. For instance, I saw a demonstration of one that was learning to say a sentence. Someone would say it, and it would try to repeat it. Every time, it got a bit closer to the original human speaker.
Computers are becoming more complex every year. Human evolution is at an end. Our brain size has not increased for over 100,000 years. Within a couple of decades, computers will surpass the human brain in complexity, that is inevitable.
Already, they have in memory. The human brain cannot store more than about 50 DVDs worth of information. Supercomputers already have that capacity, and it will be common place before long. 15 years ago, 40 MB hard drives were the limit. Now we have 1 TB hard drives, and there seems to be no current limit to the rate of increase.
The computer that defeated the chess player, "Deep Blue," was reprogrammed, however there are current programs that learn and adapt. Many programs are in the works that do this, learn and adapt. Not hard to do. It is even in some games.
Within 30 years there will be computers that can learn as well as humans can, and perform reason at the same rate or better.
ruprecht wrote:i'm not sure that learning is the same as natural intelligence, but certainly in "quantity" AI can be "bigger". From what I understand, it is the ability to break rules and learn from mistakes that is of interest in this area.
ruprecht wrote:here is the article..I think..I didn't find it on this pc bookmarks..so hopefully 9 pretty sure) it is this one. It is really interesting for it's wide scope .
http://www.google.com/search?q=meaning+ ... =utf-8&oe= utf-8&aq=t&rls=org.mozilla:en-US:official&client=firefox-a
it's number one..meaning based intelligence vs. info based.
snooziums wrote:"Meaning-Based Intelligence"? Now we have to define meaning, and if it exists, with some here will argue that it does not.
As for the "specialness" of bacteria. and rat neurons, that is just an adaptability to an environment. However software can be written the same, and some neuro-"hardware" is also designed to to this.
I am not saying that the living organism is not impressive, however machines, both hardware and software, are being designed and built that work under the same principles, and will be able to replicate everything that humans do someday.
After all, if we are really so "special," then creating something that is superior to us should not be that hard. Otherwise, we are not that far "above" the rest of the animals.
snooziums wrote:ruprecht wrote:
On the issue of creating superior intelligence to ours, that might be something to be applied to god bleevers' arguments, eh ?
No, not at all. It is the reverse.
if we create a superior intelligence to ourselves, then it works to disprove any "God" theory, since it will prove that we can create something as complex as we are. But if we cannot create something as complex as we are, and since it is argued that we are so far advanced from anything else in the animal kingdom, then one must question why we are so unique.
When we create intelligence, we will realize that it does not take some "divine" force to do so.
How could it be possible that machines could match or outstrip our intelligence, when we don't even understand some finer points of our own function, and therefore can't program a machine with such capacity, even if it can learn.
We do not know how cats purr either, but we can replicate the sounds. And some here would argue that we do know most of how the brain functions, look at the discussions about if the "soul" exists or not. The argument is that since we know how the brain functions, we must rule out the "soul" idea.
And while we do not know all of the little details of how our brain functions, unless we are designing something that functions the exact same way, that is not relevant. As long as it can learn and adapt in thought as we can, at our rate, how it does it can be unique to our own functioning.