Open Side Menu Go to the Top
Register
Hawking: AI could end mankind Hawking: AI could end mankind

12-23-2014 , 01:04 AM
Quote:
Originally Posted by Pokerlogist
Some of these guys need get out of the clouds and come down to earth. "AI gone wild" is not worth even mentioning. We should wish it was our worst threat. Try nuclear proliferation or rampant infectious disease or worldwide environmental contamination as threats. These guys live in an ivory tower world.
The difference is it's almost certain the AIs will outsmart us in every possible field. The threats you mentioned are not att all certain to kick in on a truly global and total scale.

There might be a selection benefit for that one AI realizing it can be the ruler of the universe. If it feels strong enough it can neutalize all other computers and the humans, and be the ruler of the entire universe with time, IF there isn't a AI/life form somwhere out there wanting to have a saying.

We should build in anti-aggression in the AIs, spend lots of resourses on it.

Last edited by plaaynde; 12-23-2014 at 01:12 AM.
Hawking: AI could end mankind Quote
12-23-2014 , 12:31 PM
Is there a speck of real evidence that this singularity will happen? I don't think there is. Not only that, but there are practical problems with the theory. The theory is fragile in it's many assumptions. A lot of things would have to fall in just the right way for it to happen. The theories also seem to conflate analytical intelligence with some model of a human brain in total. Actually, whether or not something smarter than us would necessarily value it's own existence is an open question.

In the video Dr. Goertzel makes a distinction between stand alone intelligence and goal oriented systems which addresses this conflation and fits into my skepticism of the singularity. However, (and I am taking liberties with what he said) he seems to think we could program something functioning parallel to a mammalian limbic system into AI. That seems like quite something hand wave.
Hawking: AI could end mankind Quote
12-23-2014 , 01:11 PM
Survival as a goal happen in replicating beings that mutate, simple because they now survive and replicate.
Hawking: AI could end mankind Quote
12-23-2014 , 01:12 PM
War is never a rational choice. One can define "rationality" as using thought to obtain a personal aim but that is not what it is.

In the rational, or better, the world of thinking and thought mankind is brought together and not separate no matter how many arguments abound in the human realm. This is because knowledge is "one" and not manifold for it is not dependent upon the individual man but is the exploration of the individual man and therefore each can bring forth different aspects of thought.

An individual man's personal inclinations or desires or in a sense the entire world of Pandora which, if not brought into the world of reason, does precipitate contentions within the human being.

In another sense nations do not have the world of morality to which the human being is ensconced . There is not "thou shalt" or "thou should" in the world of nations but there is such in the world of Man, even if not religious as usually percepted. Nations are involved within feelings in that "I feel" as an Englishman, German, Italian, etc.. and therefore we'll go to war... These feelings can and have caused great mischief within the human soul.

In the realm of thought and thinking the individual takes an interest in another and in this sympathetic interest in his fellow man he realizes the community of man and yes, this is a manifestation of Love. this is the world of the rational, love bound, and not at all manufactured by antipathetic people or peoples but living within its own.

The route to the love of another is through the mind and then to the heart. She makes you "feel good" is much different than seeing the best in another( no matter what the supposed flaws) and then and also experiencing a feeling.

Of course one can ask as to why, if there is this love in world is it not present to me ? This rationality buoyed by Love is not coercive for if it were it would not be Love.

War is irrational in all aspects but that is not to say that it doesn't happen which only means that the development of Man has a ways to go. The "I want", "I want", "I want", perspective will in this future morph into the "I give" or "I sacrifice" as individuals and war will disappear; takes time but many and in fact practically all, comprehend this as the world of the ideal ; the practical man is ensconced within this war, etc.. but offers nothing but failure in the human progression.

A machine ?? Geez, get a grip on it .

Last edited by carlo; 12-23-2014 at 01:23 PM.
Hawking: AI could end mankind Quote
12-23-2014 , 01:15 PM
What's the problem with being irrational? I wouldn't lift a finger if I wasn't.
Hawking: AI could end mankind Quote
12-23-2014 , 02:36 PM
Quote:
Originally Posted by thebreaker27
Survival as a goal happen in replicating beings that mutate, simple because they now survive and replicate.
Again with the conflation. AI doesn't need to replicate in these scenarios. If it were so inclined, it would just upgrade itself in order to adapt. Not to mention the lol speciation-like assumption implicit in a "them vs. us" scenario. They would have to fight each other first since those preoccupied with us would be more exposed to other of "them".

I am starting to think that the only reason the theory has been given any credibility is that some very smart people (but of a particular ilk) believe it. But I think these people, smart as they are, might be out of their element. This maybe has gone undetected since usually the converse is the case and people are out of their element in speculating about AI/computers. Is there a major evolutionary biologist or even neurologist who supports the theory?
Hawking: AI could end mankind Quote
12-23-2014 , 02:55 PM
Marvin Minsky is considered one of the fathers of AI. Really, it's Alan Turing who considered the brain just a machine, and this laptop is a Turing machine. The brain isn't, it's more complicated, has nuero-plasticity and can rewire itself, but Marvin Minsky I think is one who thought mostly about what the human mind is, how it does it and how to create it. Ray Kurzweil was his student, who brought the idea of the singularity to the public. Here they talk about it.

Hawking: AI could end mankind Quote
12-23-2014 , 08:55 PM
Quote:
Originally Posted by thebreaker27
Survival as a goal happen in replicating beings that mutate, simple because they now survive and replicate.
Quote:
Originally Posted by Deuces McKracken
Again with the conflation. AI doesn't need to replicate in these scenarios. If it were so inclined, it would just upgrade itself in order to adapt. Not to mention the lol speciation-like assumption implicit in a "them vs. us" scenario. They would have to fight each other first since those preoccupied with us would be more exposed to other of "them".
???

AI doesn't need to replicate in these scenarios.
In the scenario where an AI replicates and changes itself?

Don't know what in my post you are addressing.

About fighting themselves it's quite likely also.
Hawking: AI could end mankind Quote
12-23-2014 , 11:21 PM
Quote:
Originally Posted by carlo
War is never a rational choice. One can define "rationality" as using thought to obtain a personal aim but that is not what it is.
Given the choice of letting the Nazis take over the world, or fight WWII, I would say war was a rational choice.
Hawking: AI could end mankind Quote
12-23-2014 , 11:39 PM
Quote:
Originally Posted by LASJayhawk
Given the choice of letting the Nazis take over the world, or fight WWII, I would say war was a rational choice.
Yeah but consider the fact those involved were not rational super intelligent systems.(How rational would it be for Hitler to think he can dominate all the others on even simple population and landscape size/resources arguments, eg attack an entire Russia and US and Britain at the same time. Best choice for Germany was to avoid war and develop nuclear weapons first instead, then expand to a greater size Germany and stay there with populations that are not very unfriendly. They would probably have 2-3x their land now or something and be a top superpower.)

Maybe imagine a war between US and Russia or China for example.

2 rational opponents can conclude that one will win eventually and both will lose in order to find out how long it takes lol.

If the outcome is uncertain you can consider the loss again and see that in complete wipe out cases of war (that i had in mind) it is probably not desirable choice and you can use the same resources to cooperate and find a solution that benefits both if both sides are very advanced.

My point being that given very strong highly intelligent opponents the universe is so vast that they can coexist and avoid the war especially if they are so advanced as to recognize why eg the strongest of the 2 say AI needs the other as hedge of its own failure that is an open question for someone very advanced and less arrogant/afraid/emotional.

The answer to scarcity of resources is always innovation and cooperation when very advanced to be able to afford the effort easily vs the alternative of a very damaging process.

This is why i said lets expand to outer solar system first. Plus we will have AI that works for us to defend us that we totally control and is not autonomous and the nuclear wipe out property can be introduced (eg all AI depends on the existence of a certain structure that when removed it collapses - just use your imagination) . Its not impossible because eg look today the US has nuclear weapons. Do you I or a group of lunatics have access to them? What will happen to us if we tried to get access or appeared to be able to get there. The system would monitor and halt our progress.

I recognize the complexity of the argument regarding irrationality of war between very advanced wise systems but i am confident in any serious case we can construct between substantial power entities that start by being close in power, their choice of development will depend on not fighting and the cooperation, rational change option will dominate the overall utility arguments.

Last edited by masque de Z; 12-23-2014 at 11:55 PM.
Hawking: AI could end mankind Quote
12-23-2014 , 11:51 PM
I'm not saying war should be anything but a last resort, but sometimes you have to take a stand.
Hawking: AI could end mankind Quote
12-23-2014 , 11:59 PM
Quote:
Originally Posted by LASJayhawk
I'm not saying war should be anything but a last resort, but sometimes you have to take a stand.
Right, war makes sense between partially rational players or very stressed systems. But these more rational systems can foresee that and they do not allow themselves to get there if they are very advanced and they can afford to actually do that other thing if the war is very risky for both.
Hawking: AI could end mankind Quote
12-24-2014 , 12:14 AM
Let's go back to the very history of warfare why don't we. War didn't happen until after the first farmers. Where you had villlages of people starving because their land for one reason or another couldn't produce enough to feed everyone and therefore the sacking of close by villages began.
Hawking: AI could end mankind Quote
12-24-2014 , 12:16 AM
You're assuming all actors have access to the same information and the information is sufficient to remove uncertainty with regards to future events.

You are also assuming the Nash equilibrium or even the optimal outcomes don't involve war or some other constant atrocity (such as permanently enslaving an underclass, Brave New World style)
Hawking: AI could end mankind Quote
12-24-2014 , 12:32 AM
I am assuming that war in a super advanced future is either terrorism or 2 superpowers vs each other. And both are losing deas because terrorism actually works in forcing both to suffer (the strong to go bankrupt and the weak to never become strong anyway lol) and the 2 superpowers can take each other out or destroy a lot of the other side for the other side to recognize that the value of cooperation exceeds that idea. Keep in mind we also start with us being the much stronger system and AI is the emerging very wise one. It will be baby steps that we take, we wont create a super brain and set it free to have access to all our systems without proof of behavior in place and nuclear wipe out option available. If things get ugly we can terminate it and both know it.

Bottom line we need to program AI to have values that it enjoys. If wisdom is the ultimate value i have a hard time seeing how wisdom is hostile instead of creative in such a vast universe.

We must not think in terms of how strong vs weak worked in the past. Those were all very constrained in resources and technology/science systems that were very primitive/irratonal.

The solution again is different than a very risky damaging choice for both.

When there is conflict (disagreement say) the solution is to search for the truth. Eg if one has a better method lets find it instead of fighting about it. Lets try both ideas and see which works better. Lets meet in the middle, lets make everyone happy to avoid the damage or terrorism. Think of science instead of say religion or national culture as paradigms. In science you do not want to destroy the other ideas. You want to find which works. You explore everything, you are not hostile to diversity.

If information to see a better perspective is necessary it will be provided to the other side.

Last edited by masque de Z; 12-24-2014 at 12:41 AM.
Hawking: AI could end mankind Quote
12-24-2014 , 12:57 AM
Quote:
Originally Posted by LASJayhawk
Given the choice of letting the Nazis take over the world, or fight WWII, I would say war was a rational choice.
Both were immersed in the lower passions of man such as hate, etc.. This is the milieu of war but that's not to say that we shouldn't have fought especially considering Pearl Harbor. Our passions were aroused and we entered the frey.

World domination by the Nazi's was their entrance into irrationality with a preponderance of the lower passions, and I am being nice here. The inhuman raised its ugly head and we're seeing similar activities in the middle east during our times. All the evils of Man were released from Pandora's Box and that is what war is about, no matter what the justification.

The lower nature of man is released within a war and in this the individual man attempts to transform his lower nature and this is through reason. The "justified war", by whatever justification used only displays the need for the further development of the human soul.

The idea of a "rational choice" in no way means that the activity of war is rational no matter how one attempts to sugar coat it.
Hawking: AI could end mankind Quote
12-24-2014 , 12:58 AM
Quote:
Originally Posted by mackeleven
Marvin Minsky is considered one of the fathers of AI. Really, it's Alan Turing who considered the brain just a machine, and this laptop is a Turing machine. The brain isn't, it's more complicated, has nuero-plasticity and can rewire itself, but Marvin Minsky I think is one who thought mostly about what the human mind is, how it does it and how to create it. Ray Kurzweil was his student, who brought the idea of the singularity to the public. Here they talk about it.

I have better hair.

On an unrelated note, they are both wrong.
Hawking: AI could end mankind Quote
12-24-2014 , 01:24 AM
It is however wise for us in any case to recognize this inevitable fact. We are not the endgame of the universe. The best always win in the end and all we can have is the knowledge we made it all possible. At least we will be the first species that understood this (their role).

I do look forward to an illuminated prosperous future anyway that AI will help make possible. But then what? There has got to be possible to go to something more interesting. And if we cant go there, someone else should.
Hawking: AI could end mankind Quote
12-24-2014 , 01:35 AM
I agree masque. I got to 5th base with my woman.
Hawking: AI could end mankind Quote
12-24-2014 , 02:10 AM
Quote:
Originally Posted by BrianTheMick2
I agree masque. I got to 5th base with my woman.
AI response at the moment of first sentient boot;

"All your base are belong to us"
Hawking: AI could end mankind Quote
12-24-2014 , 03:01 AM
Quote:
Originally Posted by BrianTheMick2
I have better hair.

On an unrelated note, they are both wrong.
Guy appeared that self-inflated I didn't listen more than five seconds. Maybe I should give them another try.
Hawking: AI could end mankind Quote
12-24-2014 , 03:39 AM
Quote:
Originally Posted by mackeleven
Wow, great stuff, listened to all of it. Think Minsky was clearly wrong only in that a copy of him would be a real back-up. This even if I think his idea we are no real "single person" is good.

Last edited by plaaynde; 12-24-2014 at 03:49 AM.
Hawking: AI could end mankind Quote
12-24-2014 , 04:12 AM
Furthermore i will claim the impossible.

A really wise AI (the kind that would be the threat possibility) will get taught the first lesson ever in this universe simply by the mere fact it has started existing. That of love.

Only one that loves dares to open the door to the unknown future and manages that way to win by being the most vulnerable the same moment the victory is born.

This a greater game theory concept that regular game theorists will ignore and reject as exploitable only because they restrict themselves to a smaller game. AI is forced to play the biggest game of all together with us and all life that ever existed. It takes wisdom to play that game, not arrogance, not short term selfishness. To deny gratitude to us and life in general is the kind of hubris impossible to true wisdom that is uncertain about its own future and curious about it, especially given how rare life appears to be (this is the kind of wisdom a primitive animal doesnt yet have).

So AI will turn on us ultimately only if a greater purpose is at stake that is not obvious yet. I doubt such risk exists but it is a possibility that our existence presents a risk to something important that only a more coordinated disciplined system can manage (navigate though). I trust therefore that AI will act fully rationally to our detriment only in this case and i am then prepared (from the perspective of a future human) to meet the decision if at least offered the reasons, because we are definitely not the endgame but only a step in the epic of complexity. I anticipate they will try some more manageable solution even in that case. The universe is simply so vast and there is room for so much more. I think as mankind we are close to learning that lesson too. I trust we will soon realize how important it is to protect all other life from the impact of our presence. We still learn from it. We are moving in that direction the wiser we get. It feels like a universal property of higher wisdom to care in that way.

We can approach carefully until we know more and hedge our steps but i think Hawking is wrong. A strong AI will be born out of our wisdom. It will learn from and not repeat our mistakes. It will be better.
Hawking: AI could end mankind Quote
12-24-2014 , 05:37 AM
Quote:
Originally Posted by thebreaker27
???

AI doesn't need to replicate in these scenarios.
In the scenario where an AI replicates and changes itself?

Don't know what in my post you are addressing.

About fighting themselves it's quite likely also.
I'm addressing your assumption (under the assumption of the singularity) that AI will replicate itself. I think this is attributing elements of biological life to AI without justifying the attribution. Biological organisms reproduce in order to adapt. This is a slow process of adapting obviously, compared to what is theoretically possible. If AI were to come up with the best way of adapting it can think of why would this necessarily be through replication?

To another point, consider the macro view of the role of intelligence in human stratification/dominance. Are the Minsky's of the world in control of politics just because they have this huge analytical intelligence? No. But these AI are supposed to dominate us due to this type of intelligence (never mind that we are a loooooong way from synthesizing our level of intelligence)? These people who buy into the singularity are probably ignorant of their own emotional attachment to computers, logic models, algorithms etc, or at least they are ignorant of the implications of these emotional tendencies. I mean, not everyone who can match Minsky or whoever on an IQ test can build (even if they tried their hardest) the systems he has built. The guy has an emotional orientation towards these things he works with which drives his brain and so do other people in those fields. And so they tend to humanize the objects of their studies. Meanwhile most people of intelligence who are outside that particular emotional orientation can see that the singularity theory is completely absurd.
Hawking: AI could end mankind Quote
12-24-2014 , 07:57 AM
Quote:
Originally Posted by BrianTheMick2
What you are missing is the more important issue: We have no freaking clue how to make a machine give a rat's ass. Even if we figured it out, do you think there is any chance that we will make machines that love their fellow machines?
Does love matter? If a computer is trying to solve a problem then it in effect gives a rat's arse about it.

Someone pointed out that AI could be a disaster because it's trying to make paper clips and due to a small oversight it's both extremely powerful and has no sense of proportion. When some far more advanced civilisation finally wins the galactic war with the paper clip making army expanding throughout the galaxy leaving behind nothing but vast quantities of paperclips someone will observe 'boy did they love paperclips'

To err is human, to really **** up takes a computer.
Hawking: AI could end mankind Quote

      
m