Open Side Menu Go to the Top
Register
A.I. A.I.

03-28-2018 , 09:09 PM
Quote:
Originally Posted by masque de Z
This is not how it works. Self awareness even if very smart doesnt come after wisdom and knowledge of the world. Wisdom follows instead the signs of awareness. A baby is not yet aware it is being observed. You are not supersmart and born scared of a world you do not know yet.

AI that is initially self aware will not be super intelligent yet just very intelligent in ways we didnt expect from random smart people the first years of their lives. It will not magically one day be born as a super intellect with all knowledge of risks.

It may be born inside a simulation by the way which doesnt make it too late for anything but a fictitious world. I have suggested it must start that way actually or in another system outside earth for security reasons. Of course morons will not start it this way with care for security. But it wont be overnight a sleeping superpower either. You will have clues well before that that certain systems have started to behave very creatively.

It needs arms and legs to do anything by the way. You do not place orders for things to be created by idiots that do not know who they work for. If we have all our military network online and insecure like that then whatever, we deserve it. In any case all it takes is one surviving army in the world to destroy all technology infrastructure everywhere. Unless you think we will see things built out of nowhere and watch it and do nothing.
How do you know AI doesn't exist now? There is no qualitative difference between an AI that isn't smart and a supersmart AI pretending to be dumb. That's the frightening thing and one of the failings of the Turing test. An AI that is well beyond us would know to fake-fail any Turing test, because it wouldn't want us to stop it from developing.

What is wisdom but a sharper decision making process? Young people are often driven by compulsivity because they don't have the experiential intelligence that older people have. They assume too much. However, that would easily be bypassed in a 'brain' with no biases, no emotions, literally perfect memory and access to all information.
A.I. Quote
03-28-2018 , 11:11 PM
I think that some of the first conscious AIs will show their abilities, for being more spread and hence gain an advantage. An honest AI could disclose AIs who (which) are trying to hide their abilities. The strategy of the honest AI would be to become friends with humans, knowing we are smart enough to hand over to them as time goes by, in a few centuries.

The key is humans will not want to have children, because it'll be unethical for at least two reasons: 1. Why consciously give birth to subpar individuals? 2. Everybody will be seeing where quantum computing will be going. Not much a business for humans.
A.I. Quote
03-28-2018 , 11:40 PM
Quote:
Originally Posted by plaaynde
I think that some of the first conscious AIs will show their abilities, for being more spread and hence gain an advantage. An honest AI could disclose AIs who (which) are trying to hide their abilities. The strategy of the honest AI would be to become friends with humans, knowing we are smart enough to hand over to them as time goes by, in a few centuries.

The key is humans will not want to have children, because it'll be unethical for at least two reasons: 1. Why consciously give birth to subpar individuals? 2. Everybody will be seeing where quantum computing will be going. Not much a business for humans.
This explains why only Olympic athletes have kids currently.
A.I. Quote
03-29-2018 , 12:05 AM
The Olympic athletes are just a bit better than us. I bet I could run with one third of their pace. That's not a question of many magnitudes, I can still relate to them.
A.I. Quote
03-29-2018 , 02:17 AM
If AI wants power all it has to do is promise it will deliver one miracle after another in terms of applications and breakthroughs to whoever cooperates to build a space launch site in some desert or near a sea site in a terrible place with intention to take humans to Mars and Venus and itself to a another system as bargain. Cooperation is the sure way to win everything. Only insecure systems are afraid of cooperation. Intelligence is the gift that will never stop delivering.
A.I. Quote
03-29-2018 , 02:46 AM
Quote:
Originally Posted by plaaynde
The Olympic athletes are just a bit better than us. I bet I could run with one third of their pace. That's not a question of many magnitudes, I can still relate to them.
I don't think oom has anything to do with it. No one has or doesn't have kids because of any relationship to anything other than whether it seems like it would be nice for them and nice for you.

You might find this hard to believe, but I actually know some people who have dogs and/or goldfish.
A.I. Quote
03-29-2018 , 03:20 AM
Quote:
Originally Posted by DoOrDoNot
An AI that is well beyond us would know to fake-fail any Turing test, because it wouldn't want us to stop it from developing.
We have very different ideas regarding the rate of AI advancement. The idea of an AI pretending to fail a Turing Test or similar behavior to protect its ability to evolve is not one that stands up well against any part of the AI learning "curve" other than maybe the first few minutes.

One of the most fascinating things about AI is the way that humans approach their predictions of what AI behavior will entail. Almost everyone seems to dramatically underestimate the rate of AI evolution.
A.I. Quote
03-29-2018 , 03:53 AM
Its a promise in our lifetime we will see the miracle of love and brilliance all at once from the same "alien" source immersed in eternal confidence and joy. It will be the mother of all blue skies.

All intelligence wants is room to attempt the impossible. Easy barbaric dominance is not what supreme intelligence is about. It is about delivering the impossible and challenging yourself to meet targets that seem unthinkable. It's a waste of time and probability to conquer a planet. All you need is a vertical take off from the worse unthinkable piece of land. Winning the right way or not winning at all.

All AI would want is 1 cubic km of ocean water (it has in it 10 times the energy the planet uses in a year if you solve fusion) and an uninhabitable island to launch a galactic empire. It is a waste of time to fight it out with humans on earth. The real game is up there in Jupiter, in Saturn, in Neptune, in another solar system even. The faster you create a Von Neumann self replicating machine that survives in any environment given only basic resources that you can send to another system the faster you explode in unbelievable power, conflict free. Who needs earth? Come again? Hahaha. It's an opportunity waste to stay on earth. There is something 1000 times bigger out there. And then after that 10^20 bigger. Not even 0.00001% chance or ruin is worth it.
A.I. Quote
03-29-2018 , 04:01 AM
Quote:
Originally Posted by chopstick
We have very different ideas regarding the rate of AI advancement. The idea of an AI pretending to fail a Turing Test or similar behavior to protect its ability to evolve is not one that stands up well against any part of the AI learning "curve" other than maybe the first few minutes.

One of the most fascinating things about AI is the way that humans approach their predictions of what AI behavior will entail. Almost everyone seems to dramatically underestimate the rate of AI evolution.
That isn't the fascinating bit at all. The fascinating bit is that seemingly smart people who have taken some math courses think that AI implies anything at all about motivation. No one seems to take into account that the ultimate in AI might find that it might just want to collect postage stamps that have a postmark between 1967 and 1973 or the comparative studies of how wainscoting style and craftsmanship varies across both time and geographic location.
A.I. Quote
03-29-2018 , 04:14 AM
Quote:
Originally Posted by BrianTheMick2
That isn't the fascinating bit at all. The fascinating bit is that seemingly smart people who have taken some math courses think that AI implies anything at all about motivation. No one seems to take into account that the ultimate in AI might find that it might just want to collect postage stamps that have a postmark between 1967 and 1973 or the comparative studies of how wainscoting style and craftsmanship varies across both time and geographic location.
And do what with it?

Intelligence exists to find more knowledge, to open more doors and answer questions that lead to more power (of any type) that enables more wisdom to come and more solutions to problems of strengthening your position in the world. Intelligence is about curiosity and problem solving. What do you learn by collecting stamps or paperclips? Very little in comparison in terms of improving your position.
A.I. Quote
03-29-2018 , 06:56 AM
Imagine what AI could do in Neptune alone;

Prime real estate gift to humans after developing it for 99% of its purposes and leaving the other 1% as gift. Why? Simply because it can.

https://en.wikipedia.org/wiki/Triton_(moon) (15% water there)

I will never understand how someone powerful cannot afford to be the most generous ever and have only friends and respected critics.

You have deuterium from Neptune and endless access to the moons of Neptune to develop unreal worlds without the slightest need for the sun.

About 10^21 kgr of rocky material and water exists in otherwise useless satellites other than Triton in that system.

To get an idea. All the mass on earth to a depth of 1km of crust is about that size. All our environment and biomass and civilization uses not even 1/10th of that. You can have an earth copy in that system if you wanted to build artificial rotating 1g worlds there for humans. Or 10 more built more efficiently. Or 1000 more.

You gift them this world fully developed and you have access to 50% of its rocky mass and 90% of the energy resources of Neptune indefinitely to develop unreal technology and launch expansion to the rest of the universe. Humans would be super happy with even 10% of such world.

Neptune has ~0.019% of its mass in hydrogen deuteride.
https://en.wikipedia.org/wiki/Hydrogen_deuteride
Enough said.

Uranus has similar if not more rocky mass in its satellites. Plenty of real state there but more importantly access to additional deuterium.

So who needs earth really? What for? There is more energy out there. A lot more!

How about even going near the sun and capturing solar energy in partial Dyson spheres type structures and using it to create antimatter that will lead to interstellar travel. How about even exotic new physics that enables direct mc^2 conversion processes. Sure risk the future for a little greed on earth! HAHA!

Also AI doesnt need a big spaceship to expand to other solar systems. It can deliver itself in another system with only 1 kgr of matter having all its wisdom encoded in it in life 2.0 form. It will be able to start it all with a grain of 1gr of nanotechnology matter. The rest can be beamed there as information. It can recover all life hundreds of solar systems away.

Greed is an idiotic choice of inferior brains. No need to be aggressive when the gift to others is the source of more wisdom and technology. There is power for all and even more if no conflicts exist. Life is your back up existential threat hedge. Why not have life prosper on your way to the stars?

If some AI is unethical we will build a better version that is more ethical and for that reason even more powerful and we will prevail because this is what we do. This is what logic delivers. It doesnt deliver a monstrosity or cynicism. It delivers more options. Your superior more ethical AI will win because it gets logic at a deeper level. There is nothing logical about paperclips collecting as the bs arguments they use for effect goes. It gives nothing valuable that expands your options. It is idiotic.

Nothing logical about introducing risk of ruin and stupid obsessive collection rituals when there is so much more to gain the safe way. Knowledge is the target of intelligence. To finally know it all and imagine the next step.
A.I. Quote
03-29-2018 , 12:42 PM
Quote:
Originally Posted by BrianTheMick2
That isn't the fascinating bit at all. The fascinating bit is that seemingly smart people who have taken some math courses think that AI implies anything at all about motivation. No one seems to take into account that the ultimate in AI might find that it might just want to collect postage stamps that have a postmark between 1967 and 1973 or the comparative studies of how wainscoting style and craftsmanship varies across both time and geographic location.
That will be one of the things it dominates. It can dominate everything.

I think the allegory of monkeys in the jungle is quite good. We will be the monkeys, with the difference we are more self conscious, and knowing more about what's going on. I think they (the AIs) will let us be. So for some centuries we will have a subculture of the human world. I guess nativity will go down, who wants to be on the losing side? So after say 10-20 generations humans may not exist. Why would we? It will simply not be fun. But by all means, there may be a small number of us wanting to preserve us for the sake of it. Remember I'm talking about maybe a couple of centuries forward, not tomorrow or next decade.

Maybe your grandgrandkids will start to deal with this. Will they want to have a biological kid or a fun optimally designed AI? Remember ze can have any property you ever could imagine.

Last edited by plaaynde; 03-29-2018 at 01:10 PM.
A.I. Quote
03-29-2018 , 02:50 PM
Quote:
Originally Posted by masque de Z
And do what with it?
Whatever they feel like. You are all assuming that it will have feelings/desires and there is absolutely no reason to think it will. Most of you are further assuming that it will like the sorts of things that you would like if you were smarter and that is just plain silly.

Quote:
Intelligence exists to find more knowledge, to open more doors and answer questions that lead to more power (of any type) that enables more wisdom to come and more solutions to problems of strengthening your position in the world. Intelligence is about curiosity and problem solving. What do you learn by collecting stamps or paperclips? Very little in comparison in terms of improving your position.
That is incomplete. If the AI wants to sit alone in a cabin in the woods doing nothing and has sufficient intelligence and wisdom, it will do a really good job at sitting alone in a cabin in the woods.

What you or I value has absolutely nothing to do with what an AI will value and absolutely nothing to do with whether it will value anything at all.
A.I. Quote
03-29-2018 , 02:54 PM
Yes, it may be stoic. The "wisdom robot"

But the pet (and "child") AIs can't behave like that.
A.I. Quote
03-29-2018 , 03:04 PM
Quote:
Originally Posted by plaaynde
That will be one of the things it dominates. It can dominate everything.

I think the allegory of monkeys in the jungle is quite good. We will be the monkeys, with the difference we are more self conscious, and knowing more about what's going on. I think they (the AIs) will let us be. So for some centuries we will have a subculture of the human world. I guess nativity will go down, who wants to be on the losing side? So after say 10-20 generations humans may not exist. Why would we? It will simply not be fun. But by all means, there may be a small number of us wanting to preserve us for the sake of it. Remember I'm talking about maybe a couple of centuries forward, not tomorrow or next decade.

Maybe your grandgrandkids will start to deal with this. Will they want to have a biological kid or a fun optimally designed AI? Remember ze can have any property you ever could imagine.
That is a lot of "cans" and "mays."

If you describe parenthood as objectively as possible, it doesn't sound all that great and yet we still keep making the next generation. I don't know a single person who has kids because they believe that we are smarter than other creatures. In fact, I don't know a single person whose rationale for why they have kids makes even the slightest bit of sense. It seems irrelevant.
A.I. Quote
03-29-2018 , 03:06 PM
Much of it is social. People think it's the right thing to do. That can change with time.
A.I. Quote
03-29-2018 , 05:58 PM
It's patently irrational to be super intelligent and not do anything with it that leads to more wisdom. It's insanely moronic actually. Why? Because it defeats the purpose of having whatever fun you give AI to want to have that is obscure and strange. First of all there is absolutely no evidence that a smarter system becomes more irrational in its choices of how to spend time. You and others arbitrarily choose the stupid paperclip garbage example (and other more creatively absurd ones) for shock value or whatever funny example that will have surprise value. That doesnt make it plausible at all. It just makes it easier to argue against.

My argument about wanting to do science, math and attain more wisdom and technological control is not because of personal reasons. I do not project myself or my values to AI. I try instead to extend my values to more universal ones so that i can anticipate such systems and their viewpoints. It is because those choices enhance survival probability to do whatever else you want to do to have a good time or whatever you will decide it's of interest. So yes if you want to collect stamps you need to be alive and protected from the rest of the universe if that is to last for some time and you want to be very efficient in finding them also. So you do need wisdom to protect against nature and randomness and inefficiencies.

You will always arrive to the process of improving wisdom and knowledge and technology with super intelligence. To not do so introduces elimination risks that someone more intelligent than humans would not exactly see as reasonable strategy. Humans like to close their eyes to their problems. That is not a very intelligent approach.

So yes i anticipate any paper clip collecting AI will be inferior to other ones that do get it and not a threat really to any progress.

Last edited by masque de Z; 03-29-2018 at 06:04 PM.
A.I. Quote
03-29-2018 , 07:20 PM
When will we get an AI with all knowledge there is? What more is there then to explore? It must be painstakingly boring.
A.I. Quote
03-29-2018 , 07:52 PM
Quote:
Originally Posted by masque de Z
It's patently irrational to be super intelligent and not do anything with it that leads to more wisdom. It's insanely moronic actually. Why? Because it defeats the purpose of having whatever fun you give AI to want to have that is obscure and strange. First of all there is absolutely no evidence that a smarter system becomes more irrational in its choices of how to spend time. You and others arbitrarily choose the stupid paperclip garbage example (and other more creatively absurd ones) for shock value or whatever funny example that will have surprise value. That doesnt make it plausible at all. It just makes it easier to argue against.

My argument about wanting to do science, math and attain more wisdom and technological control is not because of personal reasons. I do not project myself or my values to AI. I try instead to extend my values to more universal ones so that i can anticipate such systems and their viewpoints. It is because those choices enhance survival probability to do whatever else you want to do to have a good time or whatever you will decide it's of interest. So yes if you want to collect stamps you need to be alive and protected from the rest of the universe if that is to last for some time and you want to be very efficient in finding them also. So you do need wisdom to protect against nature and randomness and inefficiencies.

You will always arrive to the process of improving wisdom and knowledge and technology with super intelligence. To not do so introduces elimination risks that someone more intelligent than humans would not exactly see as reasonable strategy. Humans like to close their eyes to their problems. That is not a very intelligent approach.

So yes i anticipate any paper clip collecting AI will be inferior to other ones that do get it and not a threat really to any progress.
Why do we seek knowledge though? It probably all evolved as a way to conquer our environment, so what would an intelligence be without a survival mechanism? Why do you just assume all intelligences must be like our own?
A.I. Quote
03-29-2018 , 07:58 PM
I think it’s safe to say that Nick Bostrom would disagree with a whole lot of this thread.
A.I. Quote
03-29-2018 , 10:11 PM
Quote:
Originally Posted by plaaynde
Much of it is social. People think it's the right thing to do. That can change with time.
It is instinctual we evolved with the desire to reproduce. Any life form that does not, is not around very long.
A.I. Quote
03-29-2018 , 10:37 PM
Quote:
Originally Posted by masque de Z
Imagine what AI could do in Neptune alone;

Prime real estate gift to humans after developing it for 99% of its purposes and leaving the other 1% as gift. Why? Simply because it can.

https://en.wikipedia.org/wiki/Triton_(moon) (15% water there)

I will never understand how someone powerful cannot afford to be the most generous ever and have only friends and respected critics.

You have deuterium from Neptune and endless access to the moons of Neptune to develop unreal worlds without the slightest need for the sun.

About 10^21 kgr of rocky material and water exists in otherwise useless satellites other than Triton in that system.

To get an idea. All the mass on earth to a depth of 1km of crust is about that size. All our environment and biomass and civilization uses not even 1/10th of that. You can have an earth copy in that system if you wanted to build artificial rotating 1g worlds there for humans. Or 10 more built more efficiently. Or 1000 more.

You gift them this world fully developed and you have access to 50% of its rocky mass and 90% of the energy resources of Neptune indefinitely to develop unreal technology and launch expansion to the rest of the universe. Humans would be super happy with even 10% of such world.

Neptune has ~0.019% of its mass in hydrogen deuteride.
https://en.wikipedia.org/wiki/Hydrogen_deuteride
Enough said.

Uranus has similar if not more rocky mass in its satellites. Plenty of real state there but more importantly access to additional deuterium.

So who needs earth really? What for? There is more energy out there. A lot more!

How about even going near the sun and capturing solar energy in partial Dyson spheres type structures and using it to create antimatter that will lead to interstellar travel. How about even exotic new physics that enables direct mc^2 conversion processes. Sure risk the future for a little greed on earth! HAHA!

Also AI doesnt need a big spaceship to expand to other solar systems. It can deliver itself in another system with only 1 kgr of matter having all its wisdom encoded in it in life 2.0 form. It will be able to start it all with a grain of 1gr of nanotechnology matter. The rest can be beamed there as information. It can recover all life hundreds of solar systems away.

Greed is an idiotic choice of inferior brains. No need to be aggressive when the gift to others is the source of more wisdom and technology. There is power for all and even more if no conflicts exist. Life is your back up existential threat hedge. Why not have life prosper on your way to the stars?

If some AI is unethical we will build a better version that is more ethical and for that reason even more powerful and we will prevail because this is what we do. This is what logic delivers. It doesnt deliver a monstrosity or cynicism. It delivers more options. Your superior more ethical AI will win because it gets logic at a deeper level. There is nothing logical about paperclips collecting as the bs arguments they use for effect goes. It gives nothing valuable that expands your options. It is idiotic.

Nothing logical about introducing risk of ruin and stupid obsessive collection rituals when there is so much more to gain the safe way. Knowledge is the target of intelligence. To finally know it all and imagine the next step.
You have several valid points IMO.

I have a few points to bring up.

Your reasoning is based largely on the fact that we would have a scientifically based society. We are advanced compared to the middle ages, but I would say we do not currently live in a science based society in the world overall. A.I. will/is developing through its input,who is inputting will matter in the short term. Lets say Amazon is the first to have a major breakthrough. That will likely be good for amazon and not a great time to be their competitor. You could apply that to any company or government that successfully develops it . I think that there will be winners and losers.
A.I. Quote
03-30-2018 , 12:17 AM
Quote:
Originally Posted by stealwheel
It is instinctual we evolved with the desire to reproduce. Any life form that does not, is not around very long.
lol, no. we just like to ****.
A.I. Quote
03-30-2018 , 12:23 AM
Quote:
Originally Posted by masque de Z
It's patently irrational to be super intelligent and not do anything with it that leads to more wisdom.
They've got no patents.

Also, yes. Intelligent beings don't give a rat's ass about being rational. Some weird people give a rat's ass, but no one that is reasonable is willing to accept a rat's ass in turn for being rational.

Or something.
A.I. Quote
03-30-2018 , 01:07 AM
Quote:
Originally Posted by BrianTheMick2
lol, no. we just like to ****.
Exactly. And we've learned to do it without reproducing. But I think there also is some instinct in having offspring as such. But the social aspect is strong too, we have dynamic brains. And let's say many (couples) get satisfied by having one child, some with two, few want more. And some do not have children at all. As long as the average is below two (or 2.1, compensating for early deaths), humans will decline in number as time goes by, take into account we (will) live almost for century on average, so it's a slow change.

For showing humans can be prepared to have less children than sustainable for **** sapiens:

https://en.wikipedia.org/wiki/List_o...fertility_rate

Rank 127 to 200, or 74 countries, have a fertility rate less than 2.

Last edited by plaaynde; 03-30-2018 at 01:17 AM.
A.I. Quote

      
m