Quote:
Originally Posted by John21
Maybe, or maybe it wouldn’t have any volition or will to power at all. But you’re somewhat conflating that sort of AI with full technological automation of the sort that replaces the bulk of human labor. I don’t think the latter necessitates the former. Nor do I think its emergence would necessarily be monopolistic. The way the model of full automation runs out (in my head) — outside of human life, creativity, artistic expression, social interaction, etc., the only thing of intrinsic value left is land. Basically, the value of everything else including housing reduces to near zero, i.e., free.
Well my scenario doesn't require AI volition, because i talk about controllers of the AI will to power. I said AI could overcome the creators but that's mostly irrelevant for the masses. Anyway we don't need to discuss that, in the sense that even if the AI is entirely without will its controllers do have will.
Yes it's not necessarily monopolistic but it's very hard to think it won't be. In order for strong AGI not to be monopolistic it should be developed at the same time by different groups and with the same power/potential, if one is slightly better it becomes exponential (because the first task you set it at is to increase itself).
Yes i am conflating AGI with full automation of human tasks. Because that's what you need to fully automate human tasks.
Think of building: even if the manual labor (brick laying and similar tasks) is automated that's mostly irrelevant for human labor and costs (after some re-arrangements). If you still need all the rest, architects, planners, people to find financing, people to sell the houses after you build them etc etc, automatizing the physical building part doesn't destroy demand of jobs.
Full automation comes if all the intellectual tasks can be automated as well and the only human interaction is "i want a 3 stores house there", and after clicking some buttons building starts with no humans involved until it is completed.
Still, please be careful about NOT thinking that solves scarcity. Because it doesn't. That 3 stores house still has value, in the sense of the costs to make it, in resources. Which don't disappear. The fully-automated process still consumes energy and materials. Which aren't free and aren't infinite.
But anyway nothing even close to full automation can come without strong AGI. Or without dedicated "local" AIs that are better than humans at specific tasks, for every single task where we now use humans. All included, regulation compliance and so on included.
It's not impossible that we don't get AGI, but we get thousands of dedicated AIs better at every human task. But in a practical sense those thousands of AIs, in aggregate, will be an AGI. Except it helps a lot toward the monopoly thing you mentioned before, if that's the model the risks are lower.
Btw if we reach the point where an AI (wether AGI or dedicated to a single task) is better than human at planning a building it's almost impossible that we don't also have a dedicated (or AGI) AI that is better than humans at social interaction, and creativity.
Dunno why we still think those are "impossible to replicate" tasks.
So, wrapping up:
1) Full automation in the sense of human labor mostly becoming completly irrelevant (this inclued fully automated repairs) requires AGI/thousands of dedicated AIs
2) Full automation doesn't solve resource scarcity (unless and until the AGI does it for us)
3) Whoever controls the AGI has absolute power over humanity, unless there are several competing AGIs at almost the same identical level of intelligence, or the thousands dedicated AIs are owned by many different groups
4) Having absolute power over humanity, in a world where resource scarcity isn't solved, imho is GG for most of humanity.
Last edited by Luciom; 09-23-2018 at 04:46 AM.