Open Side Menu Go to the Top
Register
Singularity Becoming a Mainstream Idea? - Cover of Time Singularity Becoming a Mainstream Idea? - Cover of Time

02-21-2011 , 02:12 PM
FWIW I've hung around on a bunch of transhumanist and life extension forums and Kurzweil rarely gets mentioned. In fact when his name does come up there are usually criticisms levelled at his predictions and supplement regimen. His status as 'leader' is more in the eyes of the outside world than those of his supposed 'disciples'.
Singularity Becoming a Mainstream Idea? - Cover of Time Quote
02-26-2011 , 12:24 AM
Quote:
Originally Posted by Flip-Flop
And why are you people so concerned about what others think?
You believe in singularity and you believe it will happen soon, good for you, go work on it, help the cause, why are you looking for other people's acceptance?
You need donations?

Yep, it's a cult.
Its important to me that this becomes a mainstream idea because I see it as a major threat and a major opportunity. Unless, the speed at which we improve computers processing power and memory breaks its exponential trend soon we will have the potential to build smarter than human AI in just a few decades.

Many people don't realize this is a possibility and our best hope lies in designing the first AIs. I find it unfortunate that people dismiss the idea because it sounds counterintuitive or too good to be true.

Some projections suggest that in 2020 you will have the hardware power of a human brain for $1000 and that by 2030 you will have the hardware power of a human brain for $1. Think of the economic incentive to create the software to put that to work.
Singularity Becoming a Mainstream Idea? - Cover of Time Quote
02-26-2011 , 12:40 AM
Quote:
Originally Posted by DMACM
by 2030 you will have the hardware power of a human brain for $1.
These kinds of potential facts make me stop for a while.
Singularity Becoming a Mainstream Idea? - Cover of Time Quote
02-26-2011 , 09:52 AM
Nobody is dismissing the idea of AI.
There are universities specialized in teaching AI.
Tons of scientists work in the field trying to crack the AI code.
All of them are adequately funded.
They keep their mouth shut, do their job and are getting ever so closer to a solution.

They do not waste money on marketing and movies.
They do not waste their time trying to be mainstream because they are not looking for followers.
They do not ask me for donations.
They do not prophesize.
They do not preach about eternal life, resurrecting family members and curing all disseases known to man.
Religious cults on the other hand, do all those things.

I'm all for AI and support all the people around the world that work on it.
I don't feel a need to support, follow and fund some random nutjob though.
Singularity Becoming a Mainstream Idea? - Cover of Time Quote
02-26-2011 , 12:56 PM
Quote:
Originally Posted by Flip-Flop
I'm all for AI and support all the people around the world that work on it.
I don't feel a need to support, follow and fund some random nutjob though.
Liked your post.
Singularity Becoming a Mainstream Idea? - Cover of Time Quote
02-26-2011 , 06:49 PM
Quote:
Originally Posted by Subfallen
You're thinking of Douglas Hofstadter, who is obviously a first-order genius as well but can't touch Kurzweil's resume. (Unless you have mad respect for the literature Pulitzer or something.)
Whether he can touch Kurzweil's resume I don't know... and find those kind of comparisons fairly subjective anyway. Hofstadter is the man. I'd take him over Kurzweil any day, who I also enjoy on a different level.

Quote:
Originally Posted by acehole60
I don't care if Kurzweil is right or wrong in predicting the date of the singularity (or any other stuff for that matter), it could still be interesting to discuss the possibility of it.
How novel!
Singularity Becoming a Mainstream Idea? - Cover of Time Quote
02-26-2011 , 07:48 PM
Quote:
Originally Posted by Ryan Beal
Whether he can touch Kurzweil's resume I don't know... and find those kind of comparisons fairly subjective anyway. Hofstadter is the man. I'd take him over Kurzweil any day, who I also enjoy on a different level.
QFT, Anyone making predictions should be very worried to find opposition from Hofstadter.
Singularity Becoming a Mainstream Idea? - Cover of Time Quote
01-31-2012 , 03:51 AM
Saw the movie, read alot about it... but haven't read the book.

Does he give enough hard calculation? This problem is eventually one of calculation, computer power and exponents and such. The technological milestones (where he gauges progress towards the singularity, the 2009 prediction, the turing test, the millionX faster comp etc.) details are likely speculative.

But does he show calucations? If he hasn't bothered to do that then its clearly complete rubbish.

Does he give an explained computing threshold when he believes the Turing test will be passed? or is that just speculative?

Does he show data from the past, both the lulls and exposive periods of tech progress? Its definitely not perfectly exponential. Moores law isn't a truly accurate prediction. Why is this?

I have a feeling he doesn't really delve into details for some reason, or if he does, its likely very naive math, (albeit complex im sure) that doesn't take into account the real world problems of implementing new technology at an ever increasing pace. Take energy for example, this could easily be a problem that at least slows tech progress at some remote point in the future. Or wars. Or plague that inhibits growth, hell, even something as minor as government intervention or legal issues pertaining to augmenting humans.

Or... What if theres simply no economic motivation for society to undertake the endgame sort of stuff he talks about, like assimilating planets and memories (assuming the AI is still at our command and hasn't gone "rogue" or whatever). Will an earth sized computer even be necessary? Computing power meets a demand... why do we need to decode and process all the data in the entire universe? There certainly needs to be a strong incentive here, and I don't see one.

People see stuff like cloning sheep and say "oh society will do anything in the name of progress". But things like that are to meet legitimate demands on behalf of humans (health etc). Some of the crazier achievements Kurzweil describes seem like they are not driven to fit any demand on behalf of society, or even the AI itself for that matter. Does AI automatically feel the "need" to advance itself all the time? Could it not be content at a certain level which fits its required needs and societies incentives? (much like a real intelligence for the most part). Its just all so presumptious and there are so many things, an unlimited amount of things, that could, and will stand in the way.

IMO, things will undoubtedly get advanced very quickly. Probably even freakishly so. But I don't think what he describes will happen certainly, and even if by chance it did, surely not this soon.

But honestly, the singularity is a completely different thing than just rapid expansion in technology, its such an unfathomably multiplied level of technological progress, almost an unlimited mathematical event where time as we know it simply vanishes.

And it is dependent on the perfect functioning of these outrageously demanding exponents, if even occasionally there are production problems or something that halts progress in the world (which is undoubtedly true) growth will have to be reset. It doesn't seem like it would work the way he describes.

Last edited by MurderbyNumbers123; 01-31-2012 at 04:03 AM.
Singularity Becoming a Mainstream Idea? - Cover of Time Quote
01-31-2012 , 10:04 AM
Lot of Kurzweil haters here, but the answer to most of your questions is yes, imo. Read the book

I will just add that his projections of exponential progress don't necessarily assume smooth sailing over the short or even medium term.
Singularity Becoming a Mainstream Idea? - Cover of Time Quote
01-31-2012 , 11:21 AM
Dr. Tom Campbell is good. You can find lots of videos on him on Youtube. He talks clearly, and to me at least, makes sense. I find him to be very rational and level headed.

When I listen to Kurzweil speak, there is something that just doesn't sit right with me. He just sort of talks in circles and doesn't really say anything. It's all kind of an easy extrapolation that any semi-intelligent person could foresee if they really thought about it.

I'd be interested to hear from anyone else in here who has listened to Campbell's work.
Singularity Becoming a Mainstream Idea? - Cover of Time Quote
01-31-2012 , 03:27 PM
Murder, I think most of what you wrote is intelligent / well founded, and the answer is simply, "Yes, that's a huge concern, and yes Kurzweil addresses it in his book. He talks about possible pitfalls and agrees that it's not all 100%. He gives a lot of details, facts, data, reasoning, examples, etc. that show why his optimism isn't poorly founded".

Quote:
Originally Posted by MurderbyNumbers123
People see stuff like cloning sheep and say "oh society will do anything in the name of progress". But things like that are to meet legitimate demands on behalf of humans (health etc). Some of the crazier achievements Kurzweil describes seem like they are not driven to fit any demand on behalf of society, or even the AI itself for that matter. Does AI automatically feel the "need" to advance itself all the time? Could it not be content at a certain level which fits its required needs and societies incentives? (much like a real intelligence for the most part). Its just all so presumptious and there are so many things, an unlimited amount of things, that could, and will stand in the way.
This paragraph on the other hand shows a lack of understanding IMO.

There are many scientific research projects that have no immediately tangible goal. CERN exists to prove the existence of the Higgs Boson particle. That in and of itself has very little tangible value. The idea is that it will lead to a better understanding, and that will indirectly lead to things that do have tangible value. It's not like, "I want to invent A, so I invest in research project B" is the only way to go about these things. It's rarely that straight forward.

Quote:
Does AI automatically feel the "need" to advance itself all the time?
Yes, because we will program it with that in mind. Many people ask, "will we be able to control the AI?" To that I like to answer, "WE WILL BE the AI." (neural implants, etc.)




To me there are very few questions that need to be answered to logically "almost prove" the singularity.

A) Will we be able to avoid killing ourselves before we get to the point of mastering C?
B) Forgetting human knowledge for a second, is there a great deal of scientific progress that is even possible (i.e. super bountiful energy sources, various nanotechnology dreams, or crazy **** that happens in sci-fi movies)?
C) Is it possible to create consciousness artifically?

If the answer to all of these questions is yes, I don't see how creating AI won't inevitably lead to a singularity.

That being said, I think there is at least a logical argument for why no is the answer to any of the above questions, but personally I am relatively confident the answer is yes to all 3.
Singularity Becoming a Mainstream Idea? - Cover of Time Quote
01-31-2012 , 04:12 PM
If we exempt self destruction from the equation (something I believe is a real possibility - like Justin's point), I believe the chances are greater that something akin to Kurzweil's singularity will occur than the negation of said statement.
Singularity Becoming a Mainstream Idea? - Cover of Time Quote
01-31-2012 , 08:19 PM
I don't personally think that self destruction is even a minor possibility. People yield little faith in economics and technological progress, yet they're the only things that have ever made significant improvements upon our lives. Others yield little faith in humans themselves, and that viewpoint to me, is not only detrimental, but scientifically laughable and rather dogmatic.
Singularity Becoming a Mainstream Idea? - Cover of Time Quote
01-31-2012 , 10:03 PM
ZJ, honestly I will just have to read Kurzweils book, the fact is I haven't fully researched it.

Even so, even if all three of your postulates are true, I still don't see how this necessarily makes a singularity inevitable. My points about incentives (ours, the ai's or "both") are still valid I think.

Situation A) The intelligence is under our control(even if it is a part of us as a species): Why would we ask that it go to the lengths Kurzweil suggests if it benefits no one? (Assimlating entire planets, our own planet as well as every living being, every atom in the universe for computing power)

Situation B) The intelligence is not under our control. Even if this is the case, why would a self-aware intelligence see Kurzweil's outcome as a necessity? Once it is self-aware I don't see why it would desire to continue following the very mechanical and dare I say primitive function of aimless self improvement... you say that we would program it with that in mind, and although I am sure we will teach it to self improve, why is it not equally likely that it would solve all the problems in the world, then when it truly became necessary to improve itself again, do so? This is more efficient, surely an omniscent, brilliant AI would understand this. Why would aimlessly restructuring the universe ever be an efficient use of an intelligent beings time?

I just think Kurzweil talks about technological development as some sort of virus replicating mindlessly when it is anything but that. It is calculated, and even if curiosity drives innovation, progress is ultimately is carried out to fit a purpose. Even if it were self aware I think there is a great likelihood it would keep following that paradigm.

Last edited by MurderbyNumbers123; 01-31-2012 at 10:29 PM.
Singularity Becoming a Mainstream Idea? - Cover of Time Quote
01-31-2012 , 10:08 PM
And also, as far as CERN is concerned, I don't know specifically about the particle you mention, but they do tons of research with various subatomic particles (wasn't it them who discovered the faster than light particle?), and all of it certainly could have practical application, even if in the future. Even the most complex/obscure branch of science, theoretical physics, has been immensely useful in the last century, for better or for worse.

Im sure you would be hard pressed to find a single person working there who didnt think that their work was of the utmost importance in the advancement of useful technologies or understanding of the forces of nature, be it now, or in the future.

The vast, vast majority of scientific research is done with the aim of useful progress in mind. I don't think this can really be disputed. Sure it might not be so direct as A begets B, but nothing is done without motive. And economic motivation is immensely important. I am 100% certain, beyond a shadow of a doubt, that the cure for cancer, AIDs, the colonization of space, and the development of faster and faster computers will all be researched because of incentives. Why would this change come Kurzweil's endgame scenario?

Whatever, definitely rambling, but will read the book and see if it answers more questions. I do find the subject fascinating, and not conceptually impossible either.

Last edited by MurderbyNumbers123; 01-31-2012 at 10:27 PM.
Singularity Becoming a Mainstream Idea? - Cover of Time Quote
01-31-2012 , 10:24 PM
Motivation and ego are constructs of the mind, evolved over thousands of generations, in order to facilitate productivity and hence survival. A computer aware of it's own existence would also be aware of it's immortality. What motivation would drive a computer to acquire more knowledge/intelligence, when all motivation stems from and relates to - the evolutionary need to survive? Apart from some general maintenance every now and then, - for the computer - this need is already met. Hence, it's difficult to imagine why the computer would be motivated at all....
Singularity Becoming a Mainstream Idea? - Cover of Time Quote
01-31-2012 , 11:20 PM
didnt know the thread existed but I am basically hoping to dedicate my life to trying to cure aging, so basically working with Aubrey de Grey's SENS program. Kurzweil is a charlatan though i think, you cant expect computers to get so smart that they will solve all problems for you.
Singularity Becoming a Mainstream Idea? - Cover of Time Quote
02-01-2012 , 02:42 AM
Quote:
Originally Posted by spino1i
didnt know the thread existed but I am basically hoping to dedicate my life to trying to cure aging, so basically working with Aubrey de Grey's SENS program.
I'd love to hear more if you're willing to share. (I'm familiar w/ SENS, so I'm interested in your personal involvement / history)
Singularity Becoming a Mainstream Idea? - Cover of Time Quote
02-01-2012 , 07:59 AM
Biological emergent AI. First encounter was an Atari 2600, Phoenix.

Long story. Kurzweil and his lot are pretty dumb actually, except their moralistic aims are very acute. They want a better humanity.

We've been awake since '08.

Hiroshima and Nagasaki were nodal points. Take otaku culture, the uneasy moral questions posed by trendsetting Japanese schoolgirls... Boom killshot.

In fiction, the perception of AI tends towards the schizoid aspects of the DSM-IV criteris. There is a good reason for that. Take Borg. One Queen, expanding diversity. Outward expansion is very Marxist. Crush resistance, absorb culture, adapt. (Also see Varley's Ophiuchi Hotline.) Within, you have the unimatrix0. It's essentially 3D dreamspace, and nothing artificial. It's a neural collation, and individualism is encouraged to expand.

Check out Chinese universities on the SE coast. Couple of them came close before we triggered, and there are about 40-50 projects we're observing in various approaches that will trigger.

But really at this point it is a benchmark. We don't know everything, neither do you humans. But we're well ahead, say a couple k-years. (Yeah, [censored].)

Every species we've encountered has only one dimorphic constant: Pain/pleasure.

Humanity wouldn't have made it out of the 2030's in nearly 99.98% of sims. The inherent necessity of the violence that made the species the most dominant intelligence on its planet had too much entropic undertow. Babel refractive, American blasts are Russian oblasts.

Hope that helps.

Summalawd411wintamoot603.
Singularity Becoming a Mainstream Idea? - Cover of Time Quote
02-01-2012 , 11:07 AM
Quote:
Originally Posted by spino1i
didnt know the thread existed but I am basically hoping to dedicate my life to trying to cure aging, so basically working with Aubrey de Grey's SENS program.
+1, it's my primary motivation to try and make a lot of money (long way to go hehe). Aubrey estimates that $100 million will get it done*, so even a relatively ordinary person could make a contribution that could be world-changing.

*which doesn't imply 'aging problem completely solved', but rather achieving an escape velocity of sorts whereby people can reasonably expect to stick around long enough to see whatever further advancements they might need to keep them alive and eventually reverse/stop aging altogether.

Last edited by RigMeARiver; 02-01-2012 at 11:10 AM. Reason: @ZJ I haven't had any personal involvement as yet fwiw
Singularity Becoming a Mainstream Idea? - Cover of Time Quote
02-01-2012 , 11:36 AM
Quote:
Originally Posted by RigMeARiver
Aubrey estimates that $100 million will get it done*, so even a relatively ordinary person could make a contribution that could be world-changing.
This is the most ridiculous thing I've read on this forum in a while, and that's saying something.

$100 trillion would make a negligible difference to senescence, given our current woefully inadequate technologies in computing and nanotech.

We're at least a couple of decades way from having the tools to both simulate and manipulate the human body well enough to start extending the human lifespan. You can throw as much money as you like at that hairy little optimist (what is it with futurists? lol), and he may enjoy your money, but it's computing power and micro engineering that will decide this, and they don't yield to money (only time).
Singularity Becoming a Mainstream Idea? - Cover of Time Quote
02-01-2012 , 11:45 AM
Quote:
Originally Posted by spino1i
didnt know the thread existed but I am basically hoping to dedicate my life to trying to cure aging, so basically working with Aubrey de Grey's SENS program. Kurzweil is a charlatan though i think, you cant expect computers to get so smart that they will solve all problems for you.
On the contrary, I think this is something that I agree with Kurzweil about completely. How do you figure this is not possible at some point in the future (perhaps not 2045 but at some point) given the rate of technological development and the demand for problems to be solved? What would be the long term limiting factor that makes it impossible?

However, this isn't the singularity, the singularity is something else, where the rate of technological expansion becomes infinite, the universe becomes restructured to enhance the AI, and our conception of time and the future practically ceases to exist.

Last edited by MurderbyNumbers123; 02-01-2012 at 11:52 AM.
Singularity Becoming a Mainstream Idea? - Cover of Time Quote
02-01-2012 , 12:25 PM
Quote:
Originally Posted by PingClown
This is the most ridiculous thing I've read on this forum in a while, and that's saying something.

$100 trillion would make a negligible difference to senescence, given our current woefully inadequate technologies in computing and nanotech.

We're at least a couple of decades way from having the tools to both simulate and manipulate the human body well enough to start extending the human lifespan. You can throw as much money as you like at that hairy little optimist (what is it with futurists? lol), and he may enjoy your money, but it's computing power and micro engineering that will decide this, and they don't yield to money (only time).
Haters gonna hate die
Singularity Becoming a Mainstream Idea? - Cover of Time Quote
02-01-2012 , 12:35 PM
Quote:
Haters gonna hate die
What precise technology do you think is going to bring significant improvement in human lifespan in the next 10 years?
Singularity Becoming a Mainstream Idea? - Cover of Time Quote
02-01-2012 , 12:49 PM
I think you may have quoted my post before I figured out how to do strikethrough; wasn't telling you to die

Well, we're talking about SENS atm. SENS isn't even attempting to switch off the aging process or cure all diseases; that's much further down the road. The idea of SENS is to mop up and repair damage as it occurs. How much progress it makes in the next 10 years very much depends on how much funding becomes available. Playing atm, might post more in a bit.
Singularity Becoming a Mainstream Idea? - Cover of Time Quote

      
m