Open Side Menu Go to the Top
Register
"The Singularity Is Near" by Ray Kurzweil, How Close?? "The Singularity Is Near" by Ray Kurzweil, How Close??

08-19-2010 , 07:12 PM
Plancer,

Great post. That had to take alot of time.

The parts relevant to me

Quote:
[*]Kurzweil's predictions had a different probability of occuring than his prediction of the singularity (Max Raker, MM).
This was only a meta argument against people who are saying that because Kurzweil believes in the singularity and he has a good track record we should believe in it to. That is obv not a very good argument to begin with, but it was made.

Quote:
[*]Computational complexity will be a relevant hurdle to arriving at the singularity (Max Raker)
Quote:
(I disagree)
I don't see how anyone could disagree with that. Computational complexity provides rigorously defined benchmarks for what computers can do now and more importantly, what is even possible for computers, people or any combination of to ever do. Even if you think the hurdles are irrelevant, computational complexity gives us limits on what even the singularity can do.
"The Singularity Is Near" by Ray Kurzweil, How Close?? Quote
08-19-2010 , 07:45 PM
Quote:
Originally Posted by Plancer
Sorry, that's my mistake - didn't realize you were silent regarding the singularity.

I do not think the evidence you gave supports the claim of intellectual dishonesty. You pointed to Kurzweil's reponse to criticism:
The defense that I gave in post 368. That's creationist-level nonsense.
"The Singularity Is Near" by Ray Kurzweil, How Close?? Quote
08-19-2010 , 08:39 PM
Quote:
Originally Posted by Plancer
I wrote Cliff notes just for you, but apparently those are too long as well. So here are Cliff's Cliffs.

PZMeyers = bag of ****
Searle's argument = applied to a drastically different case
Survivorship bias = doesn't apply when trials are unlikely
Everything else = ad hominem
Well thanks for the summary of the summary.

PZMeyers is just amusement.

I don't know Searle.

Survivorship bias may or may not have merits here, and I do think poker players are inclined to fail to recognize it, but since I have pretty much just dismissed the ethos based arguments from the start, I don't really care.

Lumping everything else as ad hominem is disingenuous. I don't care about Kurzweil as a person. Any ridicule I typed was for his ideas.

I just think that taking his predictions more seriously than "the DOW will hit 15,000 by 2012!" is not supported by evidence.

Could it happen?

Sure!

So what?

I believe Strong AI may or may not be possible. If someone says, "development of Strong AI is possible in the next 50 years!", my answer is, "yep". If someone says, "development of Strong AI is inevitable in the next 50 years!", my answer is, "pfffffft!".
"The Singularity Is Near" by Ray Kurzweil, How Close?? Quote
08-19-2010 , 09:08 PM
Quote:
Originally Posted by jb9
I believe Strong AI may or may not be possible. If someone says, "development of Strong AI is possible in the next 50 years!", my answer is, "yep". If someone says, "development of Strong AI is inevitable in the next 50 years!", my answer is, "pfffffft!".
This is a balanced wiew imo. There could of course rise some problems on the road, delaying progress. We could have to wait until the funtion of the brain is described in detail until 'Heisenberg scale'. My personal estimation of the odds is maybe 95-98% in favor of having 'Strong AI' 50 years from now and not having it accordingly 2-5%. This takes into account possible political restrictions too (the the probably upcoming 'BAN AI' movement ).

Last edited by plaaynde; 08-19-2010 at 09:14 PM.
"The Singularity Is Near" by Ray Kurzweil, How Close?? Quote
08-19-2010 , 11:49 PM
Then again, consider how confident computer scientists were with machine translation 50 years ago. In this particular onion, there are problems behind every problem.
"The Singularity Is Near" by Ray Kurzweil, How Close?? Quote
08-20-2010 , 12:45 AM
To those who think so, is Kurzweil a conspicuously bad futurist or are there no good futurists like there are no good astrologers?
"The Singularity Is Near" by Ray Kurzweil, How Close?? Quote
08-20-2010 , 05:05 AM
Quote:
Originally Posted by jb9
There is a lot of room above astrology where one can safely mock.
Any room above that?
"The Singularity Is Near" by Ray Kurzweil, How Close?? Quote
08-20-2010 , 10:02 AM
Quote:
Originally Posted by smrk
To those who think so, is Kurzweil a conspicuously bad futurist or are there no good futurists like there are no good astrologers?
The latter.

Quote:
Originally Posted by smrk
Any room above that?
"The Singularity Is Near" by Ray Kurzweil, How Close?? Quote
08-20-2010 , 10:56 AM
The only thing worst than a futurist is an economist.

Why? Because people actually suffer when the latter is wrong (which is basically always).
"The Singularity Is Near" by Ray Kurzweil, How Close?? Quote
08-20-2010 , 01:18 PM
Quote:
Originally Posted by simplicitus
"Ray Kurzweil does not understand the brain

There he goes again, making up nonsense and making ridiculous claims that have no relationship to reality. Ray Kurzweil must be able to spin out a good line of bafflegab, because he seems to have the tech media convinced that he's a genius, when he's actually just another Deepak Chopra for the computer science cognoscenti."

http://scienceblogs.com/pharyngula/2...t_understa.php
Ray Kurzweil Responds to “Ray Kurzweil does not understand the brain”
"The Singularity Is Near" by Ray Kurzweil, How Close?? Quote
08-20-2010 , 02:04 PM
A lot of it is BSing around the topic, but a few quotes caught my attention:

Quote:
We have sufficiently high-resolution in-vivo brain scanners now that we can see how our brain creates our thoughts and see our thoughts create our brain.
Really?

Quote:
The goal of reverse-engineering the brain is the same as for any other biological or nonbiological system – to understand its principles of operation. We can then implement these methods using other substrates other than a biochemical system that sends messages at speeds that are a million times slower than contemporary electronics. The goal of engineering is to leverage and focus the powers of principles of operation that are understood, just as we have leveraged the power of Bernoulli’s principle to create the entire world of aviation.
This is the kind of stuff that makes me roll my eyes. The 1st and 3rd sentences and the second half of the 2nd sentence contribute nothing to his argument. The bolded portion is the only non-duh thing he is saying here, but he offers no reason to believe it.

And he can't offer a reason to believe it because, as he correctly notes, we do not fully understand the principles of how the brain operates, therefore we cannot be sure that we can replicate its functioning in "other substrates".
"The Singularity Is Near" by Ray Kurzweil, How Close?? Quote
08-20-2010 , 05:11 PM
I'm not qualified to discuss the science part of this with you guys, but since this is also a philosophy forum, I put it to you: in what way is it not pragmatic to invest hope in the likelihood of radical advances happening in our lifetimes? Personally I'm more interested in radical life extension than the singularity per se (though of course AI might be able to solve some of our problems in that area and so would be welcomed) since if we have indefinite time then the technology will inevitably come. Speaking for myself, if death is a certainty then nihilism is the only option, and believe me, nihilism is no fun. Take that certainty away (even just via cryonics) and life gains infinitely more meaning.
"The Singularity Is Near" by Ray Kurzweil, How Close?? Quote
08-20-2010 , 05:43 PM
When the opportunity cost of hoping for an unlikely future outweigh the prospective gains.

That was hard.
"The Singularity Is Near" by Ray Kurzweil, How Close?? Quote
08-20-2010 , 05:45 PM
Sorry, there's an opportunity cost of holding something inside one's brain?
"The Singularity Is Near" by Ray Kurzweil, How Close?? Quote
08-20-2010 , 05:50 PM
If it replaces focus on something more productive, yes. Attention is a scarce resource, long term room for information is not (there is evidence that there's no upper bound on memory storage...at least for feasible levels of acquisition).
"The Singularity Is Near" by Ray Kurzweil, How Close?? Quote
08-20-2010 , 11:13 PM
Quote:
Originally Posted by jb9
The latter.

I don't have any set views so I'm not being all Socratic to make any point but can you say what the common problem is with futurists? What about the nature of making certain claims (edit, a certain class of claims) about the future makes it impossible for any futurist to escape your threshold of loldom?

Last edited by smrk; 08-20-2010 at 11:23 PM.
"The Singularity Is Near" by Ray Kurzweil, How Close?? Quote
08-20-2010 , 11:35 PM
For me it's that it's nearly impossible to demonstrate that a successful one isn't the result of survivorship bias.

Personally, I'd prefer to place more focus on solutions to problems that exist NOW. Neat solutions to famine or drought that we can implement for cheap now...if only more people knew of them. That would be a wonderful thing to write about.
"The Singularity Is Near" by Ray Kurzweil, How Close?? Quote
08-21-2010 , 12:18 AM
With Kurzweil, it isn't that many of his predictions for the near term (1999 predicting 2009 and 2019, 2019 from here http://en.wikipedia.org/wiki/Predict..._Kurzweil#2019) are technologically absurd- if aliens came down and said we had to get some specific prediction done or die, we could get most of them done. It's that he has either absolutely no understanding of how the world works, and its forces of inertia, or he has a hopelessly naive utopian fantasy where economic scarcity won't exist and everybody will just sit around inventing the coolest **** possible, want to implement it all for everybody, and do it for no economic cost. He just has laundry lists of predictions that could be (or could have been) done but won't be/weren't done for non-technological reasons. And then instead of admitting his mistakes and learning from them, we'll hear how baby steps in his direction meant he was really correct all along.

It takes a wide range of skills to be a good futurist, and Kurzweil is clearly lacking in some areas. And there's a strong incentive for anybody who really is that good to just STFU and spend his effort getting filthy rich instead of giving it all away in a book.
"The Singularity Is Near" by Ray Kurzweil, How Close?? Quote
08-21-2010 , 12:23 AM
All I know is when the singularity DOES come, TomCowley will be the first organic to be put to the wall.
"The Singularity Is Near" by Ray Kurzweil, How Close?? Quote
08-21-2010 , 12:30 AM
Nah, I'll be a guinea pig for the genome-specific killer nanobots before then.
"The Singularity Is Near" by Ray Kurzweil, How Close?? Quote
08-21-2010 , 12:35 AM
Come to think of it the fact that you're even posting can be taken as proof that time travel is impossible.
"The Singularity Is Near" by Ray Kurzweil, How Close?? Quote
08-21-2010 , 12:57 AM
TomCowley will end up in a canteen playing chess with a glass of bad gin in his hand, loving Big Brother
"The Singularity Is Near" by Ray Kurzweil, How Close?? Quote
08-21-2010 , 01:52 AM
Quote:
Originally Posted by TomCowley
With Kurzweil, it isn't that many of his predictions for the near term (1999 predicting 2009 and 2019, 2019 from here http://en.wikipedia.org/wiki/Predict..._Kurzweil#2019) are technologically absurd- if aliens came down and said we had to get some specific prediction done or die, we could get most of them done. It's that he has either absolutely no understanding of how the world works, and its forces of inertia, or he has a hopelessly naive utopian fantasy where economic scarcity won't exist and everybody will just sit around inventing the coolest **** possible, want to implement it all for everybody, and do it for no economic cost. He just has laundry lists of predictions that could be (or could have been) done but won't be/weren't done for non-technological reasons. And then instead of admitting his mistakes and learning from them, we'll hear how baby steps in his direction meant he was really correct all along.

It takes a wide range of skills to be a good futurist, and Kurzweil is clearly lacking in some areas. And there's a strong incentive for anybody who really is that good to just STFU and spend his effort getting filthy rich instead of giving it all away in a book.
I'm pretty sure he's already filthy rich by most metrics of filthyness. I'm pretty sure he's more worried about working to make these predictions come true so he can live forever to enjoy it than getting a tad more filthy.
"The Singularity Is Near" by Ray Kurzweil, How Close?? Quote
08-21-2010 , 05:03 AM
Quote:
Originally Posted by RigMeARiver
Sorry, there's an opportunity cost of holding something inside one's brain?
There's a recent essay by Paul Graham which makes a decent argument (imo) that there is indeed so. I found it interesting, so thought it might be worth sharing.
"The Singularity Is Near" by Ray Kurzweil, How Close?? Quote
08-21-2010 , 05:40 AM
Quote:
Originally Posted by -moe-
There's a recent essay by Paul Graham which makes a decent argument (imo) that there is indeed so. I found it interesting, so thought it might be worth sharing.
Heh, well the money matters thing is kinda true in my case since I usually think about poker in the shower. I'm ok with that though. Anyway it misses the point because a belief system (I don't 'believe' anything but there isn't really a better word) isn't something you think about much, if ever, other than during conversation. It rather modifies the backdrop of one's thoughts. I suppose what I'm really interested to know is how thoughtful people can accept the certainty that they will die and that humanity will remain in kindergarten, and not be thoroughly depressed by it. But that's getting off the topic of this thread. I would posit that for scientists, the opportunity cost argument is valid and skepticism takes precedence, while for the rest of us we function best via pragmatic optimism.
"The Singularity Is Near" by Ray Kurzweil, How Close?? Quote

      
m