Open Side Menu Go to the Top
Register
"The Singularity Is Near" by Ray Kurzweil, How Close?? "The Singularity Is Near" by Ray Kurzweil, How Close??

08-02-2010 , 07:33 PM
Quote:
Originally Posted by durkadurka33
SURVIVORSHIP BIAS

Wow people...come on.

It's the same thing with financial advisors picking mutual funds. Take 1000 futurists and statistically speaking, just from them guessing, we'll expect a few of them to show remarkable success. Then, it will naturally seem as though they're really really talented but it was just luck.
Disagree. 1 there are more bankers than futurists, 2 stocks go up or down, the set of all possible futures is much larger, therefore a collection of accurate predictions is much less likely to be statistical blind luck.
"The Singularity Is Near" by Ray Kurzweil, How Close?? Quote
08-02-2010 , 07:33 PM
Quote:
Originally Posted by ZeeJustin
This is not a good argument. It's not like mutual funds where you have millions of scam emails increasing the sample size of "guesses" to ridiculous proportions.

There are only so many people in the world well versed in information technologies. Out of those, only so many are actually qualified and intelligent enough to make real predictions, and out of those, only so many do make predictions.

Kurzweil is one of few extremely qualified, extremely intelligent people, and he's been at the forefront of this game for many, many rounds of guesses. He has proven himself as a brilliant inventor decades ago, has earned tons of awards, and has earned the respect of some of the smartest, most qualified people in the world, ranging from Bill Gates to Bill Clinton.

People make arguments about not being able to extrapolate Moore's Law, but if you read the Singularity is near, there is a ridiculous amount of discussion about this. I'm not well versed in this area, so forgive me in advance, but he goes into detail discussing all the possible improvements we can have to our current "flat" computer chips, as there is indeed a limited amount of improvement in those specifically. He talks about more 3d designs, the possible different materials we can use, and even the potential of quantum computing.

The truth is that Kurzweil makes predictions that span a wide array of fields, and the experts in those fields generally agree with him (or possibly fed him the predictions in the first place if you want to be cynical).

If you were to research Aubrey De Gray and read his books, you would understand that Kurzweil isn't just guessing, and that all his data is verifiable.

The people that disagree with Kurzweil are guys like Mitch Kapor of Lotus Software (have you heard of it? I hadn't), and a whole bunch of authors.

The guys that agree with Kurzweil are world leaders in their fields, like the aforementioned Bill Gates.

Maybe I'm bias because of all the TED videos I watch, but I can't tell you how many times I've watched a TED video and gone, "Wow, Kurzweil predicted this back in the mid 90's, and now we basically have proof that it's currently possible in early stages, and will be on the market within a decade".
There is no reason to treat all Kurzweil predictions as the same. The prior probability of the successful things that he predicted (before we found out he predicted them) were much, much higher than the singularity stuff. Or put another way, because of Kurzweil's prediction maybe you raise the probability of it happening from 10^-50 to 10^-20, but I see no reason to go all the way to 10^-1.
"The Singularity Is Near" by Ray Kurzweil, How Close?? Quote
08-02-2010 , 08:34 PM
You singularity guys just don't get it. Repeating over and over and over how Kurzweil is a genius, and a great inventor, doesn't prove his theory is a fact.
"The Singularity Is Near" by Ray Kurzweil, How Close?? Quote
08-02-2010 , 08:39 PM
Quote:
Originally Posted by Max Raker
There is no reason to treat all Kurzweil predictions as the same. The prior probability of the successful things that he predicted (before we found out he predicted them) were much, much higher than the singularity stuff. Or put another way, because of Kurzweil's prediction maybe you raise the probability of it happening from 10^-50 to 10^-20, but I see no reason to go all the way to 10^-1.
The reason he's making the prediction about the singularity is because he thinks it's going to happen, just like all his other predictions. YOUR prediction might be that the singularity is very unlikely to happen, but the prediction of Kurzweil, well known for having accurate predictions, is that it is very likely to happen. You cannot simply go back to his predictions that turned out to be correct and say "well those things were very likely to happen, its obvious" (and then conclude that he isn't so great at making predictions since they were easy to predict) - at the time it wasn't obvious...
"The Singularity Is Near" by Ray Kurzweil, How Close?? Quote
08-02-2010 , 08:51 PM
Quote:
Originally Posted by Karganeth
The reason he's making the prediction about the singularity is because he thinks it's going to happen, just like all his other predictions. YOUR prediction might be that the singularity is very unlikely to happen, but the prediction of Kurzweil, well known for having accurate predictions, is that it is very likely to happen. You cannot simply go back to his predictions that turned out to be correct and say "well those things were very likely to happen, its obvious" (and then conclude that he isn't so great at making predictions since they were easy to predict) - at the time it wasn't obvious...
Wat? Of course you can do what I said. Nobody would have put the odds of anything Kurzweil has successfully predicted at anywhere near say 10^-20 which alot of reasonable people put for the odds of the singularity happening in 2040 or whatever. That is hugely important, unless you just want to turn Kurzweil into a Demigod that we should always think is correct no matter what.
"The Singularity Is Near" by Ray Kurzweil, How Close?? Quote
08-02-2010 , 08:54 PM
Peter Schiff! Peter Schiff!
"The Singularity Is Near" by Ray Kurzweil, How Close?? Quote
08-02-2010 , 09:06 PM
Quote:
Originally Posted by Max Raker
Wat? Of course you can do what I said. Nobody would have put the odds of anything Kurzweil has successfully predicted at anywhere near say 10^-20 which alot of reasonable people put for the odds of the singularity happening in 2040 or whatever.
You really think that no one said that some of Kurzweil's predictions would never happen (such as a chess AI beating the top human players)?...
"The Singularity Is Near" by Ray Kurzweil, How Close?? Quote
08-02-2010 , 09:11 PM
Quote:
Originally Posted by Karganeth
You really think that no one said that some of Kurzweil's predictions would never happen (such as a chess AI beating the top human players)?...
We are talking about "never" having totally different meanings. I will "never" sleep with Megan Fox but that is still 10000000 times more likely than other things like the singularity. And nobody who knew anything would say that computers being better at chess than all humans in the 80s was the same odds as the singularity now.
"The Singularity Is Near" by Ray Kurzweil, How Close?? Quote
08-02-2010 , 09:30 PM
Quote:
Originally Posted by Max Raker
And nobody who knew anything would say that computers being better at chess than all humans in the 80s was the same odds as the singularity now.
So what you're saying is that Kurzweil knows nothing (since he would say they are roughly the same odds)? If he doesn't qualify as knowing anything, who does?...
"The Singularity Is Near" by Ray Kurzweil, How Close?? Quote
08-02-2010 , 09:36 PM
Quote:
Originally Posted by Karganeth
So what you're saying is that Kurzweil knows nothing (since he would say they are roughly the same odds)? If he doesn't qualify as knowing anything, who does?...
No, I am saying that if Kurzweil says something totally ridiculous that is very, very unlikely is going to happen it doesn't magically become a coin flip. If Kurzweil would have ever said those things are the same odds he is a moron. Paging David Sklansky..... this sort of stuff seems right in his wheel house.
"The Singularity Is Near" by Ray Kurzweil, How Close?? Quote
08-02-2010 , 09:41 PM
bayesians itt
"The Singularity Is Near" by Ray Kurzweil, How Close?? Quote
08-02-2010 , 09:58 PM
You have to remember that nature developed the brain with the extremely crude mechanism of evolution. Maybe you could build better thinking devices out of really planning it: more self aware, conscious, dynamic, communicative, peaceful, not to talk about more smart and with better memory. And with somewhat smaller egos.

Last edited by plaaynde; 08-02-2010 at 10:07 PM.
"The Singularity Is Near" by Ray Kurzweil, How Close?? Quote
08-02-2010 , 10:11 PM
He claims 109 (or something) predictions about 2009. Does anybody have a link to these in one place? His own account of his success rate would embarrass even Baghdad Bob, so I'm curious about giving it a less ridiculous grading and seeing how he stands up.
"The Singularity Is Near" by Ray Kurzweil, How Close?? Quote
08-02-2010 , 10:17 PM
Quote:
Originally Posted by Max Raker
No, I am saying that if Kurzweil says something totally ridiculous that is very, very unlikely is going to happen it doesn't magically become a coin flip.
Probability is all about the information thats available to you. The more information you have, the closer you can get to its actual probability. For example, if I had all the information, I would know with certainty if the coin I'm about to flip will land on heads or tails (in a deterministic universe). I'm saying this because you seem to think you know the 'true' probability of the singularity happening (that it is very unlikey). You don't know. No one does. Recognise that it is simply your own prediction that it's very unlikely.

After buying a lottery ticket and believing you have a 1 in 10 million chance of winning, you look at the results of the draw and all the numbers match. You now believe that the chance you have won is just under 1 in 1. Dramatic changes in estimations of something happening can happen. If I didn't know much about computers and technology I would think that AIs will definitely not be smarter than humans within 50 years - I would think it would be less than 1 in a trillion. But then if I learned that a world class inventor, someone awarded a medal by the president and someone known for the accuracy of his predictions said that it is very likely to happen, my estimate of how likely it is to happen would change dramatically. My prediction would change a lot because I'd believe that whoever knew a lot about predicting future trends in computing to be more accurate than myself so I would use his prediction instead of continuing with my old prediction. It's not magic.
"The Singularity Is Near" by Ray Kurzweil, How Close?? Quote
08-02-2010 , 10:19 PM
Quote:
Originally Posted by plaaynde
You have to remember that nature developed the brain with the extremely crude mechanism of evolution. Maybe you could build better thinking devices out of really planning it: more self aware, conscious, dynamic, communicative, peaceful, not to talk about more smart and with better memory. And with somewhat smaller egos.
The biggest thing is speed and consolidation of computing power. Evolution produced "thinking" beings that are very very bad at sharing information even compared to what we already have been able to build.
"The Singularity Is Near" by Ray Kurzweil, How Close?? Quote
08-02-2010 , 10:26 PM
Quote:
Originally Posted by Karganeth
Probability is all about the information thats available to you. The more information you have, the closer you can get to its actual probability. For example, if I had all the information, I would know with certainty if the coin I'm about to flip will land on heads or tails (in a deterministic universe). I'm saying this because you seem to think you know the 'true' probability of the singularity happening (that it is very unlikey). You don't know. No one does. Recognise that it is simply your own prediction that it's very unlikely.

After buying a lottery ticket and believing you have a 1 in 10 million chance of winning, you look at the results of the draw and all the numbers match. You now believe that the chance you have won is just under 1 in 1. Dramatic changes in estimations of something happening can happen. If I didn't know much about computers and technology I would think that AIs will definitely not be smarter than humans within 50 years - I would think it would be less than 1 in a trillion. But then if I learned that a world class inventor, someone awarded a medal by the president and someone known for the accuracy of his predictions said that it is very likely to happen, my estimate of how likely it is to happen would change dramatically. My prediction would change a lot because I'd believe that whoever knew a lot about predicting future trends in computing to be more accurate than myself so I would use his prediction instead of continuing with my old prediction. It's not magic.
Nobody has given any reasons why the singularity is remotely possible other than "cuz Kurzweil said". If that is all it takes for you to ramp up a probability from 0 to 1 obv people are going to laugh at you. If you think the idea has merits independent of kurzweil, you can argue them. But if not you have just elevated him to a status on par with cult leaders while making some pretty basic probability errors to boot.
"The Singularity Is Near" by Ray Kurzweil, How Close?? Quote
08-02-2010 , 10:36 PM
Quote:
Originally Posted by plaaynde
You have to remember that nature developed the brain with the extremely crude mechanism of evolution. Maybe you could build better thinking devices out of really planning it: more self aware, conscious, dynamic, communicative, peaceful, not to talk about more smart and with better memory. And with somewhat smaller egos.
Edit: and more artistic, musical, feeling
and more compassionate, tolerant
and humorous

Scary in a way, but not necessarily if you work on and try to find out the best and balanced mixes.

Last edited by plaaynde; 08-02-2010 at 10:56 PM.
"The Singularity Is Near" by Ray Kurzweil, How Close?? Quote
08-03-2010 , 04:55 AM
Quote:
Originally Posted by Max Raker
Nobody has given any reasons why the singularity is remotely possible other than "cuz Kurzweil said". If that is all it takes for you to ramp up a probability from 0 to 1 obv people are going to laugh at you. If you think the idea has merits independent of kurzweil, you can argue them. But if not you have just elevated him to a status on par with cult leaders while making some pretty basic probability errors to boot.
Max, have you read the book? There's clearly some odd stuff in there, and people are all a bit cult like and uncritical about him, but its really good, and realy well argued.

Argument seems to be mainly this.

1. Computing power is growing exponentially. (Moors Law)
  • This has happened through several computing paradigms, not just silicon chips.
  • The most pessimistic outlook has silicon chips carrying on at this rate for 15 years or so using the current tech paradigm.
  • There are numerous plausible technologies in the pipeline which could carry this further. 3d chips, quantum computing, reversible computing, carbon nanotubes.
  • The % of global resources going into computing chips is also going up.
  • The number of people with advanced technical degrees is going up. (China/india)

2. If there is a limit to computation power, we are no where close to it.

RK goes into depth on this, and I couldn't critically evaluate much of it due to lack of knowledge in the area, but presents a good case for why we could theoretically build computers 10s of orders of magnitude more powerful than we have now.

3. The brain is a physical process which carries out computation or something similar to it.

4. Computers can in theory, with enough grunt, do what the brain does.

5. We can estimate, very roughly to within 3-4 orders of magnitude, the level of computational power in the brain.


Conclusion A. Based on the above, at some point, we will be able to create computers more powerful and therefore smarter than a human brain. The date is dependent on the estimate from 5, but due to the exponential level of tech growth, not much.

Conclusion B. If this is the case, by definition, they will be better at designing computers than we will and will go from there.

I'm still not 100% convinced on the two conclusions, but I think his broader point about people underestimating how different the future will be, and how difficult it for people to get their head around exponential growth is pretty solid.
"The Singularity Is Near" by Ray Kurzweil, How Close?? Quote
08-03-2010 , 05:38 AM
There are all kinds of background assumptions there about the inevitability of technological progress.

But let's just assume that there are no geopolitical or environmental or ideological disruptions in the next 50 years and Moore's law continues unabated, and that there is no intrinsic limit to the geometric growth of computational power.

There's still this whole issue of jumping from faster and faster computers to computers that have the properties of brains and are self-conscious, self-willing, and perhaps most importantly have the ability to autonomously create norms for themselves (i.e. to create their own fundamental values, which is either extremely rare or impossible for humans depending on your particular philosophical or religious beliefs).

I don't understand why this leap should follow inevitably from an increase in complexity or computational ability. And without the leap I don't see how the singularity happens - computers designing computers with ever greater efficiency doesn't seem to have any intrinsically dire consequences to me. Stuff will just get faster, within the physical limits of the materials available to us.

You could respond by arguing that this very leap from computing machine to autonomous thinking machine is exactly what happened to man. But there is so much we doubt know about the functioning of the brain that it may easily turn out that the analogy of the human brain to a semiconductor-based computer is fatally flawed. Perhaps there are properties of the materials involved or the organization of them, perhaps the emergence of consciousness under evolutionary pressures is somehow not replicable by intelligent design. I have nfi.



And one short note: this idea that because of geometric technological advancement the future is "becoming unpredictable" seems like a fallacy to me.

The future has *always* been unpredictable.
"The Singularity Is Near" by Ray Kurzweil, How Close?? Quote
08-03-2010 , 05:43 AM
Quote:
Originally Posted by JammyDodga
but I think his broader point about people underestimating how different the future will be,

I agree with this but this has likely been the case for all of human history.

Also for what it's worth futurists and science fiction writers, though they tend to miss the unforeseeable major technological breakthroughs of the future, also tend to extrapolate far too aggressively from the known technological breakthroughs of their recent past. (For example consult any 50's sci-fi writer on the state of space travel by the year 2010.)

Kurtzweil feels like he's operating in this vein to me.
"The Singularity Is Near" by Ray Kurzweil, How Close?? Quote
08-03-2010 , 06:18 AM
Jammy's post and Micturition Man's response to it are both very good posts.

IMO Micturition Man brings up the only reasonable argument against the singularity.


I always think of the brain as a whole bunch of chemical reactions, and data processing. W/ the way I perceive the brain, it just seems natural that it can all be imitated by a powerful computer program. If this isn't the case, and there's something special about consciousness that can't be replicated, then the singularity won't happen as predicted (or at the very least will take MUCH longer to happen).



At this point, it's pretty much universally accepted, that in the near future, computers will be able to process information as quickly and efficiently as the human brain. 3-4 years after this, computers will be an order of magnitude better than humans at this.

The real question is, can we take this computing power, and turn it into intelligence? If the answer to that question is yes, then how can you possibly think the singularity won't happen? (of course, many people will say no, and I don't really have a reasonable argument against that view)


Personally, I think we will be able to reverse engineer the human brain in the next 30 years. I also think this will likely be a very inefficient way of creating computer intelligence, and we'll find something better and faster before we are capable of reverse engineering the human brain.
"The Singularity Is Near" by Ray Kurzweil, How Close?? Quote
08-03-2010 , 06:29 AM
Quote:
Originally Posted by JammyDodga

3. The brain is a physical process which carries out computation or something similar to it.

4. Computers can in theory, with enough grunt, do what the brain does.

5. We can estimate, very roughly to within 3-4 orders of magnitude, the level of computational power in the brain.

Many peopel quibble with the assumption that a giant Turing machine can simulate the brain's processes. Penrose is an obvious example of someone who has eloquently objected to this, through the argument that humans intelligently doing maths do not blindly follow algorithms.
"The Singularity Is Near" by Ray Kurzweil, How Close?? Quote
08-03-2010 , 06:46 AM
Quote:
Originally Posted by river_tilt
Many peopel quibble with the assumption that a giant Turing machine can simulate the brain's processes. Penrose is an obvious example of someone who has eloquently objected to this, through the argument that humans intelligently doing maths do not blindly follow algorithms.
True, RK gets into this in some depth. You can model neural networks with a digital computer so that your calculations don't blindly follow algorithms, and even if chemical, not turing elements are required, there's no reason we cant incorperate them.
"The Singularity Is Near" by Ray Kurzweil, How Close?? Quote
08-03-2010 , 08:19 AM
Quote:
Originally Posted by JammyDodga
3. The brain is a physical process which carries out computation or something similar to it.

4. Computers can in theory, with enough grunt, do what the brain does.
4 is the huge leap here. And even with 3's "something similar to it" (computation), 4 becomes an even bigger leap.

This assumes the reductionist, materialist approach to the brain, and says that the brain is nothing but an organic computer. There's so much that we don't know about the brain; it boggles me why people say things like computers doing eventually what the human brain can do.
"The Singularity Is Near" by Ray Kurzweil, How Close?? Quote
08-03-2010 , 08:22 AM
Quote:
Originally Posted by ZeeJustin
At this point, it's pretty much universally accepted, that in the near future, computers will be able to process information as quickly and efficiently as the human brain. 3-4 years after this, computers will be an order of magnitude better than humans at this.
It already can do it faster. I wish I could retrieve my long-term memories the same way I can sift through a hard drive, or do 1000 decimal calculations in fractions of a second.
"The Singularity Is Near" by Ray Kurzweil, How Close?? Quote

      
m