Reshared post from Singularity Utopia

I left comment in +Singularity Utopia's post that I'm copying here for archiving purposes. I've been frustrated with the discussions surrounding the Singularity for a long time, and I've found the philosophical and theoretical foundations for the discussion of technology to be significantly lacking. I tend to take it out on SU because they do good job highlighting the "mainstream" singularity view, so I mean it as no disrespect and I'm not trying to troll. I'm talking about these issues because I think they are serious and important.

I am still baffled why anyone thinks the singularity is an "event", or if they do, why they would put it off into the future.

Technological progress is already accelerating faster than our human ability to keep up, and it is already having dramatic and devastating consequences for ourselves and our planet.

We are already surrounded by a variety of intelligent machines, each of which are performing tasks that baffle and dazzle and amaze us, and which few (if any) of us understand completely. Some of these machines are responsible for maintaining critical aspects of human well-being and social practices, and we've become dependent on their operation for our very being.

Although both changes are definitely happening, and with accelerating pace, I'm not sure what point break event the Singularity theorists expect to distinguish some future state from the existing states. If the claim is that there is some qualitative distinction between the pre- and post-Singularity world, I would offer that such changes have already occurred, as part of the Digital Revolution. The Digital Age begins in the late 70's, but doesn't really kick off full blast until the last decade, and really with the introduction of Google. The Digital Age is going strong, and shows no signs of stopping, but there's no reason to expect some future qualitative shift that will dramatically change the terms of the game. The Digital Age is the realization of the cybernetic dreams of transhumanists like Haraway, where our machines are "made of sunshine" and everything is ethereal bits in information space. On this story, we are already post singularity; we are a few decades into the transition, and we have a long way to go before the transition is entirely complete. But on this story, there is no future left to predict; there is just the working out of the implications already available in the present.

For what it's worth, Google already is an artificially intelligent system with intellectual capacities far outstripping any human brain. It's not perfect, but what intelligence is? Regardless of how much you respect its abilities, Google is undoubtedly a mechanical mind that speaks to each of us in the slightly broken language of a non-native speaker, but its language is meaningful and important and we all listen to what it has to say. As an unavoidable consequence of the collective task of teaching it to be smarter, our conversations with it have irreversibly altered our very nature, in a massive symbiotic feedback loop characteristic of ecological development.

The point is that this merger, this transition into a new mode of human being, is already happening, and has been happening in a while, and there is absolutely no reason to think that this development will reach some breaking point that will fundamentally change the trajectory that we are already on. It is extraordinarily rare for a biological system to hit a point of singularity; I'm not sure there is any precedent for it. The ironic thing about all the Singularity discussion about Moore's law is that a J-curve is entirely continuous. There are no points of discontinuity, no points of singularity, anywhere along the curve.

What biological systems tend to experience instead are periods of rapid growth followed by periods of decline and decay; the biological analogy of a singularity is an extincting event. Unfortunately, the predictions of such an event hitting in the 2030's is being predicted from across scientific fields, and simultaneous products of both economic and environmental collapse. But surely the singularists aren't predicting an extinction event. Thus, I am left perpetually confused as to what exactly they are predicting. I am certainly convinced that the basic framework for discussing technology has very poor theoretical foundations.

They are predicting a transition that they believe is yet-to-come. In a sense, the Singularists have bought into the fable that was supposed to prepare us for the present day, and have simply not yet realized that the time to apply those lessons has already arrived.

Singularity Utopia originally shared this post:

If you don't know what the Singularity is, here's a good explanation:

Human intelligence evolved slowly. Artificial Intelligence evolves at an astonishingly rapid speed. Natural procreation is a slow method to improve intelligence therefore human generations progress slowly. Human minds evolved over millions of years whereas Artificial Intelligence needs the tiniest fraction of the human evolutionary period to reach then surpass human intellectual capacity. Each year new generations of computers help scientists and technologists to solve problems beyond ordinary human brainpower. When intelligent computers are able to think for themselves they will quickly create newer computers with increasingly bigger intellects; a positive feedback. Our increasing level of expertise allows us to cram more progress into our current year than we did the previous year. Based on the 2001 rate of progress we will see at least 20,000 years of progress between 2001 and 2100. Explosive 2045 is coming.

The site updates are nearly complete, should be finished on the 20th April.

Submit a comment