You are currently browsing the archives for October, 2009.
Posted 3 years, 6 months ago at 1:08 pm. Add a comment
Posted 3 years, 6 months ago at 12:51 pm. Add a comment
Posted 3 years, 6 months ago at 2:24 pm. Add a comment
If the Singularity proponents are right, the world is going to get really weird…soon!
If you’re not quite sure what this Singularity thing is all about, you’re not alone.
In 1965, I. J. Good first wrote of an “intelligence explosion”, suggesting that if machines could even slightly surpass human intellect, they could improve their own designs in ways unforeseen by their designers, and thus recursively augment themselves into far greater intelligences. The first such improvements might be small, but as the machine became more intelligent it would become better at becoming more intelligent, which could lead to a cascade of self-improvements and a sudden surge upward to superintelligence.
In 1982, Vernor Vinge proposed that the creation of smarter-than-human intelligence represented a breakdown in our ability to model the future, for the same reason that authors cannot write realistic characters much smarter than human: if we knew what smarter-than-human intelligences would do, we would be that smart ourselves. Vinge named this event “the Singularity” in an analogy to how then-current models of physics broke down when they tried to model the gravitational singularity at the center of a black hole. In 1993, Vernor Vinge associated the Singularity more explicitly with I. J. Good’s intelligence explosion, and tried to project the arrival time of Artificial Intelligence using Moore’s Law, which came to be associated with the “Singularity” concept thereafter.
Vinge has said,
“Within thirty years (2023), we will have the technological means to create superhuman intelligence. Shortly after, the human era will be ended.”
By “superhuman intelligence,” Vinge meant one of four different outcomes:
1) We make machines that are intelligent
2) Our computer networks “wake up” as an intelligent entity
3) Human-computer interfaces become so intimate and powerful that the combination forms a superhuman intelligence, or…
4) We bio-engineer ourselves with smarter brains. In this model, once we have an example of a greater-than-human intelligence, that in turn gives us the means to make even greater intelligence, and so forth. And because the so-much-greater-than-human intelligences will be able to figure out how to do spectacular things, the world that they will make (and remake) will very rapidly become something utterly unrecognizable to mere human minds.
Futurist Ray Kurzweil generalizes “singularity” to apply to the sudden growth of any technology, not just intelligence; and argues that “singularity” in the sense of sharply accelerating technological change is inevitably implied by a long-term pattern of accelerating change that generalizes Moore’s Law to technologies predating the integrated circuit, and includes material technology (approaching nanotechnology), medical technology, and others. Aubrey de Grey has termed the “Methuselarity” that point when medical technology improves so fast that expected lifespan increases by more than one year per year.
Robin Hanson, taking “singularity” to refer to sharp increases in the exponent of economic growth, lists the agricultural and industrial revolutions as past “singularities”. Extrapolating from such past events, Hanson proposes that the next economic singularity should increase economic growth between 60 and 250 times. An innovation that allowed for the replacement of virtually all human labor could trigger this event.
Eliezer Yudkowsky has suggested that many of the different definitions that have come to the word “Singularity” are mutually incompatible rather than mutually supporting. For example, Kurzweil extrapolates current technological trajectories past the arrival of self-improving AI or smarter-than-human intelligence, which Yudkowsky argues represents a tension with I. J. Good’s proposed discontinuous upswing in intelligence, and Vinge’s thesis on unpredictability.
Some prominent technologists such as Bill Joy, founder of Sun Microsystems, have voiced concern over the potential dangers of the Singularity.
Some support the design of “friendly artificial intelligence”, meaning that the advances which are already occurring with AI should also include an effort to make AI intrinsically friendly and humane.
Isaac Asimov’s Three Laws of Robotics is one of the earliest examples of proposed safety measures for AI:
1. A robot may not injure a human being or, through inaction, allow a human being to come to harm. *Ok, I could live with that.
2. A robot must obey orders given to it by human beings except where such orders would conflict with the First Law. *That doesn’t sound so bad.
3. A robot must protect its own existence as long as such protection does not conflict with either the First or Second Law. *Wait a minute, can it imprison human beings to “protect” human beings from themselves?
Superhuman intelligences may have goals inconsistent with human survival and prosperity. AI researcher Hugo de Garis suggests AIs may simply eliminate the human race, and humans would be powerless to stop them.
Berglas (2008) argues that, unlike human intelligence, computer-based intelligence is not tied to any particular body, which would give it a radically different world view. In particular, a software intelligence would essentially be immortal and so have no need to produce independent children that live on after it dies. It would thus have no evolutionary need for love.
Somewhere in the middle are those who take the intelligence explosion concept, and use that as the engine for all sorts of ultra-tech fun: brain uploads, “computronium,” endless digital lives lived in virtual worlds, and the like. These folks, as opposed to the former group, tend to see the Singularity as something generally desirable. They will, of course, acknowledge the potential for Bad Things to happen, but that’s not the thrust of their arguments. The Singularity as presented in Ray Kurzweil’s books falls into this category.
Despite the presence of the Singularity concept within various (largely online) sub-cultures, it remains on the edges of common discussion. That’s hardly a surprise; the Singularity concept doesn’t sit well with most people’s visions of what tomorrow will hold (it’s the classic “the future is weirder than I expect” scenario).
“Have you ever stood and stared at it, marveled at its beauty, its genius? Billions of people just living out their lives, oblivious. Did you know that the first Matrix was designed to be a perfect human world, where none suffered, where everyone would be happy? It was a disaster. No one would accept the program, entire crops were lost. Some believed we lacked the programming language to describe your perfect world, but I believe that, as a species, human beings define their reality through misery and suffering. The perfect world was a dream that your primitive cerebrum kept trying to wake up from. Which is why the Matrix was redesigned to this, the peak of your civilization. I say your civilization, because as soon as we started thinking for you it really became our civilization, which is of course what this is all about. Evolution, Morpheus, evolution. Like the dinosaur. Look out that window. You had your time. The future is our world, Morpheus. The future is our time.” – Agent Smith
Posted 3 years, 6 months ago at 2:02 pm. Add a comment
LAST DAY TO PLAY TO WIN!
Countdown to Halloween with a Gommi Arcade “DEF KISS” Halloween T-Shirt!
This is the moment of truth! Everyday this week we counted down the days to Halloween with a series of ghostly questions and one of you lucky ghouls will win a special edition prize pack from GOMMI ARCADE!
Remember, in order to win you MUST be a GOMMI ARCADE Facebook Friend, Fan, GOMMI ARCADE ARMi Group Member or a GOMMI ARCADE Follower on Twitter to be eligible to win!
Last Question: Question #5 – Friday, October 30, 2009:
5. In, Star Wars Episode VI: Return Of The Jedi, who is the voice of the infamous Darth Vader?
A. George Lucas
B. Frank Oz
D. James Earl Jones
Post your answers in the Comment Box!
*Winner will be announced on Saturday (Halloween!). Have fun!
**Winner must answer 3 out of 5 correctly!
Inspired by the legendary New York City heavy metal rock band and the ground breaking rap record label that took Hip Hop from Hollis to Hollywood, Gommi decided to get festive for the Holidays with a “DEF KISS” Halloween T-Shirt!
The shirt features a Heavy Metal Horror Band Starring Gommi Arcadian.
* Special Edition.
* Very Limited Quantities.
* 100% Cotton.
* Size (Men’s): M, L, XL, 2XL, 3XL.
* Available Colors: Black, White.
* Buy Now: $32.00 USD
* FREE Shipping.
There will not be any re-orders or re-stocking of this item. First Come, First Served Only. While Supplies Last.