Re: [unrev-II] Hofstadter's Saturday, April 1st,Special: WILL SPIRITUAL ROBOTS REPLACE HUMANITY BY 2100?

From: Paul Fernhout (
Date: Thu Mar 30 2000 - 08:17:01 PST

  • Next message: Peter P. Yim: "Re: [unrev-II] the save iridium project..."

    Henry van Eyken wrote:

    > The idea of downloading brains has been around a long time. One of the places, they
    > were dreaming about this sort of thing was MIT.
    > Computer modelling of human thought processes, even notions of free will (oh, how
    > brave to speak of the free!) is one thing. (Ref. Johnson-Laird, The computer and
    > the mind, 1988!) Direct electronic communication with locals in the brain is
    > another. But down/uploading "whole brains," ... humbug.

    I agree but for slightly different reasons.

    Here is something I posted on Slashdot about this:

    Many of Kurzweil's points are similar to Bill Joy's. The difference is
    the conclusion. Kurzweil has a rosy view that we will be able to
    download ourselves into the network. If you disagree that this is
    plausible, then his book defines a similar
    extinction scenario as Bill Joy's comments.

    In my opinion, Kurweil's analysis of the evolutionary dynamics of a
    world wide web of downloaded humans is flawed because it ignores
    fundamental aspects of ecology and evolution. Specifically, here are two
    issues about his conclusion:
    a) it assumes humans in a different environment will still act human
    with classical human motivations (as opposed to dissolve into an
    unrecognizable set of bits or simply locking in a pleasure loop) because
    to a large extent environment elicits behavior, and
    b) it ignores evolution and its implications in the digital realm
    (especially the enhanced pace of evolution in such a network and the
    implications for survival).
    Of these, the most important is (b).

    Evolution is a powerful process. Humans have evolved to fit a niche in
    the world -- given a certain environment which includes a 3D reality and
    various other organisms (including humans). Humans have an immune
    systems (both mental and physical) capable of dealing with common
    intellectual and organismal pathogenic threats in their environment.
    There is no easy way to translate this to success in a digital
    environment, because the digital environment will imply different
    rewards and punishments for various behavior, and evolve predators and
    parasites which these immune systems have never been exposed to before.
    Human style intelligence is valuable in a human context for many reasons
    -- but sophisticated intelligence is not necessarily a key survival
    feature in other niches (say, smaller ones the size of roaches, hydra or
    bacteria). In short, the human way of thinking will be inadequate for
    survival in the digital realm. Even augmented minds that are connected
    to the network will face these threats and likely be unable to survive
    them. Kurzweil discusses the importance of anti-viral precautions in his
    book, but I think he is rosily optimistic about this particular aspect.

    At best, one might in the short term construct digital environments for
    digital humans to live in, and defend these environments. However, both
    digitized human minds and immensely larger digitized human worlds will
    be huge compared to the smallest amount of code that can be self
    replicating. These digital "bacteria" will consume these digital human
    minds and worlds because the human minds and worlds will be constructed,
    not evolved. Human minds will be at a competitive disadvantage with
    smaller, quicker replicating code. Nor will there be any likelihood of a
    meaningful merger of human mind with these evolved and continually
    evolving patterns.

    I could endlessly elaborate on this theme, but in short -- I find it
    highly unlikely that any mind designed to work well in meatspace will be
    optimal for cyberspace. It will be overwhelmed and quickly passed by in
    an evolutionary sense (and consumed for space and runtime). It is likely
    this will happen within years of digitization (but possibly minutes or
    hours or seconds). As an example experiment, create large programs
    (>10K) in Ray's Tierra and see how long they last! ra/tierra.html

    Our best human attempts at designing digital carriers (even using
    evolutionary algorithms) will fail because of the inherent
    uncompetetiveness of clunky meatspace brain designs optimized for one
    environment and finding themselves in the digital realm. For a rough
    analog, consider how there is an upper limit of size to active creatures
    in 3D meatspace for a certain ecology. While something might survive
    somehow derived from pieces of a digitized person, it will not resemble
    that person to any significant degree. This network will be an alien
    environment and the creatures that live in it will be an alien life
    form. One might be able to negotiate with some of them at some point in
    their evolution citing the commonality of evolved intelligence as a bond
    -- but humanity may have ceased to exist by then.

    In short, I agree with the exponential theme in Kurzweil's book and the
    growth of a smart network. We differ as to the implication of this. I
    think people (augmented or not) will be unable to survive in that
    digital world he predicts for any significant time period. Further,
    digital creatures inhabiting this network may be at odds or indifferent
    to human survival, yet human civilization will likely develop in such a
    way that it is dependent on this network. The best one can hope for in
    the digital realm is "mind children" with little or no connection to the
    parents -- but the link will be as tenuous as a person's relation to a
    well cultivated strain of Brewer's yeast, since the most competetive
    early digital organisms will be tiny.

    Once you start working from that premise -- the impossibility of people
    surviving in the digital world of 2050, then Kurweil's book becomes a
    call to action, just like Bill Joy's comments. I don't think it is
    possible to stop this process for all the reasons both people mention.
    It is my goal to create a technological alternative to this failure
    scenario. That alternative is macroscopic self-replicating (space)
    However, they are no panacea. Occupants of such habitats will have to
    continually fight the self-replicating and self-extending network jungle
    for materials, space, and power. (Sounds like the making of a sci-fi
    thriller...) And they may well fail against the overwhelming odds of an
    expanding digital network without conscience or morality. Just look at
    Saberhagen's Beserker series or the Terminator

    It will be difficult for Kurweil to change his opinion on this because
    he have been heavily rewarded for riding the digital wave. He was making
    money building reading machines before I bought my first computer -- a
    Kim-I. But, I think someday the contradiction may become apparent of
    thinking the road to spiritual enlightenment can come from material
    competition (a point in his book which deserves much further
    elaboration). To the extent material competition drives the development
    of the digital realm the survival of humanity is in doubt.

    -Paul Fernhout
    Kurtz-Fernhout Software
    Developers of custom software and educational simulations
    Creators of the Garden with Insight(TM) garden simulator

    Get a NextCard Visa, in 30 seconds!
    1. Fill in the brief application
    2. Receive approval decision within 30 seconds
    3. Get rates as low as 2.9% Intro or 9.9% Fixed APR
    Apply NOW!

    Community email addresses:
      Post message:
      List owner:

    Shortcut URL to this page:

    This archive was generated by hypermail 2b29 : Thu Mar 30 2000 - 08:23:19 PST