Eric Armstrong wrote:
> Henry van Eyken wrote:
> > As things stand, it is my strong impression that people do not wish to
> > face
> > potential facts and prepare for them. In my own lifetime, many heard
> > Hitler's rantings, learned about crystal night, saw troop movements
> > along the fabulous Autobahns and, like Chamberlain, believed in peace
> > forever
> > after. In my lifetime, many have for years observed the rapid melting
> > of the
> > polar ice caps, perhaps discussing that calamity over drinks that heat
> > up
> > rapidly once the ice in them has melted. In my lifetime, the
> > deteriorating
> > effect of free radicals on the ozone layer has been known for many
> > decades
> > (very early in the seventies, I happened to see an English translation
> > of a Russian paper on the subject). Etc.
> Damn good observation. The dangers are real, but we are going to
> optimistically underestimate them, because that is the way we are wired.
Much agreement to both comments.
Also, there are spiritual robots abroad now -- they are called
Bill Joy has a quasi-symbiotic relation with one -- as long as it thinks
it needs him.
If it decided it did not, he would be removed as was Steve Jobs from
Apple (till Apple decided it needed him again.)
> And, as one of the newspaper correspondents who evaluated Joy's
> arguments said: relinquishment is not an option, because doing that
> would mean a totally coordinated effort among all the people on this
> planet, and when has *that* ever happened? So I think we can safely rule
> out relinquishment as impractical, as well as undesirable (given
> Merkle's arguments).
I agree relinquishment is not an option. Besides, while I doubt this
motivated Bill Joy, it is easy for a succesful and wealthy person in
today's economy to suddently notice the consequences of his or her
actions and say, let's keep things as they are (with me near the top.)
Why wasn't Bill Joy reading Langdon Winner's book "Autonomous
Technology" written two decades ago which discusses technics out of
Even the simplest technology adopted by corporations has many aspects of
an evolutionary arms race as they use it to gain a competetive edge.
Arms races are "resolved" by:
* One faction winning (meaningless here because even if one corporation
or country won, it might split internally leading to more arms races...
That's what would likely happen if nanotech "gray goo" covered the
planet destroyign all else -- in time the goo would evolve to more
complex things which again ompteted with each other.)
* Symbiosis (Lewis Thomas was an early proponent of the importance of
this in his book "Lives of a Cell".)
* Refugia -- one side can always hide and survive in a places the other
sides can't get it (and the arms race can persist)
* Scale -- there is just so much environment that "statistical" refugia
become possible and the arms race can persist with a certain
* anything else?
I didn't list coexistence as a resolution (apart from symbiosis),
because except as symbiosis (interdependence), I am not sure if
coexistence is a long term option. However, perhaps it is possible with
newer techniques of living and relating such as intelligence is capable
of devising. In human relations barriers like walls, rivers, myths, and
fears have kept competition in check (as well as positive human beliefs
-- compassion, morality, etc.). So perhaps a DKR on new ways to
transcend arms races might be a good one...
Arms races happen all the time in nature. What we see at the moment is
the outcome of endless historic biological arms races among species --
ending in extinction, symbiosis, or . The only "problem" with this
technological arms race is our destructive potential now outweighs the
biospheres generative capacity (on a human time scale).
To me, the major mission of the U.S. Department of Defense should be to
defend the USA from arms races. It is too bad the DOD instead seems to
view their mission more as winning them (which statistically just
ensures eventual destruction if nothing greater intervenes).
> What does that leave? Personally, I'm starting to think we should maybe
> be praying for the second coming of the big comet. (I happen to think
> there is evidence for the plausibility of that scenario.)
Well... Maybe if it orbited the Earth and we used it to build space
> There is an
> outside chance, though, that if we focus on rapid-reaction defense using
> super-powerful computers, coupled with reasonable security controls on
> access to information,
Perhaps... But except on a local scale, such will not be possible.
because of the nature of arms races.
I'd argue more strongly for increasing the scale of the living network
(space habitats mainly, but also decoupled terrestrial infrastructures
widely dispersed in the ocean, arctic, deserts, and underground).
> and if we intently focus on removing any cause
> for bitterness on the part of individuals and countries throughout the
Yes -- this should be a major part of any defense strategy.
Unfortunately, current US defense strategy (like in Iraq) is creating
millions of (malnourished == brain damaged) terrorists abroad, and at
home, U.S. social policies like the "drug war" are creating millions of
criminals. In both cases, the effort should instead be to create
productive citizens of a democracy (or the local equivalent of such).
> then we might be able to muddle through to the end of the
> If we get that far, were probably good for the long haul.
Agreed. The next century (even the next two decades) may likely be the
turning point in deciding how (and if) humanity and technology coevolve.
To put things in perspective, it is important to realize billions of
people will die over the next millenium from competition with technology
(technology that is either autonomous or augmented humanity, used for
war or just for sport or economics). This is probably not preventable.
Billions of other people will attempt to transcend into something beyond
humanity -- and it is not clear what to make of the resulting robot
claiming success as it looks over the human corpse. This too is not
Humans have always had the option to become something other than human
(leaving behind things like corpses, memories, books, and maybe angels)
-- it's called "suicide" if it's voluntary, "death" if it's not.
What is doable is to create options so that the people who want to
remain human and have human children (etc.) over the next millenium can
do so if they choose.
Developers of custom software and educational simulations
Creators of the Garden with Insight(TM) garden simulator
Get your money connected @ OnMoney.com - the first Web site that lets
you see and manage all of your finances all in one place.
Community email addresses:
Post message: unrev-II@onelist.com
List owner: unrev-IIfirstname.lastname@example.org
Shortcut URL to this page:
This archive was generated by hypermail 2b29 : Tue Apr 11 2000 - 07:26:44 PDT