An offering, in the spirit of the season...
Several things resonate:
From: Eric Armstrong <email@example.com>
Date: Mon, 11 Dec 2000 15:29:56 -0800
Subject: [unrev-II] Tech Startup "How To" notes
From: Jack Park <firstname.lastname@example.org>[...]
Date: Thu, 14 Dec 2000 16:28:56 -0500
Subject: Re: [unrev-II] Use Cases and Ontologies
So, we begin to think of the
common use cases as the 'roots' of --eventually--a forest of specialized
usecases. The common use cases represent the basis for interoperability
among the specialty domains.
Now, just substitute the term 'ontology' for the term 'use case' and you
have the mapping. Bingo. Get the ontology right, and the rest falls out
From: "Henry van Eyken" <email@example.com>Henry discusses Specification, Organization, and Content, and says:
Date: Mon, 18 Dec 2000 06:27:58 -0000
Subject: [unrev-II] Hyperscoping. It's the natural thing to do
arousals (emotions) are organizing agents, but, unless one is awake,
agents with little regard for reality. [...] Reality comes with the waking state which knits those
factuals together so as to let us interpret our daily experiences to
ourselves - to make sense of the events around us, i.e. to experience
"reality," and in the process add to our store of factuals.
Actors = factuals;
relations = organization; state = specification (selecting object of
our affection) or hyperscope.
Why do I think this is important to contemplate? Because chances are
that augmenting the human mind will work better if it is in step with
the very way natural mind works anyway.
We may just be on a very good track.
From: Paul Fernhout <firstname.lastname@example.org>
Date: Mon, 18 Dec 2000 21:17:10 -0500
Subject: [unrev-II] Is "bootstrapping" part of the problem?
I understand the desire to be neutral on the ends to which
"bootstrapping" is applied to attract broad support, but ultimately (in
my opinion) many organizations (large corporations or other
bureaucracies) in today's world effectively are already machine
intelligences (somewhat like ant colonies) working towards their own
exponential ends (in an economic framework).[...] So in this sense, I see the machine intelligences
already to an extent "shading out" efforts like the Bootstrap Institute
or the Humanities library.
First let me summarize: there is more to living than "intelligence".
Intelligence doesn't call one to act, "desire" does that. "Intelligence"
doesn't define why one should do one thing rather than another, unless
one already has "values". One can make a rational choice, but the desire
and values that cause that choice to be made and acted on are to a large
extent outside of the realm of "intelligence". As an outgrowth of
"intelligence", knowledge management will neither lead to choices or
cause actions in the absense of "values" or "desire". We are talking
about putting ever more powerful "intelligence" in the hands of
organizations that have already shown themselves capable of building
50,000 nuclear warheads, letting close to a billion people starve, and
dumping PCBs in water bodies and resisting attempts to clean them up.
From: Henry van Eyken <email@example.com>
Date: Tue, 19 Dec 2000 18:26:20 -0500
Subject: Re: [unrev-II] Is "bootstrapping" part of the problem?
Educators purport to prepare young people for a lifetime, i.e. with a horizon of more than half a century. For most of human existence, the future was not unlike the past. But that has changed drastically. We experience significant events, including dangerously critical events with shorter and shorter intervals between them, which begs the question just what are we trying to prepare young people for? Will reading, writing and 'rithmetic remain the constants that will serve them all their lives? What are the constants so we may pay especial attention to them? The closest answer I can come up with is: human nature.
Requesting focus on goals/purpose is an extradorinarily good thing,
in my view. And it beggars the imagination to think that we really
haven't done so.
This archive was generated by hypermail 2b29 : Tue Dec 19 2000 - 20:04:58 PST