11.0123 present & future of computing

Humanist Discussion Group (humanist@kcl.ac.uk)
Sun, 22 Jun 1997 10:31:55 +0100 (BST)

Humanist Discussion Group, Vol. 11, No. 123.
Centre for Computing in the Humanities, King's College London
<http://www.princeton.edu/~mccarty/humanist/>
<http://www.kcl.ac.uk/humanities/cch/humanist/>

[1] From: Willard McCarty <Willard.McCarty@kcl.ac.uk> (43)
Subject: we have our moments

[2] From: "Prof. Jose Gomes Filho" <gomes@dpx.cnen.gov.br> (79)
Subject: Re: 11.0101 future of computing

--[1]------------------------------------------------------------------
Date: Fri, 20 Jun 1997 09:42:59 +0100 (BST)
From: Willard McCarty <Willard.McCarty@kcl.ac.uk>
Subject: we have our moments

Tagging may be characterised as a collection of moments in each of which an
observation about primary data is followed by its encoding in some
metalanguage (like TEI/SGML). Then, at intervals, the collection of tags is
compiled, individual items perhaps corrected, after which the resulting
metatext becomes the basis for analysis. The great virtue of tagging as the
basis for study of primary material is that unaided we humans seem at best
to be very good at making such momentary observations but rather less good
at summing them up without overlooking or forgetting the troublesome ones.
To the degree such oversight occurs, we get out of our primary material more
or less what we expected, wanted to find.

Consider, then, the following meditation on our kind, from Philip Gerrans,
"Is it catching?", rev. of Dan Sperber, <cite>Explaining culture: A
naturalistic approach</cite> (Oxford: Blackwell), in TLS 4916 for 20 June
1997, p. 5:

"Rationality is a precarious achievement, almost instantly swamped in
day-to-day life by other influences on human psychology.... Furthermore, it
seems that rationality, even in situations where it is recognised as
desirable, such as planning and prediction, is not an automatic part of the
human inferential repertoire. Humans are very bad at employing the norms of
rationality, such as logical inference, probabilistic reasoning and planning
beyond the immediate future, but they are very good at convincing themselves
that they have sound reasons for what they do. Perhaps the most tenacious
aspect of mind is not rationality, but rationalization, the reinterpretation
of evidence to support a belief rather than revising it in the face of
counter-evidence and counter-argument. Indeed, Stuart Sutherland's recent
book, <cite>Irrationality</cite>, demonstrates that even those areas of
human culture consecrated to the norms of rationality, the academic
professions, operate, in some cases, according to non-principles of
confabulation, improbability and counter-induction."

If the development of human mentality in essence involves, as some have
argued, the externalization of mind through our creations and inventions,
then is the computer a materialization of our desire to be rational? Are we
in effect giving ourselves the means to be rational? Is this a good thing?

Comments welcome.

Yours,
WM

----------
Dr. Willard McCarty
Senior Lecturer, Centre for Computing in the Humanities
King's College London
Strand
London WC2R 2LS
+44 (0)171 873 2784 voice; 873 5081 fax
http://www.kcl.ac.uk/humanities/cch/wlm/

--[2]------------------------------------------------------------------
Date: Fri, 20 Jun 1997 12:56:37 +0000
From: "Prof. Jose Gomes Filho" <gomes@dpx.cnen.gov.br>
Subject: Re: 11.0101 future of computing

About these ideas and the future of computing, I go a little further
and add that, in the future, when the science has gotten the complete
"artificialization" of the human brain (and obviously the whole body)
we will have completely evolved to a new species, the *homo cyber* ,
which among other characteristcs, will have a quasi ethernal life,
since it will possibly posess something like a "black-box" that will
make possible the individual's reinstallation in a new body, in case
of accident. The Japanese already begin to think on the
miniaturization of such body. Something like using
nanoneurocybernetics... For the success on such direction we need for
example that the movies business would also help and turn to show
the cybernetic (and cloning) scientific evolution and fiction on a
more *userfriendly* way, with the cybernetic creatures having a
little more emotion too.

> Date: Fri, 13 Jun 1997 09:46:55 +0100 (BST)
> Reply-to: humanist@kcl.ac.uk
> From: Humanist Discussion Group <humanist@kcl.ac.uk>
> To: Humanist Discussion Group <humanist@lists.Princeton.EDU>
> > X-To: Humanist Discussion Group <humanist@lists.princeton.edu>

> Humanist Discussion Group, Vol. 11, No. 101.
> Centre for Computing in the Humanities, King's College London
> <http://www.princeton.edu/~mccarty/humanist/>
> <http://www.kcl.ac.uk/humanities/cch/humanist/>
>
> Date: Thu, 12 Jun 1997 11:05:23 -0400
> From: Hope Greenberg <hope.greenberg@uvm.edu>
> Subject: Re: 11.0098 challenges & other Online matters
>
> > Soon like Star Trek: "Computer, locate all the
> > places in English literature where...."?
>
> In the original Star Trek series, the computer occasionally had a voice,
> but was usually a box with blinking lights, toggle switches, and a slot
> into which one fed small rectangular plastic chips (rather like 3.5 inch
> floppies). In the more recent incarnations of Star Trek the computer is
> usually a voice, an omnipresence with which one communicates through
> speech, though the occasional manual buttons under backlit panels are
> also much in evidence. The former seems rather laughably limited, the
> latter seems much more wonderfully useful, but we must remember that
> both are simply reflections of the times in which they were created, and
> both share a certain conceptual framework.
>
> In both cases the computer is definitely the Other, omnipresent,
> perhaps, but still an external entity. But in our own reality will the
> computer remain an external entity, easily identified as the Other or
> will it become much more invisible, more integrated into the world
> around us? Will the computer be outside us or will we live in and with
> the computer? I am not going so far as suggesting that in five years we
> will be "jacking in" a la Gibson. But take an example: You get dressed
> in the morning. As you walk to a nearby restaurant where you will be
> meeting someone you are charging your shoes which then supply power to
> your hat. Part of your hat is collecting visual data from around you,
> while another portion displays information to you at eye level. When you
> meet your party you shake hands, which initiates a data connection,
> passing her resume, research information, or other data to you. After
> the meeting you continue to your part-time office. Your hand on the
> doorknob initiates a security check which unlocks the door for you and
> delivers your e-mail...
>
> Well, I could go on. For many years we have been bombarded with and,
> indeed have fun laughing at, visions of wondrous future technology.
> Oddly enough, the things I describe above are already happening (see
> MIT's Wearble Computing pages at:
> http://lcs.www.media.mit.edu/projects/wearable/ for examples).
>
> So, what happens to education and universities (which are not the same
> thing) in a world where computing is much more ubiquitous than it is
> now?
>
> - Hope
>
> ------------
> Hope Greenberg
> University of Vermont
> http://www.uvm.edu/~hag
>
>
>
>
>