3.1198 discomfortable questions (68)

Willard McCarty (MCCARTY@vm.epas.utoronto.ca)
Wed, 21 Mar 90 22:02:24 EST

Humanist Discussion Group, Vol. 3, No. 1198. Wednesday, 21 Mar 1990.


(1) Date: Tuesday, 20 March 1990 2015-EST (16 lines)
From: KRAFT@PENNDRLS
Subject: E-mail Course in Programming

(2) Date: 21 March 1990 (34 lines)
From: Willard McCarty <MCCARTY@vm.epas.utoronto.ca>
Subject: klatu baranga nicto!

(1) --------------------------------------------------------------------
Date: Tuesday, 20 March 1990 2015-EST
From: KRAFT@PENNDRLS
Subject: E-mail Course in Programming

Does anyone else find it strange that the contemplated
course on PROGRAMMING FOR THE HUMANITIES posted to test
interest by Eric Johnson lists BASIC as a basic programming
language, and fails to list Pascal or C (object C)? My gut
reaction is between horror and disbelief, and after a couple
of deep breaths, my considered reaction is to wonder if the
problem is me? What is the point of teaching BASIC at all,
or in any event BASIC without the widely used languages
Pascal and C ? Wouldn't that be a backwards step for
humanists with programming ambitions?

Bob Kraft
(2) --------------------------------------------------------------------
Date: 21 March 1990
From: Willard McCarty <MCCARTY@vm.epas.utoronto.ca>
Subject: klatu baranga nicto!

I had the opportunity yesterday to attend a very interesting lecture,
`Beyond Electronic Mail', given by Thomas Malone (Coordination Science,
MIT). He described an electronic mail system, Information Lens, that by
means of a filtering system consisting in part of semi-autonomous
`agents', is able to act on various clues in e-mail messages and do
various things with them. Examples would be, route all messages from
one's dean into an `urgent' folder; delete all messages from an annoying
enemy; watch for and select all messages on a certain subject. Very
interesting, very high-octane stuff.

My problem with such a system is really a problem inherent in all
devices, by which I mean all human inventions to which we grant even a
semi-autonomous role. As Northrop Frye points out, we tend to forget
that what we have devised is our invention; we objectify it, then adore
it or allow ourselves to become ruled by it as if it were some sort of
god. His usual examples are the circle, which turns into a tyrannical
image of fate, and the codex or scroll, which becomes the subject of a
nightmare in which after death we are presented with a record of all our
evil deeds carefully recorded in some great book. See also the carpenter
in Isaiah. We used to be haunted with a form of this nightmare
that involved a great central computer. Now I wonder if there isn't a
rather nasty problem created by a semi-autonomous system into which you
have encoded a definition of your interests, allegiances, and so forth,
and then simply let go to do its automatic work. If there is a problem
here, what do we do about it?


Yours, Willard McCarty