17.558 rupture, shifts & humanities computer science

From: Humanist Discussion Group (by way of Willard McCarty willard.mccarty@kcl.ac.uk)
Date: Thu Jan 22 2004 - 04:21:30 EST


<x-flowed>
               Humanist Discussion Group, Vol. 17, No. 558.
       Centre for Computing in the Humanities, King's College London
                   www.kcl.ac.uk/humanities/cch/humanist/
                        www.princeton.edu/humanist/
                     Submit to: humanist@princeton.edu

   [1] From: Wendell Piez <wapiez@mulberrytech.com> (107)
         Subject: Re: 17.552 rupture and the "self-study" route

   [2] From: DrWender@AOL.COM (46)
         Subject: Re: 17.553 paradigm shift of textuality

   [3] From: Willard McCarty <willard.mccarty@kcl.ac.uk> (86)
         Subject: Re: 17.549 humanities computer science

--[1]------------------------------------------------------------------
         Date: Thu, 22 Jan 2004 08:45:26 +0000
         From: Wendell Piez <wapiez@mulberrytech.com>
         Subject: Re: 17.552 rupture and the "self-study" route

Willard,

Sometimes I see threads converge, but rarely if ever like this. I'd like
here to respond to three threads at once.

In '17.549 humanities computer science', Prof Thaller laments

>in Humanities Computing as it stands today there is a tacit and sometimes
>vocal assumption, that there is a body of secure knowledge, produced by CS,
>which just needs to be applied to the Humanities to lead to wonderful
>results. This point of view turns Humanities scholars into consumers of
>knowledge produced by others.

Meanwhile, Julia Flanders writes in '17.552 rupture and the "self-study"
route':

>... keeping open the "self-study" route is essential--not just because
>such a route will continue to be a source of creative, intelligent influx
>to the humanities computing community, but also because it reminds us to
>evaluate things (and people) by looking at their merits rather than
>through the codified symbology of credentials.
...
>Even though we may find it desirable and useful to design educational
>programs that formalize what we know and cultivate a new generation of
>colleagues, we should make sure that we don't start mistaking those
>programs for The Way.

A position with which I can agree with whole-heartedly, while recognizing
the paradox in it.

This when you have just cited (in '17.546 new places of knowledge?') a very
interesting and provocative 1997 essay by Thomas Bender, in which he writes
(in part)

> >Fascinating research recently reported by an international team of
> >sociologists in The New Production of Knowledge (1994) argues that more
> >and more knowledge will be developed outside of universities, in
> >opportunistic and transdisciplinary settings. The intellectual style in
> >these places is different from that associated with the university.
> >Theory is much closer to the point of use than is the case of
> >university-based knowledge, and that interplay, even near assimilation,
> >of theory and practice may be a source of both vitality and invention.
> >The process of making knowledge coincides with the process of
> >dissemination, thus dissolving the old categorical distinction between
> >production and popularization, theory and practice.

This really strikes home since I have found myself in exactly this kind of
place, on the outside of the academy yet with no lack of opportunities to
help "develop [new] knowledge". What a friend of mine in these realms names
the "internetwork topology" has something to do with this (and as a
freelancing electronic text specialist in the financial industry, and a
student of Homeric poetry, he is an exemplar of the type): being on the
outside no longer means the isolation that it used to; and if I have no
research library as such, I have something scholars in earlier ages would
hardly have dared imagine for themselves. But what Flanders and Thaller
have each warned about and pointed to, on the other side, also has a great
deal to do with it: the institutionalization of certain boundaries within
academic discourse that sap our power to contribute meaningfully from
within those structures -- whether because we have no scope or mandate to
reach beyond the settled, known and approved ("How can I 'develop new
knowledge' if I have to get tenure?") or, if we plunge ahead nonetheless,
when we face the inevitable questions of authority or relevance (what does
a humanist know of computers, or a computer scientist know about sign
systems?). Thus, ironically, because academics seek authority and relevance
among your/our peers, you/we cede authority and relevance in the larger
sphere. In this light, the consumeristic mindset that Prof Thaller
complains of -- humanities specialists may "use" computers, but can't and
shouldn't seek to contribute to their development -- is a symptom of the
problem as well as a cause. And Julia, to follow on, warns that formalizing
our own discourses may tend to worsen the situation, not improve it.

This seems like a dire predicament, with all of us on either side of the
cloister walls wishing we were on the other: I yearn for an academic
calendar, a research library, a conference budget and students and
colleagues -- effectively, a "department"! -- to help me whet my brain,
even while my academic friends envy my freedom to chart my own intellectual
course without having to answer to departments and committees. Yet across
these walls, what happens?

We chafe at the difficulties ... and yet we make advances with astonishing
speed. Consider, as Prof Thaller does, the web. He fairly derides it as an
impoverished, stunted form of hypertext, and remarks on the irony of a
hypertext medium being so widespread without even gesturing towards one of
the classic desiderata of hypertext, namely revision tracking. Yet for all
its weaknesses, the web demonstrates vividly how powerful is the
combination of the academic spirit of inquiry with the extra-academic
attitudes of "can do" and "less is more" ... a combination found perhaps
only in odd niches, inside and outside the academy (when he prototyped the
web, Tim Berners-Lee was at CERN, but he was not researching nuclear
physics), but potentially explosive where it occurs. And in its turn the
web is, what? Markup technologies, which themselves were developed not by
computer scientists but by ... by readers of this list, among others,
people largely on the fringes and boundaries, who understood very well the
relevance of the humanities to computing and computing to the humanities,
even while those in more established positions were blinkered by settled
notions of who was supposed to know what.

Does this represent failure? Is it failure even if the innovators fail to
get tenure? I don't think so -- or if it is a failure, it is a failure only
of the institutions, academic and corporate, that cannot hold onto their
best, while the boundary-crossers move on to something else. Bender's
essay, which I recommend to all (at http://www.acls.org/op40ben.htm), has
much to say about reasons for and possible remedies of the current academic
malaise. But meeting the transgressors at conferences like ALLC/ACH, or
Extreme, or hiding in the corners at MLA or XML 200x (now a tradeshow for
an entire new industry), and hearing their passions and preoccupations --
which don't have much to do with institutional politics -- sometimes I
wonder if our uneasy in-betweenness is not an enviable golden age after all.

Best regards,
Wendell

======================================================================
Wendell Piez mailto:wapiez@mulberrytech.com
Mulberry Technologies, Inc. http://www.mulberrytech.com
17 West Jefferson Street Direct Phone: 301/315-9635
Suite 207 Phone: 301/315-9631
Rockville, MD 20850 Fax: 301/315-8285
----------------------------------------------------------------------
    Mulberry Technologies: A Consultancy Specializing in SGML and XML
======================================================================

--[2]------------------------------------------------------------------
         Date: Thu, 22 Jan 2004 08:49:16 +0000
         From: DrWender@AOL.COM
         Subject: Re: 17.553 paradigm shift of textuality

In einer eMail vom 21.01.04 10:08:15 (MEZ) Mitteleuropaische Zeit schreibt
willard@lists.village.virginia.edu:

>lists of ingredients. And I thought: "yeah, that's strange. The menu
>doesn't behave as it should. You can not press a Ctrl-F-Key and look for
>"cream" in all the dishes. The text doesn't highlight all occurrences of
>"cream". The text doesn't answer to my questions. This isn't a normal text,
>it's not interactive, it doesn't communicate. It's a dead thing.
>
>To me, at that moment, it seemed to be an outdated, old-fashioned kind of
>text, an antiquarian text, as from an archive. Surely the
>printed-book-people had a similar experience for hundreds of years, when
>they encountered manuscripts. They surely often will have thought: this
>isn't a normal text. It's too hard to decipher and where is the table of
>contents which directs me into the book and to some certain page numbers?
>Where are the headlines and footnotes? Why is it so inconvenient to use
>this kind of text?

This said under
"
I can't dispute the feelings. But I would debate the presupposition:
Is the later on dreamed 'answering menu' an example for some
"Paradigm Shift of Textuality"? Patrick Sahle mentions new features
of text presentation in Gutenberg era for sake to demonstrate an
earlier (the last before the one now?) 'paradigm shift'. I will take
a simpler example (but for non-germans perhaps it is to explain:
DER SPIEGEL and FOCUS are both weekly newspapers, the
first most people will known as best reputated since the
early fifties in 20th century, the latter founded I think 40 years
later). The newcoming product has taken advantage from
the new desktop publishing facilities and the new production
possibilities in the publishing industry: It was coming with a
'modern design', with highlighting essentials, term explanations
in the margins, an overall simpler writing style but in a somewhat
'vivid' outfit, esp. compared to the impression in the old-fashioned
SPIEGEL with his famous own language, heavily rhetorical uploaded.
(Apologies for my dictionary's English...)
Look at an early print of Luther Bible, look at the glosses
i.e. by Montaigne. You'll find the 'new' features of FOCUS as
old ones. And vice versa: Go to your favorite pizzeria where
you know the pizza chef since a lot of years, ask him for
a meal with cream - his answer isn't it text, and isn't it like
highlighting the asked topics in his menu standard text?
(Are you really waiting for the speaking laptop in a McDonald's
restaurant where you will go with your son?)
But beyond the pizzeria example: What is meant with
"paradigm shift of textuality"?

Herbert Wender

--[3]------------------------------------------------------------------
         Date: Thu, 22 Jan 2004 08:57:43 +0000
         From: Willard McCarty <willard.mccarty@kcl.ac.uk>
         Subject: Re: 17.549 humanities computer science

I'm personally grateful to Manfred for articulating the basis for his fine
programme in Cologne at such length. He raises a number of issues worth
chewing on, but for now I'd like to concentrate on two in particular.

The first is raised, again, by the terminology he uses as this is rendered
into English. I apologise for a rather linguistically specific point,
which would I suspect need to be argued very differently
in each of the several other languages our diverse community
speaks. Forgive me also for the repetition of something I've said
before, possibly more than once :-). But I do have a strong
objection to use of the word "science" in English for exactly the reason
John Searle cites in Minds, Brains and Science (Penguin, 1984), p. 11.
Commenting on the three main words in his title, he notes the inadequacy of
the vocabulary we have for discussing the related problems, objecting first
to "mind" and then taking up our word:

>The situation with the word 'science' is even worse. I would gladly do
>without this word if I could. 'Science' has become something of an
>honorific term, and all sorts of disciplines that are quite unlike physics
>and chemistry are eager to call themselves 'sciences'. A good rule of
>thumb to keep in mind is that anything that calls itself 'science'
>probably isn't - for example, Christian science, or military science, and
>possibly even cognitive science or social science. The word 'science'
>tends to suggest a lot of researchers in white coats waving test tubes and
>peering at instruments. To many minds it suggests an arcane infallibility.
>The rival picture I want to suggest is this: what we are all aiming at in
>intellectual disciplines is knowledge and understanding. There is only
>knowledge and understanding, whether we have it in mathematics, literary
>criticism, history, physics, or philosophy.

What does it matter? At least two ways I can think of. First, practically
it matters because of the need for university administrations, funding
agencies and colleagues to categorize activities, e.g. to find the right
sort of person to judge whether they are being well done -- or should be
done at all. Second, the cultural power of the word is so great that it
inhibits clear thinking, as Searle complains. In the sub-genre of computer
science literature devoted to agonizing over whether CS is a discipline
and if so where its allegiances lie, one commonly finds otherwise very
bright (but as Bill Wulf says, slightly paranoid) people lunging for the
physical sciences as if they were a singular monolith with a
unitary epistemological method. Because they aren't and (as is now commonly
recognized) there is no such unitary method, the effort is wasted, and
besides it directs attention away from the important matters in CS.

In brief, I am arguing that (a) words have meaning, (b) it's important to
get the right one for the job and (c) "science" does us more harm than good.

Now for the second, related point. Manfred says,

>The difference being is - leaving all anecdotal references aside - that in
>Humanities Computing as it stands today there is a tacit and sometimes
>vocal assumption, that there is a body of secure knowledge, produced by CS,
>which just needs to be applied to the Humanities to lead to wonderful
>results. This point of view turns Humanities scholars into consumers of
>knowledge produced by others.
>
>My understanding of CS as a field, which is currently amalgamating implicit
>ways to handle information taken from various knowledge domains and
>defining itself by the experience, precludes me from assuming that
>"something is out there, which we just have to adapt". I do not know, what
>these these new paradigms of handling information will ultimately lead to
>and whether they will be called CS: But influencing these paradigms seems
>to me to be of vital importance for all fields of study, if they want to
>keep their status. Humanities who bring their abilities to handle specific
>types of information into such an emerging field of study and contribute to
>the emerging body of knowledge on how to handle that information, will in
>my opinion be taken incomparably more
>serious than Humanities which just consume what "CS" has produced
>(shrink-wrapped by Microsoft afterwards).

I could not agree more strongly that once knowledge is treated as a
commodity, like just so much breakfast cereal, the game has been lost. (See
Terry Winograd's identical criticism, ironically applied to mainstream AI
research, in "Thinking Machines: Can There Be? Are We?", in The Boundaries
of Humanity: Humans, Animals, Machines, ed. James J. Sheehan and Morton
Sosna, Berkeley, 1991, pp. 198-223.) I join Manfred in condemning the tacit
or vocal assumption he identifies. But it does not follow that programmes
and research "in Humanities Computing as it stands today" necessarily make
that assumption. Our programmes at King's College London, which are in
humanities computing, do not, for example.

I think Manfred is making a good point but that it is confused by an
oversight. What is being overlooked is the crucial element of critical,
self-aware application of standard tools. (To return to the breakfast
cereal metaphor, our parents could criticize us not merely for playing
with our food but more seriously for encouraging all the other kids to
do the same!) Pedagogically there's good reason, especially at the
undergraduate level, to teach students how to use the standard tools
critically and intelligently. In a crowded programme there's only so
much time. There are other practical considerations as well. I think
Manfred's quite right that there's a danger here, however: the danger of
falling into bad habits -- to say nothing of missing things because of
inferior tools. Training and experience with building the tools is or
should be central to what we do. But humanities computing can be
a broad church, with many ways.

Yours,
WM

Dr Willard McCarty | Senior Lecturer | Centre for Computing in the
Humanities | King's College London | Strand | London WC2R 2LS || +44 (0)20
7848-2784 fax: -2980 || willard.mccarty@kcl.ac.uk
www.kcl.ac.uk/humanities/cch/wlm/
</x-flowed>



This archive was generated by hypermail 2b30 : Fri Mar 26 2004 - 11:19:38 EST