21.524 wolf! wolf!

From: Humanist Discussion Group (by way of Willard McCarty willard.mccarty_at_kcl.ac.uk>
Date: Tue, 5 Feb 2008 06:57:02 +0000

               Humanist Discussion Group, Vol. 21, No. 524.
       Centre for Computing in the Humanities, King's College London
  www.kcl.ac.uk/schools/humanities/cch/research/publications/humanist.html
                        www.princeton.edu/humanist/
                     Submit to: humanist_at_princeton.edu

         Date: Sun, 03 Feb 2008 23:06:41 +0000
         From: Willard McCarty <willard.mccarty_at_kcl.ac.uk>
         Subject: the decline-and-fall myth

Lubomir Dolezel begins his article "Possible Worlds of Fiction and
History", (New Literary History 29.4, 1998), as follows:

>The contemporary researcher is engaged in a losing struggle with the
>information explosion. The struggle is especially desperate in
>interdisciplinary research, where no one can master all the
>published literature in all the special fields. As interdisciplinary
>investigations become more and more necessary, they become more and
>more difficult. (p. 785)

Except perhaps for the threat to interdisciplinarity, Dolezel's
statement must remind nearly everyone here of some other occasion
when someone has declared an imminent Information Deluge. Vannevar
Bush said more or less the same thing in his famous piece in Atlantic
Monthly, "As We May Think" (July 1945), in which he imagined
information technology not as the cause of the problem but as its solution:

>There is a growing mountain of research. But there is increased
>evidence that we are being bogged down today as specialization
>extends. The investigator is staggered by the findings and
>conclusions of thousands of other workers -- conclusions which he
>cannot find time to grasp, much less to remember, as they appear.
>Yet specialization becomes increasingly necessary for progress, and
>the effort to bridge between disciplines is, correspondingly,
>superficial. Professionally our methods of transmitting and reviewing
>the results of research are generations old and by now are totally
>inadequate for their purpose.... The summation of human experience
>is being expanded at a prodigious rate, and the means we use for
>threading through the consequent maze to the momentarily important
>item is the same as was used in the days of square-rigged ships.

A few days ago I listened to a lecture by a noted researcher
specializing in mobile telephony, who did not construct a Deluge from
an unquestioned surfeit of information (whatever that is) but spoke
of our radically increasing connectedness as a similar Force
threatening all sorts of things, such as peace of mind. I am using
capitalization here, signifying personification, because mingled with
the facts of the matter and propelling them along, like the water of
a torrent carrying along rocks, tree branches and detrius of all
kinds, I hear the panic of a decline-and-fall, hell-in-a-handbasket
myth, which requires some god or superhuman force of destiny. I
suspect that there are facts to be got at, though every attempt I've
heard to calculate how much information there is and to predict how
much there will be never gets beyond measurable quanta (now commonly
reckoned in petabytes) to human perception and the psychology of
attention. It never questions *what information is*. I also find
myself wondering about a possible relationship between the supposed
decline-and-fall of the world and the certain biological
decline-and-fall of the doom-sayer.

Be that as it may, I wonder if a better response to Dolazel (and
mutatis mutandis to the others) is to point out that the real problem
in his formulation of the threat to interdisciplinarity is his
assumption that one should "master all the published literature in
all the special fields", as if the knowledge worth having were the
linear product of a cumulative process. Isn't it wiser to respond by
revising our idea of what knowledge is and to rethink our strategies
for getting it? Isn't it better to observe what we are in fact doing
e.g. with Google's 20 petabytes of data per day
(http://www.niallkennedy.com/blog/2008/01/google-mapreduce-stats.html)
and figure out how to do it well?

Comments?

Yours,
WM

Willard McCarty | Professor of Humanities Computing | Centre for
Computing in the Humanities | King's College London |
http://staff.cch.kcl.ac.uk/~wmccarty/. Et sic in infinitum (Fludd 1617, p. 26).
Received on Wed Feb 06 2008 - 17:57:56 EST

This archive was generated by hypermail 2.2.0 : Wed Feb 06 2008 - 17:57:59 EST