15.474 minds and bodies

From: Humanist Discussion Group (by way of Willard McCarty (w.mccarty@btinternet.com)
Date: Sat Jan 26 2002 - 04:21:45 EST

  • Next message: Humanist Discussion Group (by way of Willard McCarty : "15.472 John D'Arms, 1934-2002"

                   Humanist Discussion Group, Vol. 15, No. 474.
           Centre for Computing in the Humanities, King's College London
                   <http://www.princeton.edu/~mccarty/humanist/>
                  <http://www.kcl.ac.uk/humanities/cch/humanist/>

       [1] From: "McCullers, Jeff" <JeffM@lee.k12.fl.us> (220)
             Subject: RE: 15.467 minds and bodies

       [2] From: Willard McCarty <w.mccarty@btinternet.com> (22)
             Subject: artificial minds and bodies

    --[1]------------------------------------------------------------------
             Date: Sat, 26 Jan 2002 09:17:42 +0000
             From: "McCullers, Jeff" <JeffM@lee.k12.fl.us>
             Subject: RE: 15.467 minds and bodies

    Dear Willard,

    I thank you so very much for including me in your list and then posting this
    fine conversation as my initiation into this topic. It is with trepidation
    that post at all, being that I am both a newcomer to your community and
    quite the dilettante in such matters (after all, I'm more a bureaucrat than
    a scholar). However, I have been delightfully provoked, and so I blithely
    risk humiliation in the interest of bettering my knowledge.

    I enthusiastically agree with Charles Ess' comment that "these technologies
    indeed help us expand (sometimes dramatically) our sense of connection with
    a larger world" especially when they are thoughtfully and purposefully
    developed to reduce the awareness of the interface between the machine and
    the user.

    Although I certainly lack a full appreciation for the details of mind-body
    "problem" which appears to be a fundamental issue, I can certainly say that
    my own experience with well-developed tools includes some
    evolutionary/behavioristic aspects. What I am trying to say is that when I
    use a tool that has a particularly intimate and transparent interface (such
    as a computer keyboard), there is an amplification of ability (of power?) to
    which my mind and body respond. The keyboard allows me to transcribe my
    thoughts almost as nearly as I can think them, which means that the act of
    constructing any sentence is completely recursive in the way that has so
    fascinated Hofstadter--in fact, I began writing this very sentence without
    any notion that by the time I got halfway through it, it would include this
    self-referential clause. When writing in my own hand with a pen or pencil,
    the laboriousness of that interface inhibits recursiveness and discovery and
    my sentences tend to be shorter, more predictable, and with diminished
    import. This means that the keyboard, while being manipulated by my body
    with such intimacy that it feels like it IS part of my body, has actually
    become a primary factor in the construction of abstract thought. In this
    sense, the keyboard (a physical tool) works to no small extent in the same
    way that language works as a innate tool to focus (and limit) the
    construction of abstract thought.

    All of this is no doubt elementary to subscribers of this list, so I beg
    your indulgence as I finally get around to making my point: When Charles Ess
    notes that these tools expand our connection with a larger world, I concur
    in the sense that because these tools become so much a part of us that there
    is an evolutionary/behavioristic effect. We receive positive and powerful
    reinforcement from the use of this tool in a biological sense. To me, that
    is a fundamental "connection with a larger world" although I may be twisting
    his intended meaning somewhat for my own purposes.

    A secondary thought: a lot of the computing I do is text-based, so the
    QWERTY keyboard comes to mind as my first example, but it is not my only
    one. Other non-text, non-language-related tools achieve this same intimacy,
    and thus the same evolutionary reinforcement. If you'll pardon a low-tech
    example, my grandfather's ancient woodworking tools are astonishingly light,
    quick, and pleasing in my hand, to such a degree that I might use them
    simply for the pleasant sensation of doing so. This raises an interesting
    question: what do we make of a tool that has such a successful interface
    that we desire to use it purposelessly? Is that tool now better seen as a
    recreational drug, an evolutionary dead-end, a toy, or what?

    My thanks again for letting me watch you good people think out loud.

    Kind regards,
    Jeff
    _________________________________________

    J. F. "JEFF" MCCULLERS
    Program Administrator
    Department of Grants and Program Development

    School District of Lee County
    2055 Central Avenue
    Fort Myers, Florida USA 33901-3988

    -----Original Message-----
    From: Humanist Discussion Group
    <w.mccarty@btinternet.com>) [mailto:willard@LISTS.VILLAGE.VIRGINIA.EDU]
    Sent: Wednesday, January 23, 2002 3:52 AM
    To: humanist@Princeton.EDU

                     Humanist Discussion Group, Vol. 15, No. 467.
             Centre for Computing in the Humanities, King's College London
                     <http://www.princeton.edu/~mccarty/humanist/>
                    <http://www.kcl.ac.uk/humanities/cch/humanist/>

               Date: Wed, 23 Jan 2002 08:50:50 +0000
               From: Charles Ess <cmess@lib.drury.edu>
               Subject: minds and bodies

    [The following exchange between Charles Ess and me I thought should be
    circulated to members of Humanist, and happily Charles agrees. More on this
    subject would be most welcome. --WM]

    > From: Willard McCarty <w.mccarty@btinternet.com>
    > Date: Tue, 22 Jan 2002 09:53:46 +0000
    > To: Charles Ess <cmess@lib.drury.edu>
    > >
    > Charles,
    >
    > My first thought is to put your note out on Humanist, where this
    discussion
    > should really be taking place. That is, I'd like our conversation to be
    > overheard, and I would hope thereby to provoke others. Let me know what
    you
    > think about this.
    >
    > I am grateful to be pointed to the Englebartian sense of mind-and-body
    in
    > the GUI. Where I think the mind/body problem comes up is in the
    prevaling
    > "impression of information" (as Geoffrey Nunberg brilliantly calls it)
    > which seems all over the abstract discussions of the technology. The
    > builders of systems (to extend a point made by Jerry McGann) are
    conducting
    > a much more interesting and intelligent metatheoretical discourse in
    their
    > acts of construction. Philosophy in the building of things. Or, as Ian
    > Hacking says, in intervening in the world.
    >
    ....
    > Yours,
    > W
    >
    > At 02:20 22/01/2002, you wrote:
    >> Dear Willard:

    I'm not sure I'd say that the mind/problem is central to _everything_ we do
    - or if it is, it would be so in an equivocal way.

    That is: what is wonderful about the Bardini text on Engelbart is that it
    makes clear that the devices and general interface that you and I and most
    computer users take for granted - i.e., the mouse, the GUI, even the
    keyboard - emerge from Engelbart's commitment to a kind of symbiosis and
    co-evolution between human and machine. Specifically: his interest in the
    kinesthetic aspects of human knowing - that began, in some measure, with his
    work as a radar technician in WWII - represented a _non-Cartesian_
    epistemology, i.e., the "mind-and-body" notion that Barbara Becker so nicely
    labels with the German neologism "BodySubject" (_LeibSubjekt_).

    This approach was clearly and consciously in conflict with the then
    prevailing approach, especially in AI, that basically took off from the
    Cartesian view - i.e., a mind as divorced from body. Hence all the
    attention to mind as a mechanism that could be - ostensibly, though now we
    are considerably more modest on this point - reproduced in a computational
    machine whose interface with the larger world ran from nonexistent (no
    sensory inputs) to user-hostile: minimal read-outs and user inputs. The
    thought was apparently that once an AI was so constituted, it would take off
    down its own, putatively superior evolutionary road, leaving us puny humans
    deservedly in the dust.

    FWIW: while I enjoy tinkering with operating systems such as DOS, UNIX,
    Linux, etc., that bypass the less demanding GUI-based elements to let the
    user "speak" somewhat more directly to the machine - Engelbart's more
    democratic notion of making computing machines accessible to the many is
    admittedly closer to my own sense of the better possible uses of these
    devices. That is: somewhat as Pirsig wrote convincingly in his famous _Zen
    and the Art of Motorcycle Maintenance_, _contra_ the romantics who believe
    we are more genuine and more genuinely in touch with "nature" when we strip
    away our technologies - at least up to a point, it seems to me that these
    technologies indeed help us expand (sometimes dramatically) our sense of
    connection with a larger world. At least insofar as they are designed along
    "Engelbartian" lines - i.e., precisely so as to minimize the sense of
    "interface" between user and machine by devoting as much computational
    resources as possible to help the machine "fit" the natural/cultural human
    ways of knowing and acting in the world as embodied beings and kinesthetic
    knowers. The more "user-friendly" the interface, the more of us will be
    able to enjoy and take advantage of this symbiotic evolution. In this latter
    direction, then, rather than assume a mind-body split/problem
    - we seem to assume the identity of human beings as "mind-and-body" - one
    capable of closely interacting with at least well-designed machines, and in
    ways that might mutually enhance the development of each.

    (Addendum: this raises an interesting counterpoint to Albert Borgmann's
    important analysis of "focal activities" in his 1984 volume on _Technology
    and the Character of Contemporary Life_. Such focal activities (gardening,
    cooking and eating together, etc.) are offered as something of an antidote
    to the ways in which technological devices offer increasing convenience -
    but at the cost of greater internal complexity that quickly defeats the
    ability of users to understand, much less manipulate or repair the device,
    thus making us ever more dependent on technologies and their supporting
    infrastructures. What Pirsig suggests - and Borgmann later exemplifies with
    his discussion of the Altair 8800 computer as device so basic that the
    user's required understanding of binary, logic switches, and programming
    allowed for an experience he characterizes in terms of coherence, intimacy,
    transparency, and comprehensibility [_Holding on to Reality_, 165) - is that
    we can also experience something of the sense of skill mastery and
    engagement with others and our environment _through_ technologies, not
    always _against_ technologies. Similarly, our engagement as
    "minds-and-bodies" with well-designed machines in what Engelbart
    characterizes as symbiotic co-evolution would perhaps stand as an example of
    a "technologically-mediated" focal activity?)

    In light of this, I guess I would say that the mind-body problem perhaps
    lurks behinds all we do with the delightful little beasties - but as we
    pursue this second direction, it seems to presume that there _is_ no
    (significant) mind-body problem: rather, we seek to address both
    mind-and-body by taking up considerable amounts of computing resources to
    help the machine interface with an embodied knower - ultimately, through all
    of our senses. (Hence the excitement with multi-media?)

    Thanks for providing me the occasion to tap out something I've mulling over
    but haven't really worked up yet. Your thoughts and comments would be most
    welcome!

    >> Charles Ess
    >> Director, Interdisciplinary Studies Center
    >> Drury University
    >> 900 N. Benton Ave. Voice: 417-873-7230
    >> Springfield, MO 65802 USA FAX: 417-873-7435
    >> Home page: http://www.drury.edu/ess/ess.html
    >> Co-chair, CATaC 2002: http://www.it.murdoch.edu.au/~sudweeks/catac02/
    >> "...to be non-violent, we must not wish for anything on this earth
    which the
    >> meanest and lowest of human beings cannot have." -- Gandhi
    >>
    >>> From: Willard McCarty <w.mccarty@btinternet.com>
    >>> Date: Mon, 21 Jan 2002 06:46:08 +0000
    >>> To: Charles Ess <cmess@lib.drury.edu>
    >>> Subject: minds and bodies
    >>>
    >>> Charles,
    >>>
    >>> I hope you were knowingly cooperating with my editorial persona, whom
    I
    >>> almost always allow to speak on a rather thin diet of knowledge but
    copious
    >>> amounts of boldness :-). Just the sort of thing I was hoping that such
    a
    >>> message would provoke -- a reading list for the incompletely trained.
    >>> Thanks very much indeed. Would you mind sending me an offprint when
    your
    >>> article sees the light of day & the darkness of night?
    >>>
    >>> BTW, do you think (as I can only suspect) that the mind/body problem
    is
    >>> central to everything we do with our beloved machines?
    >>>
    >>> Yours,
    >>> W
    >>>
    >>>
    >>> Dr Willard McCarty, Senior Lecturer,
    >>> Centre for Computing in the Humanities, King's College London,
    >>> Strand, London WC2R 2LS, U.K.,
    >>> +44 (0)20 7848-2784, ilex.cc.kcl.ac.uk/wlm/,
    >>> willard.mccarty@kcl.ac.uk, w.mccarty@btinternet.com

    --[2]------------------------------------------------------------------
             Date: Sat, 26 Jan 2002 09:18:06 +0000
             From: Willard McCarty <w.mccarty@btinternet.com>
             Subject: artificial minds and bodies

    The latest issue of the Times Literary Supplement (25 January, no. 5156),
    contains an interesting review of Victoria Nelson, The Secret Life of
    Puppets (Harvard), on the artificial bodies of human simulacra, including
    robots, androids and cyborgs. If anyone here has read this book, I for one
    would appreciate comments on it. The reviewer, Edward Sidelsky argues that,
    "Idolatry -- and indeed all forms of spiritualism -- mistakes spirit for a
    kind of thing, an object that can be grasped and manipulated. The spirit is
    seen as something existing alongside the body, as a 'subtle' or 'astral'
    body. This leads to the superstition that it can somehow be separated from
    the body, or transferred without loss to another body (an idol, mannikin,
    robot or computer).... The ban on idolatry in Judaism, Christianity and
    Islam springs from a fundamental religious insight: the spirit is not
    static and thing-like, and so cannot be housed in any thing." He goes on to
    comment that the apparent revival of religious interest in recent sci-fi
    films -- where artificial bodies, thanks to computer technology, become
    instruments of redemption -- is really a revival of idolatrous paganism.

    Yours,
    WM

    Dr Willard McCarty, Senior Lecturer,
    Centre for Computing in the Humanities, King's College London,
    Strand, London WC2R 2LS, U.K.,
    +44 (0)20 7848-2784, ilex.cc.kcl.ac.uk/wlm/,
    willard.mccarty@kcl.ac.uk, w.mccarty@btinternet.com



    This archive was generated by hypermail 2b30 : Sat Jan 26 2002 - 04:34:11 EST