7.0320 MLA Computing Forum December 1993 (1/1008)

Elaine Brennan (EDITORS@BROWNVM.BITNET)
Mon, 29 Nov 1993 21:33:41 EST

Humanist Discussion Group, Vol. 7, No. 0320. Monday, 29 Nov 1993.

Date: Sun, 28 Nov 1993 19:48:58 -0500 (EST)
From: ian@epas.utoronto.ca (Ian Lancashire)
Subject: mla forum on computing







MLA FORUM: Reconfiguring the Discipline in the Electronic Age:


An Information Package

November 1993



Available from ian@epas.utoronto.ca

or

WordPerfect (binary) file: infopapr.wp5
DOS (ASCII) file: infopapr.dos
available
by anonymous ftp from
epas.utoronto.ca (/pub/cch/mla)

Contents:

1. Purpose and Schedule of Forum and Workshops

2. Forum Speakers' Abstracts

3. Personal Statements by Forum Chairs

4. Select Bibliography

1. PURPOSE AND SCHEDULE OF FORUM AND WORKSHOPS


(a) Forum

A forum on `Reconfiguring the Discipline in the Electronic Age' is being
held at the 1993 Modern Language Association Convention in Toronto
because many MLA members believe that information technology has
truly revolutionized our profession, not just by changing the mechanics
by which we teach students and disseminate our research, but
substantively by altering our knowledge itself. Computing has
challenged theories of intertextuality, narrative, reading and authorship,
shifted our understanding of the `canon' by merging different kinds of
text and media electronically, fused language idiom and culture in the
teaching of idiomatic language and culture, and given the critic, scholar
and editor new tools for analysis and publication.

The purpose of the Forum and Workshops is to acquaint the profession
with the new potential of technology and to ask the questions that its
emergence poses. Computing offers better tools to the social sciences
and the humanities alike, but for the MLA membership it is
fundamentally reshaping our knowledge, on one hand because it is being
used to model writing and reading processes of common interest to all,
on the other because it is breaking down the distinction between text and
film, language and culture, and writing, thinking and speaking.
Electronic media are `reconfiguring' our disciplines.

Participants in the forum and two workshops, one on teaching and the
other on research, will discuss the intellectual impact of new technologies
on the profession. The forum and two workshops will not include live
demonstrations (these will be handled in the convention's demonstrations
room) but will present any information in brief hand-outs or short
videotapes. Our focus will be on arguments about the effects of
technology on teaching and research in language and literature, not about
existing software.

Many MLA members believe that the profession is currently
experiencing a turning point in humanities computing that makes this
forum of vital interest. The computer medium has matured to a new
level of usefulness in text- and language-based teaching and research.
For a long time, computing was synonymous with bibliographies,
language drills, quantitative frequency lists of words and parts-of-speech,
word processing, and desktop publishing. Recently, computer
applications have moved from these marginal or supporting activities
towards issues more central to the profession, in part because the
enabling technology has become more capable, flexible, and native to the
needs of sophisticated literary and language studies. Several technical
developments have produced this more plastic and expressive medium.
Computers have become more powerful, faster, and less costly in the
past decade, making it possible to do complex processing easily, to store,
share and retrieve large bodies of texts and images, and to provide
language laboratories, writing centers, and classrooms with advanced
equipment at a fraction of the early costs. Software has become more
sophisticated and also inexpensive; hypertext, large text database
management systems, literary works in electronic form, foreign character
sets, text-analysis packages, computer-aided design programs (such as
can create `walk-through' models of theatres), digital image-analysis
techniques (for studying handwriting and typeface in analytic
bibliography), and multimedia systems that combine moving images with
text are available in off-the-shelf form. The Internet, with its gateways
to library catalogues, online academic journals, academic and
government databases, and to the MLA membership itself, has enabled
members of the profession to `work at' distant institutions and to
conference with collaborators elsewhere daily, not just semi-annually.
Networks have also enabled teachers to extend their student seminars
outside the classroom. Scholarly communication has been explosively
democratized, at no or small cost to MLA members personally.

As computers reveal cognitive processes of reading and writing in
literature, so they nurture them in helping students to think and write.
Word processing can facilitate revision, but in writing instruction the
computer-supported power to annotate and store text while linking
reviewers also allows the modelling and study of revision strategies
themselves. The inclusion of sound and graphics in texts calls for new
rhetorical understandings of structure, coherence and argumentation. In
foreign languages, communicative language-learning methods have found
a new home in a multimedia environment in which language can be
presented as it is used among native speakers rather than as a form of
sterile `teacher talk.' Dense linguistic and cultural material can be
presented on the computer because of the student's ability to replay, seek
subtitles, glossary, and cultural notes, rather than being passively
overwhelmed as one is by raw videotape. In teaching, literary works are
being presented in new electronic editions with multiple texts, critical
collation, film versions, and apparatus ... with almost no size limitations.

In research, hypertext and multimedia -- and the `virtual edition' they
make possible - - have empowered reading to the point where it goes
well beyond human cognitive and perceptual constraints, challenging
intertextuality theory to propose workable models. As well, artificial
intelligence, computational and corpus linguistics, neuropsychology and
cognitive studies are approaching consensus on how people process
language. This consensus has an immediate bearing on how our authors
think, read and write. Computational stylistics has broad-based evidence
that writing exhibits national dialectal, period, idiolectal and even gender
`signatures' of which authors are largely unaware. In practice, online
text libraries are also opening up entire literatures, as in women's
studies, that lay obscured.

In creative writing, the more capable computer is an aesthetic medium,
host to new literary forms made possible by freedom in the electronic
presentation of text and image. Hypertext and interactive fiction are
good examples.

Forum speakers appeal to the profession in its widest definition:
researchers and teachers of literature, language and writing. Norman
Holland (University of Florida) has popularized psychoanalytic and
cognitive approaches to both readers and authors in English, Michael
Riffaterre (Columbia University) has elaborated structural stylistics and
discourse analysis in French studies, and Cynthia Selfe (Michigan
Technological) has persuasively shifted discussion of the theory and
practice of computers in writing from the margins of the association into
its center.

The forum chairs are:

Ian Lancashire (Toronto)
Janet Murray (MIT)

The topics will be:

The Cybercritic (Norman N. Holland, University of Florida,
Gainesville)

The Problem of Intertextuality (Michael Riffaterre, Columbia
University)

Politicizing and Inhabiting Virtual Landscapes as Educational Spaces
(Cynthia L. Selfe, Michigan Technological University)


(b) Workshops

The two workshops will explore the impact of recent technical advances
on what we think and do.

The workshop on Teaching in the Electronic Age will explore these
issues by having a number of scholars discuss the types of classroom
applications created by the new technologies. Computers and related
technologies offer challenges to instruction in college study of foreign
languages, literature and writing. Computer-supported assignments can
make student involvement with the study more active and authentic while
also changing the relationship between student and teacher. These
speakers will describe innovative projects in the context of theory in their
field.

Chairs: Helen Schwartz (Indiana University-Purdue University at
Indianapolis)
James J. Sosnoski (Miami University)

1. The Computer as Context in Foreign Language Teaching. Peter C.
Patrikis (Consortium for Language Teaching and Learning, New Haven).

2. Modelling Narrative Theory. Peter Havholm (College of Wooster) and
Larry Stewart (College of Wooster).

3. Multimedia Computer-based Instruction in the Teaching of Writing.
Mark Ferrer (University of California at Santa Barbara).

Dr. Patrikis, Executive Director of the Consortium for Language
Teaching and Learning and a member of the MLA Advisory Committee
on Foreign Language and Literature, will discuss how multimedia can
simulate the context of the target culture and language and how it can
create a context of language acquisition and use. He will relate the
capabilities of multimedia to recent theoretical advances in language:
contextualization of utterance, sociolinguistic codes, and the rhetoric of
learning. This presentation will review some recent efforts to
contextualize and to alter classroom teaching and learning.

Illustrating how computers can involve students in testing and applying
literary theory are two educators honored for Distinguished Curricular
Innovation by an EDUCOM/NCRIPTAL award in 1989. Peter Havholm
and Larry Stewart will discuss a computer program they have developed
which students use to model the operation of narrative theory. The
program allows students to create a generator that operationalizes the
theory's criteria and produces results (in this case skeletal narratives) that
test both the theory and the students' understanding of it.

Mark Ferrer will speak on `The Present State and Future Role of
Multimedia in the Teaching of Writing'. The presenter will show how
constructivist theories of writing are supported and utilized with
multimedia's power to enhance communication and collaboration, to
increase motivation and learning. The talk will be illustrated by
students' computer-based course work.

The workshop on Textual Research in the Electronic Age will explore
how `virtual editions' implicitly modify the nature of reading, thinking,
and remodeling the text in the Electronic Age. Intertextuality and the
empowerment of readers, users, and viewers to `annotate' the text with
variant versions, critical commentary, performance moments with sound
and full-motion video (in the case of Shakespeare's plays), and their own
comments make a virtual text that challenges the notion of canonicity.
The audience will be invited to consider what impact this new set of
textual dynamics will have upon the profession as well as to see new
convergences and combinations of critical methods.

Chair: Joel Goldfield (Plymouth State College)

1. Intersections of Teaching and Research: The Construction of the
<it>Charrette</it> Database. Karl D. Uitti (Princeton University),
Gina L. Greco (Portland State University), Toby Paff (Princeton
University)

2. The Shakespeare Interactive Video Archive: Integrating Textual and
Performance Study in the Age of Multimedia. Peter S. Donaldson
(Literature, MIT)

3. Electronic Texts and Tools: What Do They Do and Are They Better
than Books? Elaine Brennan (Brown University).

Karl Uitti, Gina Greco, and Toby Paff will speak on `Intersections of
Teaching and Research: the Construction of the Charrette Database.'
The medieval vernacular poem <it>Lancelot</it> (1180) exists in a
fluid, even problematic, state in eight variant manuscripts. No single
manuscript is the work. Since the poem, written mainly by Chr tien de
Troyes, but partially also by scribes through their alteration of the text,
is actually a `virtual' one, Karl Uitti views it virtually by displaying
variants and commentary seamlessly according to critical needs.
Employing a team of graduate students, post-doctoral researchers, and
database professionals, Uitti is constructing a database to answer
linguistic, editorial, codicological, and rhetorico-poetic inquiries.
Exercises also permit students to come to grips, in direct ways, with the
realities of medieval French textuality.

Peter Donaldson will speak on `Theory and Development of the
Multimedia Shakespeare Projects.' His NEH projects seek to provide
literary and performance-based `presence' for both students and scholars
of Shakespeare. His computer-based projects on <it>Hamlet</it>,
<it>Romeo and Juliet</it>, and other Shakespearean plays will give
access to performance moments, commentaries, lexica, on-line tutorials,
multimedia notetaking or montage, and text archives, thereby linking the
textual with the theatrical, and empowering the user to create multimedia
documents for critical commentary. The theoretical underpinnings
related to reception studies, hypermedia, and intertextuality will be
explored.

Elaine Brennan, a member of the MLA Committee on Computers and
Emerging Technologies, will speak on `Electronic Texts and Tools: What
Do They Do and Are They Better than Books?' As editor of the Women
Writers' Project at Brown, and as moderator of the electronic discussion
group <it>Humanist</it>, she has a critical perspective on the usage
students and scholars make of networked texts, including browsers,
indexers, and hypertext linkage among texts. She will report on needs,
actual applications and recommendations related to her activities and
tools.

2. FORUM SPEAKERS' ABSTRACTS

(a) Norman N. Holland. `The Cybercritic'.

Computers are deeply changing the premises of our profession, but not
in the ways usually said. Hypertext and multimedia simply improve
(dazzlingly, to be sure!) our familiar technology of footnotes,
bibliography, tables, and illustrations. Similarly, those interactive
fictions that only allow readers to choose among different plot sequences
do not fundamentally change the existing relation of an active reader to
an essentially passive text. Hypertext and multimedia do not confirm the
erroneous (neo-Saussurean) claims by theorists that texts cause mental
events.

One computer technology does lead to truly active literary works, texts
that do things. These are the programs embodying rule-based characters
and plots, programs like <it>Eliza</it>, <it>Parry</it>,
<it>Conversations</it>, <it>Oz</it>, and some interactive
fictions. The literary `pre-creation' consists of variables that the author
inserts in the program. When the reader talks to the program, it
responds to each reader's unique initiatives in ways that are equally
unique but bounded by the variables preset by the writer. The final,
physical text that results is created partly by the writer and partly by the
reader. It will differ for each reading. Such programs take literature
into forms it has never had before, radically challenging today's concepts
of text, reader, and writer.

(b) Michael Riffaterre. Not yet available.

(c) Cynthia Selfe. Not yet available.

3. PERSONAL STATEMENTS BY FORUM AND WORKSHOP
CHAIRS

(a) Joel D. Goldfield
Plymouth State College
New Hampshire

As a teacher and researcher in the fields of French language/culture and
literature first, as a foreign language methodologist second, and
educational technologist third, I view technology in our
language/literature fields as a force and vehicle for merging media and
thus ideas in teaching and research. The effervescent synergy in our
profession parallels that in the realm of popular culture where we are
witnessing the melding of telephony, television, information,
transactional needs and entertainment. Just as many members of this
latter group may shy away from the complexity necessary to master
information access and manipulation, so may many in our profession.
We need colleagues who will facilitate our research and teaching goals
through helpful technological implementations of communications tools,
databases and their querying tools, and through the creation of
high-quality materials for computer-assisted language/culture learning.
But what information managers might term our "databases" are really far
more than, say, clients' names and addresses for us: they are actually
primary and secondary texts, our encylopaedias and thesauri as well.
They have a history of circulation and debate. In book form they have
a smell and feel. Will future generations be immaterial enough to forego
these last aspects?

The nostalgia of physical book properties aside, we need to take a hard
look at the way we read, the ways we would like our students to read,
the authority of the text, and the appropriate uses of various analytical
tools to derive meaning from the text. It is time to carry out an
epistemological survey of the profession. Most of us probably believe
that the avenues to pursue any and all types of knowledge should remain
free from constraint. Freer access to various media, greater ease in their
synchronous or sequential connection, and necessarily new methodologies
for skillful syntheses, interpretations and criticism will lead ever more of
us to be interdisciplinary and beholden to no narrow critical approach.

We are likely to find new collegial strengths in our teaching and
research, as many of us are already finding in integrative core curricula,
panel sessions and grant applications. The small steps we are taking in
many technology-facilitated applications enable us to start extrapolating
our paths of exploration. There are computer-based textual analysis tools
receiving common use by undergraduates, graduate students and faculty;
electronic, bilingual readers with highly useful querying tools and modes
of literary and morpho-syntactical commentary economically or
technically unfeasible for hardcopy text; computer- and videodisc-based
teaching and research tools for Shakespearean drama and Classics;
foreign language writing environments complete with grammar primers,
functional/notional cross-referencing, transactional language video clips,
etc. We can expect students and faculty to use in other courses and
forums than their original ones the knowledge thus made available.

Hypermedia-based authoring tools allow researchers and teachers to link
textual, graphic, audio and video materials as a presentation aid in the
classroom and to create an "intelligent tutor" with the ability to interact
with the self-paced individual or groups of users located on- or
off-campus. Interactive television already allows two-way communication
for pay-per-view classes in a virtual campus, paving the way for
unknown additional numbers and types of students to participate in what
might be space-constrained or otherwise restricted classrooms. I suspect
that this powerful combination of new information channels and
interactive modes will contribute to an alteration of the economic and
pedagogical assumptions regarding the way we teach and engage in
research as well as for the evaluation of these activities by students,
administrators and our communities.

The political ramifications within the profession regarding access to, use
and assessment of these new information sources (electronic texts,
archives, etc.), tools, equipment, research methodologies, pedagogical
implementations and other service-related applications will be significant.
As we debate the roles for all these, we must guide and support the
professional organizations which represent our best interests and those of
our students in the concomitant reconfiguration of the profession.

(b) Ian Lancashire
Department of English
University of Toronto

Literary scholarship, perhaps like archaeology, restores the work of past
cultures by enabling us to understand their writings as contemporaries
might have. Computer database, text-retrieval and analysis software,
electronic text corpora, and an Internet capable of linking researchers
to each other and to online libraries empower individual researchers, in
this process of restoration, to reach out to far more texts, reference
materials, and colleagues in five years than would have been possible in
a lifetime of hard work only a decade ago. Modern languages that in
1985 watched with astonishment as the entire remains of ancient
languages like Classical Greek and Old English were made available
online are now seeing their own major authors, discipline bibliographies,
and even period collections of texts and historical dictionaries appear in
electronic form. With new standards emerging for electronic text
encoding, scholarly editors of literary texts are seeing that the electronic
editions now being prepared in their studies can form the living basis for
every edition of that text in the future. A renewed battle of the books,
paper and electronic, is at hand.

The information of information technology will then make scholarship
more accurate and comprehensive, but just as exciting are the new
models for thinking about texts and authors that emerge at the boundary
points where computers bring together different disciplines. Literary text
analysis and cognitive pyschology both use electronic texts, the former
to study style and content, the latter to gauge human memory.
Computational linguistics devises parsing techniques for representing
meaning unambiguously in machine translation systems, while language
teachers create software that enables students to study translation in
process interactively. Whatever theoretical perspective we begin from,
we are going to learn a great deal about authors and texts in the next
decades. The diversity of approaches, and the common choice of tools,
increasingly force us to re-examine our assumptions about language and
meaning.

Information technology invites fundamental questions about our subject.
What is a text when it forms part of a hypertext, or of holdings on the
Internet? What happens when texts have electronic addresses rather than
being physical artifacts? How do electronic text libraries affect
word-meaning? If a critical edition derives from all principal early texts
of a work, should the textual collation be done automatically and
interactively, by software, on the basis of electronic copies of those texts
rather than as manual apparatus to a single conflated text? Given the
capabilities of information technology, how many kinds of scholarship
will have to be redone from scratch?

For the past 30 years, a quite small number of researchers have
computerized texts, developed software, established standards in
encoding, and most of all experimented at the junctions of the humanities
and computer science. My own work has been in directing the
development of, and experimenting with, <it>Textual Analysis
Computing Tools</it> (<it>TACT</it>) at the Centre for
Computing in the Humanities at Toronto, and in compiling <it>The
Humanities Computing Yearbook</it>. From the early 1980s, when
most of us acquired microcomputers and became our own personal
data-processing professionals, the entire profession has become part of
this group. This social and intellectual change is important in itself and
will continue as more academic products (such as academic journals and
textbooks) become electronic, as we face issues such as proper citation
methods for electronic materials, and as we discover new ways of
extending the classroom, the library, and the study into cyberspace.

(c) Janet Murray
Massachusetts Institute of Technology
Cambridge, Mass.

In my work as a Victorianist and fiction specialist and as the director of
a laboratory for humanities computing projects, I am witnessing the
effects of advanced computing on literary studies, the creation of
narrative, and the field of my major collaborators -- language studies.

Interactive video is changing the ways in which language can be studied,
allowing the introduction of authentic language from the beginning levels
and promising to supplant the textbook as the central medium for
presenting the language. It is also changing the teacher's role from that
of informant and sole representative of fluent (but not always native)
language to that of task-designer, shaping the tasks that students can
bring to the computer-based materials which can serve as a database of
authentic language.

In the field of literary research, the computer allows us to create a
reference and teaching environment in which multiple primary texts,
criticism, and visual materials (such as performances of a play) can all
be accessed, cross-referenced and excerpted. Provided we develop the
right structures forelectronic scholarship, the synthetic and concrete
nature of the medium promises to increase the range of things we can
think clearly about at the same time. One look at a cramped and
cryptically annotated variorum edition of a major author is enough to
convince us that paper is an inadequate medium for the complexity of the
humanist's hypertexted thoughts. The new electronic media offer us the
chance to sort things out in a space whose dimensions more closely
approximate our multi-planar minds. We can refer with precision to parts
of texts and chart intersections among various texts, while still preserving
the context and integrity of each individual referent. The criticism that
results from this more capacious and plastic medium could allow us to
synthesize areas of discourse -- e.g. close textual analysis and film
criticism -- with a lucidity and accountability not possible before. The
danger of course is that we will surrender to the seductive plenitude and
get lost in the raptures of cyberspace.

As a medium for literature, the computer allows writers to write
procedurally, to create not just word-objects but a system of presentation,
related to theater, in which the conditions under which words appear is
also specified. It also allows for a literature in which the "reader" can
participate, making choices that affect the story or co-creating a
conversation with an interlocutor. It will ultimately allow us to think
freshly about the patterns of meaning in human life, just as any art form
offers new structures for human experience.

As a teaching, research, and artistic medium the electronic environment
is coming into a new maturity and proving that it is not merely additive
-- not just text plus moving pictures -- any more than cinema is merely
photographed plays. The structures that emerge for this new medium
will be new representations of human thought processes. The classic
processes of humanists -- e.g. the gathering of large variora, the need for
precise reference, the need for contextualization of sources, the emphasis
on multiple few points -- are ideally suited to shaping the new medium
making it crucial that we secure access to the technology necessary to
fashion our own tools.

(d) Helen Schwartz
Department of English
Indiana University-Purdue University at Indianapolis

Computers support greater ease and capacity for students to discover and
create through writing. Although word processors make revision easier
to accomplish, students need instruction to learn how to revise, but the
ease of revision reduces the commitment level required. Previously, a
substantial prison term seemed the best incentive for constant reading,
writing and revision, but computer access to research texts, storage and
search capacity and revision tools reduce the difficulty of literary
activities for novice writers. Electronic storage and communication also
support the paradigm of learning and creation of knowledge as a
conversation in a community of scholars. Hard copy has clear
boundaries -- manuscripts pile one a-top the other, books in a library sit
next to each other with no visible interpenetration except through
cross-references. But electronically-stored text questions the done-ness,
the separateness of text: what constitutes a draft? -- a printout? a
precipitation inot publication (paper or electronic)? what are the degrees
of collaboration (co-author, editor, commenter, inspiration)? The lack
of a sense of closure resembles continuing conversation with its open
invitation to add, but also its lack of authority. Computers in writing can
also raise political and ethical questions of equitable access and the ways
deserved authority can be negotiated in such an open space.

(e) James J. Sosnoski
Executive Director
Alternative Educational Environments
Department of English
Miami University (Ohio)

As a teacher of English literature whose research has concerned literary
theory, I believe that emerging technologies will reshape what literature
professors do. The courseware now becoming available to teachers
seems likely to make electronic textbooks more common than printed
ones in the next decade; the explosion of interest in the Internet seems
likely to result in a correlative increase in online research and
publication; and, technological advances such as hypertext software seem
likely to necessitate changes in our theories of the production, reception,
and distribution of texts.

Our society is on the verge of becoming a technoculture. This
development has profoundly effected our schools. Card catalogues are
being replaced by computer terminals. Campus mail is now often
e-mail. Texts are electronic. It's faster, easier, and more efficient to
search a bibliographic database than its print equivalent. Information
technology will soon be a pervasive feature of most university campuses.
In sum, as we shift from print environments to electronic ones the
changing conditions of our work invite a reconfiguration of what we do.

The reconfiguration most characteristic of the electronic revolution is
convergence. As technologies merge, the activities associated with them
converge. As phones, TV's, stereos VCR's, tape recorders, CD's, and
computers merge, our daily activities converge. Games become
education; learning geography metamorphoses into a game; family
pictures turn into TV programs; and home computers transform
themselves into checkbooks that pay bills automatically over phone lines.
Similar effects can be forecast for the future of education as classrooms,
libraries, computers, and telecommunication systems merge. Similar
possibilities exist for departments, many of which are being networked.
However, the effects of such "convergences" will only be as beneficial
as we make them. The persons using technology have the greatest
impact on its cultural effects.

David Downing and I have been studying what can be done to facilitate
literary-critical practices by taking advantage of new technologies.
During the past three years, we have been working on a series of
experiments under the rubric of the Cycles project which investigates the
implications of telecommunications for the conduct of literary criticism
and scholarship. In these experiments, teaching and research tend to
converge, collapsing the traditional divisions of our work. Various
academic sites (such as classrooms, libraries, presses, and scholarly
societies housing newsletters or journals) are linked through
tele-communication into a single networked cycle of critical exchanges.
Whereas in a print environment, scholars might first propose their ideas
to students in a classroom, then present them for debate among their
colleagues at a scholarly conference organized by the relevant
professional society, and finally submit them in publishable form to a
university press, the Cycles project turns this traditional pattern into an
ongoing cycle of critical exchange in which the research conducted is
presented in dialogical form and made available to other researchers
without the time lag required of print processes. In sum, the Cycles
project integrates the functions of a seminar, a textbook, a conference,
a symposium, a newsletter, and a journal through scholarly
correspondence conducted via electronic media following a set of
protocols which facilitate the ongoing dialogue by channeling it through
various stages into its publication in a database as well as in print.

The Modern Language Association could be a leader in these new
directions much as the Association for Computers and the Humanities has
been. We need the MLA, an organization that historically has been
dedicated to our concerns, to explore and assess the implications of
changes in technology which we daily witness. MLA could become a
telecommunication center for its members that allows them to share
documents across local area and wide area networks, a kind of MLAnet,
so to speak. MLAnet could be a vehicle to enable a wide range of
knowledge workers to share their ideas and comments on projects. It
could act as a cooperative textbase, tracking its own development. It
could also gather and archive input from departments, pulling together
material for large-scale brainstorming and grant seeking. Many of the
services which the MLA convention now provides could be more
effectively handled via telecommunication. MLA could provide
guidance, assistance, and even services for language and literature
scholars whose home institutions cannot afford to match the technological
benefits of better endowed universities.

Times have changed and so have we. Just a few years ago, few persons
working in literature departments had e-mail addresses. Recently, a
colleague remarked to me that he was embarrassed to say that he did not
yet have an e-mail address. Let us hope that the post-print era is a
significant period in the history of literary study. Indeed, accessing the
Internet in the 1990s could have more significance for the future of
literary study than becoming a member of the MLA had in the 1890s.
I hope that this forum will be an inaugural event, inspiring an
MLA/ACH coalition (if not a merger) which will allow scholars and
critics to benefit from the resources that technology is making available
to us.

4. SELECT BIBLIOGRAPHY FOR COMPUTING IN THE MODERN
LANGUAGES

This handlist is based on my <it>Humanities Computing
Yearbook</it> (1991) and on additions added to it since then at the
Centre for Computing in the Humanities.

Ian Lancashire, Toronto

(a) Journals and Serials

<it>The CALICO Journal.</it> Quarterly. Frank Borchardt,
Executive Director. Contact: Eleanor M. Johnson, Mgr., 014 Language
Center, Duke University, Durham, NC 27706.

<it>CETH Newsletter.</it> Center for Electronic Texts in the
Humanities, Princeton University and Rutgers The State University of
New Jersey. Director: Susan Hockey. 1992-. Address:
ceth@zodiac.rutgers.edu.

<it>Computers and Composition.</it> Fort Collins, Colorado,
1983--. Three times a year, edited by Gail Hawisher and Cynthia Selfe.
Available from Dept. of Humanities, Michigan Technological University,
Houghton, MI 49931.

<it>Computational Linguistics. Journal of the Association for
Computational Linguistics</it>. 1984--.

<it>Computers and the Humanities.</it> 6 times per year. [The
official journal of the ACH, membership in which includes subscription
to this journal. Contact Charles Bush, Humanities Computing Center,
Brigham Young University.

<it>Computers and Philosophy Newsletter</it>. Three times per
year. Carnegie Mellon University, CDEC Bldg. B, Pittsburgh, PA
15213.]

CTI Centre for Textual Studies. <it>Resources Guide.</it> Ed.
Caroline Davis, Marilyn Deegan, and Stuart Lee. Oxford University
Computing Services, 1992. Address: ctitext@vax.ox.ac.uk.

Hewlett, Walter B., and Eleanor Selfridge-Field. <it>Directory of
Computer Assisted Research in Musicology.</it> Menlo Park, CA:
Centre for Computer Assisted Research in the Humanities.

<it>ICAME News</it>. Newsletter of the International Computer
Archive of Modern English (ICAME). The Norwegian Computing
Centre for the Humanities, Harald H rfagresgate 31, PO Box 53, N-5014
Bergen, Norway.

<it>The Humanities Computing Yearbook 1989-90.</it> Ed. Ian
Lancashire. Oxford: Oxford University Press, 1991. [A comprehensive
reference work listing articles, books, resources, and software related to
the application of the computer in the humanities. One of the first places
to look for more information about the field.]

<it>Literary and Linguistic Computing.</it> Quarterly. Journals
Subscription Department, Oxford University Press, Pinkhill House,
Southfield Road, Eynsham, Oxford OX8 1JJ, UK

<it>Postmodern Culture.</it> Online electronic journal.
Anonymous ftp address: ftp.ncsu.edu.

<it>Processing Arabic</it>. Reports 2--4. Ed. Everhard Ditters and
Nelleke Oostdijk. Katholieko Universiteit, Nijmegen: Instituut voor Talen
en Culturen van het Midden -- Osten, 1987. [Available from the Editors,
TCMO -- Nijmegen University, PO Box 9103, 6500 HD Nijmegen, The
Netherlands.]

<it>REACH: Research and Education Applications of Computers in
the Humanities Newsletter.</it> Anonymous ftp site:
ucsbuxa.ucsb.edu.

<it>ReCALL Journal.</it> Hull, UK: CTI Centre for Modern
Languages. University of Hull, HU6 7RX, cti.lang@hull.ac.uk.

<it>ReCALL Software Guide.</it> Hull, UK: CTI Centre for
Modern Languages, 1990.

<it>Revue: Informatique et Statistique dans les Sciences
Humaines.</it> Centre Informatique de Philosophie et Lettres (CIPL),
place du A ut, 32, B-4000 Li

<it>Scholar: An Online Listserv for Text Analysis and Natural
Language Applications.</it> Ed. Joseph Raben. Address:
listserv@cunyvm.cuny.edu (or jqrqc@cunyvm.edu).

<it>TEXT Technology.</it> Ed. Eric Johnson. Madison, SD:
Dakota State University, 1991-.


(b) Online resources

<it>ARTFL.</it> The American and French Research on the
Treasury of the French Language, Department of Romance Languages
and Literatures, University of Chicago, 1050 E. 59th Street, Chicago, IL
60637; voice: (302) 702-8488, artfl@artfl.uchicago.edu. [Online
collection of about 2000 French literary, historical, philosophical, and
scientific works published between the 17th and 20th centuries.]

<it>Computer Bulletin Boards for Individual Languages, or, The List
of Language Lists.</it> Comp. by Bernard Comrie and Michael
Everson. Jan. 1993-. Ftp address: irlearn.ucd.ie (in /everson
subdirectory).

<it>Corpora.</it> A list-serve on the Internet for corpus linguistics
sponsored by ICAME (the International Computer Archive of Modern
English) and located at the Norwegian Centre for Computing in the
Humanities; not limited to the English language. Address:
corpora@x400.hd.uib.no.

<it>The Dartmouth Dante Database.</it> Over 60 commentaries on
the <it>Divina Commedia</it> that span over 600 years of tradition
and are written in Latin, Italian and English. Robert Hollander, Dept. of
Comparative Literature, Princeton University, Princeton. Telnet address:
library.dartmouth.edu.

<it>Directory of Scholarly Electronic Conferences.</it> Comp.
Diane Kovacs. Anonymous ftp address: ksuvxa.kent.edu.

<it>ETEXTCTR.</it> Discussion group on Electronic Text Centers.
CETH, Princeton and Rutgers. Address: etextctr@rutvm1.rutgers.edu.
Join by sending the message "subscribe etextct firstname lastname" to
listserv@rutvm1.rutgers.edu.

<it>Catalogue of Projects in Electronioc Text.</it> Director:
Michael Neuman. Centre for Text and Technology, Georgetown
University. [Covers all languages.] Anonymous ftp address:
guvax.georgetown.edu.

Gilster, Paul. <it>The Internet Navigator: The Essential Guide to
Network Exploration for the Individual Dial-up User.</it> New York:
John Wiley, 1993. [A good introduction to resources on the Internet.]

<it>Humanist</it>, an international electronic seminar for computing
humanists. Inquiries may be sent to Allen Renear and Elaine Brennan,
Brown University, editors@brownvm.brown.edu. See Willard McCarty,
`HUMANIST: Lessons from a Global Electronic Seminar,'
<it>Computers and the Humanities</it> 26 (1992): 205- 22.

<it>Institute for Psychological Study of the Arts.</it> List-serve,
moderated by Norman Holland, University of Florida. Address:
psyart@nervm.bitnet.

<it>InterNIC Directory of Directories.</it> Anonymous ftp address:
ds.internet.net. [Information on access, directories, databases,
whitepages, and mail services.]

Oxford Text Archive. Lou Burnard, Oxford Text Archive, Oxford
University Computing Service, 13 Banbury Road, Oxford OX2 6NN,
UK. E-mail address: archive@vax.oxford.ac.uk. Anonymous ftp address:
black.ox.ac.uk. [A repository for machine-readable texts; a database of
information about texts at other centres.]

Strangelove, Michael, and Diane Kovacs. <it>Directory of Electronic
Journals, Newsletters and Academic Discussion Lists.</it> Ed. Ann
Okerson. 3rd edn. Washington, DC: Association of Research
Libraries, 1993.

Text Encoding Initiative (TEI). Michael Sperberg-McQueen, University
of Illinois at Chicago. Listserve on all issues relating to text encoding
and SGML. Anonymous ftp site: sgml1.ex.ac.uk. Discussion list:
tei-l@uicvm.bitnet.

Text Software Initiative (TSI). Coordinated by Nancy Ide
(ide@cs.vassar.edu) and Jean Veronis (veronis@grtc.cnrs-mrs.fr).


(c) Bibliographies

<it>ACM Guide to Computing Literature.</it> New York:
Association for Computing Machinery, 1960-.

<it>Computers -- Linguistics -- Communications Bibliographical
Database.</it> 67,000+ references. Conrad F. Sabourin, Montreal.
Address: sabourco@ere.umontreal.ca.

Davis, D. <it>Computer Applications in Music: A
Bibliography.</it> Madison, WI: A-R Edition, 1988.

<it>German and Medieval Scandinavian Computer Research Ongoing
in North American Universities</it>. Comp. Evelyn S. Firchow,
Director, Computer Clearing House Project, German Dept., 219 Folwell
Hall, 9 Pleasant St. SE, University of Minnesota, Minneapolis, MN
55455.

<it>ICAME (International Computer Archive of Modern English)
Online Bibliography.</it> Comp. Bengt Altenberg. FTP address:
nora.hd.uib.no.

Stevens, Vance, Roland Sussex, and Walter Tuman. <it>A
Bibliography of Computer- Aided Language Learning.</it> New York:
AMS Press, 1986.


(d) Monograph and Essay Series

<it>Research in Humanities Computing 1</it>. Selected Papers from
the ALLC/ACH Conference, Toronto, June 1989. Ed. Susan Hockey,
Nancy Ide, and Ian Lancashire. Oxford: Clarendon Press, 1991. Pp.
70-92.

<it>The Society of Text: Hypertext, Hypermedia, and the Social
Construction of Information.</it> Ed. E. Barrett. Cambridge, Mass.:
MIT Press, 1989. <it>Hypermedia and Literary Studies.</it> Ed.
Paul Delaney and George P. Landow. Technical Communications.
Cambridge, MA: MIT Press, 1991. <it>The Digital Word: Text-Based
Computing in the Humanities.</it> Ed. George P. Landow and Paul
Delaney. Technical Communication and Information Systems.
Cambridge, MA: MIT Press, 1993.

CCH Working Papers. Occasional, 1991-. Centre for Computing in the
Humanities, Univ. of Toronto. [Bilingual, on computer-assisted textual
studies. Vol. 1, <it>A TACT Exemplar</it> (1991); vol. 2,
<it>Historical Dictionary Databases</it> (1992); vol. 3,
<it>Computer-Based Chaucer Studies</it> (1993).]


(e) Monographs and Collections

Andrews, Derek, and Michael Greenhalgh. <it>Computing for
Non-Scientific Applications.</it> Leicester: Leicester University Press,
1987.

Biber, Douglas. <it>Variation across Speech and Writing.</it>
Cambridge: Cambridge University Press, 1988.

Brunet, E'tienne. <it>Le Vocabulaire Franc,aise de 1789 a' nos Jours
d'apre`s les Donne'es des Tre'sor de la Langue Franc,aise.</it> 3 vols.
Ge'neve: Skatkine, 1981.

Burrows, J. F. <it>Computation into Criticism: A Study of Jane
Austin's Novels and an Experiment in Method.</it> Oxford: Oxford
University Press, 1987.

<it>Computer Applications to Medieval Studies.</it> Ed. Anne
Gilmour-Bryson. Studies in Medieval Culture, 17. Kalamazoo, MI:
Medieval Institute Publications, Western Michigan University, 1984.

<it>Computers and Written Texts.</it> Ed. Christopher S. Butler.
Applied Language Studies. Oxford: Blackwell, 1992.

<it>Computers and the History of Art</it>. Ed. W. Vaughan and A.
Hamber. London: Mansell Publishers, 1989.

<it>Directions in Corpus Linguistics: Proceedings of Nobel
Symposium 82, Stockholm, 4-8 August 1991.</it> Ed. Jan Svartvik.
Berlin: Mouton de Gruyter, 1992. <it>Editing, Publishing and
Computer Technology: Papers given at the twentieth annual Conference
on Editorial Problems, University of Toronto, 2--3 November
1984</it>. Ed. Sharon Butler and William P. Stoneman. New York:
AMS Press Inc., 1988.

<it>English Language Corpora: Design, Analysis and
Exploration.</it> Ed. Jan Aarts, Pieter de Haan, and Nelleke
Oostdijk. Amsterdam: Atlanta, 1993.

Fortier, Paul A. <it>De'cor et dualisme: L'Immoraliste d'Andre'
Gide.</it> Stanford French and Italian Studies, 56. Saratoga, CA:
Anma Libri, 1988.

Gould, Constance C. <it>Information Needs in the Humanities: An
Assessment.</it> Stanford, CA: The Research Libraries Group, 1988.

Grishman, Ralph. <it>Computational Linguistics: An
Introduction.</it> Cambridge: Cambridge University Press, 1986.

Hockey, Susan. <it>A Guide to Computer Applications in the
Humanities</it>. London: Duckworth; Baltimore: Johns Hopkins,
1980.

Hockey, Susan. <it>SNOBOL Programming for the Humanities.</it>
Oxford: Clarendon Press, 1985.

<it>Humanities and the Computer: New Directions.</it> Ed. David
S. Miall. Oxford: Clarendon Press, 1990. Pp. 123-35.

Ide, Nancy M. <it>Pascal for the Humanities.</it> Philadelphia:
University of Pennsylvania Press, 1987.

<it></it>Kenny, Anthony. <it>The Computation of Style.</it>
New York: Pergamon, 1982. [Probably the best introductory text for
humanists.]

<it>Literary Computing and Literary Criticism: Theoretical and
Practical Essays on Theme and Rhetoric.</it> Ed. Rosanne G. Potter.
Philadelphia: University of Pennsylvania Press, 1989.

<it>Looking Up: An Account of the COBUILD Project in Lexical
Computing.</it> Ed. J. M. Sinclair. London: Collins ELT, 1987.

Nagao, Makoto. <it>Machine Translation: How Far Can It Go?</it>
Trans. Norman D. Cook. Oxford: Oxford University Press, 1989.

Nardocchio, Elaine, ed. <it>Reader Response to Literature: The
Empirical Dimension.</it> Berlin: Mouton de Gruyter, 1993.

Oakman, Robert L. <it>Computer Methods for Literary
Research</it>. 1980; repr. Athens, Georgia: University of Georgia
Press, 1984.

<it>The Politics of the Electronic Text.</it> Ed. Warren Cherniak,
Caroline David, and Marilyn Deegan. Office for Humanities
Communication Publications, Nr. 3. Oxford: Office for Humanities
Publications, Oxford University Computing Services, 1993.

Sinclair, John. <it>Corpus, Concordance, Collocation. Describing
English Language.</it> Oxford: Oxford Univ. Press, 1991.

Smith, Peter D. <it>An Introduction to Text Processing.</it>
Cambridge, Mass.: MIT Press, 1990.

<it>Theory and Practice in Corpus Linguistics.</it> Ed. Jan Aarts
and Willem Meijs. Amsterdam: Rodopi, 1990.

Tribble, Chris and Glyn Jones. <it>Concordances in the Classroom:
A Resource Book for Teachers.</it> Harlow: Longman, 1990.


(f) Some Electronic Texts and Text Series

<it>British Library, General Catalogue of Printed Books to 1975 on
CD-ROM.</it> Chadwyck-Healey Ltd., Cambridge, UK.

<it>English Poetry Full-Text Database.</it> CD-ROM in progress.
Chadwyck- Healey, Ltd.

<it>FRANCIS</it> (Ficher de recherches bibliographiques
automatise'es sur les nouveaute's, la communication et l'information en
sciences socials et humaines. [Includes Art and Arch ologie, Droit
Antiques, Histoire et Sciences de la Literature, Histoire et Sciences des
Religions, Philosophie, Prehistorie et Protohistoire, Repertoire d'Art et
d'Arche'ologie, and Sciences du Langage. Vendor:
Telesystemes-Questel, 83-5 boulevard Vincent Auriol, 75013 Paris
France.]

<it>Le Robert E'lectronique.</it> Chadwyck-Healey, Ltd.

Modern Language Association. <it>Bibliography.</it> 1981-.

<it>Oxford English Dictionary.</it> 2nd edn. Oxford Electronic
Publishing.


(g) Articles

Barnard, D. T., C. A. Fraser, and G. M. Logan. `Generalized Markup
for Literary Texts.' <it>Literary and Linguistic Computing</it> 3
(1988): 26-31. [Describes the Standard Generalized Markup Language
(SGML).]

Frye, Northrop. `Literary and Mechanical Models.' In <it>Research
in Humanities Computing 1</it>. Selected Papers from the
ALLC/ACH Conference, Toronto, June 1989. Ed. Ian Lancashire.
Oxford: Clarendon Press, 1991.

Lancashire, Ian. `Back to the Future: Literary and Linguistic Computing
1968-1988.' In <it>Computers in Literary and Linguistic Research:
Literary and Linguistic Computing 1988</it>. Paris-Gen
Champion-Slatkine, 1990. Pp. 36-47.

Potter, Rosanne G. `Literary Criticism and Literary Computing: The
Difficulties of a Synthesis.' <it>Computers and the Humanities</it>
22 (1988): 91-7.

Schwartz, Helen J., and others. `Computers in Writing Instruction:
Blueprint for Progress.' <it>Computing Across the Curriculum:
Academic Perspectives.</it> Ed. William H. Graves. EDUCOM
Strategies Series on Information Technology. McKinney, Texas:
Academic Computing Publications, 1989.

Slatin, John M. `Text and Hypertext: Reflections on the Role of the
Computer in Teaching Modern American Poetry.' In <it>Humanities
and the Computer: New Directions.</it> Ed. David S. Miall. Oxford:
Clarendon Press, 1990. Pp. 123-35.

Sperberg-McQueen, Michael. `Text in the Electronic Age: Textual Study
and Textual Encoding, with Examples from Medieval Texts.'
<it>Literary and Linguistic Computing</it> 6 (1991): 34-46.