Home About Subscribe Search Member Area

Humanist Discussion Group


< Back to Volume 32

Humanist Archives: Feb. 17, 2019, 7:43 a.m. Humanist 32.467 - the McGann-Renear debate

                  Humanist Discussion Group, Vol. 32, No. 467.
            Department of Digital Humanities, King's College London
                   Hosted by King's Digital Lab
                       www.dhhumanist.org
                Submit to: humanist@dhhumanist.org


    [1]    From: C. M. Sperberg-McQueen 
           Subject: who invented computing? (60)

    [2]    From: Desmond  Schmidt 
           Subject: Re: [Humanist] 32.463: the McGann-Renear debate (147)


--[1]------------------------------------------------------------------------
        Date: 2019-02-17 07:37:04+00:00
        From: C. M. Sperberg-McQueen 
        Subject: who invented computing?

 > From: C. M. Sperberg-McQueen 

In Humanist 32.457, Desmond Schmidt writes:

     ... we do indeed follow business ...: else why would we talk about
     Big Data, Ontologies, Mobile technologies, XML, SGML and even
     computing itself, which was a business invention?

The formulation of this question suggests that at least one of us is
profoundly confused about the origins of several of the topics named.

In particular -- does DS really believe that computing is "a business
invention"?

Which company did Alan Turing work for when he wrote "On computable
numbers"?  Or when he worked on the cryptographic machines at
Bletchley Park?  Or when he designed the ACE? Or when he worked in one
of the UK's first computing installations?  Where did J. Presper
Eckert and John Mauchly work when they designed and built ENIAC?  Who
was their customer?  What company was paying John von Neumann's salary
when he published "First draft of a report on EDVAC"?

It's true that once computers had been invented, they were promptly
commercialized.  It's also true that large organizations like the
U.S. Department of Defense and its various branches, and their
analogues in other countries, often outsource research and development
to outside organizations, and that those organizations are sometimes
organized on a for-profit basis.

In the preceding two paragraphs I have taking "computing" to mean
computing with an electronic stored-program binary digital computer.
If we broaden the definition to include punched-card equipment and
calculators, I don't believe that the picture changes.  Hollerith,
Babbage, Pascal -- giants in the development of ... business?  Not so
much, I think.

On the larger question of the relation between digital humanities and
commercial work, I will only remark that

   - Denying that digital humanities *must* follow where commercial
     entities lead does not entail the claim that nothing that happens
     in business can ever be of interest to DH.

   - None of the examples I cited (Hockey, Huitfeldt, Ott, Packard,
     Thaller) involved anything that can plausibly be described as
     "customization of business software".

My apologies to those readers of Humanist who see in the current
discussion primarily a tedious accumulation of evidence for the rule
that misrepresentations, misstatements, errors, and tendentious
argument are usually better ignored than responded to (a rule
sometimes summarized as "Please don't feed the trolls").  They may be
right.

********************************************
C. M. Sperberg-McQueen
Black Mesa Technologies LLC
cmsmcq@blackmesatech.com
http://www.blackmesatech.com


--[2]------------------------------------------------------------------------
        Date: 2019-02-16 08:05:00+00:00
        From: Desmond  Schmidt 
        Subject: Re: [Humanist] 32.463: the McGann-Renear debate

Hugh,

let me just clarify a few points about the versions and layers
representation. In my experience it takes more explaining than we have
room for here to get this concept across to those who are used to
thinking about textual variation in XML terms. But I'll try to outline
it.

A layer is NOT a version. If you look carefully the non-final layers
are greyed out. They represent a tracing of the text from start to
finish at a given temporal level to show how the text evolved locally
in time. It doesn't record the spatial relations between
text-fragments. For that you have the facsimile, to which it is
linked, and you have demonstrated by your comments that you CAN
discern what went on in the line to produce those effects without the
need for any markup. Changes at two revision-sites that are related go
into the same layer. Unrelated local changes also go into the same
layer if their relative temporal positions match.

A version is a single global state in which the author or scribe left
the text at some point in time. Usually this is the final state, but
it can be an intermediate one if it can be discerned by for example a
pen-colour. All this is explained in the introduction.

Let me just say this for the general audience although I'm sure you'll
disagree. The huge win here is that there is no need for any markup to
represent variation. The text of even the most complex document
reduces to a set of simply marked up layers and versions, trivial to
represent and trivial to edit. So long as an editor of the edition
understands the basic principle of layers and versions he/she can work
on the edition in collaboration with others. There is no longer any
tension between different interpretations of what the tags mean, how
textual phenomena are to be marked up, or different levels of
technical expertise.

As for your assertion that you could represent this in TEI you are
right. But you have to tie yourself in knots doing it. Is it worth it
if the only thing you can do with the result is to try to display it
in a diplomatic view? Because comparing two documents marked up like
this is in my view impossible. I know of no comparison algorithm that
can cleanly and provably compare texts containing embedded variants.

You said earlier that an edition could take a long time. I think
Fredson Bowers once said it could take a lifetime, but he was talking
about print editions. What is the point of digital editions if they
don't make that task easier? The fact is we (never more than 3 of us
at once) have nearly finished this edition in around 5 years. I
started working on it in 2014. I think that is pretty quick for an
edition of 5,225 pages in 27 manuscripts, 8 print books of 250 pages,
55 newspapers with cuttings containing 675 poems and notes on 775
pages and a wealth of  biographical and historical information. It's a
big thing, and I don't think you could have achieved that with XML
encoding in that length of time with the rich interface we have been
able to develop on top of it. Someone in one of our presentations said
that getting rid of the XML freed us up to do more. I would agree with
that: it took off the shackles of restrictive hierarchies, because in
my view we have no need of them.

Desmond Schmidt
eResearch
Queensland University of Technology

>     [2]    From: Hugh Cayless 
>            Subject: Re: [Humanist] 32.452: the McGann-Renear debate (120)
>
>
>
> --[2]------------------------------------------------------------------------
>         Date: 2019-02-15 14:36:46+00:00
>         From: Hugh Cayless 
>         Subject: Re: [Humanist] 32.452: the McGann-Renear debate
>
> Re Desmond's post in 32.452:
>
>> It is quite possible for a TEI encoding of holograph manuscripts to be
>> so complex that it is practically, although not literally, impossible
>> to edit. That is, it is just as likely to be damaged as improved by
>> any attempt to edit it. If it is shared by a group of editors this
>> level of complexity is reached much sooner. The problem then becomes:
>> how do you communicate your understanding of the "howling wind-storm"
>> of tags that results to your colleagues so they may share your
>> interpretation of the textual phenomena being described?
>
> I'd say two things about this, the first, that this class of manuscripts
> became easier to encode with TEI after the addition of the  and
> associated elements to the Guidelines. I don't know the relative
> chronologies, and it's a bit hard to investigate right now, with the TEI
> Vault being unavailable due to a server outage at ADHO. Git tells me that
> the genetic encoding elements were added in late 2011, but I'm not sure
> precisely when they were first released. Second, that the workflow you
> describe sounds complex indeed, and it might not be practical to do it with
> TEI. That doesn't invalidate TEI, it just means it might not work well for
> your circumstances. That also doesn't rule out that it might be possible to
> adjust it to work for your circumstances.
>
>> Here is a moderately difficult example. A succession of hired
>> transcribers simply refused to encode this for us. I wonder how
>> hierarchies help us here?
>>
>> http://charles-harpur.org/corpix/english/harpur/A87-2/00000131a.jpg
>>
>> Undoubtedly that's a tough one. I feel quite certain it can be done in
> TEI, but of course that's dependent on time, funding, and access to local
> expertise. I can't and won't fault you for making different decisions than
> I might have.
>
>
>> Breaking it down into separate layers as we have done is close to the
>> method Michael describes, and renders the editorial task perfectly
>> manageable.
>>
>>
>> http://charles-
>
harpur.org/View/Twinview/?docid=english/harpur/poems/h509&version1=/h509b/layer-
>> final
>
>
> I have a couple of concerns about this. Firstly, I'd strongly recommend
> using a visual indicator of change that goes beyond color, as readers with
> impaired color perception will have trouble with it. Secondly, I worry that
> this method might give a false impression of what's going on. If we look at
> line 2, comparing "layer 1" and "layer 2", we see something like:
>
> layer 1:
>  Keeps munching still the corn of the tall
> crop
>
> layer 2:
> Will cease not to devour the tall-eared
> corn,
>
> which is what running a `diff` operation on the two lines might give you.
> But this is not at all what has happened in the text, as the image shows
> us. First, the whole line was canceled, and then the line of layer 2 was
> written above it. I also wonder about the "layer" concept. Do we know that
> each layer represents a single editorial stage? That "Even as the mighty
> son of Telamon" was canceled at the same time as the original line 2?
>
> Of course, I'm wholly ignorant here and this might be a perfectly
> representative model of the poet's editing process. It's exceedingly hard
> (dare I say impossible) to generalize across temporal, genre, and
> disciplinary differences.
>
> -----




_______________________________________________
Unsubscribe at: http://dhhumanist.org/Restricted
List posts to: humanist@dhhumanist.org
List info and archives at at: http://dhhumanist.org
Listmember interface at: http://dhhumanist.org/Restricted/
Subscribe at: http://dhhumanist.org/membership_form.php


Editor: Willard McCarty (King's College London, U.K.; Western Sydney University, Australia)
Software designer: Malgosia Askanas (Mind-Crafts)

This site is maintained under a service level agreement by King's Digital Lab.