Home About Subscribe Search Member Area

Humanist Discussion Group


< Back to Volume 34

Humanist Archives: June 7, 2020, 10:12 a.m. Humanist 34.88 - notation, computation, liberation (?)

                  Humanist Discussion Group, Vol. 34, No. 88.
            Department of Digital Humanities, King's College London
                   Hosted by King's Digital Lab
                       www.dhhumanist.org
                Submit to: humanist@dhhumanist.org




        Date: 2020-06-07 08:55:41+00:00
        From: C. M. Sperberg-McQueen 
        Subject: Re: [Humanist] 34.85: notation, computation, liberation (?)

In Humanist 34.85, Herbert Wender suggests that data structures would be
a more pertinent level for digital humanists to worry about than
processor architecture; I am inclined to think he is right.
While some occasionally express impatience with the perpetual
discussion of problems of text and document representation (which
has been going on now for something over thirty years), it does
seem more obviously relevant to DH concerns than the choice
between a von Neumann architecture and some other way of
organizing a machine.

HW also poses a challenge which may be difficult to meet:

     As textual scholar I do also not understand what's bad with
     state/transition concepts against which Backus is
     fighting. Who will explain the reasons of his resistance?

Backus's reasons run, as I understand it, along something like
the following lines: the complex state assumed by conventional
programming languages is error prone and complicates both the
task of the programmer (especially the programmer who would like
to construct a persuasive argument that the program is correct)
and the task of a language processor (which would like to make
the program run as fast as possible, which may often involve
transforming it to give it a different structure and sequence of
operations).  He does not offer much in the way of concrete
evidence that programmers and language processors have problems
dealing with the complexity of conventional languages -- he
rather suggests that anyone can confirm his claims by looking at
the state of software and software development practices.  He
does suggest that the sheer size of conventional languages is a
symptom that they lack usefully powerful organizing principles
which could make programming simpler and optimization by a
compiler easier.

He notes that in systems with more powerful general principles --
algebraic expressions in conventional numerical calculation are a
good example -- we do not have the same frequency of errors or
the same difficulty recognizing that two different expressions
have the same value.  As early as the 17th century, numerical
calculation was well enough understood that Leibniz could use it
as an example of an area where questions were normally decidable
and where one could expect any two competent practitioners to
produce compatible results.  (So I understand his imagined
invitation:  Calculemus!)

Surely having an interpretation of a formal system in terms of
states and state transitions is better than having no
reproducible notion of its meaning at all; perhaps Backus would
agree.  But even without state- and transition-based semantics
being bad, it can be better to have semantics defined in terms of
higher-level concepts (I would say also: concepts nearer to those
of the intended domain of application).

That, at least, is how I understand Backus.  My own experience
with programming in different languages inclines me to agree with
Backus as I understand him: problems that take me a long time to
solve in languages in which the only control structures are GOTO
statements are solved much more quickly and much more reliably in
languages like Pascal or C with conditional statements and
higher-level looping constructs.  Programs that I find hard to
write in Pascal or C are simple to formulate in SQL, or Prolog,
or XSLT, or XQuery -- partly because of the higher level data
structures and partly because of the higher level semantics they
bring with them.  It is, I suppose, no accident that all of these
languages do without mutable variables, or that there are formal
semantics for SQL and XSLT and XQuery that allow great freedom to
the language processor to restructure the program to try to make
it faster or to make it operate in less space.

When in the late 1970s Susan Hockey and others created the Oxford
Concordance Program (OCP), it took a year or more to develop the
program.  When I teach a three-day introductory course in XQuery,
we sometimes write simple functions for word frequency lists and
concordances as an exercise on the third morning.  Our functions
are not, to be sure, nearly as full-featured as OCP.  But still:
how likely is it that the programmer who wrote OCP had a working
prototype of the word frequency list routine running by noon of
his first morning on the project?  One does not need to think
that Fortran is a bad language in itself, in order to think that
we can write software for many purposes faster and more reliably
if we use a different kind of language.


********************************************
C. M. Sperberg-McQueen
Black Mesa Technologies LLC
cmsmcq@blackmesatech.com
http://www.blackmesatech.com
********************************************





_______________________________________________
Unsubscribe at: http://dhhumanist.org/Restricted
List posts to: humanist@dhhumanist.org
List info and archives at at: http://dhhumanist.org
Listmember interface at: http://dhhumanist.org/Restricted/
Subscribe at: http://dhhumanist.org/membership_form.php


Editor: Willard McCarty (King's College London, U.K.; Western Sydney University, Australia)
Software designer: Malgosia Askanas (Mind-Crafts)

This site is maintained under a service level agreement by King's Digital Lab.