Home About Subscribe Search Member Area

Humanist Discussion Group


< Back to Volume 34

Humanist Archives: June 5, 2020, 8:02 a.m. Humanist 34.83 - notation, computation, liberation (?)

                  Humanist Discussion Group, Vol. 34, No. 83.
            Department of Digital Humanities, King's College London
                   Hosted by King's Digital Lab
                       www.dhhumanist.org
                Submit to: humanist@dhhumanist.org


    [1]    From: C. M. Sperberg-McQueen 
           Subject: Re: [Humanist] 34.82: liberated? (166)

    [2]    From: C. M. Sperberg-McQueen 
           Subject: Re: [Humanist] 34.76: hardware architecture and programming style? (65)

    [3]    From: C. M. Sperberg-McQueen 
           Subject: what notations allow us to do and what notations require us to do (27)

    [4]    From: Allen Riddell 
           Subject: Re: [Humanist] 34.82: liberated? (18)


--[1]------------------------------------------------------------------------
        Date: 2020-06-05 06:53:57+00:00
        From: C. M. Sperberg-McQueen 
        Subject: Re: [Humanist] 34.82: liberated?

> On 3,Jun2020, at 11:42 PM, in Humanist 34.82, Willard McCarty wrote,
> under the subject line ‘liberated?':
>
> ...
>
> Thanks to Tim Smithers for putting me to rights with his note about
> Backus 1978,

I respectfully disagree with both TS and WMcC; there are probably
Turing Award lectures that are more often cited than Backus’s, and
it certainly deserves to be read widely, but I think it is really
stretching things to call it ’neglected’.

> which I've read but it seems with insufficient attention.
> There remains a question, however. Tim says, referring to the undeserved
> obscurity of Backus' paper:
>
>> This neglect may have more to do with Moore's law, and the
>> massive increase in computational power and capacity of real
>> von Neumann machines since 1978, so much so that we now build
>> virtual computers on these von Neumann machines having
>> different architectures, and not notice any slowness or
>> significant loss of performance.  Of course, these virtual
>> machines don't change the von Machine underneath them, but
>> they do allows us to completely forget about it, and thus,
>> returning to the Backus question, liberate our programming
>> from the von Neumann style.  Liberation has been won by
>> building up and away from the von Neumann basement, and not by
>> moving to a new basement, as John Backus envisaged we would
>> need to do, is how I see what's happened.
>
> Can we "completely forget" about the von Neumann machine underneath it
> all? Do we "not notice" for reasons similar to those responsible for the
> ease with which we have accepted digitally reproduced music?

You don’t say what you think those are, so I don’t know how to
answer that question.  One reason for the popularity of digitally
reproduced music is that the degradation over time so audible in
some analog representations like vinyl disks is eliminated;
another is the increased dynamic range; a third (relevant when
CDs were introduced, probably not less relevant now, though less
visible) is the greater capacity of the physical media in
commercial use: CDs could hold a lot more music than LPs.
Another contributing factor was probably the convenience with
which digital music allows music to be and redistributed without
direct charges (whether this is legally piracy, sharing, or a
subscription service).   Since some of these (like cost and
physical convenience) were also influential in the rise of
transistor radios and other analog technologies, it's not clear
that they are specific to digital reproduction of music.

Perhaps you mean to refer to the fact that some people profess to
hear a difference between music reproduced by digital and analog
means, and some of them believe the analog product is superior.
But the people I have known in that latter class have never given
the impression of being able to forget whether the music they were
listening to was being reproduced digitally or by analog means,
so in my experience the set of factors that allow people not to notice
appears to be the empty set.

I don't, at first glance, see any direct application of any of
these factors to issues of computer architecture.

Perhaps you mean only that many people are willing to overlook a
degradation in quality of reproduction if it is accompanied by an
improvement in other areas of the experience.  There may be
something in that: lots of people, including programmers who
claim to be interested in speed above all else, have preferred to
write programs in high-level languages even though compiler
output has historically almost never come close to matching
careful hand-written assembler: portability and some protection
against some kinds of errors have counted for more than speed and
direct interaction with the machine.


> I suppose
> we can forget anything we choose to forget, not notice out of culpable
> insensitivity, so let me rephrase. Is the architecture of this machine
> absolutely undetectable in the machines we have now?

Can you define detectability?  By whom, and by what means?

There are plenty of programming languages that make certain
properties of the underlying machine accessible to programs; it
would take a very clever language specification to guarantee that
the machine so made accessible is a physical machine and not a
virtual machine.  (A recent article in CACM pointed out that the
machine exposed by modern C compilers is really much less like
the machines now on people's desktops than like the machines on
which C was first designed and implemented.  So even if there is
nothing explicit by way of a virtual machine, and even in a
language some of whose practitioners like it because it is "close
to the machine", the machine exposed to programmers is not
necessarily to be identified with the physical machine running
the program.)

Similarly, there are plenty of programming languages (the set
overlaps with the first group) which do their best to mask
certain properties of the underlying machine.  Often but not
always this may take the form of a virtual machine.

I suspect it will occur to many people to suggest that a virtual
machine is likely to run more slowly than the physical machine
it's running on.  True, but I am not certain that it means that
no one can ever forget the hardware they are running on.  If I am
a running Lisp program, and I detect that I am running slowly,
can I conclude that I am running on stock hardware instead of on
a Lisp machine?  Or might it just be that I happen to be running
on a slow Lisp machine?
But it must be admitted that sometimes a determined program can
overcome the efforts of a computing environment to mask certain
information about the environment.  Sometimes, that is,
information about the underlying implementation leaks through and
is accessible to a program.  (Readers who remember the Spectre
vulnerability may recall that it involved speculative execution
of code in modern CPUs, which turned out to have observable side
effects, which allowed a program not only to know that it was not
running in a machine dedicated to it alone, but also to glean
information useful in subverting password security in
multi-process environments.  That counts as a failure of
information masking.)

 From outside a program, it may be even easier to gain information
about the environment in which the program is running.

But for the most part, I think there is a useful distinction to
be drawn between allowing a programmer, or a user, to ignore (or
forget) some properties of the underlying device, and ensuring
that a programmer determined to detect properties of the
underlying device cannot ever do so.
That is, I think "absolutely undetectable" is probably the wrong
test to apply here, no matter how you define it.

If one’s goal is to liberate oneself from what Backus calls the von
Neumann style, it suffices to have a virtual machine in which
one need not program in that style; only if we have some rather
different goal would it be necessary for the virtual machine to
treat the programmer and the program as an adversary that
must at all costs be kept from knowledge of the environment.
> Is its conceptual
> baggage without any consequence whatsoever? (Note the unmarked
> table-thumping, please.)
>
> Let us say that we have indeed been completely liberated to do better.
> Why, then, is the imperative style still so much in evidence? What is
> the appeal? Is there variation of appeal across the disciplines?

Why does it need to concern us what other people think and do?

Yes, there are plenty of people who like to program in imperative
languages.  For more or less understandable reasons, many of them are
programmers and many programmers are people who
like to program in imperative languages.

Must those of us who prefer to program in declarative languages,
functional languages, or logic languages convert everyone else to our
way of thinking?  Why?  Why don't we just get on with the work we want
to do?


********************************************
C. M. Sperberg-McQueen
Black Mesa Technologies LLC
cmsmcq@blackmesatech.com
http://www.blackmesatech.com


--[2]------------------------------------------------------------------------
        Date: 2020-06-05 06:53:18+00:00
        From: C. M. Sperberg-McQueen 
        Subject: Re: [Humanist] 34.76: hardware architecture and programming style?

On Tue, 2020-06-02 at 10:53 +0000, Humanist wrote:
>
>         Date: 2020-06-01 09:26:47+00:00
>         From: Willard McCarty 
>         Subject: hardware architecture and programming style
>
>
... Other
> styles,
> notably the functional one pioneered by McCarthy in Lisp, can be
> accommodated by translating them via a compiler or interpreter into
> imperative code.

Lisp a non-imperative functional language?  Really?
It's possible to write functional programs in Lisp, and most
Lisps have had functions as first class objects for a while.
And in Scheme, the names of non-functional primitives for actions like
destructive assignment are typically marked with
exclamation point so they are easy to see (and can the more
readily be avoided if one's goal is to write functional code).

But I would be surprised if there were any language processor
calling itself Lisp that lacked imperative semantics and destructive
assignment (and even more suprised if anyone else agreed that such
a language processor counted as a Lisp implementation).  Am I wrong?

As for the influence of architecture -- when Lisp implementations on
stock hardware began to outperform
Lisp running on Lisp machines, the days of Lisp machines
as a commercially viable proposition were numbered (and
with a relatively small integer).  Any serious account of
the influence of architecture on programming needs to supply a coherent
story explaining how that does not
undermine the central thesis about architecture.

It also needs to explain why virtual machines don't
count as counter-examples.  And microcode -- surely if
physical realization determines all, then machines
which behave the same only by virtue of microcode must
attract different kinds of programs.  But I am pretty
sure that most code written for system 360 machines really
does not care whether a given instruction is implemented
in hardware or microcode.

Given the weasel words "more or less", it is of course true, or at least
not falsifiable, that all languages are "more or less determined" by
architecture.  But whether in any case the influence is more or is less
will be determined in part by the goals of the language designers.  Some
languages want to make it easier for us to give instructions to our
machines, and some programmers want to feel that they are working
close to the machine (though those who talk most about it are
C programmers, and the machine they are closest to is the
PDP 11, lovingly recreated by modern hardware and compilers,
not the one their code is actually running on).  Some languages
want to make it easier for humans to specify computations
and to hide details of the underlying machine from the
programmer wherever possible.  Some languages in the latter
class are imperative, some not.

The Sapir/Whorf hypothesis is tempting, but does not appear
to be any simpler or more direct in operation in programming
languages than in natural languages.

-CMSMcQ


--[3]------------------------------------------------------------------------
        Date: 2020-06-05 06:52:26+00:00
        From: C. M. Sperberg-McQueen 
        Subject: what notations allow us to do and what notations require us to do

In Humanist 34.76, Willard McCarty seems to regard Lisp as a
functional language, possibly on the grounds that one can write
functional programs free of side effects in Lisp, even though
Lisp programs are not in general guaranteed to be functional or
free of side effects.

In Humanist 34.82, on the other hand, Willard regards with
skepticism the suggestion by Tim Smithers that virtual machines
allow us (as programmers, I think) to forget about whether the
underlying physical machine uses the von Neumann architecture or
something else.  The reason for his skepticism appears to be his
suspicion that it might be possible for a programmer working in a
virtual machine to detect by some means fair or foul the nature
of the physical machine.

Perhaps I have failed to grasp the point, but I do not understand
why in the one case it should suffice that a programmer can, if
they wish, do something, while in the other case the demand is
that the programmer not be able to do anything else.


********************************************
C. M. Sperberg-McQueen
Black Mesa Technologies LLC
cmsmcq@blackmesatech.com
http://www.blackmesatech.com


--[4]------------------------------------------------------------------------
        Date: 2020-06-04 13:28:31+00:00
        From: Allen Riddell 
        Subject: Re: [Humanist] 34.82: liberated?

> Can we "completely forget" about the von Neumann machine underneath it
> all? Do we "not notice" for reasons similar to those responsible for the
> ease with which we have accepted digitally reproduced music? I suppose
> we can forget anything we choose to forget, not notice out of culpable
> insensitivity, so let me rephrase. Is the architecture of this machine
> absolutely undetectable in the machines we have now? Is its conceptual
> baggage without any consequence whatsoever? (Note the unmarked
> table-thumping, please.)

If we consider the community of programmers who write using a functional
programming language such as OCaml or Haskell, I believe it is safe to
say that at least 99.9% of the community can completely forget about the
VN machine since the compiler abstracts away the details.

But the people who write and maintain the compiler must, of course, deal
with the underlying hardware architecture on a daily basis.




_______________________________________________
Unsubscribe at: http://dhhumanist.org/Restricted
List posts to: humanist@dhhumanist.org
List info and archives at at: http://dhhumanist.org
Listmember interface at: http://dhhumanist.org/Restricted/
Subscribe at: http://dhhumanist.org/membership_form.php


Editor: Willard McCarty (King's College London, U.K.; Western Sydney University, Australia)
Software designer: Malgosia Askanas (Mind-Crafts)

This site is maintained under a service level agreement by King's Digital Lab.