Home About Subscribe Search Member Area

Humanist Discussion Group


< Back to Volume 34

Humanist Archives: May 29, 2020, 7:36 a.m. Humanist 34.68 - on notation

                  Humanist Discussion Group, Vol. 34, No. 68.
            Department of Digital Humanities, King's College London
                   Hosted by King's Digital Lab
                       www.dhhumanist.org
                Submit to: humanist@dhhumanist.org




        Date: 2020-05-29 03:10:44+00:00
        From: Douglas Knox 
        Subject: Iverson on notation as a historical struggle that begins in hand-waving

Willard,

I appreciate your provocations on ":=" and the subsequent discussion, which
led me to look around a bit in the Wexelblat volume from the 1978 History
of Programming Languages conference.

Ken Iverson, the inventor of the programming language APL, cared a lot
about notation, understood math to be embedded historically in language and
human culture, and characterized himself as a "renegade mathematician." His
closing talk at the conference seems well matched to the generously broad
spirit of your provocation while perhaps resisting some premises. I'll sign
off here and quote Iverson at length below.

Doug Knox

What follows is from Kenneth E. Iverson, transcript of presentation, in
Wexelblat (1981), pp. 681-682:

"In talking about the development of all programming languages, not just
APL, I think we tend to make a mistake in thinking of them as starting, ab
initio, in the 1940s when machines first came along. And I think that that
is a very serious mistake because, in fact, the essential notions are
notions that were worked out in mathematics, and in related disciplines,
over centuries. I mean, just the idea of a formula, the idea of a procedure
-- although in mathematics we never had a really nice formal way of
specifying this -- it's clear that this is what we do. In fact, in a
lecture, you sort of give the kernel of the thing and then you wave your
arms and that replaces the looping and what-not. But the idea of algorithms
and so on is very central to mathematics. And so I think it's a mistake to
think of the development as starting from the computer. I think it's
worthwhile trying to examine this in the following sense: if we look at
mathematical notation, say, at the time of the development of the computer
we had some very important notations. If you don't think they're important,
I would recommend strongly that you take a look at some source like
[Florian] Cajori's History of Mathematical Notations; it's just incredible
the time and effort that went into introducing what we now just take as
obvious. For example, the idea of a variable name: a very painful thing. At
the outset you wrote calculations in terms of specific quantities. Then it
was recognized that you really wanted to make some sort of a universal
statement and so they actually wrote out things like 'universalis' which,
because we are all lazy, gradually got abbreviated, and I suppose by the
time it reached 'u', somebody said, 'Aha! We've got a lot of letters like
that. And now we can talk about several universals,' and so on.

"But the serious thing is that this is a very important notion. The idea of
identifying certain important functions like addition, multiplication, and
assigning symbols to them. These symbols are not as old as you think. Take
the times sign for example. It's only about 200 years old; it's too young
to die. That's why we resist the things like the asterisk and so on, that
happen to be on tabulating machines because of the need for check
protection.

"There were other things like grouping. It took a long time for what seems
now the obviously beautiful way of showing grouping in an expression by
parentheses. It was almost superseded by what is called the 'vinculum' --
where you drew a line over the top of the subexpression -- linearization
again. I think the main reason that the parentheses won out was because
when you have what would now just be a series of parentheses you got lines
going up higher and higher on the page, and that was awkward for printing.

"The question of representation -- certainly mathematics had long since
reached the stage where you don't think about how the numbers are
represented. When you think about multiplication, division, and so on, you
don't think, 'Ah, -- that's got to be carried out in roman numeral, or in
arabic, or in base-2' or anything like that. You simply think of it as
multiplication. Representation is subordinated.

"And of course, the idea of arrays has been in mathematical notation and
other applications for something like 150 years. They still haven't really
been recognized, but they've been there.

"Now, what happened when the first computer came along? For perfectly good
practical reasons every one of these notions disappeared. You couldn't even
say 'A + B.' You had to say, 'Load A into accumulator; load B into
so-and-so' -- you couldn't even say that. You had to say, 'Load register
800 into the so-and-so and so on.' So all of these things disappeared, for
perfectly practical reasons. The sad thing is that 30 years later we're
still following that same aberration..."

https://archive.org/details/historyofprogram0000hista



_______________________________________________
Unsubscribe at: http://dhhumanist.org/Restricted
List posts to: humanist@dhhumanist.org
List info and archives at at: http://dhhumanist.org
Listmember interface at: http://dhhumanist.org/Restricted/
Subscribe at: http://dhhumanist.org/membership_form.php


Editor: Willard McCarty (King's College London, U.K.; Western Sydney University, Australia)
Software designer: Malgosia Askanas (Mind-Crafts)

This site is maintained under a service level agreement by King's Digital Lab.