Home About Subscribe Search Member Area

Humanist Discussion Group


< Back to Volume 34

Humanist Archives: June 6, 2020, 10:16 a.m. Humanist 34.85 - notation, computation, liberation (?)

                  Humanist Discussion Group, Vol. 34, No. 85.
            Department of Digital Humanities, King's College London
                   Hosted by King's Digital Lab
                       www.dhhumanist.org
                Submit to: humanist@dhhumanist.org


    [1]    From: Bill Benzon 
           Subject: What Kind of Thing Is a Number? | Edge.org (68)

    [2]    From: Dr. Herbert Wender 
           Subject: Re: [Humanist] 34.83: notation, computation, liberation (?) (20)

    [3]    From: Robert Delius Royar 
           Subject: Re: [Humanist] 34.83: notation, computation, liberation (?) (85)


--[1]------------------------------------------------------------------------
        Date: 2020-06-05 19:11:27+00:00
        From: Bill Benzon 
        Subject: What Kind of Thing Is a Number? | Edge.org

Willard and fellow humanists:

It seems to me that this informal conversation has some bearing on the much
vexed question of the relationship between mathematics and software. Here's a
bit of it:

REUBEN HERSH: What is a number? Like, what is two? Or even three? This is sort
of a kindergarten question, and of course a kindergarten kid would answer like
this: (raising three fingers), or two (raising two fingers). That's a good
answer and a bad answer.

It's good enough for most purposes, actually. But if you get way beyond
kindergarten, far enough to risk asking really deep questions, it becomes: what
kind of a thing is a number?

Now, when you ask "What kind of a thing is a number?" you can think of two basic
answers -- either it's out there some place, like a rock or a ghost, or it's
inside, a thought in somebody's mind. Philosophers have defended one or the
other of those two answers. It's really pathetic, because anybody who pays any
attention can see right away that they're both completely wrong.

A number isn't a thing out there; there isn't any place that it is, or any thing
that it is. Neither is it just a thought, because after all, two and two is
four, whether you know it or not.

Then you realize that the question is not so easy, so trivial as it sounds at
first. One of the great philosophers of mathematics Gottlob Frege made quite an
issue of the fact that mathematicians didn't know the meaning of One. What is
One? Nobody could answer coherently. Of course Frege answered, but his answer
was no better, or even worse, than the previous ones. And so it has continued to
this very day, strange and incredible as it is. We know all about so much
mathematics, but we don't know what it really is.

Of course when I say, "What is a number?" it applies just as well to a triangle,
or a circle, or a differentiable function, or a self-adjoint operator. You know
a lot about it, but what is it? What kind of a thing is it? Anyhow, that's my
question. A long answer to your short question. …

Mathematics is neither physical nor mental, it's social. It's part of culture,
it's part of history, it's like law, like religion, like money, like all those
very real things which are real only as part of collective human consciousness.
Being part of society and culture, it's both internal and external. Internal to
society and culture as a whole, external to the individual, who has to learn it
from books and in school. That's what math is.

https://www.edge.org/conversation/reuben_hersh-what-kind-of-thing-is-a-number

As for me, I've been wondering whether or not we should distinguish between
mathematics and computation. It seems to me that computation is ALWAYS a
physical process of some kind, whether it takes place as someone makes marks on
a piece of paper, moves counters on a set of parallel wires (as in an Abacus),
moves a slide rule, ticks things off with fingers and toes, or, for that matter,
uses some software on a digital computer. But mathematics is none of those
things, though we can develop a mathematical understanding of those things and
use mathematics to guide and discipline our use of such things.  Mathematics
isn't a physical process, though physical processes are involved in writing
down a proof and in reading one.

Bill Benzon
bbenzon@mindspring.com

917-717-9841

http://new-savanna.blogspot.com/ 
http://www.facebook.com/bill.benzon 
http://www.flickr.com/photos/stc4blues/
https://independent.academia.edu/BillBenzon
http://www.bergenarches.com 

--[2]------------------------------------------------------------------------
        Date: 2020-06-05 18:35:10+00:00
        From: Dr. Herbert Wender 
        Subject: Re: [Humanist] 34.83: notation, computation, liberation (?)

Rhetorically most remarkable in the Alan Turing lecture of the year 1977: the
word 'bottleneck' (24x) and the expression 'word-at-a-time' (18x). In the early
years of the WWW - when I remember correctly - a similar magic word was
'bandwidth'. Both problems I guess was solved by time, as the controversy
between reasoning and so-called power approaches in the programming of chess
playing machines im the 1980s.

I do not understand at all why the question of processor architectures should
worry in DH before the debate on data structures (and operations on it) suited
for solving DH problems has reached a point where actual processors can be shown
as obstacle. Perhaps it would be better in DH contexts to speak of higher level
manipulating languages than of programming languages, and then, would we are
really afraid by conceptualizing solutions in an 'imperative style'? As textual
scholar I do also not understand what's bad with state/transition concepts
against which Backus is fighting. Who will explain the reasons of his
resistance?

Regards, Herbert



--[3]------------------------------------------------------------------------
        Date: 2020-06-05 16:06:20+00:00
        From: Robert Delius Royar 
        Subject: Re: [Humanist] 34.83: notation, computation, liberation (?)

I enjoyed reading Humanist 34.83 (the digest form). It brought back memories
from nearly 40 years ago. The postings from Sperberg-McQueen and Riddell
crystalized some of what I had been thinking in following these two interrelated
(recent) threads but which I have not been able to articulate. Below, I offer
some idle musings.

On 2020-06-05 06:53:57+00:00 in Humanist 34.83 C. M. Sperberg-McQueen wrote

> A recent article in CACM pointed out that the machine exposed by modern C
compilers is really much less like the machines now on people's desktops than
like the machines on which C was first designed and implemented.

When I still C, I sometimes think about how my code interacts with the 8-core,
multithreaded chip that keeps the machine together. My meager abilities in my
self-taught coding of plain-old C do not include an understanding (even at the
level of metaphor) regarding the way the instructions will be run. This differs
greatly from the early 1980s when I often recoded functions in mc68K assembler
mnemonics to assure fewer cycles in screen refresh, video display buffering, or
waiting for keyboard entry. I could imagine those operations and could predict
what the Alcyon compiler would create from the short lines of C code I
wrote–what tweaks would help the mc68K handle re-display more quickly and the
ways that ancient code found in so many open-source programs needed rewriting so
as not to become caught in loops from which only a hard reset could free it.

On 2020-06-05 06:53:18+00:00 in Humanist 34.83 C. M. Sperberg-McQueen wrote

> Some languages want to make it easier for us to give instructions to our
machines, and some programmers want to feel that they are working close to the
machine (though those who talk most about it are C programmers, and the machine
they are closest to is the PDP 11, lovingly recreated by modern hardware and
compilers, not the one their code is actually running on).

“Lovingly” I do recall the joy of programming during that early period once I
could translate an idea about an operation into commands that worked on the
computer. Learning to design structures, how they were stored in the computer’s
memory, why manipulating pointers is more efficient and “elegant” that copying
structures, how flipping bits in bit fields can represent a host of complex
values in small space, &c. Coding felt like writing stories, essays, seminar
papers.

…

> The Sapir/Whorf hypothesis is tempting, but does not appear to be any simpler
or more direct in operation in programming languages than in natural languages.

Learning Basic, Prolog, Lisp, C, mc68K and z80 assembler, DCL, Perl, and coding
my own macro language and terminal shell to help me proof and print my
dissertation infected my head with ways of seeing the physical world that were
different from the ways I conceived before learning these programming tools (C
and assembler especially). I can recall a few years after completing my doctoral
work where I often imagined how I would code the properties of objects (for
example a “lamp”) or words (for example “lamp”) using a structure, including
contemplating the storage requirements for the structure and the TYPEDEF I might
need to assure that the Platonic *lamp* could contain instances that differed
from the ideal, or how *lamp* could become a part of another TYPEDEF known as
*furnishings*. At another level there would need to be a TYPEDEF for the concept
of “lamp” in human language which allowed for the prototypical instantiation of
*lamp* to be encompassed in the definition of *lamp*.

So, I believe the effect that learning programming (similar to the study of Old
English, German, and Greek) did give me a lens (if not a cataract) through which
I often filtered my view of the “*world*”.

On 2020-06-04 13:28:31+00:00 in Humanist 34.83 Allen Riddell wrote

> But the people who write and maintain the compiler must, of course, deal with
the underlying hardware architecture on a daily basis.

The old Alcyon C compiler created a series of discreet intermediate files as it
translated C into a final executable. There were supposed to have been different
front ends for the compiler that would compile other high-level languages (such
as Pascal), but I never saw evidence that those front ends were ever available.
It was the compiler system used by DRI to create CP/M-68K and and GEM and later
Atari TOS. I taught myself to write 68K assembler using a thick book for the
Motorola chip and the code that the Alcyon compiler created to send to the
assembly program. You could learn a good deal about the way the resulting
program ran by comparing how identical lines of C led to different snippets of
assembly based on the context in which the C appeared.

--
 Caught in the web since 1984.






_______________________________________________
Unsubscribe at: http://dhhumanist.org/Restricted
List posts to: humanist@dhhumanist.org
List info and archives at at: http://dhhumanist.org
Listmember interface at: http://dhhumanist.org/Restricted/
Subscribe at: http://dhhumanist.org/membership_form.php


Editor: Willard McCarty (King's College London, U.K.; Western Sydney University, Australia)
Software designer: Malgosia Askanas (Mind-Crafts)

This site is maintained under a service level agreement by King's Digital Lab.