Home About Subscribe Search Member Area

Humanist Discussion Group

< Back to Volume 34

Humanist Archives: July 26, 2020, 10:04 a.m. Humanist 34.187 - on surprise

                  Humanist Discussion Group, Vol. 34, No. 187.
            Department of Digital Humanities, King's College London
                   Hosted by King's Digital Lab
                Submit to: humanist@dhhumanist.org

        Date: 2020-07-25 11:14:32+00:00
        From: John Naughton 
        Subject: Fwd: Surprises

Dear Willard

Re your question: Where do lasting surprises come from?

One illustration comes from the history of the Internet (i.e. the
network based on the TCP/IP family of protocols that we use to day).

  Design work on what was called the "internetworking" project began in
the Fall of 1973, following the completion of the Arpanet.  The design
brief was to devise a way of seamlessly linking the various
packet-switched networks that had come into being while the Arpanet was
being built.  The design challenges were (i) that these other networks
were 'independent' (i.e. had been built and were owned by non-US
entities -- the NPL in Britain, the Cyclades network in France, for
example); and (ii) that the new 'internetwork' should -- unlike the
analog telephone network -- not be optimised for any particular
application because this could make it obsolete as new applications emerged.

  From these considerations, two design axioms emerged for the new network.

1. No central ownership or control
2. A network that did only one thing: take data-packets in at one edge
   and make best efforts to deliver them to their destinations at other 

The second axiom was the critical one.  It became known as the
end-to-end principle, and colloquially as the "stupid network, smart
applications" principle.  The idea was to leave all the ingenuity to
users at the edges of the network.  If you had an idea that could be
realised using data-packets, and had the necessary skills to write the
code, then the new network would do it for you, no questions asked.  And
no permission to use the network was required; provided your computer
ran the TCP/IP stack then you were in.

The network was thus from the outset agnostic about the purposes of
users and their applications.  All it dealt with was the routing and
forwarding of data-packets.  And it turned out that users had lots of
ideas for applications, and the skills to implement them in code, and so
the network proved to be amazingly generative.  The explosion of
creativity that it enabled was a consequence of its design axioms.  Vint
Cerf and Bob Kahn and their colleagues had designed what Barbara van
Schewick later described as "an architecture for permissionless
innovation".  A more colloquial way of putting it is that the TCP/IP
network became a global machine for springing surprises.

Some of those surprises were pleasant ones -- Voice over IP (VoIP), for
example, streaming media and -- most of all -- the Web.  When it became
clear that CERN had no organisational interest in Tim Berners-Lee's
design for the Web, he simply put the code onto CERN's Internet server
and the network did the rest.  No permissions needed or sought.  Napster
and file-sharing technologies were also a surprise, though not so
pleasant for some industrial incumbents.  Facebook was also a surprise:
having been punished by Harvard for illicit use of the university's
network with his original 'Facemash' venture, Zuckerberg borrowed a
thousand bucks from his friend Eduardo Saverin and put the Facebook code
onto an Internet server. Again, no permission needed.

There was, however, one significant difference between Zuckerberg's
surprise and Berners-Lee's.  Tim adhered to the original generative,
permissive spirit of the original design axioms.  Anyone could build
anything they wanted to on the platform that he created.  Zuckerberg --
and the other entrepreneurs who have followed the same corporate
approach -- did not believe in permissionless innovation, only in
innovation that they approved and regulated.  (With occasional lapses,
as with Cambridge Analytica -- though in fact that happened because of
Facebook's decision in 2007 to turn itself into a 'platform' for other
developers willing to adhere to the T&Cs set by the platform owner.)  So
there is a certain irony in the fact that the sensationally-profitable
targeting machines of proprietary social media platforms could be
exploited for malign purposes by foreign and domestic agents was also a
surprise -- both to the owners of the platforms, and the democracies in
which they operate!



Professor John Naughton
Director, Press Fellowship Programme
Wolfson College, Cambridge
blog: memex.naughtons.org
Observer column: https://www.theguardian.com/technology/series/networker
tw: @jjn1

Unsubscribe at: http://dhhumanist.org/Restricted
List posts to: humanist@dhhumanist.org
List info and archives at at: http://dhhumanist.org
Listmember interface at: http://dhhumanist.org/Restricted/
Subscribe at: http://dhhumanist.org/membership_form.php

Editor: Willard McCarty (King's College London, U.K.; Western Sydney University, Australia)
Software designer: Malgosia Askanas (Mind-Crafts)

This site is maintained under a service level agreement by King's Digital Lab.