Home About Subscribe Search Member Area

Humanist Discussion Group


< Back to Volume 33

Humanist Archives: April 22, 2020, 8:23 a.m. Humanist 33.790 - on academia.edu, proprietary & open

                  Humanist Discussion Group, Vol. 33, No. 790.
            Department of Digital Humanities, King's College London
                   Hosted by King's Digital Lab
                       www.dhhumanist.org
                Submit to: humanist@dhhumanist.org


    [1]    From: Francois Lachance 
           Subject: Thank you > Re: [Humanist] 33.785: on using academia.edu (36)

    [2]    From: Gabriel Egan 
           Subject: Re: [Humanist] 33.785: on using academia.edu (62)

    [3]    From: Ian Johnson 
           Subject: Re: [Humanist] 33.785: on preserving work (was: on using academia.edu) (36)


--[1]------------------------------------------------------------------------
        Date: 2020-04-21 13:28:59+00:00
        From: Francois Lachance 
        Subject: Thank you > Re: [Humanist] 33.785: on using academia.edu

Tim

Thank you for your posting to Humanist about the porous relation between
proprietary and open standards. You might be interested in my off-list reply to
Gabriel yesterday. This in particular resonates with your call to be catholic in
our tastes:

Unfortunately I created the animation in a time before SVG and the Canadian
situation at the time was not able to emulate the fine example of

the Arts and Humanities Research Council, rightly insists on Open everything
(Open Standards, Open Data, Open Access, Open Source) in all the projects

Some day I hope to learn enough SVG to convert my little ditty ("Earth
Washing") or team up with some skilled converters.

I like you (I suspect) don't think that clean by emulator is the best way to
go. Wholesale translation is the way to make the stranded assets not only
readable and but also accessible.

I think they might want to add Open Translators to their list.  Or should it be
Open Standards and Universal Translator?

Thanks for making my thinking about this clearer.

F

~  ~  ~  ~  ~  ~  ~
François Lachance
Scholar-at-large
Wannabe Professor of Theoretical and Applied Rhetoric
http://www.chass.utoronto.ca/~lachance
https://berneval.hcommons.org

to think is often to sort, to store and to shuffle: humble, embodied tasks


--[2]------------------------------------------------------------------------
        Date: 2020-04-21 12:58:38+00:00
        From: Gabriel Egan 
        Subject: Re: [Humanist] 33.785: on using academia.edu

Dear HUMANISTs

I agree with Tim Smithers that ". . . some good things do
arrive first as proprietary products, PDF being one of them".
But actually in the case of PDF I think the format became
a good thing even before it became an Open Standard. A
useful intermediate position for a format opens up when
it is still proprietary but lots of different software
manufacturers are making tools that are compatible with
it. The really dangerous situation arises when a proprietary
format is usable only with one software manufacturer's
products, as with the Adobe's Shockwave and Flash formats.

An object lesson here is the British Broadcasting
Corporation's reliance on the proprietary Real Audio
sound format for its Internet broadcasting beginning
in 1996. It took a lot of time and money to unpick that
foolish decision. The WAV sound format is also proprietary,
I believe, but with so many software applications able
to read and write WAV files this hardly matters. If
you don't like one WAV-enabled tool, pick up another;
no one company dominates the market for these tools.

When PDF first became commonly used it functioned like
a kind of digital rights management system in the sense
that the creators of PDFs widely and correctly believed
that files made in this format could not be edited by
their recipients. This happened because Adobe made the PDF
reader software free for anyone to download and charged
a lot of money for the software, Acrobat, capable of
writing PDFs. The sense of privilege amongst those
capable of creating PDFs, notably publishers using
them for proofs, was palpable, and for about a
decade Adobe's business model for PDFs was built on
this feeling.

The greatest service done to the usefulness of PDF
occurred before its transition in 2008 to an Open
Standard, when in 2006 Microsoft's 'Word 2007'
software offered an optional 'Save as PDF' add-on,
which became a standard feature with 'Word 2010'.
This broke the illusion that only institutions and
corporations could create PDFs and, I suspect, it
hastened the format's transition to an Open Standard.

I get a similar smug sense of superiority surrounding
the issuing of a DOI, which publishers seem to believe
confers respectability and longevity on a digital
object. In fact, even the scammiest of fake journals
is able to register DOIs for its articles. See, for
instance, DOI 10.33552/SJRR.2020.02.000540, the
hilarious fake article "What's the deal with birds?"
that Daniel T. Baldassarre managed to place with
the 'Scientific Journal of Research and Reviews'
(ISSN 2687-8097).

Regards

Gabriel Egan




--[3]------------------------------------------------------------------------
        Date: 2020-04-21 11:39:09+00:00
        From: Ian Johnson 
        Subject: Re: [Humanist] 33.785: on preserving work (was: on using academia.edu)

Francois, Gabriel, Tim,

Could I go further and suggest that the important thing is not open standards
per se but the separation of data and presentation. By all means use LockEmUpViz
to explore your data, even to present results, but don't use it as storage even
if you hope one day to persuade them to open the format.

Manage data in a proper data management system, use any tool you like to get
results (assuming you know/trust how it gets there) because research should be
pushing the boundaries of the possible, but when it comes to thinking about he
future save the data in a well-documented format, document the workflows,
document the final look-and-feel, and in 50 years time someone's AI-powered
personal assistant will be able to reproduce the product of your years of
painstaking work in seconds ...

To put that in perspective, an early 1970s database in my field was a flat
fixed-width file (or deck of cards) on a mainframe with coded columns and a
codesheet (inline or separate) into which you edited your analysis instructions,
perhaps producing line printer character graphs (processing turnaround time 1 -
2 hours if you were not one of the lucky few with online disk storage and a
terminal). Within a couple of decades that would be no more than a simple data
file, easily migrated to an interactive analysis environment on the desktop,
coloured graphs, pixel printing, even some mapping. Footnote: documentation
standards often went downhill, because it became TOO easy to just create a new
column and shove in any old text ...

Regards

Ian

Ian Johnson | Honorary Associate
Faculty of Arts and Social Sciences
Rm 445, Old Teachers College A22 | The University of Sydney | NSW | 2006
35, rue des Abbesses, Paris 75018
Mob: +33 6 95 34 14 66



_______________________________________________
Unsubscribe at: http://dhhumanist.org/Restricted
List posts to: humanist@dhhumanist.org
List info and archives at at: http://dhhumanist.org
Listmember interface at: http://dhhumanist.org/Restricted/
Subscribe at: http://dhhumanist.org/membership_form.php


Editor: Willard McCarty (King's College London, U.K.; Western Sydney University, Australia)
Software designer: Malgosia Askanas (Mind-Crafts)

This site is maintained under a service level agreement by King's Digital Lab.