15.436 evaluating Web-sites

From: Humanist Discussion Group (by way of Willard McCarty willard.mccarty@kcl.ac.uk)
Date: Sat Jan 05 2002 - 05:24:18 EST

  • Next message: Humanist Discussion Group (by way of Willard McCarty

                   Humanist Discussion Group, Vol. 15, No. 436.
           Centre for Computing in the Humanities, King's College London
                   <http://www.princeton.edu/~mccarty/humanist/>
                  <http://www.kcl.ac.uk/humanities/cch/humanist/>

       [1] From: Michael Fraser <mike.fraser@COMPUTING- (32)
                     SERVICES.OXFORD.AC.UK>
             Subject: Re: 15.431 evaluating Web-sites, measuring learning

       [2] From: Steve Krause <skrause@ONLINE.EMICH.EDU> (40)
             Subject: Re: 15.431 evaluating Web-sites, measuring learning

    --[1]------------------------------------------------------------------
             Date: Sat, 05 Jan 2002 10:06:38 +0000
             From: Michael Fraser <mike.fraser@COMPUTING-SERVICES.OXFORD.AC.UK>
             Subject: Re: 15.431 evaluating Web-sites, measuring learning

    Willard,

    I missed the posting to which you refer but you might be interested to
    know that our Cataloguing Guidelines for Internet resources includes a
    short evaluation form which we use in training workshops and for
    cataloguers, if they wish, to take notes about a site prior to cataloguing
    it. Our cataloguing guidelines, which may also be of interest, contains an
    appendix of starting points for evaluating web site.

    Evaluation form at http://www.humbul.ac.uk/about/humbul-evaluation.pdf
    Cataloguing Guidelines (appendix 3) at
    http://www.humbul.ac.uk/about/catalogue.html#evaluating

    I like the Arizona State Library, Archives and Public Records' "Collection
    Development Training for Arizona Public Libraries" Web site. The section
    on selecting library resources places the criteria for selecting
    Internet resources within the context of selection criteria for other
    resources (books, peridicals, audiovisual etc). There are not examples of
    criteria for the evaluation of books and articles (published or
    unpublished) but, as you note, many examples of criteria for Internet
    resources. Unless, of course, Research Assessment Exercise panels have
    checklists of criteria for measuring authority, bias, currency, audience,
    organisation, aesthetics etc.

    Best wishes, Michael

    ---
    Dr Michael Fraser
    Head of Humbul Humanities Hub
    Humanities Computing Unit, OUCS
    University of Oxford
    13 Banbury Road
    Oxford OX2 6NN
    Tel: 01865 283 343
    Fax: 01865 273 275
    Email: mike.fraser@oucs.ox.ac.uk
    http://www.humbul.ac.uk/
    

    --[2]------------------------------------------------------------------ Date: Sat, 05 Jan 2002 10:06:58 +0000 From: Steve Krause <skrause@ONLINE.EMICH.EDU> Subject: Re: 15.431 evaluating Web-sites, measuring learning

    Willard et al--

    For your and the group's consideration, I'd like to submit the web site for a presentation I gave at the Computers and Writing Conference in Muncie, IN last June. The URL for it is http://www.emunix.emich.edu/~krause/cw2001 Since it was a conference presentation that hasn't evolved any further than that (yet), it is still somewhat drafty in form, but I think it's interesting in this context.

    What I did was I showed students in an upper-level undergrad and a grad class some web sites, most of which were "spoofs" or jokes but one of which was "real" or credible, and asked them to tell me why they thought a particular site was "real" or "fake." Of course, it's a problematic decision between "real" and "fake," but if you see the sites, I think you get the idea.

    I reached a couple very tentative conclusions. First, applying the traditional standards of clarity (like your very good site), it seemed to me that it was still possible to evaluate "fake" sites as "real" because they did a good job of "faking" credibility. Second, students (and most web readers, I would argue) didn't evaluate the credibility of web sites following these criteria. When asked why they evaluated a site the way they did, most of them talked about how the site "looked" credible-- that is, a lot of their judgement was based on a sort of simple aesthetic judgement of if it looks right, it must be right-- and they talked about "common sense" knowledge, what I think really amounts to previous knowledge. They knew enough *before* they looked at the GenoChoice site (for example) that it's impossible to clone a child over the Internet.

    And I guess a last conclusion (besides the idea that this would be interesting to study further) is that I think as teachers, we have to work harder at making students understand that they really REALLY can't judge a web site based on how it "looks." It's a pretty difficult process to make a professional-looking book or magazine-- that's not something that most of us could pull off in the basement. On the other hand, just about anyone with a few hours experience with photopshop and a few other pieces of software can make very credible, very "real" looking web sites.

    Anyway, if anyone sees this site and has any other comments, I'd be interested in hearing them.

    --Steve -- +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ Steven D. Krause * Assistant Professor, English 614G Pray-Harrold Hall * Eastern Michigan University Ypsilanti, MI 48197 * 734-487-1363 * http://www.online.emich.edu/~skrause



    This archive was generated by hypermail 2b30 : Sat Jan 05 2002 - 05:32:44 EST