[GOAL] Re: Fwd: The Open Access Interviews: Paul Royster

2014-09-21 Thread brentier
 of Liège, Belgium
 Le 19 sept. 2014 à 21:52, Richard Poynder ri...@richardpoynder.co.uk a 
 écrit :
 Dear Bernard,
 I have two questions if I may:
 1.   You say that Liège is getting close to 90% compliance. Can you 
 explain how you know that, and how you calculate compliance levels? I ask 
 this because the consistent theme coming through from UK universities with 
 regard to compliance to the RCUK OA mandate is that they simply do not know 
 how many research outputs their faculty produce each year. If that is 
 right, what systems does Liège have in place to enable it to produce a 
 comprehensive list of research outputs that UK universities apparently do 
 not have?
 2.   Does Liège track the licences attached to the deposits in its 
 repository? If so, can you provide some stats, especially the number of 
 items that are available CC-BY (which we are now told is required before a 
 deposit can be characterised as being open access?
 Thank you.
 Richard Poynder
 From: goal-boun...@eprints.org [mailto:goal-boun...@eprints.org] On Behalf 
 Of brent...@ulg.ac.be
 Sent: 19 September 2014 18:46
 To: Global Open Access List (Successor of AmSci)
 Subject: [GOAL] Re: Fwd: The Open Access Interviews: Paul Royster
 Liège does not mandate anything, so far as I know; it only looks into the 
 local repository (Orbi) to see what is in it, and it does so to assess 
 performance or respond to requests for promotions or grant submissions. 
 (JC. Guédon)
 Oh no, Jean-Claude, Liège mandates everything.
 It is a real mandate and it took me a while to get almost every ULg 
 researcher to realise that it is to his/her benefit. 
 Linking the deposits to personal in-house assessment was the trick to get 
 the mandate enforced in the first place. As well as a few positive 
 incentives and a lot of time consuming persuasion (but it was well worth 
 it).
 Last Wednesday, the Liège University Board has put an ultimate touch of 
 wisdom on its mandate by adding immediately upon acceptance, even in 
 restricted access in the official procedure. Actually, a nice but to some 
 extent useless addition because, with time (the mandate was imposed in 
 2007), ULg authors have become so convinced of the increase in readership 
 and citations that two thirds of them make their deposits between the date 
 of acceptance and the date of publication. 
 All this explains why we are getting close to 90% compliance, an 
 outstanding result, I believe. 
 
 Le 18 sept. 2014 à 23:40, Jean-Claude Guédon 
 jean.claude.gue...@umontreal.ca a écrit :
 A reasonably quick response as I do not want to go into discursive tsunami 
 mode...
 
 1. Stevan admits that his evaluation of compliance is an approximation, 
 easy to get, but not easy to correct. This approximation varies greatly 
 from one institution to another, one circumstance to another. For example, 
 he admits that language plays a role; he should further admit that the 
 greater or smaller proportion of SSH researchers in the research 
 communities of various institutions will also play a role. in short, 
 comparing two institutions by simply using WoS approximations appears rash 
 and unacceptable to me, rather than simply quick and dirty (which I would 
 accept as a first approximation).
 
 The impact factor folly was mentioned because, by basing his approximation 
 on the WoS, Stevan reinforces the centrality of a partial and questionable 
 tool that is, at best, a research tool, not a management tool, and which 
 stands behind all the research assessment procedures presently used in 
 universities, laboratories, etc.
 
 2. Stevan and I have long differed about OA's central target. He limits 
 himself to journal articles, as a first step; I do not. I do not because, 
 in the humanities and social sciences, limiting oneself to journal articles 
 would be limiting oneself to the less essential part of the archive we work 
 with, unlike natural scientists. 
 
 Imagine a universe where a research metric would have been initially 
 designed around SSH disciplines and then extended as is to STM. In such a 
 parallel universe, books would be the currency of choice, and articles 
 would look like secondary, minor, productions, best left for later 
 assessments. Then, one prominent OA advocate named Stenan Harvard might 
 argue that the only way to proceed forward is to focus only on books, that 
 this is OA's sole objective, and that articles and the rest will be treated 
 later... Imagine the reaction of science researchers... 
 
 3. Liège does not mandate anything, so far as I know; it only looks into 
 the local repository (Orbi) to see what is in it, and it does so to assess 
 performance or respond to requests for promotions or grant submissions. If 
 books and book chapters are more difficult to treat than articles, then 
 place them in a dark archive with a button. This was the clever solution 
 invented by Stevan and I agree with it.
 
 4. To obtain mandates, you need either faculty

[GOAL] Re: Fwd: The Open Access Interviews: Paul Royster

2014-09-21 Thread brentier
Il agree, Richard, but we are not really looking for accuracy here, we are 
looking for a general trend. The method is approximative and, as Jean-Claude 
mentions rightfully, it suffers a terrible language and domain bias. In other 
words, it is plagued by a strong underestimation.
Whether ORBi's compliance level is 70, 80 or 90% is not a major concern to me 
(even though I would love it to reach 100%!), I must admit. I am satisfied to 
observe that it is very high and not in the 15-30% range which is what happens 
when a mandate is not being enforced by a link to assessment procedures.

Bernard


 Le 21 sept. 2014 à 10:51, Richard Poynder ri...@richardpoynder.co.uk a 
 écrit :
 
 As a layperson I would certainly be interested to know what margin of error 
 levels we can assume the “Web of Science and/or in Scopus” approach has. I am 
 conscious, for instance, that some of the reports by UK universities into 
 RCUK compliance mention using Web of Science, but they all appear keen to 
 stress that they have serious concerns about data accuracy.
  
 A list of RCUK compliance reports, by the way, can be found here: 
 http://goo.gl/Yi3twT
  
 There is also a very informative blog post on the topic of monitoring open 
 access mandates/policies by Cameron Neylon here: http://goo.gl/Y02S87
  
 Richard Poynder
  
  
  
  
 From: goal-boun...@eprints.org [mailto:goal-boun...@eprints.org] On Behalf Of 
 Jean-Claude Guédon
 Sent: 20 September 2014 23:27
 To: Global Open Access List (Successor of AmSci)
 Subject: [GOAL] Re: Fwd: The Open Access Interviews: Paul Royster
  
 Extremely good answer, Bernard!
 
 It is also very good to clarify the fact that the 90% figure is calculated 
 against the baseline of a combined WoS_Scopus search. However, and this was 
 part of my difficulties with Stevan's argument, I suspect that in SSH, in a 
 French-speaking university, many publish in French-language journals that do 
 not appear in either list. This means that, for Liège, the baseline works 
 from one year to the next, but if you want to compare Liège's mandate and its 
 effectiveness (which, once again, I agree, is - from common sense - the best) 
 with another kind of mandate in an English-speaking university, the baselines 
 will not be comparable.
 
 If, furthermore, you imagine two universities that not only differ 
 linguistically, but also differ in the relative weight of disciplines in 
 research output - say one heavily slanted STM and the other heavily slanted 
 SSH, this too will affect the baseline simply by virtue of the fact that SSH 
 publications are not as well covered by WoS and Scopus as are STM 
 publications.
 
 In conclusion, the baseline is OK for comparisons of a mandate's 
 effectiveness longitudinally, of for comparison purposes of two successive, 
 but different, mandates, assuming the institution remains pretty much the 
 same over time in terms of mix of research emphases; it is far more 
 questionable across institutions, especially when different languages are 
 involved (but not only).
 
 Incidentally, what proportion of papers deposited in ORBI do not appear in 
 either WoS or Scopus? That too would be interesting to know as it might help 
 Stevan refine his baseline and thus make it more convincing.
 
 Finally, given that all universities require, an annual assessment of 
 performance, including a bibliography of publications in the completed year, 
 would it be difficult to compare the repository's holding against the 
 publications of the researchers as declared by them? Knowing researchers, 
 every last little scrap of paper will be minutely listed in the yearly 
 assessment forms... image001.png
 
 --
  
 Jean-Claude Guédon
 Professeur titulaire
 Littérature comparée
 Université de Montréal
 Le samedi 20 septembre 2014 à 19:10 +0200, Bernard Rentier - IMAP a écrit :
 
 Dear Richard,
  
 
 Here are the answers:
  
 
 1. ORBi, the Liège University Repository, will soon (I believe) reach 90% 
 compliance. It is our target for 2014 and I hope we make it.
 This figure comes from the calculation of the percentage of ULg papers that 
 can be found in Web of Science and/or in Scopus that are deposited in ORBi as 
 well (see method in  http://eprints.soton.ac.uk/340294/)
 It concerns one year at a time and it is not cumulative. Last May, the 
 compliance level for the publications of 2013 was already 73% and our figure 
 for 2012 is in the 80% range.
  
 
 2. Only a small proportion of ULg papers are in CC-BY.
 This is simply because, in order to publish in the journal of their choice (I 
 haven’t tried to do anything against that!), our authors, in the great 
 centuries-old tradition, give away their rights to the publisher. We have no 
 control on that.
 Later on, there is no way for them to CC-BY the same text (in fact, we are 
 preparing ORBi 2.0, that will offer a CC-BY choice).
 For now, we are aiming at free access and we are not yet fighting hard for 
 re-use rights. We shall move

[GOAL] Re: Fwd: The Open Access Interviews: Paul Royster

2014-09-20 Thread Stevan Harnad
On Fri, Sep 19, 2014 at 3:52 PM, Richard Poynder ri...@richardpoynder.co.uk
 wrote:

 Dear Bernard,



 I have two questions if I may:

 1.   You say that Liège is getting close to 90% compliance. Can you
 explain how you know that, and how you calculate compliance levels? I ask
 this because the consistent theme coming through from UK universities with
 regard to compliance to the RCUK OA mandate is that they simply do not know
 how many research outputs their faculty produce each year. If that is
 right, what systems does Liège have in place to enable it to produce a
 comprehensive list of research outputs that UK universities apparently do
 not have?

I'll let Bernard give the definitive answer, but let me already mention
that last year we did the study on the Liege, Minho, Surrey and Lancaster
repositories, using exactly the methodology described here (determine the
full yearly output from WoS, then what percentage of it is deposited) and
the percentage for Liege was already over 80%.

See also the data on Liege's deposit latencies (i.e., how soon papers were
deposited, and how soon deposits were made OA), which are also quite
impressive.

Gargouri, Yassine, Larivière, Vincent  Harnad, Stevan (2013) Ten-year
Analysis of University of Minho Green OA Self-Archiving Mandate (in E
Rodrigues, A Swan  AA Baptista, Eds. *Uma  Década  de Acesso Aberto e na
UMinho no Mundo*. http://eprints.soton.ac.uk/358882/

Stevan Harnad



 2.   Does Liège track the licences attached to the deposits in its
 repository? If so, can you provide some stats, especially the number of
 items that are available CC-BY (which we are now told is required before a
 deposit can be characterised as being open access?



 Thank you.





 Richard Poynder





 *From:* goal-boun...@eprints.org [mailto:goal-boun...@eprints.org] *On
 Behalf Of *brent...@ulg.ac.be
 *Sent:* 19 September 2014 18:46
 *To:* Global Open Access List (Successor of AmSci)
 *Subject:* [GOAL] Re: Fwd: The Open Access Interviews: Paul Royster



 *Liège does not mandate anything, so far as I know; it only looks into
 the local repository (Orbi) to see what is in it, and it does so to assess
 performance or respond to requests for promotions or grant submissions.*
 (JC. Guédon)



 Oh no, Jean-Claude, Liège mandates everything.

 It is a real mandate and it took me a while to get almost every ULg
 researcher to realise that it is to his/her benefit.

 Linking the deposits to personal in-house assessment was the trick to get
 the mandate enforced in the first place. As well as a few positive
 incentives and a lot of time consuming persuasion (but it was well worth
 it).

 Last Wednesday, the Liège University Board has put an ultimate touch of
 wisdom on its mandate by adding *immediately upon acceptance, even in
 restricted access* in the official procedure. Actually, a nice but to
 some extent useless addition because, with time (the mandate was imposed in
 2007), ULg authors have become so convinced of the increase in readership
 and citations that two thirds of them make their deposits between the date
 of acceptance and the date of publication.

 All this explains why we are getting close to 90% compliance, an
 outstanding result, I believe.






 Le 18 sept. 2014 à 23:40, Jean-Claude Guédon 
 jean.claude.gue...@umontreal.ca a écrit :

 A reasonably quick response as I do not want to go into discursive tsunami
 mode...

 1. Stevan admits that his evaluation of compliance is an approximation,
 easy to get, but not easy to correct. This approximation varies greatly
 from one institution to another, one circumstance to another. For example,
 he admits that language plays a role; he should further admit that the
 greater or smaller proportion of SSH researchers in the research
 communities of various institutions will also play a role. in short,
 comparing two institutions by simply using WoS approximations appears rash
 and unacceptable to me, rather than simply quick and dirty (which I would
 accept as a first approximation).

 The impact factor folly was mentioned because, by basing his approximation
 on the WoS, Stevan reinforces the centrality of a partial and questionable
 tool that is, at best, a research tool, not a management tool, and which
 stands behind all the research assessment procedures presently used in
 universities, laboratories, etc.

 2. Stevan and I have long differed about OA's central target. He limits
 himself to journal articles, as a first step; I do not. I do not because,
 in the humanities and social sciences, limiting oneself to journal articles
 would be limiting oneself to the less essential part of the archive we work
 with, unlike natural scientists.

 Imagine a universe where a research metric would have been initially
 designed around SSH disciplines and then extended as is to STM. In such a
 parallel universe, books would be the currency of choice, and articles
 would look like secondary, minor, productions, best left

[GOAL] Re: Fwd: The Open Access Interviews: Paul Royster

2014-09-20 Thread Stevan Harnad
On Fri, Sep 19, 2014 at 4:38 PM, Jean-Claude Guédon 
jean.claude.gue...@umontreal.ca wrote:

part of the 30% (however it is calculated - is it 30% of WoS articles?)
 comes from the Gold road, and, therefore, falls under a different kind of
 argument.


Yes, it's based on WoS articles (hence an underestimate of the total) and
includes both Green and Gold OA.

Here are some data from a couple of years ago, when Green OA was about
20%: Gold OA was about 2%:

Gargouri, Yassine, Lariviere, Vincent, Gingras, Yves, Carr, Les and Harnad,
Stevan (2012) Green and Gold Open Access Percentages and Growth, by
Discipline. In: *17th International Conference on Science and Technology
Indicators (STI)*, 5-8 September, 2012, Montreal, Quebec, Canada, Montréal.
http://eprints.soton.ac.uk/340294/

Stevan Harnad

On Fri, Sep 19, 2014 at 4:38 PM, Jean-Claude Guédon 
jean.claude.gue...@umontreal.ca wrote:

  I will let readers evaluate whether Stevan's answers are satisfactory or
 not. Except for the Liège mandate where I did not express myself
 sufficiently precisely, I disagree with points I--III, V-VI.

 I agree that point VII deserves being studied more precisely.

 For point VIII, part of the 30% (however it is calculated - is it 30% of
 WoS articles?) comes from the Gold road, and, therefore, falls under a
 different kind of argument. This said, I believe that Liège's solution is
 the best one presently available,* if you can get it*. In countries where
 university autonomy is far from being the norm (e.g. France), the clout of
 in-house assessments of performance is perforce very limited.

 Promoting the Liège solution is also what I do, and I do so everywhere,
 but promoting OA publishing platforms (such as Redalyc and, with some
 caveats, Scielo) that are both free and gratis is also what I do. IMHO,
 this is superior to promoting only and exclusively the Green road: it adds
 to the Green road without subtracting  anything from it. This was also the
 spirit of BOAI.

 Finally, I do not need any fancy statistical footwork to agree that the
 ways and means of the Liège mandate are the best. Common sense is enough
 for me.

 Let us get the Liège form of mandate wherever we can (which I am presently
 trying to do in my own university), and let us also do all we can to
 promote OA for all (including all disciplines).

 And I will stop this thread here.

   --

 Jean-Claude Guédon
 Professeur titulaire
 Littérature comparée
 Université de Montréal



   Le vendredi 19 septembre 2014 à 13:17 -0400, Stevan Harnad a écrit :

 *I.* A Web-of-Science-based estimate of Green OA mandate effectiveness —
 i.e., of *the annual percentage of institutional journal article output
 that is being self-archived in the institutional repository *— is fine.
 So is one based on SCOPUS, or on any other index of annual journal article
 output across disciplines.



  *II.* The fact that books are more important than journals in SSH
 (social science and humanities) in no way invalidates WoS-based estimates
 of Green OA mandate effectiveness. *The mandates apply only to journal
 articles.*



  *III. *Green OA mandates to date apply only to journal articles, not
 books, for many obvious reasons.



  *IV.* Jean-Claude writes: *“Liège does not mandate anything, so far as I
 know.” *



   *Cf:*  *“The University of Liege policy is mandatory… the
 Administrative Board of the University has decided to make it mandatory for
 all ULg members: - to deposit the bibliographic references of ALL their
 publications since 2002; - to deposit the full text of ALL their articles
 published in periodicals since 2002…*” http://roarmap.eprints.org/56/



  *V.* The fact that research metrics are currently mostly journal-article
 based has nothing to do with the predictive power of estimates of Green OA
 mandate effectiveness.



  *VI*. The WoS-based estimate of Green OA mandate effectiveness has
 nothing to do with “impact factor folly.”



  *VII.* Jean-Claude writes:“SSH authors are less interested in depositing
 articles than STM researchers.”



   As far as I know, there is not yet any objective evidence supporting
 this assertion. In fact, we are in the process of testing it, using the WoS
 data.



  *VIII*. *Status quo*: OA to journal articles is around 30% today. Our
 practical solution: Green OA mandates (and tests for which kinds of mandate
 are most effective) so they can be promoted for adoption. Other practical
 solutions?



  Stevan Harnad




  On Thu, Sep 18, 2014 at 5:17 PM, Jean-Claude Guédon 
 jean.claude.gue...@umontreal.ca wrote:

  A reasonably quick response as I do not want to go into discursive
 tsunami mode...

 1. Stevan admits that his evaluation of compliance is an approximation,
 easy to get, but not easy to correct. This approximation varies greatly
 from one institution to another, one circumstance to another. For example,
 he admits that language plays a role; he should further admit that the
 greater or smaller proportion of SSH 

[GOAL] Re: Fwd: The Open Access Interviews: Paul Royster

2014-09-20 Thread Heather Morrison
Bernard Rentier raises an important point about external evaluations here. At 
the University of Ottawa, we have a really good collective agreement on the 
topics of tenure and promotion, and any reliance on things like the impact 
factor of the journals you publish in tenure and promotion decisions is in 
contravention of the collective agreement. However, as long as external 
evaluators of both tenure and promotion files and research grant proposals 
continue to focus on things like impact factor, a faculty member ignores these 
things at their peril.

A potential refusal of tenure - which generally means a loss of employment in a 
very tough job market - after many years to complete a doctorate and a number 
more to develop a tenure-worthy dossier is a very harsh punishment.  It is my 
perspective that the OA movement needs to keep this in mind. We need tenured 
faculty to push forward on open access, and for emerging scholars this means 
making sure that their support for OA does not hinder their prospects for 
obtaining tenure.

best,

Heather Morrison


On 2014-09-20, at 1:10 PM, Bernard Rentier - IMAP wrote:

 Dear Richard,
 
 Here are the answers:
 
 1. ORBi, the Liège University Repository, will soon (I believe) reach 90% 
 compliance. It is our target for 2014 and I hope we make it.
 This figure comes from the calculation of the percentage of ULg papers that 
 can be found in Web of Science and/or in Scopus that are deposited in ORBi as 
 well (see method in  http://eprints.soton.ac.uk/340294/)
 It concerns one year at a time and it is not cumulative. Last May, the 
 compliance level for the publications of 2013 was already 73% and our figure 
 for 2012 is in the 80% range.
 
 2. Only a small proportion of ULg papers are in CC-BY.
 This is simply because, in order to publish in the journal of their choice (I 
 haven’t tried to do anything against that!), our authors, in the great 
 centuries-old tradition, give away their rights to the publisher. We have no 
 control on that.
 Later on, there is no way for them to CC-BY the same text (in fact, we are 
 preparing ORBi 2.0, that will offer a CC-BY choice).
 For now, we are aiming at free access and we are not yet fighting hard for 
 re-use rights. We shall move progressively in this direction of course, while 
 the publishing mores evolve…
 In other words, I agree that we have free access, not a full fledge open 
 access yet. It is not a failure, it is our objective to gain confidence first.
 Unfortunately, even if we have established in-house rules for evaluation, 
 external evaluations are still based on traditional indicators such as the 
 highly and rightfully criticized but widely used Impact Factor and the like. 
 In these conditions, today we cannot sacrifice our researchers — singularly 
 the young ones — in the overall competition for jobs and funds, on the altar 
 of « pure » Open Access.
 
 Best wishes
 
 Bernard Rentier
 Rector, University of Liège, Belgium
 
 
 
 
 Le 19 sept. 2014 à 21:52, Richard Poynder ri...@richardpoynder.co.uk a 
 écrit :
 
 Dear Bernard,
  
 I have two questions if I may:
  
 1.   You say that Liège is getting close to 90% compliance. Can you 
 explain how you know that, and how you calculate compliance levels? I ask 
 this because the consistent theme coming through from UK universities with 
 regard to compliance to the RCUK OA mandate is that they simply do not know 
 how many research outputs their faculty produce each year. If that is right, 
 what systems does Liège have in place to enable it to produce a 
 comprehensive list of research outputs that UK universities apparently do 
 not have?
  
 2.   Does Liège track the licences attached to the deposits in its 
 repository? If so, can you provide some stats, especially the number of 
 items that are available CC-BY (which we are now told is required before a 
 deposit can be characterised as being open access?
  
 Thank you.
  
  
 Richard Poynder
  
  
 From: goal-boun...@eprints.org [mailto:goal-boun...@eprints.org] On Behalf 
 Of brent...@ulg.ac.be
 Sent: 19 September 2014 18:46
 To: Global Open Access List (Successor of AmSci)
 Subject: [GOAL] Re: Fwd: The Open Access Interviews: Paul Royster
  
 Liège does not mandate anything, so far as I know; it only looks into the 
 local repository (Orbi) to see what is in it, and it does so to assess 
 performance or respond to requests for promotions or grant submissions. 
 (JC. Guédon)
 
 
 Oh no, Jean-Claude, Liège mandates everything.
 It is a real mandate and it took me a while to get almost every ULg 
 researcher to realise that it is to his/her benefit. 
 Linking the deposits to personal in-house assessment was the trick to get 
 the mandate enforced in the first place. As well as a few positive 
 incentives and a lot of time consuming persuasion (but it was well worth it).
 Last Wednesday, the Liège University Board has put an ultimate touch of 
 wisdom on its mandate by adding immediately upon

[GOAL] Re: Fwd: The Open Access Interviews: Paul Royster

2014-09-19 Thread Stevan Harnad
*I.* A Web-of-Science-based estimate of Green OA mandate effectiveness —
i.e., of *the annual percentage of institutional journal article output
that is being self-archived in the institutional repository *— is fine. So
is one based on SCOPUS, or on any other index of annual journal article
output across disciplines.

*II.* The fact that books are more important than journals in SSH (social
science and humanities) in no way invalidates WoS-based estimates of Green
OA mandate effectiveness. *The mandates apply only to journal articles.*

*III. *Green OA mandates to date apply only to journal articles, not books,
for many obvious reasons.

*IV.* Jean-Claude writes: *“Liège does not mandate anything, so far as I
know.” *

*Cf:*  *“The University of Liege policy is mandatory… the Administrative
Board of the University has decided to make it mandatory for all ULg
members: - to deposit the bibliographic references of ALL their
publications since 2002; - to deposit the full text of ALL their articles
published in periodicals since 2002…*” http://roarmap.eprints.org/56/


*V.* The fact that research metrics are currently mostly journal-article
based has nothing to do with the predictive power of estimates of Green OA
mandate effectiveness.

*VI*. The WoS-based estimate of Green OA mandate effectiveness has nothing
to do with “impact factor folly.”

*VII.* Jean-Claude writes:“SSH authors are less interested in depositing
articles than STM researchers.”

As far as I know, there is not yet any objective evidence supporting this
assertion. In fact, we are in the process of testing it, using the WoS data.


*VIII*. *Status quo*: OA to journal articles is around 30% today. Our
practical solution: Green OA mandates (and tests for which kinds of mandate
are most effective) so they can be promoted for adoption. Other practical
solutions?

Stevan Harnad


On Thu, Sep 18, 2014 at 5:17 PM, Jean-Claude Guédon 
jean.claude.gue...@umontreal.ca wrote:

  A reasonably quick response as I do not want to go into discursive
 tsunami mode...

 1. Stevan admits that his evaluation of compliance is an approximation,
 easy to get, but not easy to correct. This approximation varies greatly
 from one institution to another, one circumstance to another. For example,
 he admits that language plays a role; he should further admit that the
 greater or smaller proportion of SSH researchers in the research
 communities of various institutions will also play a role. in short,
 comparing two institutions by simply using WoS approximations appears rash
 and unacceptable to me, rather than simply quick and dirty (which I would
 accept as a first approximation).

 The impact factor folly was mentioned because, by basing his approximation
 on the WoS, Stevan reinforces the centrality of a partial and questionable
 tool that is, at best, a research tool, not a management tool, and which
 stands behind all the research assessment procedures presently used in
 universities, laboratories, etc.

 2. Stevan and I have long differed about OA's central target. He limits
 himself to journal articles, as a first step; I do not. I do not because,
 in the humanities and social sciences, limiting oneself to journal articles
 would be limiting oneself to the less essential part of the archive we work
 with, unlike natural scientists.

 Imagine a universe where a research metric would have been initially
 designed around SSH disciplines and then extended as is to STM. In such a
 parallel universe, books would be the currency of choice, and articles
 would look like secondary, minor, productions, best left for later
 assessments. Then, one prominent OA advocate named Stenan Harvard might
 argue that the only way to proceed forward is to focus only on books, that
 this is OA's sole objective, and that articles and the rest will be treated
 later... Imagine the reaction of science researchers...

 3. Liège does not mandate anything, so far as I know; it only looks into
 the local repository (Orbi) to see what is in it, and it does so to assess
 performance or respond to requests for promotions or grant submissions. If
 books and book chapters are more difficult to treat than articles, then
 place them in a dark archive with a button. This was the clever solution
 invented by Stevan and I agree with it.

 4. To obtain mandates, you need either faculty to vote a mandate on itself
 (but few universities have done so), or you need administrators to impose a
 mandate, but that is often viewed negatively by many of our colleagues.
 Meanwhile, they are strongly incited to publish in prestigious journals
 where prestige is measured by impact factors. From an average
 researcher's perspective, one article in Nature, fully locked behind
 pay-walls, is what is really valuable. Adding open access may be the cherry
 on the sundae, but it is not the sundae. The result? OA, as of now, is not
 perceived to be directly significant for successfully managing a career.

 On the other 

[GOAL] Re: Fwd: The Open Access Interviews: Paul Royster

2014-09-19 Thread brentier
Liège does not mandate anything, so far as I know; it only looks into the 
local repository (Orbi) to see what is in it, and it does so to assess 
performance or respond to requests for promotions or grant submissions. (JC. 
Guédon)

Oh no, Jean-Claude, Liège mandates everything.
It is a real mandate and it took me a while to get almost every ULg researcher 
to realise that it is to his/her benefit. 
Linking the deposits to personal in-house assessment was the trick to get the 
mandate enforced in the first place. As well as a few positive incentives and a 
lot of time consuming persuasion (but it was well worth it).
Last Wednesday, the Liège University Board has put an ultimate touch of wisdom 
on its mandate by adding immediately upon acceptance, even in restricted 
access in the official procedure. Actually, a nice but to some extent useless 
addition because, with time (the mandate was imposed in 2007), ULg authors have 
become so convinced of the increase in readership and citations that two thirds 
of them make their deposits between the date of acceptance and the date of 
publication. 
All this explains why we are getting close to 90% compliance, an outstanding 
result, I believe. 



 Le 18 sept. 2014 à 23:40, Jean-Claude Guédon 
 jean.claude.gue...@umontreal.ca a écrit :
 
 A reasonably quick response as I do not want to go into discursive tsunami 
 mode...
 
 1. Stevan admits that his evaluation of compliance is an approximation, easy 
 to get, but not easy to correct. This approximation varies greatly from one 
 institution to another, one circumstance to another. For example, he admits 
 that language plays a role; he should further admit that the greater or 
 smaller proportion of SSH researchers in the research communities of various 
 institutions will also play a role. in short, comparing two institutions by 
 simply using WoS approximations appears rash and unacceptable to me, rather 
 than simply quick and dirty (which I would accept as a first approximation).
 
 The impact factor folly was mentioned because, by basing his approximation on 
 the WoS, Stevan reinforces the centrality of a partial and questionable tool 
 that is, at best, a research tool, not a management tool, and which stands 
 behind all the research assessment procedures presently used in universities, 
 laboratories, etc.
 
 2. Stevan and I have long differed about OA's central target. He limits 
 himself to journal articles, as a first step; I do not. I do not because, in 
 the humanities and social sciences, limiting oneself to journal articles 
 would be limiting oneself to the less essential part of the archive we work 
 with, unlike natural scientists. 
 
 Imagine a universe where a research metric would have been initially designed 
 around SSH disciplines and then extended as is to STM. In such a parallel 
 universe, books would be the currency of choice, and articles would look like 
 secondary, minor, productions, best left for later assessments. Then, one 
 prominent OA advocate named Stenan Harvard might argue that the only way to 
 proceed forward is to focus only on books, that this is OA's sole objective, 
 and that articles and the rest will be treated later... Imagine the reaction 
 of science researchers... 
 
 3. Liège does not mandate anything, so far as I know; it only looks into the 
 local repository (Orbi) to see what is in it, and it does so to assess 
 performance or respond to requests for promotions or grant submissions. If 
 books and book chapters are more difficult to treat than articles, then place 
 them in a dark archive with a button. This was the clever solution invented 
 by Stevan and I agree with it.
 
 4. To obtain mandates, you need either faculty to vote a mandate on itself 
 (but few universities have done so), or you need administrators to impose a 
 mandate, but that is often viewed negatively by many of our colleagues. 
 Meanwhile, they are strongly incited to publish in prestigious journals 
 where prestige is measured by impact factors. From an average researcher's 
 perspective, one article in Nature, fully locked behind pay-walls, is what is 
 really valuable. Adding open access may be the cherry on the sundae, but it 
 is not the sundae. The result? OA, as of now, is not perceived to be directly 
 significant for successfully managing a career. 
 
 On the other hand, the OA citation advantage has been fully recognized and 
 accepted by publishers. That is in part why they are finally embracing OA: 
 with high processing charges and the increased citation potential of OA, they 
 can increase revenues even more and satisfy their stakeholders. This is 
 especially true if funders, universities, libraries, etc., are willing to pay 
 for the APC's. This is the trap the UK fell into.
 
 5. SSH authors are less interested in depositing articles than STM 
 researchers because, for SSH researchers, articles have far less importance 
 than books (see above), and, arguably, book 

[GOAL] Re: Fwd: The Open Access Interviews: Paul Royster

2014-09-19 Thread Richard Poynder
Dear Bernard,

 

I have two questions if I may:

 

1.   You say that Liège is getting close to 90% compliance. Can you explain 
how you know that, and how you calculate compliance levels? I ask this because 
the consistent theme coming through from UK universities with regard to 
compliance to the RCUK OA mandate is that they simply do not know how many 
research outputs their faculty produce each year. If that is right, what 
systems does Liège have in place to enable it to produce a comprehensive list 
of research outputs that UK universities apparently do not have?

 

2.   Does Liège track the licences attached to the deposits in its 
repository? If so, can you provide some stats, especially the number of items 
that are available CC-BY (which we are now told is required before a deposit 
can be characterised as being open access?

 

Thank you.

 

 

Richard Poynder

 

 

From: goal-boun...@eprints.org [mailto:goal-boun...@eprints.org] On Behalf Of 
brent...@ulg.ac.be
Sent: 19 September 2014 18:46
To: Global Open Access List (Successor of AmSci)
Subject: [GOAL] Re: Fwd: The Open Access Interviews: Paul Royster

 

Liège does not mandate anything, so far as I know; it only looks into the 
local repository (Orbi) to see what is in it, and it does so to assess 
performance or respond to requests for promotions or grant submissions. (JC. 
Guédon)





Oh no, Jean-Claude, Liège mandates everything.

It is a real mandate and it took me a while to get almost every ULg researcher 
to realise that it is to his/her benefit. 

Linking the deposits to personal in-house assessment was the trick to get the 
mandate enforced in the first place. As well as a few positive incentives and a 
lot of time consuming persuasion (but it was well worth it).

Last Wednesday, the Liège University Board has put an ultimate touch of wisdom 
on its mandate by adding immediately upon acceptance, even in restricted 
access in the official procedure. Actually, a nice but to some extent useless 
addition because, with time (the mandate was imposed in 2007), ULg authors have 
become so convinced of the increase in readership and citations that two thirds 
of them make their deposits between the date of acceptance and the date of 
publication. 

All this explains why we are getting close to 90% compliance, an outstanding 
result, I believe. 





 


Le 18 sept. 2014 à 23:40, Jean-Claude Guédon jean.claude.gue...@umontreal.ca 
mailto:jean.claude.gue...@umontreal.ca  a écrit :

A reasonably quick response as I do not want to go into discursive tsunami 
mode...

1. Stevan admits that his evaluation of compliance is an approximation, easy to 
get, but not easy to correct. This approximation varies greatly from one 
institution to another, one circumstance to another. For example, he admits 
that language plays a role; he should further admit that the greater or smaller 
proportion of SSH researchers in the research communities of various 
institutions will also play a role. in short, comparing two institutions by 
simply using WoS approximations appears rash and unacceptable to me, rather 
than simply quick and dirty (which I would accept as a first approximation).

The impact factor folly was mentioned because, by basing his approximation on 
the WoS, Stevan reinforces the centrality of a partial and questionable tool 
that is, at best, a research tool, not a management tool, and which stands 
behind all the research assessment procedures presently used in universities, 
laboratories, etc.

2. Stevan and I have long differed about OA's central target. He limits himself 
to journal articles, as a first step; I do not. I do not because, in the 
humanities and social sciences, limiting oneself to journal articles would be 
limiting oneself to the less essential part of the archive we work with, unlike 
natural scientists. 

Imagine a universe where a research metric would have been initially designed 
around SSH disciplines and then extended as is to STM. In such a parallel 
universe, books would be the currency of choice, and articles would look like 
secondary, minor, productions, best left for later assessments. Then, one 
prominent OA advocate named Stenan Harvard might argue that the only way to 
proceed forward is to focus only on books, that this is OA's sole objective, 
and that articles and the rest will be treated later... Imagine the reaction of 
science researchers... 

3. Liège does not mandate anything, so far as I know; it only looks into the 
local repository (Orbi) to see what is in it, and it does so to assess 
performance or respond to requests for promotions or grant submissions. If 
books and book chapters are more difficult to treat than articles, then place 
them in a dark archive with a button. This was the clever solution invented by 
Stevan and I agree with it.

4. To obtain mandates, you need either faculty to vote a mandate on itself (but 
few universities have done so), or you need administrators

[GOAL] Re: Fwd: The Open Access Interviews: Paul Royster

2014-09-19 Thread Jean-Claude Guédon
Thank you, Bernard. I should have said, more precisely, that Liège does
not force anything; that it has a mandate and that it is backed up, as
you point out, by the procedures used for in-house research assessment. 

This form of enforcement is very different from that of directly
applying penalties for not conforming, or whatever else has been used
elsewhere. What you are doing, cleverly, is say: if you do not comply,
you will suffer from bad results  in your personal research assessment.

I also believe that this mandate applies to more than journal articles,
or am I wrong? Books and book chapters, so very important for SSH
disciplines, cannot be easily disregarded, and assessing SSH personnel
purely on the basis of journal articles would be a (bad) joke. A dark
archive can take care of all difficulties, and the celebrated button
allows working around most difficulties.

And getting close to 90% is indeed outstanding.

Jean-Claude


-- 

Jean-Claude Guédon
Professeur titulaire
Littérature comparée
Université de Montréal



Le vendredi 19 septembre 2014 à 19:46 +0200, brent...@ulg.ac.be a
écrit :
 Liège does not mandate anything, so far as I know; it only looks into
 the local repository (Orbi) to see what is in it, and it does so to
 assess performance or respond to requests for promotions or grant
 submissions. (JC. Guédon)
 
 
 Oh no, Jean-Claude, Liège mandates everything.
 It is a real mandate and it took me a while to get almost every ULg
 researcher to realise that it is to his/her benefit. 
 Linking the deposits to personal in-house assessment was the trick to
 get the mandate enforced in the first place. As well as a few positive
 incentives and a lot of time consuming persuasion (but it was well
 worth it).
 Last Wednesday, the Liège University Board has put an ultimate touch
 of wisdom on its mandate by adding immediately upon acceptance, even
 in restricted access in the official procedure. Actually, a nice but
 to some extent useless addition because, with time (the mandate was
 imposed in 2007), ULg authors have become so convinced of the increase
 in readership and citations that two thirds of them make their
 deposits between the date of acceptance and the date of publication. 
 All this explains why we are getting close to 90% compliance, an
 outstanding result, I believe. 
 
 
 
 
 
 Le 18 sept. 2014 à 23:40, Jean-Claude Guédon
 jean.claude.gue...@umontreal.ca a écrit :
 
 
  A reasonably quick response as I do not want to go into discursive
  tsunami mode...
  
  1. Stevan admits that his evaluation of compliance is an
  approximation, easy to get, but not easy to correct. This
  approximation varies greatly from one institution to another, one
  circumstance to another. For example, he admits that language plays
  a role; he should further admit that the greater or smaller
  proportion of SSH researchers in the research communities of various
  institutions will also play a role. in short, comparing two
  institutions by simply using WoS approximations appears rash and
  unacceptable to me, rather than simply quick and dirty (which I
  would accept as a first approximation).
  
  The impact factor folly was mentioned because, by basing his
  approximation on the WoS, Stevan reinforces the centrality of a
  partial and questionable tool that is, at best, a research tool, not
  a management tool, and which stands behind all the research
  assessment procedures presently used in universities, laboratories,
  etc.
  
  2. Stevan and I have long differed about OA's central target. He
  limits himself to journal articles, as a first step; I do not. I do
  not because, in the humanities and social sciences, limiting oneself
  to journal articles would be limiting oneself to the less essential
  part of the archive we work with, unlike natural scientists. 
  
  Imagine a universe where a research metric would have been initially
  designed around SSH disciplines and then extended as is to STM. In
  such a parallel universe, books would be the currency of choice, and
  articles would look like secondary, minor, productions, best left
  for later assessments. Then, one prominent OA advocate named Stenan
  Harvard might argue that the only way to proceed forward is to focus
  only on books, that this is OA's sole objective, and that articles
  and the rest will be treated later... Imagine the reaction of
  science researchers... 
  
  3. Liège does not mandate anything, so far as I know; it only looks
  into the local repository (Orbi) to see what is in it, and it does
  so to assess performance or respond to requests for promotions or
  grant submissions. If books and book chapters are more difficult to
  treat than articles, then place them in a dark archive with a
  button. This was the clever solution invented by Stevan and I agree
  with it.
  
  4. To obtain mandates, you need either faculty to vote a mandate on
  itself (but few universities have done so), or you need
  administrators to 

[GOAL] Re: Fwd: The Open Access Interviews: Paul Royster

2014-09-19 Thread Jean-Claude Guédon
I will let readers evaluate whether Stevan's answers are satisfactory or
not. Except for the Liège mandate where I did not express myself
sufficiently precisely, I disagree with points I--III, V-VI.

I agree that point VII deserves being studied more precisely.

For point VIII, part of the 30% (however it is calculated - is it 30% of
WoS articles?) comes from the Gold road, and, therefore, falls under a
different kind of argument. This said, I believe that Liège's solution
is the best one presently available, if you can get it. In countries
where university autonomy is far from being the norm (e.g. France), the
clout of in-house assessments of performance is perforce very limited. 

Promoting the Liège solution is also what I do, and I do so everywhere,
but promoting OA publishing platforms (such as Redalyc and, with some
caveats, Scielo) that are both free and gratis is also what I do. IMHO,
this is superior to promoting only and exclusively the Green road: it
adds to the Green road without subtracting  anything from it. This was
also the spirit of BOAI.

Finally, I do not need any fancy statistical footwork to agree that the
ways and means of the Liège mandate are the best. Common sense is enough
for me. 

Let us get the Liège form of mandate wherever we can (which I am
presently trying to do in my own university), and let us also do all we
can to promote OA for all (including all disciplines).

And I will stop this thread here.

-- 

Jean-Claude Guédon
Professeur titulaire
Littérature comparée
Université de Montréal



Le vendredi 19 septembre 2014 à 13:17 -0400, Stevan Harnad a écrit :
 I. A Web-of-Science-based estimate of Green OA mandate effectiveness —
 i.e., of the annual percentage of institutional journal article output
 that is being self-archived in the institutional repository — is fine.
 So is one based on SCOPUS, or on any other index of annual journal
 article output across disciplines. 
 
 
 II. The fact that books are more important than journals in SSH
 (social science and humanities) in no way invalidates WoS-based
 estimates of Green OA mandate effectiveness. The mandates apply only
 to journal articles.
 
 
 III. Green OA mandates to date apply only to journal articles, not
 books, for many obvious reasons.
 
 
 IV. Jean-Claude writes: “Liège does not mandate anything, so far as I
 know.” 
 
 
 Cf:  “The University of Liege policy is mandatory… the
 Administrative Board of the University has decided to make it
 mandatory for all ULg members: - to deposit the bibliographic
 references of ALL their publications since 2002; - to deposit
 the full text of ALL their articles published in periodicals
 since 2002…” http://roarmap.eprints.org/56/
 
 
 V. The fact that research metrics are currently mostly journal-article
 based has nothing to do with the predictive power of estimates of
 Green OA mandate effectiveness.
 
 
 VI. The WoS-based estimate of Green OA mandate effectiveness has
 nothing to do with “impact factor folly.” 
 
 
 VII. Jean-Claude writes:“SSH authors are less interested in depositing
 articles than STM researchers.” 
 
 
 As far as I know, there is not yet any objective evidence
 supporting this assertion. In fact, we are in the process of
 testing it, using the WoS data.
 
 
 VIII. Status quo: OA to journal articles is around 30% today. Our
 practical solution: Green OA mandates (and tests for which kinds of
 mandate are most effective) so they can be promoted for adoption.
 Other practical solutions?
 
 
 Stevan Harnad
 
 
 
 
 On Thu, Sep 18, 2014 at 5:17 PM, Jean-Claude Guédon
 jean.claude.gue...@umontreal.ca wrote:
 
 A reasonably quick response as I do not want to go into
 discursive tsunami mode...
 
 1. Stevan admits that his evaluation of compliance is an
 approximation, easy to get, but not easy to correct. This
 approximation varies greatly from one institution to another,
 one circumstance to another. For example, he admits that
 language plays a role; he should further admit that the
 greater or smaller proportion of SSH researchers in the
 research communities of various institutions will also play a
 role. in short, comparing two institutions by simply using WoS
 approximations appears rash and unacceptable to me, rather
 than simply quick and dirty (which I would accept as a first
 approximation).
 
 The impact factor folly was mentioned because, by basing his
 approximation on the WoS, Stevan reinforces the centrality of
 a partial and questionable tool that is, at best, a research
 tool, not a management tool, and which stands behind all the
 research assessment procedures presently used in universities,
 laboratories, etc.
 
 2. Stevan and I have long differed about OA's central target.
 He 

[GOAL] Re: Fwd: The Open Access Interviews: Paul Royster

2014-09-18 Thread Stevan Harnad
On Wed, Sep 17, 2014 at 10:53 AM, Jean-Claude Guédon 
jean.claude.gue...@umontreal.ca wrote:

  Most interesting dialogue.

 I will focus on two points:

 1. *Using the Web of Science collection as a reference*: this generates
 all kinds of problems, particularly for disciplines that are not dominated
 and skewed by the impact factor folly. This is true, for example, of most
 of the social sciences and the humanities, especially when these
 publications are not in English.


The purpose of using WoS (or SCOPUS, or any other standardized index) as a*
baseline* for assessing OA repository success is to be able to estimate
(and compare) *what percentage of an institution's total annual refereed
journal article output has been self-archived. *

Raw total or annual deposit counts tell us neither (1) whether the deposits
are refereed journal articles nor (2) when the articles were published, nor
(most important of all) (3) what proportion of total annual refereed
journal article output is deposited.

Institutions do not know even know their total annual refereed journal
article output. (One of the (many) reasons for mandating self-archiving is
in order to get that information.)

The WoS (or SCOPUS, or other) standardized database provides the
denominator against which the deposits of those articles provide the
numerator.

Once that ratio is known (for WoS articles, for example), it provides an
estimate of the proportion of total institutional article output deposited.

Anyone can then correct the ratio for their institution and discipline,
if they wish, by simply taking a (large enough) sample of total
institutional journal article output for a recent year and seeing what
percentage of it is in WoS! (This would obviously have to be done
discipline by discipline; and indeed the institutional totals should also
be broken down and analyzed by discipline.)

So if  D/W, the WoS-deposit/total-WoS ratio = R, and w/s, the
WoS-indexed-portion/total-output-sample = c, then c can be used to upgrade
W to the estimate of total institutional article output, and the WoS
deposit ratio R can be compared to the deposit ratio for the non-WoS sample
(*which must not, of course, be derived from the repository, but some other
way!*) to get a non-WoS ratio of Rc.

My own prediction is that R and Rc will be quite similar, but if not, c can
also be used to correct R to better reflect both WoS and non-WoS output and
their relative sizes.

But R is still by far the easiest and fastest way to get an estimate of
institutional deposit percentages.

(As far as I can see, none of this has much to do with impact factor folly.
For non-English-language institutions, however, the non-WoS correction may
be more substantial.)

Stevan has also and long argued about limiting oneself to journal articles.
 I have my own difficulties with this limitation because book chapters and
 monographs are so important in the disciplines that I tend to work in.
 Also, I regularly write in French as well as English, while reading
 articles in a variety of languages. Most of the articles that are not in
 English are not in the Web of Science. A better way to proceed would be to
 check if the journals not in the WoS, and corresponding to deposited
 articles, are peer-reviewed. The same could be done with book chapters.
 Incidentally, if I limited myself to WoS publications for annual
 performance review, I would look rather bad. I suspect I am not the only
 one in such a situation, while leading a fairly honourable career in
 academe.


Authors are welcome to deposit as much as they like: articles, chapters,
books, data, software.

But OA's primary target (and also its primary obstacle) is journal
articles. Ditto for OA mandates.

All disciplines, including the social sciences and humanities, in all
languages, write journal articles. This discussion is about the means of
measuring the success of an OA self-archiving mandate. It applies to all
journal articles (and refereed conference articles) in all disciplines.

There are problems with mandating book deposit, or even book chapter
deposit, so that is being left for later.

Nothing is being said about performance review except that the way to
submit journal articles should be stipulated to be repository deposit.


 2. *The issue of rules and regulations.* It is absolutely true that a
 procedure such as the one adopted at the Université de Liège and which
 Stevan aptly summarizes as (with a couple of minor modifications): 
 *henceforth
 the way to submit refereed* *journal article** publications for annual
 performance review is to deposit them in the [appropriate] IR *.


Liège does not mandate the deposit of books.


 However, obtaining this change of behaviour from an administration is no
 small task. At the local, institutional, level, it corresponds to a
 politically charged effort that requires having a number of committed OA
 advocates working hard to push the idea. Stevan should know this from his
 own 

[GOAL] Re: Fwd: The Open Access Interviews: Paul Royster

2014-09-18 Thread Jean-Claude Guédon
A reasonably quick response as I do not want to go into discursive
tsunami mode...

1. Stevan admits that his evaluation of compliance is an approximation,
easy to get, but not easy to correct. This approximation varies greatly
from one institution to another, one circumstance to another. For
example, he admits that language plays a role; he should further admit
that the greater or smaller proportion of SSH researchers in the
research communities of various institutions will also play a role. in
short, comparing two institutions by simply using WoS approximations
appears rash and unacceptable to me, rather than simply quick and dirty
(which I would accept as a first approximation).

The impact factor folly was mentioned because, by basing his
approximation on the WoS, Stevan reinforces the centrality of a partial
and questionable tool that is, at best, a research tool, not a
management tool, and which stands behind all the research assessment
procedures presently used in universities, laboratories, etc.

2. Stevan and I have long differed about OA's central target. He limits
himself to journal articles, as a first step; I do not. I do not
because, in the humanities and social sciences, limiting oneself to
journal articles would be limiting oneself to the less essential part of
the archive we work with, unlike natural scientists. 

Imagine a universe where a research metric would have been initially
designed around SSH disciplines and then extended as is to STM. In such
a parallel universe, books would be the currency of choice, and articles
would look like secondary, minor, productions, best left for later
assessments. Then, one prominent OA advocate named Stenan Harvard might
argue that the only way to proceed forward is to focus only on books,
that this is OA's sole objective, and that articles and the rest will be
treated later... Imagine the reaction of science researchers... 

3. Liège does not mandate anything, so far as I know; it only looks into
the local repository (Orbi) to see what is in it, and it does so to
assess performance or respond to requests for promotions or grant
submissions. If books and book chapters are more difficult to treat than
articles, then place them in a dark archive with a button. This was the
clever solution invented by Stevan and I agree with it.

4. To obtain mandates, you need either faculty to vote a mandate on
itself (but few universities have done so), or you need administrators
to impose a mandate, but that is often viewed negatively by many of our
colleagues. Meanwhile, they are strongly incited to publish in
prestigious journals where prestige is measured by impact factors.
From an average researcher's perspective, one article in Nature, fully
locked behind pay-walls, is what is really valuable. Adding open access
may be the cherry on the sundae, but it is not the sundae. The result?
OA, as of now, is not perceived to be directly significant for
successfully managing a career. 

On the other hand, the OA citation advantage has been fully recognized
and accepted by publishers. That is in part why they are finally
embracing OA: with high processing charges and the increased citation
potential of OA, they can increase revenues even more and satisfy their
stakeholders. This is especially true if funders, universities,
libraries, etc., are willing to pay for the APC's. This is the trap the
UK fell into.

5. SSH authors are less interested in depositing articles than STM
researchers because, for SSH researchers, articles have far less
importance than books (see above), and, arguably, book chapters.

6. I am not citing rationales for the status quo, and Stevan knows this
well. This must be the first time that I have ever been associated with
the status quo... Could it be that criticizing Stevan on one point could
be seen by him as fighting for the status? But that would be true only
if Stevan were right beyond the slightest doubt. Hmm!

I personally think he is right on some points and not so right on
others. 

Also, I am simply trying to think about reasons why OA has been so hard
to achieve so far, and, in doing so, I have come to two conclusions: too
narrow an objective and too rigid an approach can both be
counter-productive.

This said, trying to have a method to compare deposit rates in various
institutional and mandate circumstances would be very useful. I support
Stevan's general objective in this regard; I simply object to the
validity of the method he suggests. Alas, I have little to suggest
beyond my critique. 

I also suggest that  a better understanding of the sociology of research
(not the sociology of knowledge) is crucial to move forward.

Finally, I expect that if I saw Stevan self-archive his abundant
scientific production, I would be awed by the lightning speed of his
keystrokes. But are they everybody's keystrokes?

Jean-Claude Guédon

 
-- 

Jean-Claude Guédon
Professeur titulaire
Littérature comparée
Université de Montréal



Le jeudi 18 

[GOAL] Re: Fwd: The Open Access Interviews: Paul Royster

2014-09-17 Thread Richard Poynder
From: Stevan Harnad har...@ecs.soton.ac.uk mailto:har...@ecs.soton.ac.uk 

 

On Sep 16, 2014, at 2:30 PM, Sue Gardner sgardn...@unl.edu 
mailto:sgardn...@unl.edu  wrote:

 

Stevan,

Apologies for a delayed response. I have been meaning to reply, and now have 
time.

You have asked some questions of us at UNL. Paul Royster may reply, as well. 
These are my thoughts.

(1) What percentage of Nebraska-Lincoln output of peer-revewed journal 
articles (only) per year is deposited in the N-L Repository?
(Without that figure, there is no way of knowing how well N-L is doing, 
compared to other institutional repositories, mandated or unmandated.)

You are requesting a certain metric and claiming that it is the only valid one. 
We have approximately 75,000 items in our repository, almost all of which can 
be read freely by anyone with an Internet connection. We also have several 
dozen monographs under our own imprint, and we host several journals. We don't 
devote too much of our time to analyzing our metrics, in part because we are a 
staff of three (as of two weeks ago--before which we were a staff of two), and 
we spend much of our time getting content into the repository in favor of 
administrative activities. Personally, I welcome anyone to analyze our output 
by any measure and I will be interested to know the result, but that 
information won't change our day-to-day activities, so it would remain off to 
the side of what we're doing.

 

Sue, 

 

I mentioned it because UNL was being described as one of the biggest and most 
successful Institutional Repositories (IRs). This may be true if IR success is 
gauged by total contents, regardless of type. But if it is about success for 
OA’s target contents — which are first and foremost refereed journal articles — 
then there is no way to know how UNL compares with other IRs unless the 
comparison is based on the yearly proportion of UNL yearly refereed journal 
article output that is being deposited in UNC’s IR (and when).

 

I might add that the question is all the more important as the success of UNC’s 
IR was being adduced as evidence that an OA mandate is not necessary for IR 
(OA) success.

 

Stevan Harnad




 

Here, I fear, we bump up against another of the many confusions and 
disagreements surrounding open access: what is an institutional repository, and 
what should be its aims and purpose?

 

I do not think the 2002 Budapest Open Access Initiative uses the term 
“institutional repository”, rather it proposes that papers be deposited in 
“open electronic archives”. 

 

http://www.budapestopenaccessinitiative.org/read

 

Stevan Harnad’s 1994 “Subversive Proposal” urged researchers to archive their 
papers in “globally accessible local ftp archives”.

 

http://babel.hathitrust.org/cgi/pt?id=mdp.39015034923758;view=1up;seq=24

 

I would think the seminal text on institutional repositories was the paper 
written by Raym Crow in 2002 (“The Case for Institutional Repositories: A SPARC 
Position Paper”). 

 

Crow defined institutional repositories as “digital collections capturing and 
preserving the intellectual output of a single or multiple-university 
community.” 

 

Their role, he suggested, should be twofold. First: to “Provide a critical 
component in reforming the system of scholarly communication--a component that 
expands access to research, reasserts control over scholarship by the academy, 
increases competition and reduces the monopoly power of journals, and brings 
economic relief and heightened relevance to the institutions and libraries that 
support them;

 

Second: to “serve as tangible indicators of a university’s quality and to 
demonstrate the scientific, societal, and economic relevance of its research 
activities, thus increasing the institution’s visibility, status, and public 
value.”

 

http://www.sparc.arl.org/sites/default/files/media_files/instrepo.pdf

 

But today I would think that when defining the term “institutional repository” 
most people (especially librarians) refer to a document authored by Clifford 
Lynch in 2003 (“Institutional Repositories: Essential Infrastructure for 
Scholarship in the Digital Age”).

 

Lynch described an institutional repository as “a set of services that a 
university offers to the members of its community for the management and 
dissemination of digital materials created by the institution and its community 
members. It is most essentially an organizational commitment to the stewardship 
of these digital materials, including long-term preservation where appropriate, 
as well as organization and access or distribution.”

 

http://www.arl.org/storage/documents/publications/arl-br-226.pdf

 

The above, for instance, is how Cambridge University defines an institutional 
repository, see:

 

http://www.lib.cam.ac.uk/repository/about/about_institutional_repositories.html

 

Speaking to me in 2006, Lynch said, “If all you want to do is author 
self-archiving, I suspect that there are 

[GOAL] Re: Fwd: The Open Access Interviews: Paul Royster

2014-09-17 Thread Stevan Harnad

Begin forwarded message:

 From: Stevan Harnad har...@ecs.soton.ac.uk
 Subject: Re: The Open Access Interviews: Paul Royster
 Date: September 16, 2014 at 5:28:48 PM GMT-4
 To: jisc-repositor...@jiscmail.ac.uk
 
 On Sep 16, 2014, at 2:46 PM, Paul Royster proyst...@unl.edu wrote:
 
 At the risk of stirring up more sediment and further muddying the waters of 
 scholarly communications,
 but in response to direct questions posed in this venue earlier this month, 
 I shall venture the following …
 
 Answers for Dr. Harnad
 
 (1) What percentage of Nebraska-Lincoln output of peer-revewed journal 
 articles (only) per year is
 deposited in the N-L Repository? About 3 months ago I furnished your 
 graduate student (at least he
 said he was your student) with 5 years of deposit data so he could compare 
 it to Web of Science
 publication dates and arrive at some data-based figure for this. I cautioned 
 him that I felt Web of
 Science to be a narrow and commercially skewed comparison sample, but I sent 
 the data anyway.
 So I expect you will have an answer to this query before I do. If the news 
 is good, I hope you will
 share it with this list; if not, then let your conscience be your guide. As 
 for benchmarking, I don’t believe
 it is a competition, and every step in the direction of free scholarship is 
 a positive one. I hope when
 they hand out the medals we at least get a ribbon for participation.
 
 Thanks for reminding me! It was my post-doc, Yassine Gargouri, and I just 
 called him to ask about
 the UNL results. He said he has the UNL data and will have the results of the 
 analysis in 2-3 weeks!
 
 So the jury is still out. But many thanks for sending the data. Apparently 
 Sue was not aware that UNL
 had provided those data (and I too had forgotten!).
 
 (2) Why doesn’t N-L adopt a self-archiving mandate? 
 I do not even attempt to explain the conduct of the black box that is my 
 university’s administration;
 so in short, I cannot say why or why not. I can only say why I have not 
 campaigned for adoption of
 such a mandate.  My reasons have been purely personal and idiosyncratic, and 
 I do not hold them
 up as a model for anyone else or as representing the thinking or attitude of 
 this university. Bluntly,
 I have not sought to create a mandate because I feel there are enough 
 regulations and requirements
 in effect here already. Instituting more rules brings further problems of 
 enforcement or compliance,
 and it creates new categories of deviance. There are already too many rules: 
 we have to park in
 designated areas; we have to drink Pepsi rather than Coke products; we have 
 to wear red on game
 days; we can’t enter the building through the freight dock; etc. etc. etc. I 
 simply do not believe in
 creating more rules and requirements, even if they are for our own good. The 
 Faculty Senate
 voted to “endorse and recommend” our repository; I have not desired more 
 than that. But I am
 concerned mainly with 1600 faculty on two campuses in one medium-sized 
 university town—not
 with a universal solution to the worldwide scholarly communications crisis. 
 I see discussions lately
 about “putting teeth” into mandated deposit rules, and I wonder—who is 
 intended to be bitten?
 Apparently, the already-beleaguered faculty.
 
 I agree that we are over-regulated! But I think that doing a few extra 
 keystrokes when a refereed
 final draft is accepted for publication is really very little, and the 
 potential benefits are huge. Also,
 there is some evidence as to how authors comply with a self-archiving mandate 
 — if it’s the right
 self-archiving mandate, i.e., If the mandate simply indicates that henceforth 
 the way to submit refereed
 journal article publications for annual performance review is to deposit them 
 in UNL’s IR (rather than
 however they are being submitted currently) then UNL faculty will comply as 
 naturally as they did
 when it was mandaed that submissions should be online rather than in hard 
 copy. It’s just a technological upgrade.
 
 (3) Why do you lump together author-pays with author-self-archives?
 I was not aware that I did this, so perhaps you are responding to Sue’s 
 catalog of various proposed
 solutions—“author-pays OA, mandated self-archiving of manuscripts, CHORUS, 
 SHARE, and others”—as
 all being “ineffectual or unsustainable initiatives to varying degrees.” I 
 feel we are strong believers and
 even advocates for author self-archiving (so-called), and disdainful 
 non-advocates for author-pays models.
 But I think we have become aware of the divergence of interests between the 
 global theoretics of the
 open access “movements” on the one hand and the “boots-on-the-ground” 
 practicalities of managing
 a local repository, even one with global reach, on the other. Crusades for 
 and controversies about
 “open access” have come to seem far removed from what we actually do, and 
 now seem more of a
 distraction than a help or guide.
 
 I can understand 

[GOAL] Re: Fwd: The Open Access Interviews: Paul Royster

2014-09-17 Thread Jean-Claude Guédon
Most interesting dialogue.

I will focus on two points:

1. Using the Web of Science collection as a reference: this generates
all kinds of problems, particularly for disciplines that are not
dominated and skewed by the impact factor folly. This is true, for
example, of most of the social sciences and the humanities, especially
when these publications are not in English. 

Stevan has also and long argued about limiting oneself to journal
articles. I have my own difficulties with this limitation because book
chapters and monographs are so important in the disciplines that I tend
to work in. Also, I regularly write in French as well as English, while
reading articles in a variety of languages. Most of the articles that
are not in English are not in the Web of Science. A better way to
proceed would be to check if the journals not in the WoS, and
corresponding to deposited articles, are peer-reviewed. The same could
be done with book chapters. Incidentally, if I limited myself to WoS
publications for annual performance review, I would look rather bad. I
suspect I am not the only one in such a situation, while leading a
fairly honourable career in academe.

2. The issue of rules and regulations. It is absolutely true that a
procedure such as the one adopted at the Université de Liège and which
Stevan aptly summarizes as (with a couple of minor modifications):
henceforth the way to submit refereed journal article publications for
annual performance review is to deposit them in the [appropriate] IR .
However, obtaining this change of behaviour from an administration is no
small task. At the local, institutional, level, it corresponds to a
politically charged effort that requires having a number of committed OA
advocates working hard to push the idea. Stevan should know this from
his own experience in Montreal; he should also know that, presently, the
Open Access issue is not on the radar of most researchers. In scientific
disciplines, they tend to be mesmerized by impact factors without making
the link between this obsession and the OA advantage, partly because
enough controversies have surrounded this issue to maintain a general
feeling of uncertainty and doubt. In the social sciences and humanities
where the citation rates are far less meaningful - I put quotation
marks here to underscore the uncertainty surrounding the meaning of
citation numbers: visibility, prestige, quality? - the benefits of
self-archiving one's articles in open access are less obvious to
researchers, especially if they do not adopt a global perspective on the
importance of the grand conversation needed to produce knowledge in an
optimal manner, but rather intend to manage and protect their career.

Saying all this is not saying that we should not remain committed to OA,
far from it; is is simply saying that the chances of success in reaching
OA will not be significantly improved by simply referring to huge
benefits at the cost of only a few extra keystrokes. This is rhetoric.
The last time I deposited an article of mine, given the procedure used
in the depository I was using, it took me close to half an hour to enter
all the details required by that depository - a depository organized by
librarians, mainly for information science specialists. All these
details were legitimate and potentially useful.  However, while I was
absolutely sure I was doing the right thing, I could well understand why
a colleague less sanguine about OA than I am might push this task to the
back burner. In fact, I did so myself for several months. Shame on me,
probably, but this is the reality of the quotidian.

In conclusion, i suspect that if Stevan focuses on such a
narrowly-defined target - journal articles in the STM disciplines - this
is because he gambles on the fact that making these disciplines fully OA
would force the other disciplines in the humanities and social sciences
to follow suit sooner or later. Perhaps, it is so, but perhaps it is
not. Meanwhile, arguing in this fashion tends to alienate practitioners
of the humanities and the social sciences, so that the alleged
advantages of narrowly focusing on a well-defined target are perhaps
more than negatively compensated by the neglect of SSH disciplines. yet,
the latter constitute about half, if not more, of the researchers in the
world.

-- 

Jean-Claude Guédon
Professeur titulaire
Littérature comparée
Université de Montréal



Le mercredi 17 septembre 2014 à 07:07 -0400, Stevan Harnad a écrit :

 
 
 Begin forwarded message:
 
 
  From: Stevan Harnad har...@ecs.soton.ac.uk
  
  Subject: Re: The Open Access Interviews: Paul Royster
  Date: September 16, 2014 at 5:28:48 PM GMT-4
  
  To: jisc-repositor...@jiscmail.ac.uk
  
  
  
  On Sep 16, 2014, at 2:46 PM, Paul Royster proyst...@unl.edu wrote:
  
  
  
   At the risk of stirring up more sediment and further muddying the
   waters of scholarly communications,
   but in response to direct questions posed in this venue earlier
   this month, I shall 

[GOAL] Re: Fwd: The Open Access Interviews: Paul Royster, Coordinator of Scholarly Communications, University of Nebraska-Lincoln

2014-09-04 Thread Heather Morrison
The gap here may be the service provided by the repository. How easy or 
difficult is it to deposit in your institutional subject repository? In my 
experience this is often much more difficult than it needs to be. 

If faculty or students upload their work, make sure this doesn't take any more 
than a few keystrokes and give them the URL to share right away (that's a 
service we get from this), not a few days or weeks later after you've checked 
the metadata and copyright. Make copyright the responsibility of the person 
doing the deposit, not the library or repository, or offer this service as an 
option with the delay this entails.

From this service perspective I can see the benefits of initiatives like 
PeerLibrary. In the long run we are all better off with professionally run 
open access archives to look after preservation and participation in relevant 
standards, but when institutional services are too hard to use there is a lot 
to be said for DIY.

A lot of my own informal scholarly work is posted on my blogs using Google 
Blogger or my new Wordpress blog. Neither Google nor Wordpress has any 
obligation to make sure that this work continues to be available, so this makes 
my work vulnerable, but at least it's a way to get the work out there.

My perspective is that libraries need to understand that this is the collection 
of the future and develop programs and services to support this work rather 
than trying to fit author self-archiving into traditional publishing. The 
questions should not be, are you allowed to deposit this in the IR given 
publisher copyright? but rather are you allowed to transfer all copyright to 
publishers given your obligation to the public to share your work through the 
IR? The strong institutional deposit mandate (as Stevan recommends) is a good 
way to change this question at every university. 

Green policies provide the groundwork for open access publishing to happen. 
Once you have incentive to look for publishers that provide good dissemination 
practices, you have incentive to choose open access journals (all else being 
reasonably equal).

best,

Heather Morrison


On 2014-09-03, at 9:40 PM, Stevan Harnad wrote:

 Begin forwarded message:
 
 From: Stevan Harnad har...@ecs.soton.ac.uk
 Subject: Re: The Open Access Interviews: Paul Royster, Coordinator of 
 Scholarly Communications, University of Nebraska-Lincoln
 Date: September 3, 2014 at 9:25:39 PM GMT-4
 To: jisc-repositor...@jiscmail.ac.uk
 
 Three questions for Nebraska-Lincoln (N-L) Libraries, in order of importance:
 
 (1) What percentage of Nebraska-Lincoln output of peer-revewed journal 
 articles (only) per year is deposited in the N-L Repository?
 
 (Without that figure, there is no way of knowing how well N-L is doing, 
 compared
 to other institutional repositories, mandated or unmandated.)
 
 Simple way to estimate the above (but you have to keep track of both the 
 publication date and the deposit date): Sample total annual N-L output from
 WoS or SCOPUS and then test what percentage of it is deposited (and
 when). That can be benchmarked against other university repositories.
 
 (2) Why doesn’t N-L adopt a self-archiving mandate? 
 
 The right mandate — immediate-deposit of all refereed final drafts 
 immediately upon acceptance for publication — plus the request-copy 
 Button during any allowable publisher embargo interval — works 
 (especially if librarians keep mediating during the start-up and if
 deposit is designated as the sole means of submitting articles for 
 performance-review). Try it.
 
 (3) Why do you lump together author-pays with author-self-archives?
 
 They’re opposites… Only one of them is objectively describable as
 the author bearing the brunt” (and that’s having to shell out a lot
 of money — not just do a few extra keystrokes -- or else give up 
 journal-choice).
 
 Stevan Harnad
 
 On Sep 3, 2014, at 3:53 PM, Sue Gardner sgardn...@unl.edu wrote:
 
 As repository managers, many of us are having trouble envisioning getting 
 from where we are currently to what the original OA movement idealistically 
 proposed. This is due to the practical constraints we are faced with (such 
 as restrictive publishers’ policies including not allowing posting of 
 published versions even a decade and more after publication, lack of ready 
 access to authors’ manuscripts, etc.). The solutions being offered to move 
 toward the initial goal include author-pays OA, mandated self-archiving of 
 manuscripts, CHORUS, SHARE, and others, which are—from my standpoint as a 
 repository manager—one-and-all ineffectual or unsustainable initiatives to 
 varying degrees.
  
 In populating our repository within the varied constraints, and in offering 
 non-mandated, mediated deposit, at the University of Nebraska-Lincoln we 
 are taking a bottom-up approach to access (from the author to the reader) 
 and, as Paul Royster has pointed out, it leaves us in the odd position of 
 actually standing outside the