[GOAL] Re: Fwd: The Open Access Interviews: Paul Royster

2014-09-21 Thread brentier


 Le 21 sept. 2014 à 07:51, Jean-Claude Guédon 
 jean.claude.gue...@umontreal.ca a écrit :
 
 Finally, given that all universities require, an annual assessment of 
 performance, including a bibliography of publications in the completed year, 
 would it be difficult to compare the repository's holding against the 
 publications of the researchers as declared by them? Knowing researchers, 
 every last little scrap of paper will be minutely listed in the yearly 
 assessment forms...  

This should be very easy in Liège, considering that ORBi being the only 
resource for official assessment, everything the author considers worth 
mentioning in his/her own production is in there, of course.
So, your suggestion to evaluate how many items are in ORBi and are not in 
WoS/Scopus is a good advice.




 Le 21 sept. 2014 à 07:51, Jean-Claude Guédon 
 jean.claude.gue...@umontreal.ca a écrit :
 
 Extremely good answer, Bernard!
 
 It is also very good to clarify the fact that the 90% figure is calculated 
 against the baseline of a combined WoS_Scopus search. However, and this was 
 part of my difficulties with Stevan's argument, I suspect that in SSH, in a 
 French-speaking university, many publish in French-language journals that do 
 not appear in either list. This means that, for Liège, the baseline works 
 from one year to the next, but if you want to compare Liège's mandate and its 
 effectiveness (which, once again, I agree, is - from common sense - the best) 
 with another kind of mandate in an English-speaking university, the baselines 
 will not be comparable.
 
 If, furthermore, you imagine two universities that not only differ 
 linguistically, but also differ in the relative weight of disciplines in 
 research output - say one heavily slanted STM and the other heavily slanted 
 SSH, this too will affect the baseline simply by virtue of the fact that SSH 
 publications are not as well covered by WoS and Scopus as are STM 
 publications.
 
 In conclusion, the baseline is OK for comparisons of a mandate's 
 effectiveness longitudinally, of for comparison purposes of two successive, 
 but different, mandates, assuming the institution remains pretty much the 
 same over time in terms of mix of research emphases; it is far more 
 questionable across institutions, especially when different languages are 
 involved (but not only).
 
 Incidentally, what proportion of papers deposited in ORBI do not appear in 
 either WoS or Scopus? That too would be interesting to know as it might help 
 Stevan refine his baseline and thus make it more convincing.
 
 Finally, given that all universities require, an annual assessment of 
 performance, including a bibliography of publications in the completed year, 
 would it be difficult to compare the repository's holding against the 
 publications of the researchers as declared by them? Knowing researchers, 
 every last little scrap of paper will be minutely listed in the yearly 
 assessment forms... face-smile.png 
 
 --
 Jean-Claude Guédon
 Professeur titulaire
 Littérature comparée
 Université de Montréal
 
 
 Le samedi 20 septembre 2014 à 19:10 +0200, Bernard Rentier - IMAP a écrit :
 Dear Richard,
 Here are the answers:
 1. ORBi, the Liège University Repository, will soon (I believe) reach 90% 
 compliance. It is our target for 2014 and I hope we make it.
 This figure comes from the calculation of the percentage of ULg papers that 
 can be found in Web of Science and/or in Scopus that are deposited in ORBi 
 as well (see method in  http://eprints.soton.ac.uk/340294/)
 It concerns one year at a time and it is not cumulative. Last May, the 
 compliance level for the publications of 2013 was already 73% and our figure 
 for 2012 is in the 80% range.
 2. Only a small proportion of ULg papers are in CC-BY.
 This is simply because, in order to publish in the journal of their choice 
 (I haven’t tried to do anything against that!), our authors, in the great 
 centuries-old tradition, give away their rights to the publisher. We have no 
 control on that.
 Later on, there is no way for them to CC-BY the same text (in fact, we are 
 preparing ORBi 2.0, that will offer a CC-BY choice).
 For now, we are aiming at free access and we are not yet fighting hard for 
 re-use rights. We shall move progressively in this direction of course, 
 while the publishing mores evolve…
 In other words, I agree that we have free access, not a full fledge open 
 access yet. It is not a failure, it is our objective to gain confidence 
 first.
 Unfortunately, even if we have established in-house rules for evaluation, 
 external evaluations are still based on traditional indicators such as the 
 highly and rightfully criticized but widely used Impact Factor and the like. 
 In these conditions, today we cannot sacrifice our researchers — singularly 
 the young ones — in the overall competition for jobs and funds, on the altar 
 of « pure » Open Access.
 Best wishes
 Bernard Rentier
 Rector, University of 

[GOAL] Re: Fwd: The Open Access Interviews: Paul Royster

2014-09-21 Thread brentier
Il agree, Richard, but we are not really looking for accuracy here, we are 
looking for a general trend. The method is approximative and, as Jean-Claude 
mentions rightfully, it suffers a terrible language and domain bias. In other 
words, it is plagued by a strong underestimation.
Whether ORBi's compliance level is 70, 80 or 90% is not a major concern to me 
(even though I would love it to reach 100%!), I must admit. I am satisfied to 
observe that it is very high and not in the 15-30% range which is what happens 
when a mandate is not being enforced by a link to assessment procedures.

Bernard


 Le 21 sept. 2014 à 10:51, Richard Poynder ri...@richardpoynder.co.uk a 
 écrit :
 
 As a layperson I would certainly be interested to know what margin of error 
 levels we can assume the “Web of Science and/or in Scopus” approach has. I am 
 conscious, for instance, that some of the reports by UK universities into 
 RCUK compliance mention using Web of Science, but they all appear keen to 
 stress that they have serious concerns about data accuracy.
  
 A list of RCUK compliance reports, by the way, can be found here: 
 http://goo.gl/Yi3twT
  
 There is also a very informative blog post on the topic of monitoring open 
 access mandates/policies by Cameron Neylon here: http://goo.gl/Y02S87
  
 Richard Poynder
  
  
  
  
 From: goal-boun...@eprints.org [mailto:goal-boun...@eprints.org] On Behalf Of 
 Jean-Claude Guédon
 Sent: 20 September 2014 23:27
 To: Global Open Access List (Successor of AmSci)
 Subject: [GOAL] Re: Fwd: The Open Access Interviews: Paul Royster
  
 Extremely good answer, Bernard!
 
 It is also very good to clarify the fact that the 90% figure is calculated 
 against the baseline of a combined WoS_Scopus search. However, and this was 
 part of my difficulties with Stevan's argument, I suspect that in SSH, in a 
 French-speaking university, many publish in French-language journals that do 
 not appear in either list. This means that, for Liège, the baseline works 
 from one year to the next, but if you want to compare Liège's mandate and its 
 effectiveness (which, once again, I agree, is - from common sense - the best) 
 with another kind of mandate in an English-speaking university, the baselines 
 will not be comparable.
 
 If, furthermore, you imagine two universities that not only differ 
 linguistically, but also differ in the relative weight of disciplines in 
 research output - say one heavily slanted STM and the other heavily slanted 
 SSH, this too will affect the baseline simply by virtue of the fact that SSH 
 publications are not as well covered by WoS and Scopus as are STM 
 publications.
 
 In conclusion, the baseline is OK for comparisons of a mandate's 
 effectiveness longitudinally, of for comparison purposes of two successive, 
 but different, mandates, assuming the institution remains pretty much the 
 same over time in terms of mix of research emphases; it is far more 
 questionable across institutions, especially when different languages are 
 involved (but not only).
 
 Incidentally, what proportion of papers deposited in ORBI do not appear in 
 either WoS or Scopus? That too would be interesting to know as it might help 
 Stevan refine his baseline and thus make it more convincing.
 
 Finally, given that all universities require, an annual assessment of 
 performance, including a bibliography of publications in the completed year, 
 would it be difficult to compare the repository's holding against the 
 publications of the researchers as declared by them? Knowing researchers, 
 every last little scrap of paper will be minutely listed in the yearly 
 assessment forms... image001.png
 
 --
  
 Jean-Claude Guédon
 Professeur titulaire
 Littérature comparée
 Université de Montréal
 Le samedi 20 septembre 2014 à 19:10 +0200, Bernard Rentier - IMAP a écrit :
 
 Dear Richard,
  
 
 Here are the answers:
  
 
 1. ORBi, the Liège University Repository, will soon (I believe) reach 90% 
 compliance. It is our target for 2014 and I hope we make it.
 This figure comes from the calculation of the percentage of ULg papers that 
 can be found in Web of Science and/or in Scopus that are deposited in ORBi as 
 well (see method in  http://eprints.soton.ac.uk/340294/)
 It concerns one year at a time and it is not cumulative. Last May, the 
 compliance level for the publications of 2013 was already 73% and our figure 
 for 2012 is in the 80% range.
  
 
 2. Only a small proportion of ULg papers are in CC-BY.
 This is simply because, in order to publish in the journal of their choice (I 
 haven’t tried to do anything against that!), our authors, in the great 
 centuries-old tradition, give away their rights to the publisher. We have no 
 control on that.
 Later on, there is no way for them to CC-BY the same text (in fact, we are 
 preparing ORBi 2.0, that will offer a CC-BY choice).
 For now, we are aiming at free access and we are not yet fighting hard for 
 re-use rights. We shall move 

[GOAL] Estimating institutional total annual institutional article output vs estimating percentage deposit

2014-09-21 Thread Stevan Harnad
Using total institutional Web-of-Science-indexed article output as the
baseline with which to compare instutional deposit percentages yields an
underestimate (as we always state with our findings) *of total
institutional article output*, of course, by definition.

But it is not at all clear that there is any reason to assume that that
this is an underestimate of the *deposit percentage*. (Why would authors be
more compliant with a deposit mandate with non-WoS journal articles than
with WoS journal articles?)

As I have already replied to Jean-Claude, if someone believes that the
deposit percentage would be systematically different for non-WoS articles,
there are ways to test this. An obvious way is to specifically test SCOPUS
*non-WoS-indexed* articles, to see whether their deposit percentage turns
out to be any different. Other indices can be used too, to test
non-ISI-articles, including non-English-language indices.

But I do think that there is some conflation of total output with deposit
percentage here. For any institutional repository, we know the total
deposits, we just don't know what percentage of total institutional output
they reflect.

Now Liège has a second way to estimate this (*if they can be confident that
Liège authors have indeed at least been compliant in depositing the
bibliographic metadata for all their articles, as mandated*).

And the fact that the Liège repository (ORBi) has been designated as the
sole means of submitting publications for institutional performance review
makes it very likely that Liège authors have been faithfully depositing all
their metadata there. This does give Liège a way to improve its estimate of
the percentage deposit of full texts.

But this method (as I'm sure Bernard  Paul will agree) is a bit of a
bootstrap, as it is based on the assumption (not the evidence) *that all
authors are faithfully depositing the metadata for all their articles.*

The WoS estimate makes no such assumption. It gets the metadata from a
reliable source elsewhere. (But the comparison is certainly worth making.)

Other institutions, however, whose mandates do not yet include Liège's
all-important performance-review condition, will not be able to use this
second method to estimate the effectiveness of their deposit mandates.
Indeed, one of the rationales for adopting an institutional deposit mandate
is that *institutions currently have no way of knowing their total research
output* -- hence they can only find out by consulting external databases
such as WoS or SCOPUS!

One last point. It is certain that WoS underestimates total
non-English-language article output and hence it is conceivable that
authors at a non-English-language institution could have a higher deposit
rate for their non-English-language articles than their English ones.
Whether the WoS baseline thus generates a biassed underestimate of overall
percentage deposit at non-English-language institutions can be tested
against a relevant non-English-language bibliographic index.

However, this definitely does not support Jean-Claude's parallel conjecture
that the WoS baseline might produce a biassed estimate of the deposit
percentage of SSH (Social Sciences and Humanities) *journal articles*. It
is logically possible that SSH authors are selectively non-compliant with
deposit mandates for their journal articles -- but that, of course, could
already be tested with the WoS baseline data...

*Stevan Harnad*

On Sun, Sep 21, 2014 at 3:54 AM, Richard Poynder ri...@richardpoynder.co.uk
 wrote:

 As a layperson I would certainly be interested to know what margin of
 error levels we can assume the “Web of Science and/or in Scopus” approach
 has. I am conscious, for instance, that some of the reports by UK
 universities into RCUK compliance mention using Web of Science, but they
 all appear keen to stress that they have serious concerns about data
 accuracy.



 A list of RCUK compliance reports, by the way, can be found here:
 http://goo.gl/Yi3twT



 There is also a very informative blog post on the topic of monitoring open
 access mandates/policies by Cameron Neylon here: http://goo.gl/Y02S87



 Richard Poynder









 *From:* goal-boun...@eprints.org [mailto:goal-boun...@eprints.org] *On
 Behalf Of *Jean-Claude Guédon
 *Sent:* 20 September 2014 23:27
 *To:* Global Open Access List (Successor of AmSci)
 *Subject:* [GOAL] Re: Fwd: The Open Access Interviews: Paul Royster



 Extremely good answer, Bernard!

 It is also very good to clarify the fact that the 90% figure is calculated
 against the baseline of a combined WoS_Scopus search. However, and this was
 part of my difficulties with Stevan's argument, I suspect that in SSH, in a
 French-speaking university, many publish in French-language journals that
 do not appear in either list. This means that, for Liège, the baseline
 works from one year to the next, but if you want to compare Liège's mandate
 and its effectiveness (which, once again, I agree, is - from common sense