Re: [GOAL] Re : Re: SSRN Sellout to Elsevier

2016-05-18 Thread Isidro F. Aguillo
community
would not have been successful without the commitment of so
many of you who have contributed in so many ways. I am proud
of the community we have created, and I invite you to
continue your involvement and support in this effort. 


The staff at SSRN are all staying (including Gregg Gordon,
CEO and myself), the Rochester office is still in place, it
will still be free to upload and download papers, and we
remain committed to “Tomorrow’s Research Today”. I look
forward to and am committed to a successful transition and to
another great 25 years for the SSRN community that rivals the
first. 

Michael C. Jensen 

Founder & Chairman, SSRN 




Search

<http://hq.ssrn.com/GroupProcesses/RedirectClick.cfm?partid=2338421=4024=15740=http://papers.ssrn.com/>
the SSRN eLibrary

<http://hq.ssrn.com/GroupProcesses/RedirectClick.cfm?partid=2338421=4024=15740=http://papers.ssrn.com/>
| Browse

<http://hq.ssrn.com/GroupProcesses/RedirectClick.cfm?partid=2338421=4024=15740=http://papers.ssrn.com/sol3/DisplayJournalBrowse.cfm>
SSRN

<http://hq.ssrn.com/GroupProcesses/RedirectClick.cfm?partid=2338421=4024=15740=http://papers.ssrn.com/sol3/DisplayJournalBrowse.cfm>|
Top

<http://hq.ssrn.com/GroupProcesses/RedirectClick.cfm?partid=2338421=4024=15740=http://papers.ssrn.com/sol3/topten/topTenPapers.cfm>
Papers

<http://hq.ssrn.com/GroupProcesses/RedirectClick.cfm?partid=2338421=4024=15740=http://papers.ssrn.com/sol3/topten/topTenPapers.cfm>

___ 


GOAL mailing list GOAL@eprints.org <mailto:GOAL@eprints.org>
http://mailman.ecs.soton.ac.uk/mailman/listinfo/goal 


___ GOAL mailing list
GOAL@eprints.org <mailto:GOAL@eprints.org>
http://mailman.ecs.soton.ac.uk/mailman/listinfo/goal 


--
--
-/-/-/-/-/-/-/-/-/-/-/-/-/-/-/-/-/-/-/-/-/-/-/-/-/-/-/-/-/-/-/-/-/-/-/-/-/- 
Ross Mounce, PhD 
Software Sustainability Institute Fellow 2016 Dept. of Plant 
Sciences, University of Cambridge www.rossmounce.co.uk 
<http://rossmounce.co.uk/> 
-/-/-/-/-/-/-/-/-/-/-/-/-/-/-/-/-/-/-/-/-/-/-/-/-/-/-/-/-/-/-/-/-/-/-/-/-/- 



___
GOAL mailing list
GOAL@eprints.org <mailto:GOAL@eprints.org>
http://mailman.ecs.soton.ac.uk/mailman/listinfo/goal


___________
GOAL mailing list
GOAL@eprints.org
http://mailman.ecs.soton.ac.uk/mailman/listinfo/goal


--


**
Isidro F. Aguillo
Dr. Honoris Causa Universitas Indonesia
Dr. Honoris Causa National Research Nuclear University Moscow
Editor Rankings Web
Cybermetrics Lab - Scimago Group, IPP-CSIC
Madrid. SPAIN

isidro.agui...@csic.es
ORCID -0001-8927-4873
ResearcherID: A-7280-2008
Scholar Citations SaCSbeoJ
Twitter @isidroaguillo
Rankings webometrics.info
***


---
El software de antivirus Avast ha analizado este correo electrónico en busca de 
virus.
https://www.avast.com/antivirus
___
GOAL mailing list
GOAL@eprints.org
http://mailman.ecs.soton.ac.uk/mailman/listinfo/goal


Re: [GOAL] Re : Re: SSRN Sellout to Elsevier

2016-05-17 Thread Isidro F. Aguillo
unity of authors, researchers and institutions
>  that has made this all possible. I consider it one of my great
>  accomplishments in life. The community would not have been
>  successful without the commitment of so many of you who have
>  contributed in so many ways. I am proud of the community we have
>  created, and I invite you to continue your involvement and support
>  in this effort.
>
>
>  The staff at SSRN are all staying (including Gregg Gordon, CEO and
>  myself), the Rochester office is still in place, it will still be
>  free to upload and download papers, and we remain committed to
>  “Tomorrow’s Research Today”. I look forward to and am committed to
>  a successful transition and to another great 25 years for the SSRN
>  community that rivals the first.
>
>
>  Michael C. Jensen
>
> Founder & Chairman, SSRN
>
>
>
> Search
>  the SSRN eLibrary | Browse
>  SSRN  | Top
>  Papers
> ___
>
> GOAL mailing list
> GOAL@eprints.org
> http://mailman.ecs.soton.ac.uk/mailman/listinfo/goal


-- 
Isidro F. Aguillo, HonPhD
Cybermetrics Lab (3E14). IPP - CSIC
Albasanz, 26-28. 28037 Madrid. Spain

isidro.aguillo @ csic.es
www. webometrics.info

___
GOAL mailing list
GOAL@eprints.org
http://mailman.ecs.soton.ac.uk/mailman/listinfo/goal


Re: [GOAL] Request Your Help for an open access study on non-English-language journals

2016-03-11 Thread Isidro F. Aguillo
You can use our list of Portals of journals with dozens of OJS 
implementations in Latin America and other regions plus other in-house 
developments:


http://repositories.webometrics.info/en/top_portals


On 11/03/2016 16:29, Pierre Mounier wrote:

Revues.org in France : http://www.revues.org
Hrcak in Croatia : http://hrcak.srce.hr/
AJOL in Africa : http://www.ajol.info/


Best,


--
Pierre Mounier
Directeur adjoint au développement international - OpenEdition
Associate Director for international development - OpenEdition

EHESS
190-198 avenue de France
75244 Paris cedex 13
Bureau/Office 447
Mob. +33 (0)6 61 98 31 86
Twitter : @piotrr70
orcid.org/-0003-0691-6063 <http://orcid.org/-0003-0691-6063>


On Fri, Mar 11, 2016 at 3:57 PM, Jean-Claude Guédon 
<jean.claude.gue...@umontreal.ca 
<mailto:jean.claude.gue...@umontreal.ca>> wrote:


Do not forget Redalyc in Mexico.


Jean-Claude Guédon

Professeur titulaire
Littérature comparée
Université de Montréal

Le vendredi 11 mars 2016 à 12:04 +0200, Cenyu Shen a écrit :

Dear recipient,

We have started a study to look at a subset of Open Access scholarly
journal publishing, which we feel has been overlooked in much of the
published research, namely OA journals published in other languages
than English. We include both newly started electronic only OA
journals, as well as older print journals that have started to make
the e-version free. The vast majority of these probably don?t charge
authors. There are a number of reasons there have been few results
about the overall extent of such publishing etc. One is that many
leading researchers come from countries where English is the main
language, and many studies have from the start been restricted to such
only journals publishing in English, in order to facilitate the
gathering of data. Another is that non-English journals are likely to
be underrepresented in all the available indexes, including the DOAJ.

We have so far indentified two easy ways to find information about
such journals. The first is using DOAJ and its search facilities. The
second one is using the OA journal portals we are aware of, such as
Scielo, J-stage, doiSerbia etc. In addition to this there are many
countries, which don?t have such portals and we would like also to get
information from those. For this purpose we try to contact experts who
we believe have good knowledge of the situation in their countries and
could provide us links to list of all reputable scholarly journals in
their country, lists of OA journals etc.

Countries which interest us in particular are: Canada, Most European
countries (except the UK) and including Russia, Francophone countries
in Africa, Middle East and Asian Countries.

We will use three ways to contact volunteers who can help us:

Contact with the management of DOAJ and its voluntary editorial staff
An email to The Global Open Access List (GOAL)
Direct e-mail to people we know

If you feel you are in a position to provide us information, please
contact us by e-mail

Cenyu Shen, Ph.D. Student
Principal researcher
cenyu.s...@hanken.fi <mailto:cenyu.s...@hanken.fi>

Bo-Christer Björk, Professor
bo-christer.bj...@hanken.fi <mailto:bo-christer.bj...@hanken.fi>

Mikael Laakso, Assistant Professor
mikael.laa...@hanken.fi <mailto:mikael.laa...@hanken.fi>

Information Systems Science
Dept. of Management and Organisation
Hanken School of Economics
P.O. Box 479, 00101 Helsinki, Finland



___
GOAL mailing list
GOAL@eprints.org <mailto:GOAL@eprints.org>
http://mailman.ecs.soton.ac.uk/mailman/listinfo/goal

___ GOAL mailing list
GOAL@eprints.org <mailto:GOAL@eprints.org>
http://mailman.ecs.soton.ac.uk/mailman/listinfo/goal 


___
GOAL mailing list
GOAL@eprints.org
http://mailman.ecs.soton.ac.uk/mailman/listinfo/goal

--


**************
Isidro F. Aguillo
Dr. Honoris Causa Universitas Indonesia
Dr. Honoris Causa National Research Nuclear University Moscow
Editor Rankings Web
Cybermetrics Lab - Scimago Group, IPP-CSIC
Madrid. SPAIN

isidro.agui...@csic.es
ORCID -0001-8927-4873
ResearcherID: A-7280-2008
Scholar Citations SaCSbeoJ
Twitter @isidroaguillo
Rankings webometrics.info
***


---
El software de antivirus Avast ha analizado este correo electrónico en busca de 
virus.
https://www.avast.com/antivirus
___
GOAL mailing list
GOAL@eprints.org
http://mailman.ecs.soton.ac.uk/mailman/listinfo/goal


Re: Ranking of repositories

2011-08-04 Thread Isidro F . Aguillo
Dear colleagues,

In my country sometimes we said: the best is enemy of the good and certainly
there are far better tools for analyzing empirically the OAI but at this moment
the key objective is IMPACT and according to my personal experience several
universities are promoting their institutional repositories for improving their
position in the Rankings. Perhaps the problem is not with the Rankings
themselves, but with authorities not applying quality criteria in the evaluation
of such classifications. Only in this way can be explained that a lot of people
believe that the unethical Times Higher Education Ranking is very prestigious.

Best regards,



El 03/08/2011 17:52, Jean-Claude Guédon escribió:
  Personally, I regret these constant efforts to create rankings
  leading to the identification of excellence. They completely
  distort the quality issues which, IMHO, are far more important.
  Would it not be much better to create evaluation thresholds
  corresponding to quality levels. This would encourage lower-level
  repositories to try moving up a category, and then perhaps two?

  Some may object that category classifications are nothing more than
  rough, crude ranking. This is not false, but there is a distinction
  to be observed, however: quality thresholds do not put competition
  at the center of everything, and it does not rely on competition to
  identify quality.

  Some may think that competition is a good way to create quality, but
  this is not the case. Just to give an example: the US health system
  is largely dominated by competitive rankings of all kinds. This
  leads to two opposite results: the US has many of the best health
  centers in the world and a great many Nobel prizes in medicine; yet,
  the US ranks about 35th in the world for life expectancy, which is
  shockingly low. If one were to choose between having the medical
  champions of the world, versus having a population with a better
  general health, one would tend to prefer the latter. At least that
  would be my choice.

  In other words, fighting for excellence as the over-arching
  principle of quality creation leads to the concentration of quality
  at the very top, and it often leads to the neglect of overall
  quality.

  I believe science needs quality everywhere, and not just at the top.
  A bit of competition is also needed, but only at the very top, to
  stimulate the very best to go one step further. Competition
  everywhere does not work because those that cannot hope to come even
  close to the very best, the gold medals, simply give up.

  Incidentally, OA corresponds to a massive vote in favor of quality,
  as the many discussions about quality control and peer review that
  are appearing in its wake demonstrate. Excellence is all right if it
  is limited to the very top of science, where the paradigm shifts
  occur. But most of science is not about paradigm shifting, far from
  it. Let us value excellence, but let us keep it also in its proper
  place. Meanwhile, let us grow quality all over and Open Access is a
  powerful tool to that end.

  My two cents' worth.

  Jean-Claude Guédon

  Le mercredi 03 août 2011 ??10:04 -0400, Peter Suber a écrit :
[Forwarding from Isidro F. Aguillo, via the AmSci OA
Forum. ??--Peter Suber.]


The second edition of the 2011 Ranking Web of
Repositories has been published at the end of July. ??It
is available from the Webometrics portal:

http://repositories.webometrics.info/


The number of repositories is growing fast, especially
in academic institutions from developing countries. As
in previous editions the subject repositories still
appear in the top positions, with large institutional
ones following them.


There are no relevant changes in this edition, but the
editors are making a plea to the Open Access community
regarding a few aspects related to intellectual property
issues.


The papers and other documents deposited in
institutional repositories are probably the main asset
of those institutions. As important as giving free
access to others is the proper recognition of the
authorship of the scientific documents. Unfortunately a
few institutions are hosting their repositories in
websites outside the main webdomain of its organization
and many repositories are recommending to use systems
like handle and others purl-like URLs for citing
(linking) the deposited items. This means that moral
rights regarding institutional authorship are ignored,
relevant information about authors

Ranking of repositories

2011-08-03 Thread Isidro F . Aguillo

The second edition of the 2011 Ranking Web of Repositories has been published at
the end of July.  It is available from the Webometrics portal:

http://repositories.webometrics.info/

The number of repositories is growing fast, especially in academic institutions
from developing countries. As in previous editions the subject repositories
still appear in the top positions, with large institutional ones following them.

There are no relevant changes in this edition, but the editors are making a plea
to the Open Access community regarding a few aspects related to intellectual
property issues.

The papers and other documents deposited in institutional repositories are
probably the main asset of those institutions. As important as giving free
access to others is the proper recognition of the authorship of the scientific
documents. Unfortunately a few institutions are hosting their repositories in
websites outside the main webdomain of its organization and many repositories
are recommending to use systems like handle and others purl-like URLs for citing
(linking) the deposited items. This means that moral rights regarding
institutional authorship are ignored, relevant information about authors is
missed and the semantic possibilities of the web address are not explored.

Nowadays it is already common to add the URL address of the full text document
in the bibliographic references of the published papers. Logically the link to
the full text in the institutional repository can be used for that purpose, but
researchers are facing options that ignore their institutional affiliation, with
strange meaningless codes, prone to typos or other mistakes and pointing to
metadata pages not to the full text documents. Obviously for authors it could be
more profitable to host the papers in their personal pages instead doing it in
institutional repositories whose naming policies have relevant copyright issues.

Our position is that end-users should be taken into account, that web addresses
are going to place in important role in citing behavior, that citations are the
key tool for evaluation of authors, that institutions are investing large
amounts of money in their repositories in exchange of prestige and impact and
that providing permanent address is the duty of the institution, nor
responsibility of external third-parties.

Comments are welcomed

 



--
===

Isidro F. Aguillo, HonPhD

The Cybermetrics Lab
IPP-CCHS-CSIC
Albasanz, 26-28 (3C1)
28037 Madrid. Spain

isidro.aguillo @ cchs.csic.es

===




Re: Ranking Web of Repositories: July 2010 Edition

2010-07-12 Thread Isidro F . Aguillo
.

For the record, I completely agree with you about PDF / HTML /
XHTML. If only Microsoft Word (and LaTeX) had decent export
facilities that produced good semantic HTML.

--
Les Carr




--
===

Isidro F. Aguillo, HonPhD
Cybermetrics Lab (3C1)
IPP-CCHS-CSIC
Albasanz, 26-28
28037 Madrid. Spain


Editor of the Rankings Web
===




Re: Ranking Web of Repositories: July 2010 Edition

2010-07-12 Thread Isidro F . Aguillo
 HTML is superior to PDF for purposes of
 access and reuse, I self-archive in HTML rather than PDF whenever I can.
 For the record, I completely agree with you about PDF / HTML / XHTML. If
 only
 Microsoft Word (and LaTeX) had decent export facilities that produced good
 semantic HTML.
 Why wait for Microsoft? What has the the open source community be doing on
 this front? What about OpenOffice? Any good open source NLM DTD conversion
 tools out there? Why has it taken so long?
No answer to that. I am only mirroring the current situation.

 Leslie (Chan)

 --
 Les Carr




-- 
===

Isidro F. Aguillo, HonPhD
Cybermetrics Lab (3C1)
IPP-CCHS-CSIC
Albasanz, 26-28
28037 Madrid. Spain


Editor of the Rankings Web
===



Re: Ranking Web of Repositories: July 2010 Edition

2010-07-09 Thread Isidro F . Aguillo
  Dear Stevan:

A lot of interesting stuff to think about. We are already working on 
some of those proposals but it is not easy. However perhaps you will 
like this page we prepared for the University rankings related to UK 
universities commitment to OA:

http://www.webometrics.info/openac.html

Thanks for your useful comments,



El 08/07/2010 18:34, Stevan Harnad escribió:
 On 2010-07-08, at 4:43 AM, Isidro F. Aguillo wrote:

 Dear Hélène:

 Thank you for your message, but I disagree with your proposal. We are not 
 measuring only contents but contents AND visibility in the web.
 Dear Isidro,

 If I may intervene with some comments too, as this discussion has some wider 
 implications:

 Yes, you are measuring both contents and visibility, but presumably you want 
 the difference between (1) the ranking of the top 800 repositories and (2) 
 the ranking of the top 800 *institutional* repositories to be based on the 
 fact that the latter are institutional repositories whereas the former are 
 all repositories (central, i.e., multi-institutional, as well as 
 institutional).

 Moreover, if you list redundant repositories (some being the proper subsets 
 of others) in the very same ranking, it seems to me the meaning of the 
 ranking becomes rather vague.

 Certainly HyperHAL covers the contents of all its participants, but the 
 impact of these contents depends of other factors. Probably researchers 
 prefer to link to the paper in INRIA because of the prestige of this 
 institution, the affiliation of the author or the marketing of their 
 institutional repository.
 All true, but perhaps the significance and usefulness of the rankings would 
 be greater if you either changed the weight of the factors (volume of 
 full-text content, number of links) or, alternatively, you designed the 
 rankings so the user could select and weight the criteria on which the 
 rankings are displayed.

 Otherwise your weightings become like the h-index -- an a-priori 
 combination of untested, unvalidated weights that many users may not be 
 satisfied with, or fully informed by...

 But here is a more important aspect. If I were the president of INRIA I will 
 prefer people using my institutional repository instead CCSD. No problem 
 with the last one, they are makinng a great job and increasing the reach of 
 INRIA, but the papers deposited are a very important (the most important?) 
 asset of INRIA.
 But how much INRIA papers are linked, downloaded and cited is not necessarily 
 (or even probably) a function of their direct locus!

 What is important for INRIA (and all institutions) is that as much as 
 possible of their paper output should be OA, simpliciter, so that it can be 
 linked, downloaded, read, applied, used and cited. It is entirely secondary, 
 for INRIA (and all institutions), *where* their papers are OA, compared to 
 the necessary condition *that* they are OA (and hence freely accessible, 
 usaeble, harvestable).

 Hence (in my view) by far the most important ranking factor for institutional 
 repositories is how much of their full-text institutional paper output is 
 indeed deposited and OA. INRIA would have no reason to be disappointed if the 
 locus from which its content is searched, retrieved and linked is some other, 
 multi-institutional harvester. INRIA still gets the credit and benefits from 
 all the links, downloads and citations of INRIA content!

 (Having said that, locus of deposit *does* matter, very much, for deposit 
 mandates, Deposit mandates are necessary in order to generate OA content. 
 And, for strategic reasons that are elaborated in my reply to Chris 
 Armbruster, it makes a big practical difference for success in agreeing on 
 the adoption of a mandate that both institutional and funder mandates should 
 require convergent *institutional* deposit, rather than divergent and 
 competing institutional vs. institution-extermal deposit. Here too, your 
 repository rankings would be much more helpful and informative if they gave a 
 greater weight to the relative size of each institutional repository's 
 content and eliminated multi-institutional repositories from the 
 institutional repository rankings -- or at least allowed institutional 
 repositories to be ranked independently on content vs links.

 I think you are perhaps being misled here by the analogy with your sister 
 rankings http://www.webometrics.info/ RWWU of universities rather than their 
 repositories In university rankings, the links to the university site itself 
 matter a lot. But in repository rankings links matter much less than *how 
 much institutional content is accessible*. For the degree of usage of that 
 content, harvester sites may be more relevant measures, and, after all, 
 downloads and citations, unlike links, carry their credits (to the authors 
 and institutions) with them no matter where the transaction happens to 
 occur...

 Regarding the other comments we are going to correct those

Re: Ranking Web of Repositories: July 2010 Edition

2010-07-08 Thread Isidro F . Aguillo
  Dear Hélène:

Thank you for your message, but I disagree with your proposal. We are 
not measuring only contents but contents AND visibility in the web. 
Certainly HyperHAL covers the contents of all its participants, but the 
impact of these contents depends of other factors. Probably researchers 
prefer to link to the paper in INRIA because of the prestige of this 
institution, the affiliation of the author or the marketing of their 
institutional repository.
But here is a more important aspect. If I were the president of INRIA I 
will prefer people using my institutional repository instead CCSD. No 
problem with the last one, they are makinng a great job and increasing 
the reach of INRIA, but the papers deposited are a very important (the 
most important?) asset of INRIA.

Regarding the other comments we are going to correct those with mistakes 
but it is very difficult for us to realize that Virginia Tech University 
is faking its institutional repository with contents authored by 
external scholars.

Best regards,





El 07/07/2010 23:03, Hélène.Bosc escribió:
 Isidro,
 Thank you for your Ranking Web of World Repositories and for informing 
 us about the best quality repositories!


 Being French, I am delighted to see HAL so well ranked and I take this 
 opportunity to congratulate Franck Laloe for having set up such a good 
 national repository as well as the CCSD team for continuing to 
 maintain and improve it.

 Nevertheless, there is a problem in your ranking that I have already 
 had occasion to point out to you in private messages.
 May I remind you that:

 Correction for the top 800 ranking:


 The ranking should either index HyperHAL alone, or index both 
 HAL/INRIA and HAL/SHS, but not all three repositories at the same 
 time: HyperHAL includes both HAL/INRIA and HAL/SHS .

 Correction for the ranking of institutional repositories:


 Not only does HyperHAL (#1) include both HAL/INRIA (#3) and HAL/SHS 
 (#5), as noted above, but HyperHAL is a multidisciplinary repository, 
 intended to collect all French research output, across all 
 institutions. Hence it should not be classified and ranked against 
 individual institutional repositories but as a national, central 
 repository. Indeed, even HAL/SHS is multi-institutional in the usual 
 sense of the word: single universities or research institutions. The 
 classification is perhaps being misled by the polysemous use of the 
 word institution.


 Not to seem to be biassed against my homeland, I would also point out 
 that, among the top 10 of the top 800 institutional repositories, 
 CERN (#2) is to a certain extent hosting multi-institutional output 
 too, and is hence not strictly comparable to true single-institution 
 repositories. In addition, California Institute of Technology Online 
 Archive of California (#9) is misnamed -- it is the Online Archive of 
 California http://www.oac.cdlib.org/ (CDLIB, not CalTech) and as such 
 it too is multi-institutional. And Digital Library and Archives 
 Virginia Tech University (#4) may also be anomalous, as it includes 
 the archives of electronic journals with multi-institutional content. 
 Most of the multi-institutional anomalies in the Top 800 
 Institutional seem to be among the top 10 -- as one would expect if 
 multiple institutional content is inflating the apparent size of a 
 repository. Beyond the top 10 or so, the repositories look to be 
 mostly true institutional ones.


 I hope that this will help in improving the next release of your 
 increasingly useful ranking!


 Best wishes
 Hélène Bosc

 - Original Message - From: Stevan Harnad 
 har...@ecs.soton.ac.uk
 To: american-scientist-open-access-fo...@listserver.sigmaxi.org
 Sent: Tuesday, July 06, 2010 6:07 PM
 Subject: Fwd: Ranking Web of Repositories: July 2010 Edition



 Begin forwarded message:

 From: Isidro F. Aguillo isidro.agui...@cchs.csic.es
 Date: July 6, 2010 11:13:58 AM EDT
 To: sigmetr...@listserv.utk.edu
 Subject: [SIGMETRICS] Ranking Web of Repositories: July 2010 Edition

 Ranking Web of Repositories: July 2010 Edition

 The second edition of 2010 Ranking Web of Repositories has been 
 published the same day OR2010 started here in Madrid. The ranking is 
 available from the following URL:

 http://repositories.webometrics.info/

 The main novelty is the substantial increase in the number of 
 repositories analyzed (close to 1000). The Top 800 are ranked 
 according to their web presence and visibility. As usual thematic 
 repositories (CiteSeer, RePEc, Arxiv) leads the Ranking, but the 
 French research institutes (CNRS, INRIA, SHS) using HAL are very 
 close.  Two issues have changed from previous editions from a 
 methodologicall point of view:, the use of Bing's engine data has been 
 discarded due to irregularities in the figures obtained and MS Excel 
 files has been excluded again.

 At the end of July the new edition of the Rankings of universities, 
 research centers and hospitals

Ranking Web of World Repositories

2009-01-27 Thread Isidro F . Aguillo
The January edition of the Ranking Web of Repositories is just published.

http://repositories.webometrics.info/

The number of repositories is growing fast worldwide but still many of
them does not have their own domain or subdomain, and for this reason it
is not possible to add them in our analysis. Some institutions maintain
several databases with completely different URLs which penalize the
global visibility they have.


We are still unable to add usage/download statics but there are many
initiatives already working on standardization of the collecting
methods, so we expect that global data could be available soon.

Following several requests we now show two global Rankings. One that
covers all repositories as was shown in previous editions (Top 300), and
a new one that focus only on Institutional Repositories (Top 300
Institutional).

There is a minor change regarding the calculation of the number of rich
files as in this new edition we are using again other formats than pdf
(doc, ppt, ps) to obtain the data. Contrary to the methodology we used
to make the other Rankings, the figures for rich files are combined and
not treated individually.

The French HAL central repository, and its subsets like INRIA, Social
Sciences and Humanities (HAL-SHS) or IN2P3, are at the top of the
institutional repository list.

Important repositories like PubMedCentral, CiteSeerX and
Smithsonian/NASA Astrophysics Data System, do not use standard suffixes
to design their contents (e.g. papers in acrobat format with file names
which extension is not .pdf). This is a bad practice as it reduces the
visibility of these documents to the search engines.

Our policy is not to include collectors or metarepositories, with one
exception which is DiVa, the interface that the Uppsala University
provides to more than 20 Nordic repositories. Many of these institutions
do not have their own systems but link their contents to the DiVa
portal. Unfortunately, this means that many of the papers are under
different domains and then they do not contribute to the DiVa's rank.

--
*
Isidro F. Aguillo
Cybermetrics Lab
CCHS - CSIC
Albasanz, 26-28, 3C1. 28037 Madrid. Spain

Ph. 91-602 2890. Fax: 91-602 2971

isidro.aguillo @ cchs.csic.es
www. webometrics.info
*


Re: University ranking

2008-08-19 Thread Isidro F . Aguillo
-ACCESS-FORUM Digest - 6 Aug 2008 to 7
 Aug 2008 (#2008-151)
 
 
 Hello all,
 
 This type of ranking is to me clearly a case of crank ranking
 
 What is THAT supposed to mean? Quality? But of What? In fact it means not
 much of anything in terms of academic reality that should be the basic
 focus of universities and indicators rerlated to their missions. The
 accompanying text of the message seems to imply that the changes in
 positions are related  to any kind of improvement of a university., which
 is clearly NOT the case.  Worst still, in terms of interpretation, the
 text notes that  the UNAM climbs up to the position number 51, a relevant
 change from previous number 59. (we underline) Well... In fact, this
 changes of 8 positions among a thousand is in all probability (99.99...)
 due to a random change from year to year given the large variance of the
 sample.
 
 Let us hope those who like constructing indicators wil use their time to
 find meaningful ones instead of trying to interpret smalll variations in
 pseudo common-sense  indicators (here web hits) which are not really
 connected to universities' missions ...
 
 Have a nice day
 
 
 Yves Gingras
 
   

-- 

Isidro F. Aguillo
Laboratorio de Cibermetría
Cybermetrics Lab
CCHS - CSIC
Joaquin Costa, 22
28002 Madrid. Spain

isidro @ cindoc.csic.es
+34-91-5635482 ext 313



New Beta version of the Ranking Web of World Repositories

2008-05-26 Thread Isidro F . Aguillo
We have taken into account some of the suggestions regarding the Ranking Web
of World Repositories for the new Beta 2 version. We have changed the domain
that it is now autonomous from the Universities Ranking one:

http://repositories.webometrics.info/

This will allow to be more flexible in the future and to enlarge and
diversify the coverage. Now we are ranking the Top 300 instead the first 200
as in the previous edition. New repositories have been added, like
PubMedCentral that was missed and others have been deleted because they are
not focused on research papers.

The most important changes are related to ranking methodology: For the Rich
files ranking we are considering only Adobe Acrobat pdf files, as other
formats numbers were very low for ranking purposes. For the same reason,
only data extracted from Google and Yahoo were considered. Regarding Scholar
ranking, it is now build from the mean between the total number of items and
those published between 2001 and 2008 to increase the weight of the
freshness.

Finally we are suggesting the use of Google Analytics as common minimum
standards for obtaining usage data, clearly the weakest point of our current
approach.

Comments are welcomed as usual,

-- 

Isidro F. Aguillo
Laboratorio de Cibermetría
Cybermetrics Lab
CCHS - CSIC
Joaquin Costa, 22
28002 Madrid. Spain

isidro @ cindoc.csic.es
+34-91-5635482 ext 313



Re: New Ranking of Central and Institutional Repositories

2008-02-12 Thread Isidro F . Aguillo
 20%.
  
  If it is necessary to measure size, and it probably is, then I suggest a
  measure that counts the number of records with a publication date within
  the last five years. Choose 10 years if you want, but ancient
  record-keeping does not translate into impact.
  
  ACTIVITY
  It is quite clear from ROAR that deposit activity is a major measure of
  impact. There are three easy measures to derive.
 * The number of acquisitions in the last 12 months. Easily discovered
  from the OAI interface.
 * The number of acquisitions with a publication date in the last 12
  months. Easily discovered from the OAI interface. This measures currency
  as well as activity.
 * Some repositories are sporadic, some are continuous, the latter
  reflecting a deep-seated integration within the university's activity. A
  simple measure would be to derive a statistic from the traffic (see
  ROAR), such as
 * number of days in last 12 months with a deposit event
 * the Fourier spectrum of the last 12 months deposit events
  having no component with a period longer than 7 days above 10% (I guess
  at what is significant and perhaps this can be turned into a score).
  
  RICH TEXT
  This is a reasonable measure, though subject to error. For example we
  sometimes put a full-text that gives instructions on how to ask for
  access to the item concerned, or a bio of the creator of an artwork.
  
  
  DOWNLOADS
  I'd love to promote downloads as a measure of impact, but there is as
  yet no federated way to access this data.
  
  I'm happy to continue this dialogue.
  
  Arthur Sale
  Professor of Computer Science
  University of Tasmania
  
   -Original Message-
   From: American Scientist Open Access Forum
  
  [mailto:AMERICAN-SCIENTIST-OPEN-ACCESS-FORUM@LISTSERVER.SIGMAXmailto:A
  MERICAN-SCIENTIST-OPEN-ACCESS-FORUM@LISTSERVER.SIGMAX 
   I.ORG] On Behalf Of Isidro F. Aguillo
   Sent: Monday, 11 February 2008 6:53 PM
   To: american-scientist-open-access-fo...@listserver.sigmaxi.org
   Subject: Re: [AMERICAN-SCIENTIST-OPEN-ACCESS-FORUM] New
   Ranking of Central and Institutional Repositories
  
   Dear all:
  
   Thanks for your interest in the Ranking of repositories, part
   of our larger effort for rnaking webpresence of universities
   and research centers. A few comments to your messages:
  
   - Currently the Ranking of repositories is a beta version. We
   will thank comments, suggestions and criticisms. Information
   about missed repositories are warmly welcomed. After feedback
   recieved during the last days we are considering a new
   edition before the scheduled one in July.
   - Our rank formula mimic in part PageRank but our
   inspiration was in fact impact factor. We maintain a ratio
   1:1 between visibility (impact) and size (activity) that it
   is the basis of IF. In order to take into account the
   diversity of web info we decide to split the size
   contribution according to additional criteria.
   - Freshness is a topic we are concerned about not only for
   repositories but for the rest of the rankings too. We are
   considering to take it into account  in the Scholar
   contribution giving more weight to recent publications.
   - There are methodological problems for producing relative
   indicators:
   percentage of global output, or institution size
   normalization. But you know ranking are usually build by GDP
   (US, Japan, Germany,...) and not GDP per capita (Luxembourg,
   United Arab Emirates, ...)
   - Our position as a research group has been previously stated
   but I am going to summarise again: The rankings are made with
   the aim of increase the volume of academic information
   available on the Web, promoting the electronic publication of
   all the activities of the universities, not only the research
   related ones. And specially from developing countries institutions.
  
   Best regards,
  
   Leslie Carr escribió:
   
On 9 Feb 2008, at 21:36, Arthur Sale wrote:
   
It looks as though the algorithm is the same as for
   university websites.
   
Rank each repository for inward bound hyperlinks (VISIBILITY) Rank
every repository for number of pages (SIZE) Rank every
   repository for
number of 'interesting' documents eg .doc.
.pdf (RICH FILES)
Rank every repository for number of records returned by a Google
Scholar search (GOOGLE SCHOLAR) Compute (VISIBILITY x 50%)
   + (SIZE x
20%) + (RICH FILES x 15%) + (GOOGLE SCHOLAR x 15%) And
   then rank the
repositories on this score.
   
This is a poor measure in general. VISIBILITY (accounts for 50% of
score!) is not necessarily useful for repositories, when
   harvesting
in more important than hyperlinks. It will be strongly
   influenced by
staff members linking their publications off a repository search.
Both SIZE and RICH FILES measure absolute size and say
   nothing about
currency or activity. Some of the higher placed Australian

Re: New Ranking of Central and Institutional Repositories

2008-02-11 Thread Isidro F . Aguillo
Dear all:

Thanks for your interest in the Ranking of repositories, part of our larger
effort for rnaking webpresence of universities and research centers. A few
comments to your messages:

- Currently the Ranking of repositories is a beta version. We will thank
comments, suggestions and criticisms. Information about missed repositories
are warmly welcomed. After feedback recieved during the last days we are
considering a new edition before the scheduled one in July.
- Our rank formula mimic in part PageRank but our inspiration was in fact
impact factor. We maintain a ratio 1:1 between visibility (impact) and size
(activity) that it is the basis of IF. In order to take into account the
diversity of web info we decide to split the size contribution according to
additional criteria.
- Freshness is a topic we are concerned about not only for  repositories but
for the rest of the rankings too. We are considering to take it into account
in the Scholar contribution giving more weight to recent publications.
- There are methodological problems for producing relative indicators:
percentage of global output, or institution size normalization. But you know
ranking are usually build by GDP (US, Japan, Germany,...) and not GDP per
capita (Luxembourg, United Arab Emirates, ...)
- Our position as a research group has been previously stated but I am going
to summarise again: The rankings are made with the aim of increase the
volume of academic information available on the Web, promoting the
electronic publication of all the activities of the universities, not only
the research related ones. And specially from developing countries
institutions.

Best regards,

Leslie Carr escribió:
 
 On 9 Feb 2008, at 21:36, Arthur Sale wrote:
 
  It looks as though the algorithm is the same as for university websites.
  
  Rank each repository for inward bound hyperlinks (VISIBILITY)
  Rank every repository for number of pages (SIZE)
  Rank every repository for number of 'interesting' documents eg .doc.
  .pdf (RICH FILES)
  Rank every repository for number of records returned by a Google Scholar
  search (GOOGLE SCHOLAR)
  Compute (VISIBILITY x 50%) + (SIZE x 20%) + (RICH FILES x 15%) + (GOOGLE
  SCHOLAR x 15%)
  And then rank the repositories on this score.
  
  This is a poor measure in general. VISIBILITY (accounts for 50% of
  score!) is not necessarily useful for repositories, when harvesting in
  more important than hyperlinks. It will be strongly influenced by staff
  members linking their publications off a repository search. Both SIZE
  and RICH FILES measure absolute size and say nothing about currency or
  activity. Some of the higher placed Australian universities have simply
  had old stuff dumped in them, and are relatively inactive in acquiring
  current material. Activity should be a major factor in metrics for
  repositories, and this could easily measured by a search limited to a
  year (eg 2007), or by the way ROAR does it through OAI-PMH harvesting.
  
 I believe that the Webometrics (ghastly name!) ranking of repositories
 uses the same criteria as its ranking of universities ie it is attempting
 to quantify the impact that the repository has had. This is very different
 to the size, deposit activity, or even used-ness of the repository and
 explains why the major contributing factor is VISIBILITY. The main issue
 for this league table is how much evidence is there in the public web
 that your active research and scholarly outputs are valued enough by your
 community of peers that they are linking to them. 
 This will probably seem entirely arbitrary to some people, and entirely
 obvious to others, depending on how much they see the web as a
 para-literature. It mimics Google's PageRank valuation of web pages
 according to how many 'votes' (links/quasi-citations) they get from other
 pages from independent sources.
 
  It is not possible to tell with any accuracy whether a University Website
 is a good website simply by looking at the University's place in the
 Webometrics Ranking of Universities. The website is simply a channel which
 delivers visibility-impact for the University (or not). Similarly for the
 repository. --
 Les Carr
 

-- 

Isidro F. Aguillo
Laboratorio de Cibermetría
Cybermetrics Lab
CCHS - CSIC
Joaquin Costa, 22
28002 Madrid. Spain

isidro @ cindoc.csic.es
+34-91-5635482 ext 313