Could you please tell me if you have any experience using http://netty.io/ with
Perl?
Thank you in advance
Greetings,
do you have a suggestion or have you had any experience with a digital
management system?
Thank you in advance,
(only open source suggestions please)
And my biggest question is how one with such a template could import the
logic, the rules behind a format. For instance how you handle a query that
according to the format says: if the date of publication is not stored in the
i.e. --1 field, then get it from the --2 field...
Thank you both for
On Tue, Sep 24, 2013 at 09:29:58AM +0100, dasos ili wrote:
for storage in mongodb and elasticsearch
then don't convert anything and just store the xml content as a value:
( less code, less transformation, less computation so more speed and
more reliability)
use IO::All;
use MongoDB;
my $xml = io
@LISTSERV.ND.EDU
Στάλθηκε: 11:59 π.μ. Τρίτη, 24 Σεπτεμβρίου 2013
Θέμα: Re: [CODE4LIB] Σχετ: [CODE4LIB] Perl module to transform XSL to JSON
On Tue, Sep 24, 2013 at 09:29:58AM +0100, dasos ili wrote:
for storage in mongodb and elasticsearch
then don't convert anything and just store the xml content
thank you very much for your help
Απο: Marc Chantreux m...@unistra.fr
Προς: CODE4LIB@LISTSERV.ND.EDU
Στάλθηκε: 2:35 μ.μ. Τρίτη, 24 Σεπτεμβρίου 2013
Θέμα: Re: [CODE4LIB] A Proposal to serialize MARC in JSON
On Tue, Sep 24, 2013 at 06:39:14AM -0400, Riley
My initial problem though with the marc-in-json approach is the complexity of
the JSON, i am looking to find a simpler model in order to also make my
queries, in ES for example, simpler to implement.
If anyone has any examples of how make use of this marc - in - json output in
order to use ES,
Greetings,
could you please suggest me a tool in order to transform an xsl file i have
manged to get from XML, into JSON?
Thank you
It is exactly three years back, and no real progress has been made concerning
this proposal to serialize MARC in JSON:
http://dilettantes.code4lib.org/blog/2010/09/a-proposal-to-serialize-marc-in-json/
Meanwhile new tools for searching and retrieving records have come in, such as
Solr and
Could you please give us any suggestions on a data model example regarding a
MARC record? The goal is to be able to store it in mongodb, in an efficient way
so as to get results with the appropriate queries.
thank you in advance
thank you all for your quick replies
...@princeton.edu
On 07/05/2013 05:47 AM, dasos ili wrote:
Could you please give us any suggestions on a data model example regarding a
MARC record? The goal is to be able to store it in mongodb, in an efficient
way so as to get results with the appropriate queries.
thank you in advance
Dear all,
we are planning to import our data that right now are in UNIMARC format, but we
would appreciate input also for MARC records, into a db. We have made a
transformation to get a summary of the data and the output is an XML file,
which we would like then to have it stored in a db.
Dear all,
thank you for your so importand information, we are now seeking to get
statistics for the quality of UNIMARC (or MARC) records, we would really
appreciate if someone has done something similar and could give us a hint, on
what tools they use.
By quality of bibliographic records, we
Greetings,
i would appreciate if you could provide us with feedback regarding the way you
have adopted elasticsearch, and how you have used it with MARC records. I have
already read all previous answers, if someone has something in more detail
(technically) it would be very helpful. I have
Greetings,
i would appreciate it if you could please tell me if there is any research, or
output, or any program that gets statistical analysis of the LEADER, all data
fields, all subfields, controlfields. I would appreciate also your input on
ideas for what should be analysed, what would be
-v0.0.4/bin/marcstats.pl can
be useful, but generates no stats for leader or controlfields.
Stefano
On 20/mag/2013, at 12.55, dasos ili wrote:
Greetings,
i would appreciate it if you could please tell me if there is any research,
or output, or any program that gets statistical analysis
Greetings dear all,
Let's say we have a file with many records in unimarc format, or Marc, it is
almost the same.
could you please send me a hint on how get all coded data values for example
for the fields 100, 101 102, for instance we would like to have the info: 100
appears 387 times, $a
. Stefano
On 16/mag/2013, at 12.01, dasos ili wrote:
Greetings dear all,
Let's say we have a file with many records in unimarc format, or Marc, it is
almost the same.
could you please send me a hint on how get all coded data values for example
for the fields 100, 101 102, for instance we
via code
Tool http://en.pusc.it/bib/MARCgrep can help you.
Bye. Stefano
On 16/mag/2013, at 12.01, dasos ili wrote:
Greetings dear all,
Let's say we have a file with many records in unimarc format, or Marc, it is
almost the same.
could you please send me a hint on how get all coded data
] How get coded data values in unimarc
format via code
Try the -e switch on MARCgrep.
-Tod
On May 16, 2013, at 5:38 AM, dasos ili dasos_...@yahoo.gr
wrote:
i suppose there is not anything to retrieve the values of the subfields $a,
etc
Απο: Stefano
21 matches
Mail list logo