Marc::XML with MARC21

2010-01-25 Thread Michele Pinassi
-BEGIN PGP SIGNED MESSAGE- Hash: SHA1 Hi all, im'm working on a perl plugin for EPrints that let user importing from Aleph simply using system id. It use Aleph OAI-PMH service that export metadata in MARC21 format: OAI-PMH xsi:schemaLocation=http://www.openarchives.org/OAI/2.0/

Re: Marc::XML with MARC21

2010-01-25 Thread Jon Gorman
my $file = MARC::Record-new_from_xml($marc-serialize(),UTF-8,MARC21);        $epdata = $plugin-EPrints::Plugin::Import::MARC::convert_input( $file ); and here come troubles: only few metadatas will be interpreted correctly, losing a lot of datas. Ummm, so what metdata makes it through? I

Splitting a large file of MARC records into smaller files

2010-01-25 Thread Nolte, Jennifer
Hello- I am working with files of MARC records that are over a million records each. I'd like to split them down into smaller chunks, preferably using a command line. MARCedit works, but is slow and made for the desktop. I've looked around and haven't found anything truly useful- Endeavor's

Re: Splitting a large file of MARC records into smaller files

2010-01-25 Thread Emmanuel Di Pretoro
Hi, A long time ago, I've written the following : --- snippet --- #!/usr/bin/env perl use strict; use warnings; use MARC::File::USMARC; use MARC::Record; use Getopt::Long; my $config = { output = 'input' }; GetOptions($config, 'input=s', 'chunk=s', 'output=s', 'max=s'); if (not exists

RE: Splitting a large file of MARC records into smaller files

2010-01-25 Thread Smith,Devon
This isn't a perl solution, but it may work for you. You can use the unix split command to split a file into several other files with the same number of lines each. For that to work, you'll first have to use tr to convert the ^] record separators into newlines. Then use tr to convert them all

Re: Splitting a large file of MARC records into smaller files

2010-01-25 Thread Ashley Sanders
Jennifer, I am working with files of MARC records that are over a million records each. I'd like to split them down into smaller chunks, preferably using a command line. MARCedit works, but is slow and made for the desktop. I've looked around and haven't found anything truly useful-

RE: Splitting a large file of MARC records into smaller files

2010-01-25 Thread Walker, David
yaz-marcdump allows you to break a marcfile into chunks of x-records +1 --Dave == David Walker Library Web Services Manager California State University http://xerxes.calstate.edu From: Colin Campbell [colin.campb...@ptfs-europe.com]

Re: Marc::XML with MARC21

2010-01-25 Thread Ed Summers
Hi Michele: I copied and pasted the XML from your email and ran it through a simple test script (both attached) and the record seemed to be parsed ok. What do you see if you run the attached test.pl? //Ed test.pl Description: Binary data marc:record xmlns:marc=http://www.loc.gov/MARC21/slim;

Re: Splitting a large file of MARC records into smaller files

2010-01-25 Thread Sébastien Hinderer
Hi, The yaz-marcdump utility may be what you are looking for. See for instance options -s and -C. hth, Shérab.

Re: Splitting a large file of MARC records into smaller files

2010-01-25 Thread Robert Fox
Assuming that memory won't be an issue, you could use MARC::Batch to read in the record set and print out seperate files where you split on X amount of records. You would have an iterative loop loading each record from the large batch, and a counter variable that would get reset after X

Re: Splitting a large file of MARC records into smaller files

2010-01-25 Thread Saiful Amin
I also recommend using MARC::Batch. Attached is a simple script I wrote for myself. Saiful Amin +91-9343826438 On Mon, Jan 25, 2010 at 8:33 PM, Robert Fox rf...@nd.edu wrote: Assuming that memory won't be an issue, you could use MARC::Batch to read in the record set and print out seperate