hello world,

Years ago, i wrote MARC::Template (https://github.com/eiro/MARC-Template)
to ease the process of migrating data to koha ILS. We often use it at
biblibre (http://biblibre.com).

For a MARC to MARC migration, just making some manipulations CRUD manipulations
on fields, cleaning some data, moving some fields, API is an awfull waste of 
time:
learn to manipulate perl structures is much more efficient imho.

For a more complex migration mixing data coming from multiple datasources and
multiple formats, or even to write some migration from MARC to a modern
biblio format, we're convinced that the job can be done better and faster
by adding a level of abstraction over MARC. What i mean about abstraction is
that the business programmer, as well as the librarian, don't carre about the
999$x field: he carres about authors, titles, year of edition ... That
can be partially done by a YAML driven Moose metaprogramming.

Actually: i personnally think that is would be possible to write a complete GUI
driven ETL able to deal with MARC.

At the very end of the process, we transform everything as MARC::Record
to use the MARC::Record serialization but the Frederic's lib can be a
good output for our libs. So is there a chance to specify a library
agnostic datastructure as a bridge for all your libs, a kind of PSGI for
MARC so everyone could import and export to this format so we can easily
mix all of them ?

simple proposition is: 

[ [qw/ 001 value  /] # example of control field
, [qw/ 005 value  /] # example of control field
, [ [qw/ 200 0 1  /] # example of data field
  , [ [qw/ a foo  /]
    , [qw/ b bar  /]
    , [qw/ a foo2 /]
    , [qw/ b bar2 /]
    ]
  ]
, [ [qw/ 200 0 1  /] # example of data field
  , [ [qw/ a foo  /]
    , [qw/ b bar  /]
    , [qw/ a foo2 /]
    , [qw/ b bar2 /]
    ]
  ]
]

regards,
-- 
Marc Chantreux
BibLibre, expert en logiciels libres pour l'info-doc
http://biblibre.com


Reply via email to