Along this line I’ve been struggling with finding an authoritative data
dictionary for all of ASpace’s model classes and all the possible endpoint
parameters. The API documentation provides much of this but not seemingly all,
esp. mixins.
For example: I would like an application developer to mo
We use this field to record opaque (e.g. Voyager ILS bib record unique ID),
semantic (accession number/call number + sequential numbers), and mixed
identifiers (opaque ID + box/folder numbers + sequential numbers).
We generally try to prescribe that the Voyager ID be a component for all new
pro
This is fantastic. The EAD tag library springs to mind as a useful abstract
model, but ASpace would benefit from the additional implementation details.
My list of elements would include:
Field Description
Allowable Attributes
Where fields/mixins are stored internally
Data type and/or range restri
How about adding to the core a separate URI field for the full string?
John P. Rees
Archivist and Digital Resources Manager
History of Medicine Division
National Library of Medicine
301-827-4510
From: Brian Thomas [mailto:btho...@tsl.texas.gov]
Sent: Monday, January 08, 2018 1:02 PM
To: 'Archi
Has anyone tried writing a script to import batches of MODS XML to create
Digital Object records? I haven't found much on the interwebs.
Is it folly to attempt such a thing?
John
John P. Rees
Archivist and Digital Resources Manager
History of Medicine Division
National Library of Medicine
301-
We’ve been doing this for several years as an end-of-year task. We make brief
collection level MARC records first, then grab them all through MarcEdit and
spin them through the MARCXML-EAD conversion stylesheet to get EAD for our DLXS.
You can see them at
https://oculus.nlm.nih.gov/cgi/f/findai
We're migrating some journal article metadata to ASpace and are seeking
description advice.
The original metadata schema nicely records articleTitle, journalTitle, issue,
pagination in distinct elements. ASpace doesn't specifically offer the same
granularity in either Resource or Digital Object
We also have SPLUNKD running across all our local PCs, so my local/C: drive
aspace v3.3.1 runs on port 8087 (or any other port not list in the config) to
get around the cannot bind error.
#
## The ArchivesSpace backend listens on port 8089 by default. You can set it to
## something else below.
Using the background job importer, I'm trying to import extent data from EAD
2002 schema XML, specifically the other extent data we record in parenthesis
like 15.0 linear feet
(36 boxes + oversize folder)
I'd like these parenthetical statements to map to the Container Summary field.
Accordin
e_users_group-boun...@lyralists.lyrasis.org>
[mailto:archivesspace_users_group-boun...@lyralists.lyrasis.org] On Behalf Of
Rees, John (NIH/NLM) [E]
Sent: Thursday, 21 June, 2018 3:43 PM
To: Archivesspace Users Group
mailto:archivesspace_users_group@lyralists.lyrasis.org>>
Subject: [Ar
Wow, this is super helpful.
We’ve been noodling on a similar use case. All our existing repository projects
leverage our MARC 035 field ids (Voyager ILS-supplied) to mint ids/Fedora PIDs,
but now we’re embarking on ASpace projects that don’t always have a Voyager
record, or have ID minting prac
We’ve been toying with the idea of using Events to record digitization and
metadata review/approval status information for a large data
migration/re-architecture project as well as record these elements going
forward. The customer’s old system tracked project and item status information,
metada
Lydia,
ArcLight completed a MVP work cycle last summer and a few institutions have
done some local work since then, notably the Bentley and U. Albany. Mark
Matienzo is the Stanford POC.
You can take a look at the ArcLight Community Google Group, or request to join
the Slack channel.
G Group:
Hi all,
I administer a finding aids aggregation service that in part scrapes
HTML-source code as a data input and I am looking for some advice/start a
conversation.
Several of our contributing repositories with this data type moved to
ArchivesSpace in 2018 and we are not able to crawl ASpace's
sitemaps to
configure this sort of action, i.e.
Use this other URL to index this resource.
— Steve Majewski
On Jan 4, 2019, at 9:59 AM, Rees, John (NIH/NLM) [E]
mailto:re...@mail.nlm.nih.gov>> wrote:
Hi all,
I administer a finding aids aggregation service that in part scrapes
HTML-sou
We’d describe the sets’ nature in the unittitle and, like others, randomly
assign volume numbers from 1-N across the entire collection.
We insert Permalife flags in each volume which has the call number, arbitrary
volume number, and a barcode. We do this for all our rare books holdings and
book
We're trying to post-process digital objects and link existing subjects to them
via the API, however there doesn't seem to be a call to do this.
Are we missing something? It seems doable for linked_agents and linked_events.
Or is the preferred approach to insert subjects?
https://archivesspace.
Hi all,
This question isn't aspace-centric, but for those of you that have multiple
deployment strings (dev, qa, production) and that use the PUI, what is your
best practice for entering new data or massaging existing data?
1. Do you test load/edit in QA then re-run the job in production?
lyralists.lyrasis.org>
mailto:archivesspace_users_group-boun...@lyralists.lyrasis.org>>
on behalf of Rees, John (NIH/NLM) [E]
mailto:re...@mail.nlm.nih.gov>>
Sent: Wednesday, May 29, 2019 11:43 AM
To:
archivesspace_users_group@lyralists.lyrasis.org<mailto:archivesspace_users_group@lyralist
Is it possible to import an EAD component that is comprised of multiple
parent-level top containers and have ASpace create two sibling top container
instances? Currently it seems ASpace will create a single instance with one top
container for the positioned first, then create
children/grandchi
users_group-boun...@lyralists.lyrasis.org>>
on behalf of Rees, John (NIH/NLM) [E]
mailto:re...@mail.nlm.nih.gov>>
Sent: Wednesday, May 13, 2020 11:30 AM
To:
archivesspace_users_group@lyralists.lyrasis.org<mailto:archivesspace_users_group@lyralists.lyrasis.org>
Hey all,
We'd like to re-baseline our production data periodically, and before
performing a version upgrade.
For those that have multiple prod/qa strings, is there a best practice for
re-baselining data different than following the backup and recovery
instructions?
Does this grab things like
Hi Dawne,
I'm doing pretty much what Kate laid out to edit source EAD, for all sorts of
different container conventions. You can see my ever-growing punch list at
https://github.com/John-Rees/aspace-migrations/issues?q=is%3Aopen+is%3Aissue+milestone%3Amigrations
I use a variety of regex and xsl
Hi all,
We're on v.2.8.0 and are experimenting with the new Digital Object spreadsheet
importer to append DO's to existing archival objects.
In 2.8.0, I understand that import will fail where another DO already exists
which is a use case I'm testing. However, using the ASpace 2.8.1 sandbox I ge
Hi all,
Our cloud implementation team is having issues setting up the AS OAuth plugin
with AWS Cognito for centralized authentication.
If anyone has experience getting the plugin to work in this environment we'd
appreciate connecting for a conversation.
Thanks,
John
John P. Rees
Archivist and
Hi all,
Experimenting with the human readable URLs options and I notice that with our
numerical EADIDS 2 underscores are inserted as a prefix for Resources, but not
Archival Objects. I see the same behavior with or without autogenerate slugs
on/off and when running the background job.
This beh
Hi Sarah,
My path (pre-covid) used MarcEdit
1. grab MARCXML via z39.50
2. use marcedit's MARC-EAD transform xsl (w/ local edits)
3. import EAD into aspace
4. massage/publish agents, subjects, etc.
We do this annually for adding accession records as resources (we don't use the
aspace
p-boun...@lyralists.lyrasis.org<mailto:archivesspace_users_group-boun...@lyralists.lyrasis.org>
mailto:archivesspace_users_group-boun...@lyralists.lyrasis.org>>
on behalf of Rees, John (NIH/NLM) [E]
mailto:re...@mail.nlm.nih.gov>>
Sent: Tuesday, October 4, 2022 4:33 PM
To: Archivessp
Hi all,
We recently tried a hard re-index and discovered that for the PUI, subjects
linked only to digital objects were not re-indexed and dropped from the PUI. Is
this expected behavior?
Related, is there a table/tables in the dbase that tell Solr when to index that
we could investigate, as o
Thanks. They weren't deleted records, but published-then-suppressed records.
Being once published, their subject terms were published and then persisted as
ghost terms in the PUI with no linked records. Until either the subject term
was touched (when we ran the full re-index) or the published st
30 matches
Mail list logo