For closure, I’ve solved the problem! It was not using my schema.xml at all.
I had to change the solrconfig.xml to include and comment out the schema adding processor.
My schema still didn’t work right, but I took the managed-schema and renamed it
and changed uniqueKey to uuid and everything
Yes, think of the starving orphan records…
Ours is an eCommerce system, selling mostly shoes. We have three levels of
nested objects representing what we sell:
- Product: Mostly title and description
- Item: A specific color and some other attributes, including price. Products
have 1 or more
David,
I hardly get the way which IDs are assigned, but beware that repeating
uniqueKey
value causes deleting former occurrence. In case of block join index it
corrupts block structure: parent can't be deleted and left children orphans
(.. so touching, I'm sorry). Just make sure that number of
Thanks, for responding. Mikhail. There are no deleted documents. Since I’m
fairly new to Solr, one of the things I’ve been paranoid about is I have no way
of validating my schema.xml, or know whether Solr is even using it (I have
evidence it’s not, more below). So for each test, I’ve wiped
David,
Can you make sure your index doesn't have deleted docs? This can be seen
in SolrAdmiun.
And can you merge index to avoid having them in the index?
On Thu, Feb 2, 2017 at 12:29 AM, David Kramer
wrote:
>
>
> Some background:
> · The data involved is
Some background:
· The data involved is catalog data, with three nested objects:
Products, Items, and Skus, in that order. We have a docType field on each
record as a differentiator.
· The "id" field in our data is unique within datatype, but not across
datatypes. We added a