So I have tables like this:

Users
UserSales
UserHistory
UserAddresses
UserNotes
ClientAddress
CalenderEvent
Articles
Blogs

Just seems odd to me, jamming on these tables into a single index.  But I
guess the idea of using a 'type' field to quality exactly what I am
searching is a good idea, in case I need to filter for only 'articles' or
blogs or contacts etc.

But there might be 50 fields if I do this no?



On Fri, Jul 30, 2010 at 4:01 AM, Chantal Ackermann <
chantal.ackerm...@btelligent.de> wrote:

> Hi Ahmed,
>
> fields that are empty do not impact the index. It's different from a
> database.
> I have text fields for different languages and per document there is
> always only one of the languages set (the text fields for the other
> languages are empty/not set). It works all very well and fast.
>
> I wonder more about what you describe as "unrelated data" - why would
> you want to put unrelated data into a single index? If you want to
> search on all the data and return mixed results there surely must be
> some kind of relation between the documents?
>
> Chantal
>
> On Thu, 2010-07-29 at 21:33 +0200, S Ahmed wrote:
> > I understand (and its straightforward) when you want to create a index
> for
> > something simple like Products.
> >
> > But how do you go about creating a Solr index when you have data coming
> from
> > 10-15 database tables, and the tables have unrelated data?
> >
> > The issue is then you would have many 'columns' in your index, and they
> will
> > be NULL for much of the data since you are trying to shove 15 db tables
> into
> > a single Solr/Lucense index.
> >
> >
> > This must be a common problem, what are the potential solutions?
>
>
>
>

Reply via email to