Hi all,
we just tried out metamodel with mongodb and tried out a simple join (as
in select * from t1 join t2 on (t1.id = t2.oid) ) between two
collections each containing roughly 10,000 documents. Using a developer
setup on a mac, we did not get a result, as the system was more or less
stuck.
A quick examination revealed that
MetaModelHelper.getCarthesianProduct(DataSet[] fromDataSets,
Iterable<FilterItem> whereItems) consumes most of the resources.
This implementation first computes the carthesian product in memory and
than applies filters on it.
I wonder what the rationale behind this implementation is, as it will
not scale well, even for selective joins.
Or am i using Metamodel wrong here, as in: The join should never be
computed by getCarthesianProduct().
The problem appears to me as a general one, i did not supply a code example.
Best,
Jörg