Github user meiercaleb commented on a diff in the pull request:

    https://github.com/apache/incubator-rya/pull/153#discussion_r134025772
  
    --- Diff: 
extras/indexing/src/main/java/org/apache/rya/indexing/entity/storage/mongo/MongoEntityStorage.java
 ---
    @@ -242,4 +283,84 @@ private static Bson makeExplicitTypeFilter(final 
RyaURI typeId) {
     
             return Stream.of(dataTypeFilter, valueFilter);
         }
    +
    +    private boolean detectDuplicates(final Entity entity) throws 
EntityStorageException {
    +        boolean hasDuplicate = false;
    +        if (duplicateDataDetector.isDetectionEnabled()) {
    +            // Grab all entities that have all the same explicit types as 
our
    +            // original Entity.
    +            final List<Entity> comparisonEntities = 
searchHasAllExplicitTypes(entity.getExplicitTypeIds());
    +
    +            // Now that we have our set of potential duplicates, compare 
them.
    +            // We can stop when we find one duplicate.
    +            for (final Entity compareEntity : comparisonEntities) {
    +                try {
    +                    hasDuplicate = 
duplicateDataDetector.compareEntities(entity, compareEntity);
    +                } catch (final SmartUriException e) {
    +                    throw new EntityStorageException("Encountered an error 
while comparing entities.", e);
    +                }
    +                if (hasDuplicate) {
    +                    break;
    +                }
    +            }
    +        }
    +        return hasDuplicate;
    +    }
    +
    +    /**
    +     * Searches the Entity storage for all Entities that contain all the
    +     * specified explicit type IDs.
    +     * @param explicitTypeIds the {@link ImmutableList} of {@link RyaURI}s 
that
    +     * are being searched for.
    +     * @return the {@link List} of {@link Entity}s that have all the 
specified
    +     * explicit type IDs. If nothing was found an empty {@link List} is
    +     * returned.
    +     * @throws EntityStorageException
    +     */
    +    private List<Entity> searchHasAllExplicitTypes(final 
ImmutableList<RyaURI> explicitTypeIds) throws EntityStorageException {
    +        final List<Entity> hasAllExplicitTypesEntities = new ArrayList<>();
    +        if (!explicitTypeIds.isEmpty()) {
    +            // Grab the first type from the explicit type IDs.
    +            final RyaURI firstType = explicitTypeIds.get(0);
    +
    +            // Check if that type exists anywhere in storage.
    +            final List<RyaURI> subjects = new ArrayList<>();
    +            Optional<Type> type;
    +            try {
    +                if (mongoTypeStorage == null) {
    +                    mongoTypeStorage = new MongoTypeStorage(mongo, 
ryaInstanceName);
    +                }
    +                type = mongoTypeStorage.get(firstType);
    +            } catch (final TypeStorageException e) {
    +                throw new EntityStorageException("Unable to get entity 
type: " + firstType, e);
    +            }
    +            if (type.isPresent()) {
    +                // Grab the subjects for all the types we found matching 
"firstType"
    +                final ConvertingCursor<TypedEntity> cursor = 
search(Optional.empty(), type.get(), Collections.emptySet());
    --- End diff --
    
    Instead of getting all of the TypedEntities in the database with the given 
Type, you should call the method Event.makeTypedEntity(...) for each typeId.  
Then use the Type and Property map of each TypedEntity to query the DB.  This 
will provide a more constrained query that uses the actual property values.  
Finally, I think that you should add a compareTypedEntities method to your 
DuplicateDataDetector so that you can then apply it to compare the returned 
TypedEntities with the TypedEntity that you created from the original Entity.  
This eliminates the need the re-query the DB to get the Entities that each 
TypedEntity is derived from.
    Also, comparing all TypedEntities derived from a given Entity with all 
other TypeEntities in the database provides a stricter notion of duplicate data 
detection.  For example, if an Entity contains the Types People and Employee 
with associated properties, then the approach I'm describing would compare the 
People TypedEntity and the Employee TypedEntity with all other People and 
Employee TypedEntities in the DB.  None of those TypedEntities could be 
duplicates in order for the Entity to be deemed a non-duplicate.  As it's 
currently implemented, if an Employee TypedEntity was ingested and derived from 
an Entity whose sole type was Employee, then an Entity with Type Person and 
Employee would not be considered a duplicate even if the Employee properties 
were exactly the same!  So in effect, I think we should detect if any 
TypedEntites derived from an Entity are duplicate to avoid duplicating 
TypedEntities (I think that these are more meaningful and concrete than 
Entities, which are e
 ssentially just a bin containing many Typed Entities).  As it's currently 
implemented, you are requiring that the given Entity match another Entity in 
types and property values (within the tolerance) in order to be considered a 
duplicate.  I think that this requirement is too strict and could lead to lots 
of duplicate TypedEntities.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

Reply via email to