[
https://issues.apache.org/jira/browse/ATLAS-503?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15298058#comment-15298058
]
Hemanth Yamijala commented on ATLAS-503:
----------------------------------------
Started looking at this.
My primary focus for the bug will be to replicate, debug and fix it with a
HBase backend, as BerkeleyDB is not a recommended production.
I have been trying to replicate the issue by trying what [~ssainath] reported,
except with HBase as the backend. However, I have had no success so far
(imported 1000, 5000, 10000 tables) in replicating the issue.
Talking to Sharmadha, I found that we don't see this specific problem with
HBase, only with BerkeleyDB. However, there are other scenarios where even
HBase gives a lock exception. Some scenarios are:
* Multiple consumer threads and partitions while importing data into Atlas.
* Multiple threads creating tags
I am assuming the underlying cause will be the same and will try to use one of
these scenarios to replicate the issue.
> Not all Hive tables are not imported into Atlas when interrupted with search
> queries while importing.
> -------------------------------------------------------------------------------------------------------
>
> Key: ATLAS-503
> URL: https://issues.apache.org/jira/browse/ATLAS-503
> Project: Atlas
> Issue Type: Bug
> Reporter: Sharmadha Sainath
> Assignee: Hemanth Yamijala
> Priority: Critical
> Fix For: 0.7-incubating
>
> Attachments: hiv2atlaslogs.rtf
>
>
> On running a file containing 100 table creation commands using beeline -f ,
> all hive tables are created. But only 81 of them are imported into Atlas
> (HiveHook enabled) when queries like "hive_table" is searched frequently
> while the import process for the table is going on.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)