[ 
https://issues.apache.org/jira/browse/GOBBLIN-1845?focusedWorklogId=866146&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-866146
 ]

ASF GitHub Bot logged work on GOBBLIN-1845:
-------------------------------------------

                Author: ASF GitHub Bot
            Created on: 17/Jun/23 21:44
            Start Date: 17/Jun/23 21:44
    Worklog Time Spent: 10m 
      Work Description: codecov-commenter commented on PR #3706:
URL: https://github.com/apache/gobblin/pull/3706#issuecomment-1595868617

   ## 
[Codecov](https://app.codecov.io/gh/apache/gobblin/pull/3706?src=pr&el=h1&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=apache)
 Report
   > Merging 
[#3706](https://app.codecov.io/gh/apache/gobblin/pull/3706?src=pr&el=desc&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=apache)
 (a5a5924) into 
[master](https://app.codecov.io/gh/apache/gobblin/commit/ee17b1576759c0669bf7bc46c9faf6868544ceb9?el=desc&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=apache)
 (ee17b15) will **decrease** coverage by `2.00%`.
   > The diff coverage is `n/a`.
   
   ```diff
   @@             Coverage Diff              @@
   ##             master    #3706      +/-   ##
   ============================================
   - Coverage     46.83%   44.84%   -2.00%     
   + Complexity    10797     2102    -8695     
   ============================================
     Files          2141      411    -1730     
     Lines         84410    17734   -66676     
     Branches       9383     2162    -7221     
   ============================================
   - Hits          39533     7952   -31581     
   + Misses        41272     8922   -32350     
   + Partials       3605      860    -2745     
   ```
   
   
   [see 1734 files with indirect coverage 
changes](https://app.codecov.io/gh/apache/gobblin/pull/3706/indirect-changes?src=pr&el=tree-more&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=apache)
   
   :mega: We’re building smart automated test selection to slash your CI/CD 
build times. [Learn 
more](https://about.codecov.io/iterative-testing/?utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=apache)
   




Issue Time Tracking
-------------------

    Worklog Id:     (was: 866146)
    Time Spent: 20m  (was: 10m)

> Java parallel stream usage causes class loader conflict when run with spark
> ---------------------------------------------------------------------------
>
>                 Key: GOBBLIN-1845
>                 URL: https://issues.apache.org/jira/browse/GOBBLIN-1845
>             Project: Apache Gobblin
>          Issue Type: Task
>            Reporter: Vikram Bohra
>            Priority: Major
>          Time Spent: 20m
>  Remaining Estimate: 0h
>
> DatasetsFinderFilteringDecorator uses parallel stream on datasets to filter 
> them on predicates. When this code runs in spark, system class loader gets 
> used to pickup hive jar instead of the current conext class loader which 
> leads to ClassNotFound issues 
> stacktrace 
> {code:java}
> Caused by: 
> MetaException(message:org.apache.hadoop.hive.metastore.HiveMetaStoreClient 
> class not found)
>       at 
> org.apache.hadoop.hive.metastore.MetaStoreUtils.getClass(MetaStoreUtils.java:1494)
>       at 
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:130)
>       at 
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
>       at 
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:98)
>       at 
> org.apache.gobblin.hive.HiveMetaStoreClientFactory.createMetaStoreClient(HiveMetaStoreClientFactory.java:100)
>       at 
> org.apache.gobblin.hive.HiveMetaStoreClientFactory.create(HiveMetaStoreClientFactory.java:106)
>  {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

Reply via email to