racevedoo commented on issue #3620:
URL: https://github.com/apache/iceberg/issues/3620#issuecomment-982969999


   I may have found a solution (maybe just a workaround). Looking at the 
`iceberg-spark-3.2` POM, there is a version specification for `slf4j-api` 
(1.7.30, got from 
[here](https://repository.apache.org/content/repositories/snapshots/org/apache/iceberg/iceberg-spark-3.2/0.13.0-SNAPSHOT/iceberg-spark-3.2-0.13.0-20211130.000831-33.pom)).
 With that, we do not need slf4j-api at all in the spark `runtime` modules, as 
they include the respective `iceberg-spark` module.
   
   Applying the following patch and publishing a custom version to local maven 
makes `spark-shell` able to start:
   
   ```patch
   diff --git a/build.gradle b/build.gradle
   index 38388747..b1645a01 100644
   --- a/build.gradle
   +++ b/build.gradle
   @@ -111,7 +111,6 @@ subprojects {
      targetCompatibility = '1.8'
    
      dependencies {
   -    implementation 'org.slf4j:slf4j-api'
        implementation 'com.github.stephenc.findbugs:findbugs-annotations'
    
        testImplementation 'org.junit.vintage:junit-vintage-engine'
   @@ -147,6 +146,12 @@ subprojects {
      }
    }
    
   +configure(subprojects.findAll{!(it.name.contains("spark") && 
it.name.contains("runtime")) }) {
   +  dependencies {
   +    implementation 'org.slf4j:slf4j-api'
   +  }
   +}
   +
    project(':iceberg-bundled-guava') {
      apply plugin: 'com.github.johnrengelman.shadow'
    
   
   ```
   
   If this makes enough sense, I'd be happy open a PR


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to