[ 
https://issues.apache.org/jira/browse/HBASE-18176?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16042187#comment-16042187
 ] 

Sean Busbey commented on HBASE-18176:
-------------------------------------

{code}
40            <plugin>
41              <groupId>org.apache.maven.plugins</groupId>
42              <artifactId>maven-enforcer-plugin</artifactId>
43              <executions>
44                <!-- scala is ok in the assembly -->
45                <execution>
46                  <id>banned-scala</id>
47                  <goals>
48                    <goal>enforce</goal>
49                  </goals>
50                  <configuration>
51                    <skip>true</skip>
52                  </configuration>
53                </execution>
54              </executions>
55            </plugin>
{code}

Are we shipping scala in the assembly? I guess I have more reviewing to do; my 
intuition is that we ought not be doing that because spark provides it.

Or is this because the same ban also covers hbase-spark? Can we break them out?

> add enforcer rule to make sure hbase-spark / scala aren't dependencies of 
> unexpected modules
> --------------------------------------------------------------------------------------------
>
>                 Key: HBASE-18176
>                 URL: https://issues.apache.org/jira/browse/HBASE-18176
>             Project: HBase
>          Issue Type: Improvement
>          Components: build, spark
>            Reporter: Sean Busbey
>            Assignee: Mike Drob
>             Fix For: 2.0.0
>
>         Attachments: HBASE-18176.patch
>
>
> We should have an enforcer plugin rule that makes sure we don't have scala 
> and/or hbase-spark showing up in new modules. (based on prior discussions 
> about limiting the scope of where those things show up in our classpath, esp 
> given scala's poor history on binary compatibility)



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

Reply via email to