Github user vanzin commented on the pull request:

    https://github.com/apache/spark/pull/2318#issuecomment-54890315
  
    @srowen the problem is that maven is stup... err, peculiar.
    
    If you declare the plugin in the parent pom's "pluginManagement" section, 
it will try to execute the plugin when building the parent pom itself. Since 
the parent pom has packaging "pom", it will fail, since there are no jars. So 
you need a way to only add the plugin to the build in certain situations.
    
    The logical thing would have the plugin only be activated for children, but 
there's no option I can find to do that. The next would be to only bind the 
plugin to a certain packaging type, but again, no dice.
    
    So you have to resort to profiles. You could have a profile activated by a 
property, but then that doesn't work when the property is set in the child pom 
(instead of the command line). So you're left with checking for some file on 
disk, like my approach. I checked for "src/main/scala", but if that's too 
generic, we can use a tag file (and create that file in all projects where we 
want the plugin to be activated).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to