[
https://issues.apache.org/jira/browse/FALCON-165?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13812508#comment-13812508
]
Venkatesh Seetharam commented on FALCON-165:
--------------------------------------------
Thanks [~arpitgupta] for proposing this. We had a hallway conversation on
Friday and Arpit rightly suggested that hadoop and other components in the
ecosystem has one package/tar ball and users can start different services from
the same. This would simplify bigtop integration as well.
I haven't looked at the patch but it'd help if you can outline a few things:
* How this works in stand alone mode. My observation is server and bin tar
balls are pretty much the same, hence might help here.
* Also, since the dependencies are common across falcon and prism, do we add
all to the classpath?
* How do we expect this to work with both jetty and tomcat?
> simplify packaging to create one package instead of client, falcon and prism
> packages
> -------------------------------------------------------------------------------------
>
> Key: FALCON-165
> URL: https://issues.apache.org/jira/browse/FALCON-165
> Project: Falcon
> Issue Type: Improvement
> Reporter: Arpit Gupta
> Assignee: Arpit Gupta
> Attachments: FALCON-165-1383504173.patch, FALCON-165-1383504576.patch
>
>
> Currently we create multiple packages client, falcon server and prism server.
> Our scripts how ever already add only specific files to the class path based
> on what server we are running. I think we can also move to a single package.
> Introduced a new java property falcon.domain which will be set to falcon or
> prism based on what service is being started.
> Thus the user can set all the various startup and run time properties for
> both falcon and prism in the same file and the appropriate configs will be
> picked up based on what service is being started.
--
This message was sent by Atlassian JIRA
(v6.1#6144)