Hi,

Noted. And created a document JIRA[1] to include this information in  DAS
documents.

[1].https://wso2.org/jira/browse/DOCUMENTATION-2592

Regards,

Nayomi Dayarathne

*Software Engineer-QA*
Mobile : +94 (0) 775246619 <+94+(0)+775246619>

*[email protected] <[email protected]>*

On Tue, Oct 20, 2015 at 11:25 AM, Anjana Fernando <[email protected]> wrote:

> Hi Nayomi,
>
> Current approach is an acceptable solution, where copying the DAS pack to
> a target Spark node is not a problem. Having a script or something to
> compile the list of jars only required for it is just a small convenience
> mechanism, and is not really that important. Someone may actually just like
> copying the pack right away, rather than running another command.
>
> Cheers,
> Anjana.
>
> On Mon, Oct 19, 2015 at 10:24 PM, Nayomi Dayarathne <[email protected]>
> wrote:
>
>> Hi all,
>>
>> Given the current situation in DAS, there should be a DAS distribution in
>> every spark node in an external spark cluster in order to work DAS with
>> it.That way, external spark cluster able to access jar files which needs to
>> work with DAS.
>>
>> Since this is not a better approach,I have already reported an
>> improvement JIRA[1] regarding this.
>>
>> Therefore, we want to know whether we are going to implement a solution
>> according to the JIRA[1] reported or are there any better solution for this
>> ?
>>
>>
>> [1].https://wso2.org/jira/browse/DAS-197
>>
>>
>> Regards,
>>
>> Nayomi Dayarathne
>>
>> *Software Engineer-QA*
>> Mobile : +94 (0) 775246619 <+94+(0)+775246619>
>>
>> *[email protected] <[email protected]>*
>>
>
>
>
> --
> *Anjana Fernando*
> Senior Technical Lead
> WSO2 Inc. | http://wso2.com
> lean . enterprise . middleware
>
_______________________________________________
Dev mailing list
[email protected]
http://wso2.org/cgi-bin/mailman/listinfo/dev

Reply via email to