[GitHub] [airavata-mft-portal] akhil-8607 opened a new pull request #6: UI update

2020-04-16 Thread GitBox
akhil-8607 opened a new pull request #6: UI update
URL: https://github.com/apache/airavata-mft-portal/pull/6
 
 
updated the static UI according to 
https://issues.apache.org/jira/browse/AIRAVATA-3314 . 
   
   More Improvements to be done in css 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


Re: Apache Airavata MFT - AWS/GCS support

2020-04-16 Thread Aravind Ramalingam
Hello,

Wouldn't it be that in this example the whole file has to be present and 
converted into a single stream and uploaded at once?
We had understood that MFT expects it to be chunk by chunk upload without 
having to have the entire file present.

Thank you
Aravind Ramalingam

> On Apr 17, 2020, at 00:07, DImuthu Upeksha  wrote:
> 
> 
> Aravind,
> 
> Streaming is supported in GCS java client. Have a look at here [8]
> 
> [8] 
> https://github.com/GoogleCloudPlatform/java-docs-samples/blob/master/storage/json-api/src/main/java/StorageSample.java#L104
> 
> Thanks
> Dimuthu
> 
>> On Thu, Apr 16, 2020 at 9:56 PM Aravind Ramalingam  wrote:
>> Hello Dimuthu,
>> 
>> As a followup, we explored GCS in detail. We are faced with a small dilemma. 
>> We found that though GCS has a Java support, but the functionality does not 
>> seem to extend to a stream based upload and download. 
>> The documentation says it is currently done with a gsutil command line 
>> library [7], hence we are confused if we would be able to proceed the GCS 
>> integration.
>> 
>> Could you please give us any suggestions? Also we were wondering if we could 
>> maybe take up Box integration or some other provider if GCS proves not 
>> possible currently.
>> 
>> [7] https://cloud.google.com/storage/docs/streaming 
>> 
>> Thank you
>> Aravind Ramalingam
>> 
>>> On Thu, Apr 16, 2020 at 12:45 AM Aravind Ramalingam  
>>> wrote:
>>> Hello Dimuthu,
>>> 
>>> We had just started looking into Azure and GCS. Since Azure is done we will 
>>> take up and explore GCS.
>>> 
>>> Thank you for the update.
>>> 
>>> Thank you
>>> Aravind Ramalingam
>>> 
> On Apr 16, 2020, at 00:30, DImuthu Upeksha  
> wrote:
> 
 
 Aravind,
 
 I'm not sure whether you have made any progress on Azure transport yet. I 
 got a chance to look into that [6]. Let me know if you are working on GCS 
 or any other so that I can plan ahead. Next I will be focusing on Box 
 transport.
 
 [6] 
 https://github.com/apache/airavata-mft/commit/013ed494eb958990d0a6f90186a53103e1237bcd
 
 Thanks
 Dimuthu
 
> On Mon, Apr 6, 2020 at 5:19 PM Aravind Ramalingam  
> wrote:
> Hi  Dimuthu,
> 
> Thank you for the update. We look into it and get an idea about how the 
> system works.
> We were hoping to try an implementation for GCS, we will also look into 
> Azure.
> 
> Thank you
> Aravind Ramalingam
> 
>> On Mon, Apr 6, 2020 at 4:44 PM DImuthu Upeksha 
>>  wrote:
>> Aravind,
>> 
>> Here [2] is the complete commit for S3 transport implementation but 
>> don't get confused by the amount of changes as this includes both 
>> transport implementation and the service backend implementations. If you 
>> need to implement a new transport, you need to implement a Receiver, 
>> Sender and a MetadataCollector like this [3]. Then you need to add that 
>> resource support to Resource service and Secret service [4] [5]. You can 
>> similarly do that for Azure. A sample SCP -> S3 transfer request is like 
>> below. Hope that helps.
>> 
>> String sourceId = "remote-ssh-resource";
>> String sourceToken = "local-ssh-cred";
>> String sourceType = "SCP";
>> String destId = "s3-file";
>> String destToken = "s3-cred";
>> String destType = "S3";
>> 
>> TransferApiRequest request = TransferApiRequest.newBuilder()
>> .setSourceId(sourceId)
>> .setSourceToken(sourceToken)
>> .setSourceType(sourceType)
>> .setDestinationId(destId)
>> .setDestinationToken(destToken)
>> .setDestinationType(destType)
>> .setAffinityTransfer(false).build();
>> 
>> [2] 
>> https://github.com/apache/airavata-mft/commit/62fae3d0ab2921fa8bf0bea7970e233f842e6948
>> [3] 
>> https://github.com/apache/airavata-mft/tree/master/transport/s3-transport/src/main/java/org/apache/airavata/mft/transport/s3
>> [4] 
>> https://github.com/apache/airavata-mft/blob/master/services/resource-service/stub/src/main/proto/ResourceService.proto#L90
>> [5] 
>> https://github.com/apache/airavata-mft/blob/master/services/secret-service/stub/src/main/proto/SecretService.proto#L45
>> 
>> Thanks
>> Dimuthu
>> 
>> 
>>> On Sun, Apr 5, 2020 at 12:10 AM DImuthu Upeksha 
>>>  wrote:
>>> There is a working on S3 transport in my local copy. Will commit it 
>>> once I test it out properly. You can follow the same pattern for any 
>>> cloud provider which has clients with streaming IO. Streaming among 
>>> different transfer protocols inside an Agent has been discussed in the 
>>> last part of this [1] document. Try to get the conceptual idea from 
>>> that and reverse engineer SCP transport. 
>>> 
>>> [1] 
>>> https://docs.google.com/document/d/1zrO4Z1dn7ENhm1RBdVCw-dDpWiebaZEWy66ceTWoOlo
>>> 
>>> 

Re: Apache Airavata MFT - AWS/GCS support

2020-04-16 Thread DImuthu Upeksha
Aravind,

Streaming is supported in GCS java client. Have a look at here [8]

[8]
https://github.com/GoogleCloudPlatform/java-docs-samples/blob/master/storage/json-api/src/main/java/StorageSample.java#L104

Thanks
Dimuthu

On Thu, Apr 16, 2020 at 9:56 PM Aravind Ramalingam 
wrote:

> Hello Dimuthu,
>
> As a followup, we explored GCS in detail. We are faced with a small
> dilemma. We found that though GCS has a Java support, but the functionality
> does not seem to extend to a stream based upload and download.
> The documentation says it is currently done with a gsutil command line
> library [7], hence we are confused if we would be able to proceed the GCS
> integration.
>
> Could you please give us any suggestions? Also we were wondering if we
> could maybe take up Box integration or some other provider if GCS proves
> not possible currently.
>
> [7] https://cloud.google.com/storage/docs/streaming
>
> Thank you
> Aravind Ramalingam
>
> On Thu, Apr 16, 2020 at 12:45 AM Aravind Ramalingam 
> wrote:
>
>> Hello Dimuthu,
>>
>> We had just started looking into Azure and GCS. Since Azure is done we
>> will take up and explore GCS.
>>
>> Thank you for the update.
>>
>> Thank you
>> Aravind Ramalingam
>>
>> On Apr 16, 2020, at 00:30, DImuthu Upeksha 
>> wrote:
>>
>> 
>> Aravind,
>>
>> I'm not sure whether you have made any progress on Azure transport yet. I
>> got a chance to look into that [6]. Let me know if you are working on GCS
>> or any other so that I can plan ahead. Next I will be focusing on Box
>> transport.
>>
>> [6]
>> https://github.com/apache/airavata-mft/commit/013ed494eb958990d0a6f90186a53103e1237bcd
>>
>> Thanks
>> Dimuthu
>>
>> On Mon, Apr 6, 2020 at 5:19 PM Aravind Ramalingam 
>> wrote:
>>
>>> Hi  Dimuthu,
>>>
>>> Thank you for the update. We look into it and get an idea about how the
>>> system works.
>>> We were hoping to try an implementation for GCS, we will also look into
>>> Azure.
>>>
>>> Thank you
>>> Aravind Ramalingam
>>>
>>> On Mon, Apr 6, 2020 at 4:44 PM DImuthu Upeksha <
>>> dimuthu.upeks...@gmail.com> wrote:
>>>
 Aravind,

 Here [2] is the complete commit for S3 transport implementation but
 don't get confused by the amount of changes as this includes both transport
 implementation and the service backend implementations. If you need to
 implement a new transport, you need to implement a Receiver, Sender and a
 MetadataCollector like this [3]. Then you need to add that resource support
 to Resource service and Secret service [4] [5]. You can similarly do that
 for Azure. A sample SCP -> S3 transfer request is like below. Hope that
 helps.

 String sourceId = "remote-ssh-resource";
 String sourceToken = "local-ssh-cred";
 String sourceType = "SCP";
 String destId = "s3-file";
 String destToken = "s3-cred";
 String destType = "S3";

 TransferApiRequest request = TransferApiRequest.newBuilder()
 .setSourceId(sourceId)
 .setSourceToken(sourceToken)
 .setSourceType(sourceType)
 .setDestinationId(destId)
 .setDestinationToken(destToken)
 .setDestinationType(destType)
 .setAffinityTransfer(false).build();


 [2]
 https://github.com/apache/airavata-mft/commit/62fae3d0ab2921fa8bf0bea7970e233f842e6948
 [3]
 https://github.com/apache/airavata-mft/tree/master/transport/s3-transport/src/main/java/org/apache/airavata/mft/transport/s3
 [4]
 https://github.com/apache/airavata-mft/blob/master/services/resource-service/stub/src/main/proto/ResourceService.proto#L90
 [5]
 https://github.com/apache/airavata-mft/blob/master/services/secret-service/stub/src/main/proto/SecretService.proto#L45

 Thanks
 Dimuthu


 On Sun, Apr 5, 2020 at 12:10 AM DImuthu Upeksha <
 dimuthu.upeks...@gmail.com> wrote:

> There is a working on S3 transport in my local copy. Will commit it
> once I test it out properly. You can follow the same pattern for any cloud
> provider which has clients with streaming IO. Streaming among different
> transfer protocols inside an Agent has been discussed in the last part of
> this [1] document. Try to get the conceptual idea from that and reverse
> engineer SCP transport.
>
> [1]
> https://docs.google.com/document/d/1zrO4Z1dn7ENhm1RBdVCw-dDpWiebaZEWy66ceTWoOlo
>
> Dimuthu
>
> On Sat, Apr 4, 2020 at 9:22 PM Aravind Ramalingam 
> wrote:
>
>> Hello,
>>
>> We were looking at the existing code in the project. We could find
>> implementations only for local copy and SCP.
>> We were confused on how to go about with an external provider like S3
>> or Azure? Since it would require integrating with their respective 
>> clients.
>>
>> Thank you
>> Aravind Ramalingam
>>
>> > On Apr 4, 2020, at 21:15, Suresh Marru  wrote:
>> >
>> > Hi Aravind,
>> >
>> 

Re: Apache Airavata MFT - AWS/GCS support

2020-04-16 Thread Aravind Ramalingam
Hello Dimuthu,

As a followup, we explored GCS in detail. We are faced with a small
dilemma. We found that though GCS has a Java support, but the functionality
does not seem to extend to a stream based upload and download.
The documentation says it is currently done with a gsutil command line
library [7], hence we are confused if we would be able to proceed the GCS
integration.

Could you please give us any suggestions? Also we were wondering if we
could maybe take up Box integration or some other provider if GCS proves
not possible currently.

[7] https://cloud.google.com/storage/docs/streaming

Thank you
Aravind Ramalingam

On Thu, Apr 16, 2020 at 12:45 AM Aravind Ramalingam 
wrote:

> Hello Dimuthu,
>
> We had just started looking into Azure and GCS. Since Azure is done we
> will take up and explore GCS.
>
> Thank you for the update.
>
> Thank you
> Aravind Ramalingam
>
> On Apr 16, 2020, at 00:30, DImuthu Upeksha 
> wrote:
>
> 
> Aravind,
>
> I'm not sure whether you have made any progress on Azure transport yet. I
> got a chance to look into that [6]. Let me know if you are working on GCS
> or any other so that I can plan ahead. Next I will be focusing on Box
> transport.
>
> [6]
> https://github.com/apache/airavata-mft/commit/013ed494eb958990d0a6f90186a53103e1237bcd
>
> Thanks
> Dimuthu
>
> On Mon, Apr 6, 2020 at 5:19 PM Aravind Ramalingam 
> wrote:
>
>> Hi  Dimuthu,
>>
>> Thank you for the update. We look into it and get an idea about how the
>> system works.
>> We were hoping to try an implementation for GCS, we will also look into
>> Azure.
>>
>> Thank you
>> Aravind Ramalingam
>>
>> On Mon, Apr 6, 2020 at 4:44 PM DImuthu Upeksha <
>> dimuthu.upeks...@gmail.com> wrote:
>>
>>> Aravind,
>>>
>>> Here [2] is the complete commit for S3 transport implementation but
>>> don't get confused by the amount of changes as this includes both transport
>>> implementation and the service backend implementations. If you need to
>>> implement a new transport, you need to implement a Receiver, Sender and a
>>> MetadataCollector like this [3]. Then you need to add that resource support
>>> to Resource service and Secret service [4] [5]. You can similarly do that
>>> for Azure. A sample SCP -> S3 transfer request is like below. Hope that
>>> helps.
>>>
>>> String sourceId = "remote-ssh-resource";
>>> String sourceToken = "local-ssh-cred";
>>> String sourceType = "SCP";
>>> String destId = "s3-file";
>>> String destToken = "s3-cred";
>>> String destType = "S3";
>>>
>>> TransferApiRequest request = TransferApiRequest.newBuilder()
>>> .setSourceId(sourceId)
>>> .setSourceToken(sourceToken)
>>> .setSourceType(sourceType)
>>> .setDestinationId(destId)
>>> .setDestinationToken(destToken)
>>> .setDestinationType(destType)
>>> .setAffinityTransfer(false).build();
>>>
>>>
>>> [2]
>>> https://github.com/apache/airavata-mft/commit/62fae3d0ab2921fa8bf0bea7970e233f842e6948
>>> [3]
>>> https://github.com/apache/airavata-mft/tree/master/transport/s3-transport/src/main/java/org/apache/airavata/mft/transport/s3
>>> [4]
>>> https://github.com/apache/airavata-mft/blob/master/services/resource-service/stub/src/main/proto/ResourceService.proto#L90
>>> [5]
>>> https://github.com/apache/airavata-mft/blob/master/services/secret-service/stub/src/main/proto/SecretService.proto#L45
>>>
>>> Thanks
>>> Dimuthu
>>>
>>>
>>> On Sun, Apr 5, 2020 at 12:10 AM DImuthu Upeksha <
>>> dimuthu.upeks...@gmail.com> wrote:
>>>
 There is a working on S3 transport in my local copy. Will commit it
 once I test it out properly. You can follow the same pattern for any cloud
 provider which has clients with streaming IO. Streaming among different
 transfer protocols inside an Agent has been discussed in the last part of
 this [1] document. Try to get the conceptual idea from that and reverse
 engineer SCP transport.

 [1]
 https://docs.google.com/document/d/1zrO4Z1dn7ENhm1RBdVCw-dDpWiebaZEWy66ceTWoOlo

 Dimuthu

 On Sat, Apr 4, 2020 at 9:22 PM Aravind Ramalingam 
 wrote:

> Hello,
>
> We were looking at the existing code in the project. We could find
> implementations only for local copy and SCP.
> We were confused on how to go about with an external provider like S3
> or Azure? Since it would require integrating with their respective 
> clients.
>
> Thank you
> Aravind Ramalingam
>
> > On Apr 4, 2020, at 21:15, Suresh Marru  wrote:
> >
> > Hi Aravind,
> >
> > I have to catch up with the code, but you may want to look at the S3
> implementation and extend it to Azure, GCP or other cloud services like
> Box, Dropbox and so on.
> >
> > There could be many use cases, here is an idea:
> >
> > * Compute a job on a supercomputer with SCP access and push the
> outputs to a Cloud storage.
> >
> > Suresh
> >
> >> On Apr 4, 2020, at 8:09 PM, Aravind