Agreed.  You should provide some details on how you will integrate with 
Airavata: which components will you modify/create/extend?

From: Shameera Rathnayaka 
<[email protected]<mailto:[email protected]>>
Date: Tuesday, March 22, 2016 at 2:29 PM
To: "[email protected]<mailto:[email protected]>" 
<[email protected]<mailto:[email protected]>>, marpierc 
<[email protected]<mailto:[email protected]>>
Subject: Re: [GSoC Proposal] - Integrating Resource Information from Apache 
Mesos with Apache Airavata’s Job Management Modules

Hi Pankaj,

 Airavata architecture haven't change at all since last year, but few internal 
implementations which require minor effort to understand. As you are last year 
GSoC Student, we don't see you need much time (2 weeks, according to the your 
GSoC proposal) to spend to understand Airavata architech. Specially we expect 
task intensive proposal from you. Please revisit your milestones and 
deliverables. We would like to see something integrated with Airavata in you 
mid term evaluation. You can use Community bounding period to explore more 
about backgroud knowledge require such as Apache Mesos. Considering thease 
things you can add more comprehensive task as your milestons.

This proposal is seems to be continuation from your previous year GSoC 
proposal, if you could explain what you have done with previous GSoC and how 
you are going to start  from that will help us to understand the scope of this 
year proposal.

Thanks,
Shameera.

On Mon, Mar 21, 2016 at 5:17 PM Pankaj Saha 
<[email protected]<mailto:[email protected]>> wrote:
Hi Marlon,
Here is the link that I have created.

https://docs.google.com/document/d/1qtFvg4-usT4D_1TDNBsQDFQGZIkH99ideYQ1T3HU9nY/edit?usp=sharing
The draft is created under the GSoC proposal site under Apache foundation with 
the same title.


Thanks
Pankaj



On Mon, Mar 21, 2016 at 4:47 PM, Pierce, Marlon 
<[email protected]<mailto:[email protected]>> wrote:
Hi Pankaj,

I have some comments, but it would be easier if you created a proposal draft in 
the GSOC site. The google doc option for your draft is better than pointing to 
the Airavata wiki. Please make sure you give comment and suggestion permissions.

Marlon


From: Pankaj Saha <[email protected]<mailto:[email protected]>>
Reply-To: "[email protected]<mailto:[email protected]>" 
<[email protected]<mailto:[email protected]>>
Date: Monday, March 21, 2016 at 11:16 AM
To: dev <[email protected]<mailto:[email protected]>>
Subject: [GSoC Proposal] - Integrating Resource Information from Apache Mesos 
with Apache Airavata’s Job Management Modules

Hi Dev Team,

Please review the following GSoC proposal that I plan to submit:
Title: Integrating Resource Information from Apache Mesos with Apache 
Airavata’s Job Management Modules

Abstract:
Apache Airavata provides gateway computing capability across clustered 
environments for scientific users. It abstracts away the complexities of 
submitting jobs to HPC platforms and provides users with an intuitive and 
elegant web-based interface to submit jobs. Apache Mesos is a  distributed 
kernel that manages distributed computing resources as a single computer. As 
Airavata is being extended to use Big Data and Cloud tools to launch jobs in 
cloud environments, it needs to retrieve the resource and job execution 
information from the Big Data framework back to the Apache portal accessible to 
the end user. In this project we will develop code and scripts to be integrated 
with the Airavata that will use the HTTP API of Mesos to continuously fetch the 
complete resource and scheduling information. This information can then be used 
by Airavata to dynamically monitor and improve its job submission strategy in 
cloud environments such as Jetstream.

Introduction:
Apache Mesos provides HTTP API endpoints for scheduler, executor, internal and 
admin related queries. To fetch information regarding a clustered environment 
that is managed by the Mesos master, the API can be accessed via curl requests 
over HTTP. The response to such requests will be received as well formed json 
document. We will parse the json response and present the information in the 
format desired. The retrieved information will include resource usage, resource 
available for further jobs, job status, time elapsed since the job started, 
etc.  Airavata, in turn, will use this information to determine the resource 
usage, performance of the jobs on a job submission, rapid diagnosis on the 
health of the submitted jobs.

We will use the observer pattern to continuously pull information from Cloud 
and big Data Resource Managers, such as Apache Mesos, to Airavata.

Any comment and suggestions would be very helpful.

Thanks
Pankaj


--
Shameera Rathnayaka

Reply via email to