Hi Tom,

Thank you for your quick reply.

I re-created the design in my draft proposal since I had misunderstood some
concepts about OODT. I really appreciate if you could have a look into my
draft proposal
<https://docs.google.com/document/d/1BP55CNevap-88Vd7gPzOd2trqvAOHb2e3_jfoazA9Q8>
again and advice me further.

Please let me bother you again for some questions. :-)

   1. What is meant by "ingest hook" in your previous mail?
   2. I am not much familiar with Workflow Engine and Resource Manager at
   this point. Hence, I have added only the proposed Kafka messaging system
   for the File Manager in my design diagram. Until I catch up with those,
   would that be enough?

Thanks & Regards,




Eranga Heshan
*Undergraduate*
Computer Science & Engineering
University of Moratuwa
Mobile:  +94 71 138 2686 <%2B94%2071%20552%202087>
Email: [email protected]
<https://www.facebook.com/erangaheshan>   <https://twitter.com/erangaheshan>
   <https://www.linkedin.com/in/erangaheshan>

On Fri, Mar 31, 2017 at 8:01 AM, Tom Barber <[email protected]> wrote:

> Hi Eranga
>
> Sorry for the delay on getting back to you.
>
> OODT 2.0 is basically whats on the development branch of the repository:
> https://github.com/apache/oodt
>
> As for communications basically you have a File Manager Client and File
> Manager Server, the server currently speaks XMLRPC or Avro. In terms of
> communication the most obvious path would be: https://github.com/apache/
> oodt/blob/master/filemgr/src/main/java/org/apache/oodt/cas/filemgr/system/
> XmlRpcFileManager.java XML encoded products come into the file manager,
> then depending on what your output source would be the metadata is written
> to either Lucene, Solr or something custom: eg: https://github.com/apache/
> oodt/blob/master/filemgr/src/main/java/org/apache/oodt/cas/
> filemgr/catalog/solr/SolrCatalogFactory.java
> Personally I'd be interested in seeing those calls get managed by Kafka,
> so that for example multiple backends could register with an ingest hook,
> or you could have an audit listener trigger when kafka messages are passed
> for an audit log etc.
>
> I think that diagram is pretty good, basically, comms are mostly passed
> between components via RPC calls, but it would certainly be more effective
> for both inter component calls and internal bus calls, for example on the
> workflow engine where objects are passed around on a trigger type
> mechanism, for these to be replaced by Kafka messaging.
>
> Tom
>
>
> On Thu, Mar 30, 2017 at 6:21 PM, Eranga Heshan <[email protected]>
> wrote:
>
>> Hi Tom,
>>
>> I am currently looking into http://oodt.apache.org/site_do
>> cs/cas-filemgr/development/developer.html to get an abstract idea of how
>> OODT File Manager handles its internal communications. I hope that would
>> help me to identify which components needed to be implemented in Kafka. If
>> my approach is wrong please help me to make it correct.
>>
>> Further, I really appreciate if you could have a look into my proposal
>> and give your opinions on how to make it strong.
>>
>> Thanks & Regards,
>>
>>
>

Reply via email to