Hi Arijit,
Mahout seq2sparse is a really great tool- hard to beat in alot of ways. Mahout does have a distributed SPCA algorithm. Maybe we should move this discussion to the mahout dev list? We can get you more information there, and If you're interested in contributing that would be a great place to talk. Look forward to talking to you... Andy ________________________________ From: ARIJIT DAS <arijit.mcse...@gmail.com> Sent: Tuesday, January 24, 2017 4:24:35 AM To: dev@community.apache.org Subject: Re: Hello Andy, We have done tf-idf (term frequency,inverse document frequency ) calculation from a big corpus using map reduce.We were finding if there is any ready library available for calculation of eigen value,eigen vector and ofcourse principal component analysis. It would be great to call the function instead of writing. I am interested to take part in the development procedure of mahout...The advantage we have is the great thirst to learn new thing the disadvantage is we don't have that much infrastructure...everyone has his own laptop only. With Regards Arijit On Mon, Jan 23, 2017 at 11:02 AM, Andrew Palumbo <ap....@outlook.com> wrote: > Hello Arijit, > > > That is very good to hear, Have you been working with the MapReduce Mahout > algorithms? Have you used any of the new Samsara algolrithms? This is > where we've been focusing our work; around distributed linear Algebra based > frameworks and Algorithms. We're actually just now working on a release > which is centered around GPU bindinds and Native Multithreading for > Distributed Matrix operations. We also will be introducing a new pipeline > based framework for algorithms. > > > We are always looking for contributors so you'd be more than welcomed if > you dropped by to say hi :). > > > We do have a blog that we've been hoping to get started but have not had > the bandwidth lately It's up but actually empty. > > > Maybe we could discuss a blog entry to talk about how you've used mahout > in your work? We are trying to get people to see what the new frameworks > have offer. > > > Very nice to meet you, > > > Andy > > > ________________________________ > From: ARIJIT DAS <arijit.mcse...@gmail.com> > Sent: Sunday, January 22, 2017 1:32:29 AM > To: dev@community.apache.org > Subject: Re: Hello > > Hi Andy, > I am using Mahout for NLP specifically for semantic searching in > Indian languages as part of my PhD work...it will be my pleasure to > contribute for mahout and also take part in the discussion...with a huge > data in hdfs mahout has the capability to help us for decision making...You > may share the link of mahout specific forum/blog also (if required)..I will > be obliged to conribute. > > With Regards > Arijit > > On Sun, Jan 22, 2017 at 5:58 AM, Andrew Palumbo <ap....@outlook.com> > wrote: > > > Hello, > > > > > > I sent an email yesterday regarding swag for Meetups, etc to this list, > > thinking that it was an Apache list that was devoted specifically to that > > type of thing (i was just going off of a comment automatically sent to me > > from the last quarter's board report review).. I've noticed several > Emails, > > and realized that I was wrong about this just being a place to ask for > swag > > from. > > > > > > So I just wanted to Introduce myself and say hello. I'm the PMC chair of > > Apache Mahout. The Mahout team is a die hard group of all volunteer > > committers, devoted to distributed shared nothing machine learning, > > centered currently around an abstract Engine neutral set of Distributed > > Linear Algebra primitives. We're working currently on a release with GPU > > and native multithreaded backed matrix operations, as well as to provide > > more caned algorithms. We have no corporate backing, which makes it > > challenging at times to keep up; most of us work after our day-jobs. > This > > also makes us flexible and able to answer to no one but our own PMC when > > deciding the best direction for the project to go. > > > > > > So I just wanted to Introduce myself, and give you an overview of our > > team. The email that I sent yesterday came off more as an order for > Apache > > swag.. > > > > > > So, Hello, and have a good weekend, all, > > > > > > Andy > > >