Re: Using spark MLlib without installing Spark
Decoupling mlllib and core is difficult...it is not intended to run spark core 1.5 with spark mllib 1.6 snapshot...core is more stabilized due to new algorithms getting added to mllib and sometimes you might be tempted to do that but its not recommend. On Nov 21, 2015 8:04 PM, "Reynold Xin" <r...@databricks.com> wrote: > You can use MLlib and Spark directly without "installing anything". Just > run Spark in local mode. > > > On Sat, Nov 21, 2015 at 4:05 PM, Rad Gruchalski <ra...@gruchalski.com> > wrote: > >> Bowen, >> >> What Andy is doing in the notebook is a slightly different thing. He’s >> using sbt to bring all spark jars (core, mllib, repl, what have you). You >> could use maven for that. He then creates a repl and submits all the spark >> code into it. >> Pretty sure spark unit tests cover similar uses cases. Maybe not mllib >> per se but this kind of submission. >> >> Kind regards, >> Radek Gruchalski >> ra...@gruchalski.com <ra...@gruchalski.com> >> de.linkedin.com/in/radgruchalski/ >> >> >> *Confidentiality:*This communication is intended for the above-named >> person and may be confidential and/or legally privileged. >> If it has come to you in error you must take no action based on it, nor >> must you copy or show it to anyone; please delete/destroy and inform the >> sender immediately. >> >> On Sunday, 22 November 2015 at 01:01, bowen zhang wrote: >> >> Thanks Rad for info. I looked into the repo and see some .snb file using >> spark mllib. Can you give me a more specific place to look for when >> invoking the mllib functions? What if I just want to invoke some of the ML >> functions in my HelloWorld.java? >> >> ------ >> *From:* Rad Gruchalski <ra...@gruchalski.com> >> *To:* bowen zhang <bowenzhang...@yahoo.com> >> *Cc:* "dev@spark.apache.org" <dev@spark.apache.org> >> *Sent:* Saturday, November 21, 2015 3:43 PM >> *Subject:* Re: Using spark MLlib without installing Spark >> >> Bowen, >> >> One project to look at could be spark-notebook: >> https://github.com/andypetrella/spark-notebook >> It uses Spark you in the way you intend to use it. >> Kind regards, >> Radek Gruchalski >> ra...@gruchalski.com <ra...@gruchalski.com> >> de.linkedin.com/in/radgruchalski/ >> >> >> *Confidentiality:*This communication is intended for the above-named >> person and may be confidential and/or legally privileged. >> If it has come to you in error you must take no action based on it, nor >> must you copy or show it to anyone; please delete/destroy and inform the >> sender immediately. >> >> >> On Sunday, 22 November 2015 at 00:38, bowen zhang wrote: >> >> Hi folks, >> I am a big fan of Spark's Mllib package. I have a java web app where I >> want to run some ml jobs inside the web app. My question is: is there a way >> to just import spark-core and spark-mllib jars to invoke my ML jobs without >> installing the entire Spark package? All the tutorials related Spark seems >> to indicate installing Spark is a pre-condition for this. >> >> Thanks, >> Bowen >> >> >> >> >> >> >
Re: Using spark MLlib without installing Spark
You can even use it without spark as well (besides local). For example i have used the following algo in some web app: https://github.com/apache/spark/blob/master/mllib/src/main/scala/org/apache/spark/mllib/classification/NaiveBayes.scala Essentially some algorithms (i havent checked them all) they will have to run the same steps in each partition so if you overlook the distributed oriented parts (spark specific) of the code there is a lot of resuable stuff. You have just to use the api where that is public and conform to the input/output contract of it. There used to be some dependencies like Breeze for example in the api hidden now (eg. https://github.com/apache/spark/blob/master/mllib/src/main/scala/org/apache/spark/mllib/linalg/Vectors.scala), but of course this is a hint not a list of what is available for your use case. Mind though that this may not be the cleanest way to implement your use case or might sound a hack ;) As an alternative choice besides spark local you could use a job server ( https://github.com/spark-jobserver/spark-jobserver) to integrate with your app as a proxy for spark or have a spark service there to respond back with results. From a design point of you its best to separate the concerns for several reasons: scaling , utilization etc. On Sun, Nov 22, 2015 at 5:03 AM, Reynold Xin <r...@databricks.com> wrote: > You can use MLlib and Spark directly without "installing anything". Just > run Spark in local mode. > > > On Sat, Nov 21, 2015 at 4:05 PM, Rad Gruchalski <ra...@gruchalski.com> > wrote: > >> Bowen, >> >> What Andy is doing in the notebook is a slightly different thing. He’s >> using sbt to bring all spark jars (core, mllib, repl, what have you). You >> could use maven for that. He then creates a repl and submits all the spark >> code into it. >> Pretty sure spark unit tests cover similar uses cases. Maybe not mllib >> per se but this kind of submission. >> >> Kind regards, >> Radek Gruchalski >> ra...@gruchalski.com <ra...@gruchalski.com> >> de.linkedin.com/in/radgruchalski/ >> >> >> *Confidentiality:*This communication is intended for the above-named >> person and may be confidential and/or legally privileged. >> If it has come to you in error you must take no action based on it, nor >> must you copy or show it to anyone; please delete/destroy and inform the >> sender immediately. >> >> On Sunday, 22 November 2015 at 01:01, bowen zhang wrote: >> >> Thanks Rad for info. I looked into the repo and see some .snb file using >> spark mllib. Can you give me a more specific place to look for when >> invoking the mllib functions? What if I just want to invoke some of the ML >> functions in my HelloWorld.java? >> >> -- >> *From:* Rad Gruchalski <ra...@gruchalski.com> >> *To:* bowen zhang <bowenzhang...@yahoo.com> >> *Cc:* "dev@spark.apache.org" <dev@spark.apache.org> >> *Sent:* Saturday, November 21, 2015 3:43 PM >> *Subject:* Re: Using spark MLlib without installing Spark >> >> Bowen, >> >> One project to look at could be spark-notebook: >> https://github.com/andypetrella/spark-notebook >> It uses Spark you in the way you intend to use it. >> Kind regards, >> Radek Gruchalski >> ra...@gruchalski.com <ra...@gruchalski.com> >> de.linkedin.com/in/radgruchalski/ >> >> >> *Confidentiality:*This communication is intended for the above-named >> person and may be confidential and/or legally privileged. >> If it has come to you in error you must take no action based on it, nor >> must you copy or show it to anyone; please delete/destroy and inform the >> sender immediately. >> >> >> On Sunday, 22 November 2015 at 00:38, bowen zhang wrote: >> >> Hi folks, >> I am a big fan of Spark's Mllib package. I have a java web app where I >> want to run some ml jobs inside the web app. My question is: is there a way >> to just import spark-core and spark-mllib jars to invoke my ML jobs without >> installing the entire Spark package? All the tutorials related Spark seems >> to indicate installing Spark is a pre-condition for this. >> >> Thanks, >> Bowen >> >> >> >> >> >> > -- Stavros Kontopoulos <http://www.typesafe.com> <http://www.typesafe.com>
Re: Using spark MLlib without installing Spark
Bowen, One project to look at could be spark-notebook: https://github.com/andypetrella/spark-notebook It uses Spark you in the way you intend to use it. Kind regards, Radek Gruchalski ra...@gruchalski.com (mailto:ra...@gruchalski.com) (mailto:ra...@gruchalski.com) de.linkedin.com/in/radgruchalski/ (http://de.linkedin.com/in/radgruchalski/) Confidentiality: This communication is intended for the above-named person and may be confidential and/or legally privileged. If it has come to you in error you must take no action based on it, nor must you copy or show it to anyone; please delete/destroy and inform the sender immediately. On Sunday, 22 November 2015 at 00:38, bowen zhang wrote: > Hi folks, > I am a big fan of Spark's Mllib package. I have a java web app where I want > to run some ml jobs inside the web app. My question is: is there a way to > just import spark-core and spark-mllib jars to invoke my ML jobs without > installing the entire Spark package? All the tutorials related Spark seems to > indicate installing Spark is a pre-condition for this. > > Thanks, > Bowen
Re: Using spark MLlib without installing Spark
Thanks Rad for info. I looked into the repo and see some .snb file using spark mllib. Can you give me a more specific place to look for when invoking the mllib functions? What if I just want to invoke some of the ML functions in my HelloWorld.java? From: Rad Gruchalski <ra...@gruchalski.com> To: bowen zhang <bowenzhang...@yahoo.com> Cc: "dev@spark.apache.org" <dev@spark.apache.org> Sent: Saturday, November 21, 2015 3:43 PM Subject: Re: Using spark MLlib without installing Spark Bowen, One project to look at could be spark-notebook: https://github.com/andypetrella/spark-notebookIt uses Spark you in the way you intend to use it.Kindregards, RadekGruchalski ra...@gruchalski.com de.linkedin.com/in/radgruchalski/ Confidentiality: Thiscommunication is intended for the above-named person and may beconfidential and/or legally privileged. If it has come to you inerror you must take no action based on it, nor must you copy or showit to anyone; please delete/destroy and inform the senderimmediately. On Sunday, 22 November 2015 at 00:38, bowen zhang wrote: Hi folks,I am a big fan of Spark's Mllib package. I have a java web app where I want to run some ml jobs inside the web app. My question is: is there a way to just import spark-core and spark-mllib jars to invoke my ML jobs without installing the entire Spark package? All the tutorials related Spark seems to indicate installing Spark is a pre-condition for this. Thanks,Bowen
Re: Using spark MLlib without installing Spark
Bowen, What Andy is doing in the notebook is a slightly different thing. He’s using sbt to bring all spark jars (core, mllib, repl, what have you). You could use maven for that. He then creates a repl and submits all the spark code into it. Pretty sure spark unit tests cover similar uses cases. Maybe not mllib per se but this kind of submission. Kind regards, Radek Gruchalski ra...@gruchalski.com (mailto:ra...@gruchalski.com) (mailto:ra...@gruchalski.com) de.linkedin.com/in/radgruchalski/ (http://de.linkedin.com/in/radgruchalski/) Confidentiality: This communication is intended for the above-named person and may be confidential and/or legally privileged. If it has come to you in error you must take no action based on it, nor must you copy or show it to anyone; please delete/destroy and inform the sender immediately. On Sunday, 22 November 2015 at 01:01, bowen zhang wrote: > Thanks Rad for info. I looked into the repo and see some .snb file using > spark mllib. Can you give me a more specific place to look for when invoking > the mllib functions? What if I just want to invoke some of the ML functions > in my HelloWorld.java? > > From: Rad Gruchalski <ra...@gruchalski.com (mailto:ra...@gruchalski.com)> > To: bowen zhang <bowenzhang...@yahoo.com (mailto:bowenzhang...@yahoo.com)> > Cc: "dev@spark.apache.org (mailto:dev@spark.apache.org)" > <dev@spark.apache.org (mailto:dev@spark.apache.org)> > Sent: Saturday, November 21, 2015 3:43 PM > Subject: Re: Using spark MLlib without installing Spark > > Bowen, > > One project to look at could be spark-notebook: > https://github.com/andypetrella/spark-notebook > It uses Spark you in the way you intend to use it. > > > > Kind regards, > Radek Gruchalski > ra...@gruchalski.com (mailto:ra...@gruchalski.com) > (mailto:ra...@gruchalski.com) > de.linkedin.com/in/radgruchalski/ (http://de.linkedin.com/in/radgruchalski/) > > Confidentiality: > This communication is intended for the above-named person and may be > confidential and/or legally privileged. > If it has come to you in error you must take no action based on it, nor must > you copy or show it to anyone; please delete/destroy and inform the sender > immediately. > > > On Sunday, 22 November 2015 at 00:38, bowen zhang wrote: > > Hi folks, > > I am a big fan of Spark's Mllib package. I have a java web app where I want > > to run some ml jobs inside the web app. My question is: is there a way to > > just import spark-core and spark-mllib jars to invoke my ML jobs without > > installing the entire Spark package? All the tutorials related Spark seems > > to indicate installing Spark is a pre-condition for this. > > > > Thanks, > > Bowen > > >
Using spark MLlib without installing Spark
Hi folks,I am a big fan of Spark's Mllib package. I have a java web app where I want to run some ml jobs inside the web app. My question is: is there a way to just import spark-core and spark-mllib jars to invoke my ML jobs without installing the entire Spark package? All the tutorials related Spark seems to indicate installing Spark is a pre-condition for this. Thanks,Bowen
Re: Using spark MLlib without installing Spark
You can use MLlib and Spark directly without "installing anything". Just run Spark in local mode. On Sat, Nov 21, 2015 at 4:05 PM, Rad Gruchalski <ra...@gruchalski.com> wrote: > Bowen, > > What Andy is doing in the notebook is a slightly different thing. He’s > using sbt to bring all spark jars (core, mllib, repl, what have you). You > could use maven for that. He then creates a repl and submits all the spark > code into it. > Pretty sure spark unit tests cover similar uses cases. Maybe not mllib per > se but this kind of submission. > > Kind regards, > Radek Gruchalski > ra...@gruchalski.com <ra...@gruchalski.com> > de.linkedin.com/in/radgruchalski/ > > > *Confidentiality:*This communication is intended for the above-named > person and may be confidential and/or legally privileged. > If it has come to you in error you must take no action based on it, nor > must you copy or show it to anyone; please delete/destroy and inform the > sender immediately. > > On Sunday, 22 November 2015 at 01:01, bowen zhang wrote: > > Thanks Rad for info. I looked into the repo and see some .snb file using > spark mllib. Can you give me a more specific place to look for when > invoking the mllib functions? What if I just want to invoke some of the ML > functions in my HelloWorld.java? > > -- > *From:* Rad Gruchalski <ra...@gruchalski.com> > *To:* bowen zhang <bowenzhang...@yahoo.com> > *Cc:* "dev@spark.apache.org" <dev@spark.apache.org> > *Sent:* Saturday, November 21, 2015 3:43 PM > *Subject:* Re: Using spark MLlib without installing Spark > > Bowen, > > One project to look at could be spark-notebook: > https://github.com/andypetrella/spark-notebook > It uses Spark you in the way you intend to use it. > Kind regards, > Radek Gruchalski > ra...@gruchalski.com <ra...@gruchalski.com> > de.linkedin.com/in/radgruchalski/ > > > *Confidentiality:*This communication is intended for the above-named > person and may be confidential and/or legally privileged. > If it has come to you in error you must take no action based on it, nor > must you copy or show it to anyone; please delete/destroy and inform the > sender immediately. > > > On Sunday, 22 November 2015 at 00:38, bowen zhang wrote: > > Hi folks, > I am a big fan of Spark's Mllib package. I have a java web app where I > want to run some ml jobs inside the web app. My question is: is there a way > to just import spark-core and spark-mllib jars to invoke my ML jobs without > installing the entire Spark package? All the tutorials related Spark seems > to indicate installing Spark is a pre-condition for this. > > Thanks, > Bowen > > > > > >