thats confusing. it seems to me the breeze dependency has been compiled
with java 6, since the mllib tests passed fine for me with java 6


On Sun, Apr 6, 2014 at 12:00 PM, Debasish Das <debasish.da...@gmail.com>wrote:

> Hi Koert,
>
> How do I specify that in sbt ?
>
> Is this the correct way ?
>   javacOptions ++= Seq("-target", "1.6", "-source","1.6")
>
> Breeze project for examples compiles fine with jdk7, fails with jdk6 and
> the function it fails on:
>
> error] /home/debasish/github/breeze/
> src/main/scala/breeze/util/package.scala:200: value valueOf is not a member
> of object java.util.BitSet
> [error]       java.util.BitSet.valueOf(bs.toBitMask)
>
> is not available in jdk6...
>
> http://docs.oracle.com/javase/6/docs/api/java/util/BitSet.html
>
> I have no clue how with target 1.6 solves the issue...are you saying jdk7
> will put a function that's closest to java.util.BitSet.valueOf ?
>
> Thanks.
> Deb
>
>
>
> On Sun, Apr 6, 2014 at 8:41 AM, Koert Kuipers <ko...@tresata.com> wrote:
>
> > classes compiled with java7 run fine on java6 if you specified "-target
> > 1.6". however if thats the case generally you should also be able to also
> > then compile it with java 6 just fine.
> >
> > something compiled with java7 with "-target 1.7" will not run on java 6
> >
> >
> >
> > On Sat, Apr 5, 2014 at 9:10 PM, Debasish Das <debasish.da...@gmail.com
> > >wrote:
> >
> > > With jdk7 I could compile it fine:
> > >
> > > java version "1.7.0_51"
> > > Java(TM) SE Runtime Environment (build 1.7.0_51-b13)
> > > Java HotSpot(TM) 64-Bit Server VM (build 24.51-b03, mixed mode)
> > >
> > > What happens if I say take the jar and try to deploy it on ancient
> > centos6
> > > default on cluster ?
> > >
> > > java -version
> > > java version "1.6.0_31"
> > > Java(TM) SE Runtime Environment (build 1.6.0_31-b04)
> > > Java HotSpot(TM) 64-Bit Server VM (build 20.6-b01, mixed mode)
> > >
> > > Breeze compilation also fails with jdk6, runs fine with jdk7 and breeze
> > jar
> > > is already included in spark mllib with Xiangrui's Sparse vector
> > > checkin....
> > >
> > > Does that mean that classes compiled and generated using jdk7 will run
> > fine
> > > on jre6 ?
> > >
> > > I am confused
> > >
> > >
> > > On Sat, Apr 5, 2014 at 3:09 PM, Sean Owen <so...@cloudera.com> wrote:
> > >
> > > > Will do. I'm just finishing a recompile to check for anything else
> like
> > > > this.
> > > >
> > > > The reason is because the tests run with Java 7 (like lots of us do
> > > > including me) so it used the Java 7 classpath and found the class.
> > > > It's possible to use Java 7 with the Java 6 -bootclasspath. Or just
> > > > use Java 6.
> > > > --
> > > > Sean Owen | Director, Data Science | London
> > > >
> > > >
> > > > On Sat, Apr 5, 2014 at 11:06 PM, Patrick Wendell <pwend...@gmail.com
> >
> > > > wrote:
> > > > > If you want to submit a hot fix for this issue specifically please
> > do.
> > > > I'm
> > > > > not sure why it didn't fail our build...
> > > > >
> > > > >
> > > > > On Sat, Apr 5, 2014 at 2:30 PM, Debasish Das <
> > debasish.da...@gmail.com
> > > > >wrote:
> > > > >
> > > > >> I verified this is happening for both CDH4.5 and 1.0.4...My deploy
> > > > >> environment is Java 6...so Java 7 compilation is not going to
> > help...
> > > > >>
> > > > >> Is this the PR which caused it ?
> > > > >>
> > > > >> Andre Schumacher
> > > > >>
> > > > >>     fbebaed    Spark parquet improvements A few improvements to
> the
> > > > Parquet
> > > > >> support for SQL queries: - Instead of files a ParquetRelation is
> now
> > > > backed
> > > > >> by a directory, which simplifies importing data from other
> sources -
> > > > >> InsertIntoParquetTable operation now supports switching between
> > > > overwriting
> > > > >> or appending (at least in HiveQL) - tests now use the new API -
> > > Parquet
> > > > >> logging can be set to WARNING level (Default) - Default
> compression
> > > for
> > > > >> Parquet files (GZIP, as in parquet-mr) Author: Andre Schumacher
> &...
> > > >  2
> > > > >> days ago    SPARK-1383
> > > > >>
> > > > >> I will go to a stable checkin before this
> > > > >>
> > > > >>
> > > > >>
> > > > >>
> > > > >> On Sat, Apr 5, 2014 at 2:22 PM, Debasish Das <
> > > debasish.da...@gmail.com
> > > > >> >wrote:
> > > > >>
> > > > >> > I can compile with Java 7...let me try that...
> > > > >> >
> > > > >> >
> > > > >> > On Sat, Apr 5, 2014 at 2:19 PM, Sean Owen <so...@cloudera.com>
> > > wrote:
> > > > >> >
> > > > >> >> That method was added in Java 7. The project is on Java 6, so I
> > > think
> > > > >> >> this was just an inadvertent error in a recent PR (it was the
> > > 'Spark
> > > > >> >> parquet improvements' one).
> > > > >> >>
> > > > >> >> I'll open a hot-fix PR after looking for other stuff like this
> > that
> > > > >> >> might have snuck in.
> > > > >> >> --
> > > > >> >> Sean Owen | Director, Data Science | London
> > > > >> >>
> > > > >> >>
> > > > >> >> On Sat, Apr 5, 2014 at 10:04 PM, Debasish Das <
> > > > debasish.da...@gmail.com
> > > > >> >
> > > > >> >> wrote:
> > > > >> >> > I am synced with apache/spark master but getting error in
> > > spark/sql
> > > > >> >> > compilation...
> > > > >> >> >
> > > > >> >> > Is the master broken ?
> > > > >> >> >
> > > > >> >> > [info] Compiling 34 Scala sources to
> > > > >> >> >
> > /home/debasish/spark_deploy/sql/core/target/scala-2.10/classes...
> > > > >> >> > [error]
> > > > >> >> >
> > > > >> >>
> > > > >>
> > > >
> > >
> >
> /home/debasish/spark_deploy/sql/core/src/main/scala/org/apache/spark/sql/parquet/ParquetRelation.scala:106:
> > > > >> >> > value getGlobal is not a member of object
> > > java.util.logging.Logger
> > > > >> >> > [error]       logger.setParent(Logger.getGlobal)
> > > > >> >> > [error]                               ^
> > > > >> >> > [error] one error found
> > > > >> >> > [error] (sql/compile:compile) Compilation failed
> > > > >> >> > [error] Total time: 171 s, completed Apr 5, 2014 4:58:41 PM
> > > > >> >> >
> > > > >> >> > Thanks.
> > > > >> >> > Deb
> > > > >> >>
> > > > >> >
> > > > >> >
> > > > >>
> > > >
> > >
> >
>

Reply via email to