Hi Arnab,

it is nice of you to take responsibility (add me too). of around
5 checklists to verify each software artifact  each one with path
isolation is a bit tricky.

We need a fresh perspective from a user (who is not a developer
or someone with no environment paths.) :)

I will add simple instructions for testing, individually as a starting
point. Ironically, the small bugs escape!

Best Regards,
Janardhan

On Wed, Nov 10, 2021 at 2:33 PM arnab phani <phaniar...@gmail.com> wrote:
>
> As a Release Manager, I take the responsibility for the issue. I should
> have tested all the release artifacts individually.
> The fix is now merged in Main and branch-2.2.0 branches. I will cut
> candidates for patch 2.2.1 today afternoon (CET).
>
> Regards,
> Arnab..
>
> On Tue, Nov 9, 2021 at 2:59 PM Matthias Boehm <mboe...@gmail.com> wrote:
>
> > hmm, I just tested it again and it runs fine on a setup spark cluster
> > and that's how I tested the artifact. Regarding forced single-node
> > computations (as used in bin/systemds or the python API) it runs into
> > issues. So yes, let's cut, vote, and release a fix that moves the code
> > change that pulls in the Spark dependencies into CommonThreadPool.
> >
> > Regards,
> > Matthias
> >
> > On 11/9/2021 1:35 PM, Baunsgaard, Sebastian wrote:
> > > Hi Devs,
> > >
> > > If you install the official release and follow our instructions it fails
> > because somehow not all the required java packages is included.
> > > This error is not found in our automatic tests, since we don't have any
> > verification tests of our binary release assets.
> > >
> > > python release is fine, since it include all the java packages.
> > >
> > > I suggest that we make a new patch release 2.2.1 (forked from the
> > official release commit), where all required java packages are included in
> > the bin release.
> > >
> > > Best regards
> > > Sebastian
> > >
> > > steps to reproduce:
> > >
> > >    1.  Download the release <https://systemds.apache.org/download>
> > https://www.apache.org/dyn/closer.lua/systemds/2.2.0/systemds-2.2.0-bin.zip
> > >    2.  Setup environment ... (like our docs says)
> > >       *   export SYSTEMDS_ROOT= "path/to/unziped/bin"
> > >       *   export PATH=$SYSTEMDS_ROOT/bin:$PATH
> > >    3.  run any script 'systemds test.dml'
> > >
> > > The error looks as follows:
> > >
> > >
> > > Me:~/temp$ systemds test.dml
> > >
> > ###############################################################################
> > > #  SYSTEMDS_ROOT= ../systemds/systemds-2.2.0-bin
> > > #  SYSTEMDS_JAR_FILE=
> > ../systemds/systemds-2.2.0-bin/lib/systemds-2.2.0.jar
> > > #  SYSDS_EXEC_MODE= singlenode
> > > #  CONFIG_FILE=
> > > #  LOG4JPROP=
> > -Dlog4j.configuration=file:/home/baunsgaard/systemds/systemds-2.2.0-bin//conf/log4j.properties
> > > #  CLASSPATH=
> > ../systemds/systemds-2.2.0-bin/lib/systemds-2.2.0.jar:../systemds/systemds-2.2.0-bin/lib/*:../systemds/systemds-2.2.0-bin/target/lib/*
> > > #  HADOOP_HOME= /home/baunsgaard/systemds/systemds-2.2.0-bin/lib/hadoop
> > > #
> > > #  Running script test.dml locally with opts:
> > >
> > ###############################################################################
> > > Executing command:     java       -Xmx4g      -Xms4g      -Xmn400m
> > -cp
> > ../systemds/systemds-2.2.0-bin/lib/systemds-2.2.0.jar:../systemds/systemds-2.2.0-bin/lib/*:../systemds/systemds-2.2.0-bin/target/lib/*
> >  
> > -Dlog4j.configuration=file:/home/baunsgaard/systemds/systemds-2.2.0-bin//conf/log4j.properties
> >  org.apache.sysds.api.DMLScript   -f test.dml   -exec singlenode
> > >
> > > Exception in thread "main" java.lang.NoClassDefFoundError:
> > scala/Function0
> > >      at org.apache.sysds.lops.Checkpoint.<clinit>(Checkpoint.java:43)
> > >      at
> > org.apache.sysds.runtime.instructions.spark.utils.SparkUtils.<clinit>(SparkUtils.java:69)
> > >      at
> > org.apache.sysds.api.DMLScript.cleanupHadoopExecution(DMLScript.java:522)
> > >      at
> > org.apache.sysds.api.DMLScript.initHadoopExecution(DMLScript.java:494)
> > >      at org.apache.sysds.api.DMLScript.execute(DMLScript.java:402)
> > >      at org.apache.sysds.api.DMLScript.executeScript(DMLScript.java:274)
> > >      at org.apache.sysds.api.DMLScript.main(DMLScript.java:169)
> > > Caused by: java.lang.ClassNotFoundException: scala.Function0
> > >      at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
> > >      at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
> > >      at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
> > >      at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
> > >      ... 7 more
> > >
> > >
> > >
> > >
> > >
> >

Reply via email to