I'm up to try anything to solve this so, sure, happy to try to share. Sorry though, but what's the "easy" way to share an image?
On Fri, Mar 2, 2018 at 3:03 PM, Robert Dale <[email protected]> wrote: > is it possible to share your images and see if it's an image issue or a > docker environment issue? > > Robert Dale > > On Wed, Feb 28, 2018 at 3:34 PM, Stephen Mallette <[email protected]> > wrote: > > > I pushed the fix for master, so I don't have the console problem anymore. > > So now, it's just the issue I guess I've always had..... > > > > On Wed, Feb 28, 2018 at 12:18 PM, Robert Dale <[email protected]> wrote: > > > > > That works for me. > > > > > > Robert Dale > > > > > > On Wed, Feb 28, 2018 at 11:23 AM, Florian Hockmann < > > [email protected] > > > > > > > wrote: > > > > > > > I also ran into the exact same problem in my feature branch for the > > > > docker images which is why haven't added my own vote for the PR yet. > > But > > > > it's good to see that it's really completely unrelated to my changes. > > > > > > > > > > > > Am 28.02.2018 um 17:15 schrieb Stephen Mallette: > > > > > I'm still a bust - same kind of error I keep having - > > > > > > > > > > * source: /usr/src/tinkerpop/docs/src/recipes/olap-spark-yarn. > > > > asciidoc > > > > > target: > > > > > /usr/src/tinkerpop/target/postprocess-asciidoc/recipes/ > > > > olap-spark-yarn.asciidoc > > > > > progress: > > > > > [=========================================================== > ======> > > > > > ] 65%java.io.IOException: No input paths > > > > > specified in job > > > > > Type ':help' or ':h' for help. > > > > > Display stack trace? [yN]pb(94); '----' > > > > > > > > > > > > > > > Last 10 lines of > > > > > /usr/src/tinkerpop/target/postprocess-asciidoc/recipes/ > > > > olap-spark-yarn.asciidoc: > > > > > > > > > > gremlin> conf.setProperty('spark.executor.extraLibraryPath', > > > > > "$hadoop/lib/native:$hadoop/lib/native/Linux-amd64-64") > > > > > ==>null > > > > > gremlin> conf.setProperty('gremlin.spark.persistContext', 'true') > > > > > ==>null > > > > > gremlin> graph = GraphFactory.open(conf) > > > > > ==>hadoopgraph[gryoinputformat->gryooutputformat] > > > > > gremlin> g = graph.traversal().withComputer(SparkGraphComputer) > > > > > ==>graphtraversalsource[hadoopgraph[gryoinputformat-> > > > gryooutputformat], > > > > > sparkgraphcomputer] > > > > > gremlin> g.V().group().by(values('name')).by(both().count()) > > > > > gremlin> > > > > > > > > > > xargs: /usr/src/tinkerpop/docs/preprocessor/preprocess-file.sh: > > exited > > > > with > > > > > status 255; aborting > > > > > > > > > > > > > > > > > > > > > > > > > On Wed, Feb 28, 2018 at 10:46 AM, Robert Dale <[email protected]> > > > wrote: > > > > > > > > > >> Yup, it's a step in the release docs. Once updated, master builds > > > docs > > > > for > > > > >> me. > > > > >> > > > > >> Robert Dale > > > > >> > > > > >> On Wed, Feb 28, 2018 at 10:42 AM, Stephen Mallette < > > > > [email protected]> > > > > >> wrote: > > > > >> > > > > >>> ah - i forgot to do that step.....i'm running tp32 now to see if > > that > > > > >>> works, but i'll fix that issue on master. > > > > >>> > > > > >>> On Wed, Feb 28, 2018 at 10:41 AM, Robert Dale <[email protected] > > > > > > wrote: > > > > >>> > > > > >>>> Could it be that 'bin/gremlin.sh' is linked to a specific > version? > > > > Does > > > > >>>> this have to be updated every release? > > > > >>>> > > > > >>>> $ ll gremlin-console/bin/gremlin.sh > > > > >>>> gremlin-console/bin/gremlin.sh -> > > > > >>>> ../target/apache-tinkerpop-gremlin-console-3.3.2- > > > > >>> SNAPSHOT-standalone/bin/ > > > > >>>> gremlin.sh > > > > >>>> > > > > >>>> Robert Dale > > > > >>>> > > > > >>>> On Wed, Feb 28, 2018 at 10:37 AM, Robert Dale < > [email protected]> > > > > >> wrote: > > > > >>>>> I got it too. tp32: good. tp33: good. master: bad. > > > > >>>>> > > > > >>>>> Robert Dale > > > > >>>>> > > > > >>>>> On Wed, Feb 28, 2018 at 10:20 AM, Stephen Mallette < > > > > >>> [email protected] > > > > >>>>> wrote: > > > > >>>>> > > > > >>>>>> I'm having no success generating docs with Docker. On master > i'm > > > > >>>> currently > > > > >>>>>> getting: > > > > >>>>>> > > > > >>>>>> Starting namenodes on [localhost] > > > > >>>>>> localhost: Warning: Permanently added 'localhost' (ECDSA) to > the > > > > >> list > > > > >>> of > > > > >>>>>> known hosts. > > > > >>>>>> localhost: starting namenode, logging to > > > > >>>>>> /usr/local/lib/hadoop-2.7.2/logs/hadoop-root-namenode- > > > > >>> 61cfd8f63c77.out > > > > >>>>>> localhost: Warning: Permanently added 'localhost' (ECDSA) to > the > > > > >> list > > > > >>> of > > > > >>>>>> known hosts. > > > > >>>>>> localhost: starting datanode, logging to > > > > >>>>>> /usr/local/lib/hadoop-2.7.2/logs/hadoop-root-datanode- > > > > >>> 61cfd8f63c77.out > > > > >>>>>> Starting secondary namenodes [0.0.0.0] > > > > >>>>>> 0.0.0.0: Warning: Permanently added '0.0.0.0' (ECDSA) to the > > list > > > > >> of > > > > >>>>>> known > > > > >>>>>> hosts. > > > > >>>>>> 0.0.0.0: starting secondarynamenode, logging to > > > > >>>>>> /usr/local/lib/hadoop-2.7.2/logs/hadoop-root-secondarynameno > > > > >>>>>> de-61cfd8f63c77.out > > > > >>>>>> starting yarn daemons > > > > >>>>>> starting resourcemanager, logging to > > > > >>>>>> /usr/local/lib/hadoop-2.7.2/logs/yarn-root-resourcemanager- > > > > >>>>>> 61cfd8f63c77.out > > > > >>>>>> localhost: Warning: Permanently added 'localhost' (ECDSA) to > the > > > > >> list > > > > >>> of > > > > >>>>>> known hosts. > > > > >>>>>> localhost: starting nodemanager, logging to > > > > >>>>>> /usr/local/lib/hadoop-2.7.2/logs/yarn-root-nodemanager- > > > > >>> 61cfd8f63c77.out > > > > >>>>>> Gremlin REPL is not available. Cannot preprocess AsciiDoc > files. > > > > >>>>>> Untagged: tinkerpop:build-1519830232 > > > > >>>>>> > > > > >>>>>> On that particular run, I'd just deleted all my Docker images > > > > >>> (including > > > > >>>>>> the linux image) and it still failed. Any clues as to what > might > > > be > > > > >>>> wrong? > > > > >>>>>> am i the only person with a problem here? > > > > >>>>>> > > > > >>>>> > > > > > > > > > > > > > >
