1) In the very first example under OLTP Hadoop-Gemlin, is that example now
missing a "traversal" call ?

The following is what worked for me...but does not match the example in the
docs:

gremlin>
graph=GraphFactory.open('/full/path/conf/hadoop/hadoop-gryo.properties')
==>hadoopgraph[gryoinputformat->gryooutputformat]
*gremlin> g=graph.traversal()*
==>graphtraversalsource[hadoopgraph[gryoinputformat->gryooutputformat],
standard]
gremlin> g.V().count()
==>6

2) How do I verify that SparkGraphComputer is being used ?

The example implies that I will see the following if using Spark:

==>hadoopgraph[gryoinputformat->gryooutputformat*[sparkgraphcomputer]]*

However, even though the following is set in the properties file for
opening the graph:

gremlin.hadoop.defaultGraphComputer=SparkGraphComputer

- set in the hadoop-gryo.properties - what I see is this:

==>hadoopgraph[gryoinputformat->gryooutputformat]


3) Has the syntax of the :remote command changed ?

I cannot get the :remote example shown for Spark to work.
The remote command fails with

gremlin> :remote connect tinkerpop.hadoop
'./conf/hadoop/hadoop-gryo.properties'
No such property: './conf/hadoop/hadoop-gryo.properties' for class:
groovy.lang.Binding
Display stack trace? [yN] n

gremlin> :remote connect tinkerpop.hadoop  "./conf/spark-gryo.properties"
No such property: "./conf/spark-gryo.properties" for class:
groovy.lang.Binding
Display stack trace? [yN] n

I also tried a full path and created a bogus sparkp-gryo.properties file.

Can you please provide an example of the remote command for this situation
?


4) Can you please paste example contents of a spark-gryo.properties which
is shown in the example ?
Do I need both a hadoop-gryo.properties AND a spark-gryo.properties when
doing remote commands ?

There did not appear to be a sample in the source or build.
The few sample files I found included a lot of Giraph stuff, which I
am not using, and
it isn't clear what is needed for Spark vs other GraphComputers.

5) If the Spark master has a URL like this: spark://test.machine.com:7077 ,
what is the appropriate value for spark.master in the
hadoop-gryo.properties and/or spark-gryo.properties ?

I don't understand what spark.master=local[4]  is supposed to mean.

6) Can you please explain what this is doing in the examples:
        g.engine(computer)

"computer" doesn't ever appear to be set in the examples
and this command doesn't work in my environment.

gremlin> g.engine(computer)
No such property: computer for class: groovysh_evaluate
Display stack trace? [yN] n

What is "g.engine" doing that isn't already done via a config file ?

7) Is the GA release candidate shipping with MapReduceGraphComputer undefined ?

*COMING SOON*


Thanks !

Reply via email to