Hi,

I have very simple code to try out Ignite Spark.


public class SparkTestJava {
    private static final String CONFIG = "examples/config/example-ignite.xml";
    private static final String CACHE_NAME = "testCache";
    private static final String TableName = "table_access_master";

    /** */
    public static void main(String args[]) throws AnalysisException {

        IgniteSparkSession igniteSession = IgniteSparkSession.builder()
                .appName("Spark test")
                .master("spark://192.168.1.25:7077")
                .config("spark.executor.instances", "1")
                .config("spark.cores.max", 2)
                .config("spark.submit.deployMode", "cluster")
                .config("spark.executor.memory", "3g")
                .config("spark.driver.cores", 2)
                .config("spark.executor.cores", 2)
                .config("spark.driver.memory", "1g")
                .igniteConfig("resources/node-config-spark.xml")
                .getOrCreate();

        System.out.println("List of available tables:");
        igniteSession.catalog().listTables().show();
        igniteSession.catalog().listColumns(TableName).show();



        //Selecting data through Spark SQL engine.
        Dataset<Row> df = igniteSession.sql("SELECT * FROM " + TableName + " 
Limit 10");
        df.printSchema();
        df.show();


I see the following Error

0/05/28 15:14:56 WARN TaskSetManager: Lost task 0.0 in stage 0.0 (TID 0, 
192.168.1.25, executor 0): class org.apache.ignite.IgniteIllegalStateException: 
Ignite instance with provided name doesn't exist. Did you call 
Ignition.start(..) to start an Ignite instance? [name=null]
at org.apache.ignite.internal.IgnitionEx.grid(IgnitionEx.java:1390)
at org.apache.ignite.internal.IgnitionEx.grid(IgnitionEx.java:1258)
at org.apache.ignite.Ignition.ignite(Ignition.java:489)
at org.apache.ignite.spark.impl.package$.ignite(package.scala:84)
at 
org.apache.ignite.spark.impl.IgniteRelationProvider$$anonfun$configProvider$1$2.apply(IgniteRelationProvider.scala:226)
at 
org.apache.ignite.spark.impl.IgniteRelationProvider$$anonfun$configProvider$1$2.apply(IgniteRelationProvider.scala:223)
at org.apache.ignite.spark.Once.apply(IgniteContext.scala:222)
at org.apache.ignite.spark.IgniteContext.ignite(IgniteContext.scala:144)
at 
org.apache.ignite.spark.impl.IgniteSQLDataFrameRDD.compute(IgniteSQLDataFrameRDD.scala:65)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
at org.apache.spark.scheduler.Task.run(Task.scala:109)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)

is there something I am missing?

regards
Mahesh




Reply via email to